I dont want any of the pages to be blocked from the Search Engines. Do i need to place a blank robots.txt file
please advise
thanks
No. Only place a blank robots.txt if you don't want 404 errors in your logs/stat software. Otherwise there is no need for one.
You want each and every page of your site should be index by every spider you can use a simple two line code in your robots.txt
User-agent: *
Disallow:
Here * indicates all spiders are allowed to crawl and the disallow section is placed empty, this indicates every portion of the site is free for them to crawl.
... or you can just not have a robots.txt file to achieve the same effect.
A robots.txt file is for blocking content from search engines. This is helpful if you do not want search engines to "waste" their time attempting to index such things that:
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.