Hello Members,

I am working on one small website ( just 8 pages) and all pages are mention in sitemap. I have successfully submitted XML sitemap to Google Webmaster Tools. The problem is that Google Not index all pages (They just index only 4 pages instead of 8). Though they index all pages when i use "Fetch as Google" option from Google Webmaster Tools. But after few days Google again stop Crawling and indexing pages. So please let me know what kind of problem is their? Why Google not able to index all pages of my website?

Thanks in Advance... :)

Hi, crawling and indexing are processes that take some time. Many factors are also to be considered. For instance make sure that you have given the "www" version and the normal version. eg: (www.abc.com) and (abc.com). The other reasons are, may be the links included in the site is not proper. Check again if the site complies to the webmaster guidelines.

Hello Sara Evans

Thanks for your reply.

I understand that crawling and indexing process take time, but i already used "Fetch As Google" option from Google Webmaster Tools and after 2 days google suddenly close indexing and crawling as well. However my pages has few 404 pages which are already redirect (301) to proper one. Is their any server issue in my site?

May be you can check for server issues... I suggest that you go for some SEO tool that will help your website.

You have to check to robots.txt file and htaccess. If those are OK than fetch again. You could verify robots.txt. You could check my one also.

Thanks Sara Evans And Inzcpa for your reply.

Sara let i use SEO tool as you suggest.

Inzcpa, I have check my robots.txt and .htaccess file both are perfect. The problem is that Google suddenly close crawling and indexing few pages of my website after using "Fetch AS Google" option. So is it normal things or something going wrong with my website.? What is the most possible thing for this problem.

Also i am not block pages from Robots.txt or using "NoFollow - NoIndex" code.

Thanks...

4 of 8 50% is a good result

I have 5000+ of 200 000+ well linked and menued pages in the google index, the others are similar enough in purpose to need no further indexing.

It is normal
no site is 100% indexed, suffient to direct searches appropriately, in the opinion of the software driving the bots

The bots will return as your site is further referenced by other sites,
unless those links are farmed links, in which case your ser will rapidly decline

resubmitting does not help, the threshold at which your submissions become link-farming, or link-spam, is very low

how google search works

If you are planing to use an SEO tool I suggest this link. Seems to be good option...you can get opinions and refer other sites too... I'm just making a suggestion here.

google not indexing all the pages , then there may be any issue with site map. Please check all your link properly mention in the site map . you can also try to insert the webmaseter insert into the pages , then try again to fetch as google .It may help to index your pages .

Thanks all for your reply.

"thaversantos" can you please elaborate what is "Insert the webmaster insert into the pages".

Thanks... :)

Will you see canonical issue on yoyr web page. If yes, then first solve it.
Same domain but duplicate conetnt may harful to your site. SE confuse that which page is original and which is not. SO solve immidiately.
Then check tour robots.txt file.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.