Hi everyone, this website stuff is all new to me. So I have set up my web site, read through all tutorials and how to's.
Created my keywords, meta tags , description & etc..
Created my sitemap.xml (using Google's Python tool) along with urllist.txt & config.xml and created my robots.txt
Now when I use Google Sitemap tool , I see that Google is working with some cached data , from March 12 and none of the keywords and tags that I have added in the last few weeks have been indexed.
I keep submitting my sitemap to google, however that does not seem to change anything
Can some please give me a clue on what else I need to do.

Thanks in advance

Jay

Hi Jay,

Welcome to Daniweb. Great post!

What you need to do is build links. Googebot (and the others) finds your site by crawling new links. There are many quality resources for link building here on Daniweb. What I would suggest is to list your site in a few directories (such as Yahoo, Best of the Web) which will cost a few bucks and take a few days. Alternatively you can purchase text link ads. I don't want to go into all the different ways of acquiring links, but a directory listing should be all it takes to get included.

Swap links :) Get your site visible to spiders crawling other sites and finding yours. A big misconception is that a Google sitemap will let Google find your site and know about your pages. Simply not true! The only way to get Google to spider your site is by building pagerank through incoming links (known as backlinks). The higher your pagerank, the more frequently googlebot will return to your site and the more pages it will spider at a time. If you find that google isn't spidering your most important pages, a sitemap will lend them a helping hand by allowing you to specify what pages you give a higher priority to. But, remember, you need backlinks to build pagerank, and pagerank dictates how many pages they'll spider ... a sitemap will just let you tell googlebot which of your pages are more important and should get first dibbs at the googlebot.

Member Avatar for palconit

He could also check if his domain is banned in Google. Google Banned Tool is good for new sites that could be banned

Thanks for the tidbit. Not familiar with that :)

Hi everyone, this website stuff is all new to me. So I have set up my web site, read through all tutorials and how to's.
Created my keywords, meta tags , description & etc..
Created my sitemap.xml (using Google's Python tool) along with urllist.txt & config.xml and created my robots.txt
Now when I use Google Sitemap tool , I see that Google is working with some cached data , from March 12 and none of the keywords and tags that I have added in the last few weeks have been indexed.
I keep submitting my sitemap to google, however that does not seem to change anything
Can some please give me a clue on what else I need to do.

Thanks in advance

Jay

Hi,

Very simple :)
Add quality and related one-way links with good PR. You will be in the top :)

Hi everyone, this website stuff is all new to me. So I have set up my web site, read through all tutorials and how to's.
Created my keywords, meta tags , description & etc..
Created my sitemap.xml (using Google's Python tool) along with urllist.txt & config.xml and created my robots.txt
Now when I use Google Sitemap tool , I see that Google is working with some cached data , from March 12 and none of the keywords and tags that I have added in the last few weeks have been indexed.
I keep submitting my sitemap to google, however that does not seem to change anything
Can some please give me a clue on what else I need to do.

Thanks in advance

Jay

Just no need to care when google index your pages or not. Site map is also not a right key. Just focus on your quality contents, quality links then Google will like it and do the rest.
Good luck.

Googlebot has two versions, deepbot and freshbot. Deepbot is a deep crawler that tries to folow every link on the web and download as many pages as it can for the Google index. It also examines the internal structure of a site, giving a complete picture for the index.
Freshbot, on the other hand, is a newer bot that crawls the web looking for fresh content. The Google freshbot was implemented to take some of the pressure off of the GoogleBot. The freshbot recalls pages already in the index and then crawls them for new, modified, or updated pages. In this way, Google is better equipped to keep up with the ever-changing Web.
This means that the more you update your web site with new, quality content, the more the Googlebot will come by to chëck you out. If you'd like to see the Googlebot crawling around your web property more often, you need to obtain quality inbound links. However, there is also one more step that you should take. If you haven't already done so, you should create a Google Sitemap for your site.
Creating a Google sitemap allows you to communicate with Google, telling them about your most important pages, new pages, and updated pages. In return, Google will provide you with some valuable information as well. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. This allows you to pinpoint problems and fix them so that you can gain increased exposure in the search results.
I hope this one will help you.

Hi everyone, this website stuff is all new to me. So I have set up my web site, read through all tutorials and how to's.
Created my keywords, meta tags , description & etc..
Created my sitemap.xml (using Google's Python tool) along with urllist.txt & config.xml and created my robots.txt
Now when I use Google Sitemap tool , I see that Google is working with some cached data , from March 12 and none of the keywords and tags that I have added in the last few weeks have been indexed.
I keep submitting my sitemap to google, however that does not seem to change anything
Can some please give me a clue on what else I need to do.

Thanks in advance

Jay

Jay, you need links and one quick way to get them is to make a link to your site in your signature in forums. Don't ask me how I know, it just came to me... :-)

BB

Googlebot has two versions, deepbot and freshbot. Deepbot is a deep crawler that tries to folow every link on the web and download as many pages as it can for the Google index. It also examines the internal structure of a site, giving a complete picture for the index.
Freshbot, on the other hand, is a newer bot that crawls the web looking for fresh content. The Google freshbot was implemented to take some of the pressure off of the GoogleBot. The freshbot recalls pages already in the index and then crawls them for new, modified, or updated pages. In this way, Google is better equipped to keep up with the ever-changing Web.
This means that the more you update your web site with new, quality content, the more the Googlebot will come by to chëck you out. If you'd like to see the Googlebot crawling around your web property more often, you need to obtain quality inbound links. However, there is also one more step that you should take. If you haven't already done so, you should create a Google Sitemap for your site.
Creating a Google sitemap allows you to communicate with Google, telling them about your most important pages, new pages, and updated pages. In return, Google will provide you with some valuable information as well. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. This allows you to pinpoint problems and fix them so that you can gain increased exposure in the search results.
I hope this one will help you.

I had to leave that all in so I could point at each bit of it individually, sorry for duping and bandwidth and everything. used to have two bots, true, now has a couple more actually, I believe, essentially mediabots but, but but but, lately the game's changed, you need a goodly measure of inbound links before you become worth spidering and indexing and checking, so you can add fresh content till the cows come home and it won't do you a scrap of good unless you have Google's version of provenance which usually means loads of quality inbound links. And he does have a sitemap, he built in in Python, yet! Dood knows more than I do... :-(

BB

Even I had a similar problem. I have two sites and googlebot more often tends to visit the one having less content in terms of quality, quantity and relevance. :-O I update the other everyday and seems receiving little attention from google. The content on the other site is static for a long time, which googlebot loves visiting. I wonder what a contradiction...!!! :-/

You may visit this forum for some info:
http://www.websitepublisher.net/forums/showthread.php?t=6987

Jymoo
BloomingBuds

Keep increasing the links. and if you are not included the xml sitemap, include that also.

If you want Google to visit more frequently get more high quality links and update your pages relatively fequenetly.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.