Posts
 
Reputation
Joined
Last Seen
Ranked #414
Strength to Increase Rep
+5
Strength to Decrease Rep
-1
100% Quality Score
Upvotes Received
3
Posts with Upvotes
3
Upvoting Members
1
Downvotes Received
0
Posts with Downvotes
0
Downvoting Members
0
2 Commented Posts
9 Endorsements
Ranked #207
~5K People Reached
Favorite Forums
Favorite Tags
Member Avatar for cakka

I don't know of any way to directly influence the speed with which Google will crawl and index sites, or update the cache. Setting update frequency in your sitemap or meta tags will have no effect -- Google crawls and indexes your site based on other factors, such as site …

Member Avatar for Zueas
0
433
Member Avatar for wish71

Subdomains can be very useful for SEO, but only in certain situations. Note that search engines to some extent treat a subdomain as a separate domain (website). Thus, the page rank, link popularity, authority, etc of your primary domain will not necessarily be transferred to the subdomain. This means that …

Member Avatar for Div Murphy
0
240
Member Avatar for MktgRob

You may be getting at two things here: [LIST=1] [*]Internal linking structure for crawling [*]"Sculpting" specific links to flow or not flow page rank [/LIST] The first is critical for any initial site design. Make sure all pages key to SEO on your site can be reached via HTML links …

Member Avatar for snapshot
0
114
Member Avatar for Sophia.s

Don't worry so much about crawling frequency -- focus on quality and relevancy to your business or website purpose. If frequent updates to your content is relevant, then do it. If your content is "evergreen", don't do artificial things to update it. You do want to keep your site from …

Member Avatar for Sophia.s
0
90
Member Avatar for TechGurus

Also, don't rely on the hover technique to verify direct links. It is very easy to spoof this, and make it show your URL but have it link elsewhere, use rel="nofollow", use a redirect, etc. The only way to really validate is to look at the page source, find the …

Member Avatar for Robdale
0
140
Member Avatar for shennon

Yahoo and Microsoft are also supporting this. This is a very important feature for anyone who has problems with canonical URLs. Rather than a mess of 301 redirects, robots tags, rewrites, etc, you can just state the canonical URL for each page and let the search engines figure it out. …

Member Avatar for sam09
0
170
Member Avatar for topsurveysite

Keywords in the domain name are VERY important. You will rank much better for a keyword if it is in the domain. The challenge is that you can only have one or two keywords in a domain, since you will get dinged if you try to stuff it full of …

Member Avatar for Robdale
0
187
Member Avatar for jrafique

[QUOTE]* Outbound links improve your ranking[/QUOTE] Actually, they can help by improving the quality and relevancy of your site. I've seen rankings improve when quality, relevant links are added. [QUOTE]* Submitting your site to the search engines too many times will get you banned[/QUOTE] Yes it can. [QUOTE]* Links from …

Member Avatar for Jenniferlinn
0
196
Member Avatar for phoenix06007

For the most websites, there is no difference between a dedicated vs a shared IP for ranking algorithms. There are, however, rare cases could make a difference with a shared IP: 1) The shared IP address is hosting sites that are banned for spamming or other reasons (or the IP …

Member Avatar for JonathanD
0
120
Member Avatar for phoenix06007

Both Alexa and Compete have thin data and are not really accurate. You can use them to get rough indications on traffic and keywords, but don't take them literally. In many cases, I've found them to be way off the mark. If you are looking for which is "better", I'd …

Member Avatar for 4nicefriend
0
162
Member Avatar for phoenix06007

This varies widely based on what you are capturing or selling. In classic lead generation (e.g., mortgage, real estate, white paper download, etc) a typical range is from 3-8%, but it can be over 20% or less than 1% depending on factors such as value and traffic targeting. The problem …

Member Avatar for Dani
0
69
Member Avatar for phoenix06007

That is spam, or at least misuse of the tag. I don't think you will face a penalty but I'll bet the search engines will not give you any lift from using <Hn> tags this way. The conventional wisdom is that using <Hn> tags correctly will give a little boost …

Member Avatar for andyf
0
108
Member Avatar for jimmy03

If your site really got banned from Google (no pages in results at all, even when you search on exact text), assuming you have corrected all of your violations you can ask for reconsideration. To do this, see [url]http://www.google.com/support/webmasters/bin/answer.py?answer=35843[/url].

Member Avatar for jreseo
0
62
Member Avatar for phoenix06007

Ultimately there is nothing you can do to hide any source code that is served to the browser. Any attempts to hide that can be bypassed by simply using an HTTP viewer or other tool that reads directly from the webserver bypassing a browser (e.g., the HTTP viewer at [url]http://www.rexswain.com/httpview.html)[/url]. …

Member Avatar for jreseo
0
87
Member Avatar for phoenix06007

Only Google knows for sure. The conventional wisdom is that page rank for a given page is divided up across the links on that page to other pages. Thus, 3 links to one page might get a little more weight to that target page, but at the expense of other …

Member Avatar for jreseo
0
61
Member Avatar for Dani

Search engines, including Google and Yahoo, will definitely find and index text with display:none and hidden. Lots of examples out there. For example, do a Google or Yahoo and search on [INDENT]seattle real estate science recreation trulia[/INDENT] You will see this page in spot #1: [INDENT] [url]http://www.trulia.com/real_estate/Seattle-Washington/[/url][/INDENT] If you view …

Member Avatar for wickedsunny
0
159
Member Avatar for phoenix06007

To answer the specific question "Does W3 validating code help SEO", the answer is no, it does not. However, having invalid code on your page could [B]hurt [/B]your SEO, and that is why it is considered a best practice to validate your code through W3C compliant validators. The reason is …

Member Avatar for Dani
0
119
Member Avatar for JG42122

Actually, Google will now crawl and index some portions of Flash. See [URL="http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html"]http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html[/URL] I'm starting to see this work, with some Flash menus being followed and content indexed in Google. Of course, Yahoo, MSN Live and others are not there yet, so Flash is still invisible to them and they …

Member Avatar for Dani
0
104
Member Avatar for phoenix06007

Here are a few best practices for creating good title tags, meaning they help pages rank better for your target keywords, and are inviting to users who see them in the search results so they are more likely to click on them. 1) Use good keywords that are also in …

Member Avatar for Dani
0
90
Member Avatar for kaleem_ullah

Don't bother. Build a site with good content and get links from good quality sites, and you will get crawled. No need to submit to any of these engines, especially meta-engines like dogpile that pull from other search engines. John Erickson << signature snipped >>

Member Avatar for TREVOR13
0
252
Member Avatar for seogoat

I'm not sure what the "problem" is here. It does help a page rank better for a given keyword if inbound link text includes that keyword. However, this is not the only factor, just one of many. So, a page can rank just fine if it does not have backlinks …

Member Avatar for jreseo
0
131
Member Avatar for seogoat

You are correct -- you need the town or other location name in the body text of the page. You should also have the name in the page title, meta tags, headings, alt text and other areas (with sufficient but not excessive keyword density), as well as in anchor text …

Member Avatar for jreseo
0
166
Member Avatar for jrwolf7

Putting pages into folders should have very little, if any, impact on rankings. There is some evidence that pages within many levels of folders are given less weight than pages in top level folders, but the difference is slight, if any, and I've never seen any difference for just one …

Member Avatar for jreseo
0
100
Member Avatar for Lucie08

Don't fool yourself that just because you use robots.txt or rel=nofollow to "exclude" a page Google will not look at that page and evaluate it. Google has separate algorithms for detecting spam, and these do not behave the same as those that actually index pages. I've seen several examples of …

Member Avatar for jreseo
0
130
Member Avatar for aandy501

First, if the content of your header tags (I assume you mean title, meta description and meta keywords) match the body content of the page, Google will not consider this "fishy", and in fact this is a best practice. Also, having all the title/meta tags duplicated on your inner pages …

Member Avatar for jreseo
0
79
Member Avatar for pesto555

For the long term, you are better off creating country-specific domains, and building up links for each. Google and other engines will give preference to county/language-specific TLDs (top level domains) for searches from those countries or in those languages, so leverage that advantage. I think over time this will give …

Member Avatar for pesto555
0
139
Member Avatar for Shikha.Malhotra

Don't fall for it. Directories don't really work anymore, and doing mass submissions like this can get your web site banned.

Member Avatar for elbuhleini
0
143
Member Avatar for whiterox

LSI is a term that now has multiple meanings. It is related to the term "Latent Semantic Analysis" (LSA), but refers to the practice of using LSA for indexing text. From an academic standpoint, LSA is a set of methodologies for analyzing the relationships of terms used in written text. …

Member Avatar for jreseo
0
97
Member Avatar for Missimin

First, the Google link: command only shows a subset of links, and usually only the best quality links. If that is what you are using, I'd suggest you look at the Yahoo linkdomain: command results, and also sign up for Google Webmaster tools and look at the link lists within …

Member Avatar for jreseo
0
104
Member Avatar for rt7878

The meta keywords tag is not given much (if any, in some cases) weight by search engines, but since it is easy to populate I think it is still worth doing. Other areas such as title tags, meta description, headings and body text are much more important for SEO. Put …

Member Avatar for selfhelpebooks
0
103