I want to disallow certain query strings with robots.txt and I want to check I am doing this correctly...

I am using:

Disallow: /browse.asp?cat=*-*

I want to check that this rule will allow these urls to be indexed:

browse.asp?cat=123/1234/1234-1
browse.asp?cat=123/1234-1

While disallowing these urls:

browse.asp?cat=1234-1
browse.asp?cat=1234-2

Will this rule work? or will the wild card cause it to dissallow all the cat query strings?

Thanks.

Query strings aren't mentioned in the spec, so if you're lucky a crawler will do something with it, but don't count on it. Perhaps it can be done with rewrites, but am unsure how to get that working (if at all).

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.