I have many redirect scripts on my site (they compute some PHP then redirect to the relevant page) - however on google webmaster they all kept coming up as "Soft-404" errors, which I read are bad for PR. A while ago I restricted googlebot's access to my /site/ folder, which contains all these redirect scripts to prevent this, which has worked fine, however I'm concerned this might be preventing the crawler from actually navigating the site to get to other pages.
Is it safe to keep these redirect scripts restricted, will googlebot still be able to look around and sort the rest? Or should I stop restricting access and get Soft-404's instead.
I also get soft-404's on pages that sometimes redirect (script before the doctype and headers) say, if a user have an invalid url variable, but the pages come out fine.
That's not so much of a problem, could just make another redirect script, but I'm curious as to why google thinks they're all Soft-404's when they all return 200 Ok.
Any help is much appreciated.