All,
I'm looking for the best practice in regard to a CGI shopping cart - I am working on. What happens is that I have a checkout file that sends an email to our generic mail account and shows a thank you image once someone fills in their information. What happens is that even though the information page has checks that data is filled in - the checkout file if a search crawl hits the page it sends blank info to the generic e-mail box. Therefore to remedy what code can I put on the checkout page so that the e-mail only fires if thepage before it is the user info page:
Would basic CGI re-direct as such work:

    use CGI;
    $query = new CGI;

    print $query->redirect('http://www.nameofmysite.com/userinfo.cgi');

Hello koneill,

What happens is that even though the information page has checks that data is filled in - the checkout file if a search crawl hits the page it sends blank info to the generic e-mail box.

Is this script checking on server side too? Since you are submitting a form, you can verify if the request method is POST. That should block random crawlers, as genuine robots usually do not send POST requests.

You could also add a csrf token to the session and to the form, and verify that the request matches the one saved in the session. This should block also intentionals submissions from remote.

See:

Thanks - the script is not server side - but in taking all this into account appears since I'm using cgi something like this needs to be implemented on the checkout page:

$origin = $ENV{'HTTP_REFERRER'};
print "Content-type:text/html\n\n";

if ($origin =~ m#^http://www.mysite.com/#)
    {
        print "This page cannot be accessed directly";

 }

or

I can just instead of using the
print "This page cannot be accessed directly"; could be just a simple re-direct back to the user information screen.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.