in reply to Perl Scripts in search engines?
Non-preprocessor CGI's (like those written in C, Perl, server-side Java, etc. etc.) aren't HTML files at all, they're executables that (generally) require some sort of specific input from the user to present something meaningful. That is, a Perl CGI often finds out what the user wants (and then displays something interesting) only after the user clicks/selects/types something interesting into a static HTML form. Now, spiders only follow links that are pre-set on webpages, and they aren't smart enough to fill out a form, so the plain old HTML form gets 'botted (a .html file) but the CGI behind it that's called when Submit is pressed is ignored.
Of course all this is overgeneralized. You can certainly write asp and php that require user input, and you can write a "static" site using .cgi Perl, but you get the idea.
If you're concerned that your new site gets properly 'botted just make sure there are nice fat hard-linked .html/.asp/.php files tying your site together. You probably do NOT want your dynamic content 'botted (the CGI stuff) because things like Google only archive sites once a month, and you probably don't want someone searching old inventory.
Hope this helps.
Gary Blackburn
Trained Killer
|
|---|