Perhaps I am overly ambitious, but is there a way to verify that my perl-based site has been visited by the search engines I've submitted it to?
I believe I have coverage not found elsewhere and, yet, my site does not appear. I use META tags. I use Titles. I have carefully chosen keywords that accurately reflect my coverage.
This is not necessarily a perl question, however, I have used perl to implement my site. The actual source pages are nothing more than perl scripts containing variable definitions and content. The source files are parsed when requested.
The content delivered to the end-user's browser is a proper, correct, and complete web page.
Is it possible that the robots are not parsing my pages? If so, can someone advise me how to correct this?
I confess that I am a Windows programmer with little experience with *nix and admit that i may have overreached in my ambition.
In reply to Can dynamic sites be parsed by search engine robots? by footpad
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |