in reply to Best use of Template::Toolkit for Search Engine Optimization?

Contrarian that I (sometimes) am, I like the static page scheme (option 3) given the comparatively small number of possible combinations of data and relative rarity of changes you expect in the db.

Your 3rd option can do more for you than provide user and SE friendly URLs. If you generate the thousand or fewer pages on a local machine, you can readily (with a bit of a big workload up front) tweak the DESCRIPTION and KEYWORDS metas, which may do more for your search engine ranking that many other optomizing techniques.

On the other hand, there are so many parameters that go into SE ranking ("How fresh is the 'Latest Update' or 'Last modified' date? How well do the keywords match the actual content? etc.) so this suggestion may be an example of something comparable to "premature optomization."

  • Comment on Re: Best use of Template::Toolkit for Search Engine Optimization?

Replies are listed 'Best First'.
Re^2: Best use of Template::Toolkit for Search Engine Optimization?
by karavelov (Monk) on Oct 27, 2008 at 21:26 UTC

    Another option is to generate static "sitemap.xml" file describing all 1000 entires and combinations. And then say to search spiders to look in this file: in robots.txt

    User-Agent: * Disallow: Sitemap: /sitemap.xml

    For the sitemap syntax look here