Tanktalus has asked for the wisdom of the Perl Monks concerning the following question:

Being the resident computer geek in local charitable organisation, I have been, well, volunteered to maintain our web site. It's really quite a trivial undertaking, nothing fancy, because we're not even paying for a regular site or anything. We're just using the space afforded by my ISP - which does not, to my knowledge, allow for CGI of any sort. This is not a huge deal because there is no (current) need for forms or anything. Just that the content changes from time to time, and I'd like to generate the content as much as possible.

So what I've done in the past is to create an HTML template (yes, using HTML::Template) for the header, the footer, and then the content of each page. The content for each page is based on a number of CSVs (which I use DBD::CSV for) which I pull together as-needed. Then I would run another perl script to upload, via FTP, to the ISP's site. And it works. More or less.

On sites where I can use CGI, I just throw in CGI::Application, inherit from there, and I'm basically done. Here, I need some way to loop through all runmodes, grab the output from each, and upload them (along with any new non-generated data, such as PDFs, CSS, or other images).

What I have is more or less working. However, I'm wondering if any other monks have a better idea on how to handle this:

I'm thinking to throw in some XML::Twig into the mix here during static-generation to change any href's from "index.cgi?rm=*foo*" to "*foo*.html" to help with the "working as-is in both environments". Mind you, if someone has a better idea, I figure I should ask now before I go ahead and write anything :-)

  • Comment on Generation of dynamically-static documents

Replies are listed 'Best First'.
Re: Generation of dynamically-static documents
by CountZero (Bishop) on Sep 11, 2005 at 20:54 UTC
    Template::Toolkit seems better suited for such venture. The Badger Book has several chapters on making such "static" websites directly from templates: see Chapter 2.

    CountZero

    "If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

      I'm not sure I follow how having a script invoke Template::Toolkit via some custom logic to produce the data is any different than HTML::Template. In fact, I am unsure that the choice of templating system (is this the View in MVC?) is significant here, unless it's in conjunction with the underlying Controller which is dependant on some templating system to work.

      I'm even less clear on how TT will help keep a single set of code with as little redundancy as possible between a static site and a CGI-based site in producing the same end-result for the browser. (The href changes I mentioned above are still the same end-result in that the links work, not that they're identical markup.)

        TT2 is quite separate from any MVC-system (although it plays very nice with Catalyst). You could run TT2 outside of a web-server (there is the ttree.pl script to do so) and it would generate all your html-pages at once -- and would even do so without re-generating the pages which remain unchanged) or run it in your web-server (CGI or mod-perl enabled Apache for instance) and the requested page(s) will be made on-the-fly.

        Perhaps that is also possible with HTML::Template, but I like TT2 more. YMMV.

        CountZero

        "If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

        I'm not sure I follow how having a script invoke Template::Toolkit via some custom logic to produce the data is any different than HTML::Template

        Sure, you can do it with HTML::Template too - but TT2 has some built in tools to make the process easier. Tools like Template::Tools::ttree (comes with TT2) and Template::TT2Site will automatically generate a static copy of a TT2 site if set up appropriately. So you can have dynamically generated content on your dev site, with an appropriate ttree/rsync combo in your makefile to shift a static version of your site to the production server.

Re: Generation of dynamically-static documents
by Zaxo (Archbishop) on Sep 12, 2005 at 00:47 UTC

    This sounds like a job for make. You can run anything you like in the makefile rules, including perl/cgi scripts. Shell redirection is available, and make will only run what is necessary to update from new data files.

    ExtUtils::MakeMaker is available to help write makefiles, but may be too specialized to module building to be helpful.

    After Compline,
    Zaxo

        You left your withering sarcasm on again. Try to save it for when people do something more unforgivable than being unaware of a column you wrote.

        Caution: Contents may have been coded under pressure.
Re: Generation of dynamically-static documents
by tinita (Parson) on Sep 12, 2005 at 00:54 UTC
    i don't know if this will do what you want, i hacked together a quick and dirty script...
    Generate static pages "at will" (e.g., via cron job, or just when I run the command).
    set $offline to 1 in the script, then it will generate the runmode html file with every click on your local webserver.
    the only requirement is that you don't have the links in the templates but give them to the template module as objects $foo = "rm=foo";bless(\$foo, "Static::Link");
    you might add a spider script that automatically calls every runmode like that.
    the script probably runs also with HTML::Template if you adjust the tmpl_vars.
Re: Generation of dynamically-static documents
by sgifford (Prior) on Sep 12, 2005 at 02:18 UTC
    Apache does a lot of its magic based on file extensions. If you can upload a .htaccess file to change the meanings of extensions, that might be a solution: create a new extension, say .page, and on your local system have it run a Perl script, while on the remote system it's just another word for text/html. That way your links are all the same on both systems.
Re: Generation of dynamically-static documents
by mattr (Curate) on Sep 12, 2005 at 08:18 UTC
    FWIW I did something similar some years ago. Unfortunately I have been unable to find where I put it. I used it to automatically process 1000 photos from a CD using perl-fu and the Gimp, and then generated the html. It made a 1000 page site for a server with no database to show hotel rooms for different units in a hotel chain. The code was somewhat similar to what would happen if you ran live off a db and used a template engine (I had HTML::Template). If you did the same probably it would probably take you a week or two and then about 5 minutes to run it. The nice thing was that you could slip new photos in and just run the thing again to rebuild the site. With all the Gimp photo windows opening up, highlighting being added and so on, it looked like an invisible photoshop operator was going nuts, so I called it Magic Hands.

    So it's a lot of fun and people will freak when you show it to them, but still I'd recommend getting a server that has cgi and db access. It isn't that expensive these days.

    If you already have a cgi based system running and you want to go through every state like you mentioned, then maybe WWW::Mechanize or even wget could be your pal.. well if I do find it I'll give you a copy but I think it's been consigned to the data dungeons!

Re: Generation of dynamically-static documents
by fmerges (Chaplain) on Sep 12, 2005 at 09:18 UTC

    Hi,

    You can also think about using Mason to generate Static Pages. You have the book online, and also a section commenting how to do it

    Regards,

    |fire| at irc.freenode.net
Re: Generation of dynamically-static documents
by abell (Chaplain) on Sep 12, 2005 at 14:15 UTC
    I would set up the dynamic version on my local PC and when needed I would make a static copy of it with wget or some other recursive downloader and upload it on the server.

    Cheers

    Antonio


    The stupider the astronaut, the easier it is to win the trip to Vega - A. Tucket
      Specifically, with wget (if you're on windows, grab a copy of cygwin) it's as simple as:
      date=`date +%Y%m%d` mkdir static_cut_$date (cd static_cut_$date && wget -m -nH http://localwebserver/)
      --
      @/=map{[/./g]}qw/.h_nJ Xapou cets krht ele_ r_ra/; map{y/X_/\n /;print}map{pop@$_}@/for@/