Tanktalus has asked for the wisdom of the Perl Monks concerning the following question:
Being the resident computer geek in local charitable organisation, I have been, well, volunteered to maintain our web site. It's really quite a trivial undertaking, nothing fancy, because we're not even paying for a regular site or anything. We're just using the space afforded by my ISP - which does not, to my knowledge, allow for CGI of any sort. This is not a huge deal because there is no (current) need for forms or anything. Just that the content changes from time to time, and I'd like to generate the content as much as possible.
So what I've done in the past is to create an HTML template (yes, using HTML::Template) for the header, the footer, and then the content of each page. The content for each page is based on a number of CSVs (which I use DBD::CSV for) which I pull together as-needed. Then I would run another perl script to upload, via FTP, to the ISP's site. And it works. More or less.
On sites where I can use CGI, I just throw in CGI::Application, inherit from there, and I'm basically done. Here, I need some way to loop through all runmodes, grab the output from each, and upload them (along with any new non-generated data, such as PDFs, CSS, or other images).
What I have is more or less working. However, I'm wondering if any other monks have a better idea on how to handle this:
|
|---|