Your question is a bit vague, so I assume that you have all the articles of the Big Boss as text files. For the lowest resistance solution, I would do the task like this :
- All articles get named yyyy-mm-dd-Title_with_spaces_converted_to_underlines.txt
- A Perl script runs nightly (or on demand), and recreates the HTML page by reading the directory of the articles. Alphabetical sorting (in reverse order) automatically creates the order you want (no, you don't want to sort on file creation/modification time!).
Here's some really simple code to get you started :
use strict;
my $article_directory = $ARGV[0] || ".";
my $base_url = $ARGV[1] || "http://your.boss.net/BigBoss/articles/";
my @articles;
print "Reading articles from $article_directory";
opendir DIR, $article_directory or die "Couldn't find $article_directo
+ry : $!\n";
@articles = reverse sort grep /\.txt$/ readdir DIR;
closedir DIR;
print "$#articles articles found.\n";
# Now we have all articles in order. Let's print them out :
open HTML, "> $article_directory/index.html" or die "Couldn't create i
+ndex.html in $article_directory : $!"
# Change the HTML to your taste
print HTML "<html><body>"
my $article;
foreach $article (@articles) {
print ".";
my $title = $article;
my $date = "No date given";
if ($article =~ /^(\d{4})-(\d{2})-(\d{2})-(.*).txt$/) {
$date = "$3.$2.$1";
$title = $4;
$title =~ tr/_/ /;
};
print HTML "$date : <a href='$base_url$article'>$title</a>\n"
};
print HTML "</body></html>
close HTML;
perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The
$d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider
($c = $d->accept())->get_request(); $c->send_response( new #in the
HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
|