the for loop could be rewritten as:
my $path = "http://www.perlmonks.org/?node_id=1044;next=";
foreach my $i (0..20) {
my $url = $path . 15 * $i;
}
Going over 300 doesn't error out, it returns empty and prompts for a new node. So I'm not sure how you could do a while construct unless you checked the page length returned, and saw it was smaller than a page with listings.
You may prefer using the print format instead of standard layout by adding displaytype to your url
http://www.perlmonks.org/?displaytype=print;node_id=1044;next=100
As the others have said, you shouldn't pound on the server, and you really should use the is_success(get($url)) before concatenating to your file. And you're not going to get any of the "Read More..." text from these either.
But it's interesting :-) I read through all the CUFP when I first discovered Perl Monks.
Cheers