I have run a number of these 'monster' processes, for example last year I converted my complete MP3 collection to a different bitrate by resampling my FLAC files, the process took 3 weeks. My assumption is always that we will have a power-cut at the worst moment, so I normally control the process with a loop like (from memory):
my $results_dir = "results"; mkdir $results_dir,0755 if(!-d $results_dir); foreach my $next_step (list_steps()) { next if(-r "$results_dir/$next_step.done"); $res = do_step($next_step); my $fh = IO::File->new(">$results_dir/$next_step.done"); print $fh $res; $fh->close(); } print "All steps done\n";
Of course I normally add the date, some other interesting details to the results files and a lot more checking
In reply to Re: running Perl scripts in the background
by hawtin
in thread running Perl scripts in the background
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |