This is on Win2k/Activeperl 806 using a fast P4 with a 10k rpm hard drive and 1Gb ram so the read is probably all from the disk cache. Check your code and make sure you aren't opening the output file for every line. I've seen people do that and slow things to a crawl.use Time::HiRes qw/ gettimeofday /; use strict; my $starttime = gettimeofday; open OUTFILE, ">file.txt" or die $!; for (1 .. 554152) { print OUTFILE "X"x100, "\n"; } close OUTFILE; print "Creating file took ", gettimeofday - $starttime, " seconds\n"; $starttime = gettimeofday; open INFILE, "<file.txt" or die $!; open OUTFILE, ">file1.txt" or die $!; while (<INFILE>) { chomp; print OUTFILE join (",", /(.{,10})/g), "\n" } print "Splitting file using regexp took ", gettimeofday - $starttime, +" seconds\n"; __OUTPUT__ Creating file took 8.515625 seconds Splitting file using regexp took 1.5 seconds
--
flounder
In reply to Re: What's the most efficient way to write out many lines of data?
by flounder99
in thread What's the most efficient way to write out many lines of data?
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |