in reply to Slow file creator

You could create the file content in advance,
and save (possible) system calls.

my $fill_char = " "; my $string = $fill_char x $nr_of_bytes; open FW, ... ; print FW $string; close FW;
With this approach you do only one print,
open and close systemcall. As systemcalls can
take up much time, this would be lot faster.
You also give the perl compiler more chance of
optimizing by not forcing it to use a loop.

Also, if you are on a machine with a gigabyte
of ram, who cares about your script taking one
MB for it's job. If you're planning to use more
than 50 MB or so.. You should check out the tools
that have been build for this kind of job. See the
'dd' manpage if you are on a *NIX box.

Replies are listed 'Best First'.
Re: Re: Slow file creator
by exussum0 (Vicar) on Feb 07, 2004 at 23:40 UTC
    The printing is buffered anyhow, so it's not like a loop would kill perormance.

    Play that funky music white boy..
      But still doing a print call gives overhead,
      because the print buffer will be flushed much
      more often.
      use Benchmark::Timer; my $t = Benchmark::Timer->new(); my $size = 1000000; $t->start( 'string' ); open FW, "> temp.out"; print FW " " x $size; close FW; $t->stop( 'string' ); $t->start( 'loop' ); open FW, "> temp.out"; for ( 0..$size ) { print FW " " } close FW; $t->stop( 'loop' ); $t->report; __END__ Reports: 1 trial of string (17.594ms total) 1 trial of loop (733.321ms total)
      This clearly shows the loop decreases performance.

      Update: I figured that perl could create the string
      at compile time, so I put the size in a variable. But
      this yields thesame results. Perl still optimizes
      the string version in this exapmle. But that was
      also the point I made in the first reply.