spiderbo has asked for the wisdom of the Perl Monks concerning the following question:

Hi, Is it possible to directly write to a file the contents of another file? Or do you need to read the contents of the first file into a variable and then write the variable to the second file? The size of the file that I am reading from (and then writing to a variable) is 21223. If you can do either, a) what is the syntax to read and write at the same time, b) which is the most economical?
  • Comment on Write contents of a file, whilst reading from another file

Replies are listed 'Best First'.
Re: Write contents of a file, whilst reading from another file
by jasonk (Parson) on Apr 01, 2003 at 15:55 UTC

    You can read and write as many files as you want at the same time (within the limits of your OS restrictions of course).

    open(FILE1, "<input_file"); open(FILE2, ">output_file"); while(<FILE1>) { do_stuff(); print FILE2; } close(FILE2); close(FILE1);

    In general if you can do it this way, it is better than reading the whole thing into memory (which will of course take a lot of memory for large files).


    We're not surrounded, we're in a target-rich environment!

      It'll be more efficent if you read the file in via blocks instead of line-by-line:

      while(my $in = read(FILE1, $buf, 4096)) { do_stuff(); print $buf; }

      Though it may not really matter, depending on the file size (the size given by the orginal post is small enough that you probably won't notice the difference in practice). Also, depending on what is put in do_stuff(), line-by-line might be more convieant.

      ----
      I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
      -- Schemer

      Note: All code is untested, unless otherwise stated

        hardburn,
        Just a minor clarifcation as your post is correct, but potentially misleading. You should check out Chromatic's reply to my post saying basically the same thing as you.

        Perl does not stop reading a file at a newline and have to go back out each time it encounters a newline. It reads to a buffer and each readline is taken from that buffer. The disk is not read again until the buffer is exhausted. You are showing one way of changing the buffer size. I used a different method $/ = \65536; Depending on what data munging is going on inbetween each readline and how big the file is can determine if you should use sysreads with a specified buffer as you have shown, or change the input record seperator $/, or use combination of both.

        The only reason I clarify here is because I can see some inexperienced monks think that while <FILE> {}; as being inefficient, prematurely optimize, and end up with a headache back here asking for advice.

        Cheers - L~R

Re: Write contents of a file, whilst reading from another file
by dga (Hermit) on Apr 01, 2003 at 16:00 UTC

    One way is to open 2 filehandles thus:

    use strict; open(my $reader, '<', 'file.to.read'); open(my $writer, '>', 'file.to.write); # will be truncated while(<$reader>) { # do desired mangling here print $writer "$_"; }

    This will copy file.to.read into file.to.write unchanged.

    If you do stuff in the while where the comment is before writing then print the variable you want into the new file you have a filter.

Re: Write contents of a file, whilst reading from another file
by artist (Parson) on Apr 01, 2003 at 22:24 UTC
    Do you mean to copy the file ? or tell why copying doesn't work here.
    use File::Copy; copy($old_file,$new_file);

    artist