in reply to Re^2: Read and write files concurrently in Perl
in thread Read and write files concurrently in Perl
Think of cassette tapes, if you record 3 songs and you want to add a new song at the beginning of the tape, you have to record the new song and then all of the next 3 songs over again. The disk works like that.
Far and away your best bet if these files aren't huge, is to open the file for read, slurp it into a memory array, do what want to that data, open a new file for write, write the data, close the file and then delete the old file and rename the new file to the old file's name.
Tie::File would do some grunt work for you, but this is what it essentially does and its not portable to Windows if that matters to you.
To be frank, you are in way over your head if you intend to write something that can insert things into an existing file by yourself. That is definitely not beginner stuff!
If the idea of reading the file into memory, working on it and writing it back out doesn't work, then this is time for using a Perl DB module or Tie::File.
Anyway it works this way because the hardware works that way.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^4: Read and write files concurrently in Perl
by sselva (Novice) on Aug 03, 2009 at 07:31 UTC |