in reply to Problem truncating a huge file

There are several possible issues. Try creating a smaller file (eg 500 MB) and truncate it to something smaller still (eg 300 MB). If that works then you've got a 32-bit issue.

Your next step should be to run the following program:

use Config; print "$Config{uselargefiles\n";
If that says that uselargefiles should be defined, then you've found a Perl bug and should file a perlbug with explanation and sample code to produce it. (I would suggest having sample code that writes the file that you then fail to truncate.)

If uselargefiles is not defined then it would be best to install a version of Perl that can handle large files. If you want a workaround you should rename the large file to something else, and then do a read/write loop to write a copy that stops at the desired size. Then delete the original. Be warned that this utility will not be as fast as the native function.

Replies are listed 'Best First'.
Re^2: Problem truncating a huge file
by abhishek_akj (Initiate) on Aug 27, 2009 at 11:43 UTC
    I dont think size of the file is an issue over here, as the same script is reading and writing to the file perfectly. It is just truncate which throws me this error.
      The file is large enough that 32-bit interfaces are going to have trouble. When you call truncate you're hitting a different C function than normal reading/writing hits, and it is seldom used. Is it really beyond the bounds of possibility that this case got missed?

      Given that it is easy to test it is at least worth testing the possibility, even if you think the theory is unlikely.