in reply to Re: Iterating through HUGE FILES
in thread Iterating through HUGE FILES

It is activestate's perl I think it was not compiled with that parameter. Thank you

Replies are listed 'Best First'.
Re^3: Iterating through HUGE FILES
by BrowserUk (Patriarch) on Feb 19, 2006 at 06:26 UTC

    Which version of AS Perl? It must be petty ancient as the last 7 or 8 version (at least) have been built with large file support. On win32 anyway. It's easy to forget that they also produce binaries for other OSs.

    If you cannot upgrade for any reason, then I second the idea of using a system utility to read the file and pipe it into your script. I'd probably do it using the 'piped open'. If you need to re-write the data, send it to stdout and redirect the output via the command line.

    die "You didn't redirect the output" if -t STDOUT; open BIGFILE, "cmd/c type \path\to\bigfile |" or die $!; while( <BIGFILE> ) { ## do stuff } close BIGFILE; __END__ script bigfile.dat > modified.dat

    Dying if STDOUT hasn't been re-directed is a touch that you'll appreciate after the first time you print a huge binary file to the console by accident. The bells! The bells! :)


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.