ahoriuch has asked for the wisdom of the Perl Monks concerning the following question:

Monks, I am having trouble reading a large file > 2GB. Google search yields issues with newer versions of perl but no real solutions. It would be a pain to partition the file out into segments as I have many large files to parse. Any suggestions? v5.8.5

open(IN, "zcat %s $opt_file | ") or die "#ERROR> Could not read $op +t_file\n"; EXEC> ${opt_file}.gz: Value too large for defined data type

Thanks in advance!

Replies are listed 'Best First'.
Re: Reading large files
by BrowserUk (Patriarch) on Mar 12, 2015 at 19:47 UTC

    That isn't a "perl problem".

    Your zcat binary isn't built to handle large (>2GB) files.


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority". I'm with torvalds on this
    In the absence of evidence, opinion is indistinguishable from prejudice. Agile (and TDD) debunked
      You are absolutely right!!! I fixed the problem by switching to zgrep. Thanks!
Re: Reading large files
by pvaldes (Chaplain) on Mar 12, 2015 at 20:41 UTC

    I understand for your example that those are compressed files with extension .gz

    Untested, but you could try the module IO::Uncompress::Gunzip instead. Your gunzip version should be > 1.2.4 in any case.