in reply to Re: reading from a huge file
in thread reading from a huge file

Actually, unless your Perl is built with large file support (USE_LARGE_FILES in perl -V), you may not be able to deal with files larger than 4 GB. I think the OP's basically stuffed unless he can get another program to chop the file into smaller pieces, or install a version of Perl with large file support.

Replies are listed 'Best First'.
Re^3: reading from a huge file
by BrowserUk (Patriarch) on Mar 21, 2011 at 19:20 UTC
    I think the OP's basically stuffed ...

    This is FUD. Didn't you notice I said (>4GB)?

    Every version of Perl I've used on windows in the last 9+years has been built with USE_LARGE_FILES.

    Here is perl processing a 12GB/141 million line file using Active State 5.10.1 in 88 seconds:

    C:\test>dir dna.txt Volume in drive C has no label. Volume Serial Number is 8C78-4B42 Directory of C:\test 08/11/2010 16:35 12,831,000,000 dna.txt 1 File(s) 12,831,000,000 bytes 0 Dir(s) 281,938,862,080 bytes free C:\test>perl -nE"}{say $." dna.txt 141000000

    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      So you've always had Perls compiled with USE_LARGE_FILES. How is that relevant to the OP? And how is what I wrote "fear, uncertainty, and doubt?"
        How is that relevant to the OP?

        Because, by default, every windows perl build is configured with large file support. To build without it require manual intervention, and there is no good reason for doing so. And the vast majority of Perl installations on Windows are AS binary distributions that all come with large file support.

        So, whilst it isn't impossible for the OP to be using a perl build that has been deliberately crippled, the likelyhood is small.

        This:

        I think the OP's basically stuffed unless he can get another program to chop the file into smaller pieces,

        Is FUD because:

        • because it demonstrates a complete lack of understanding of Perl on Windows.
        • it assumes just about the most unlikely reason for the problem based upon no more information than the mere mention of "huge file".
        • even if your assumption as to the cause of the problem turns out to be a lucky correct guess, you offer the very worst solution to that problem.

        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
Re^3: reading from a huge file
by esddew (Initiate) on Mar 21, 2011 at 19:38 UTC
    Thanks, that's helpful. Perl wasn't even on this server when this all started, so I've no idea what the systems guys even put on it. As I said in the original message, I don't actually have access to the box myself, which is frustrating matters.

    For what it's worth, the error that the person running the script reported to me in email was: Unable to open input file UDB_sessions01252011

    That leads me to believe that the code is failing at this point:

    ## Open input file open(INFILE, "$infile") || die "unable to open input file $infile\n";
    The code as written is pretty darn elementary. I am assuming that my user got the name of the file she was trying to use correct, though I could double check that.

      Why would you not print the error message? (also, 3-arg open, lexical handle, yadda, etc)

      open my $fileHandle, '<', $infile or die "Unable to open $infile for read: $!\n";

        $! could be quite interesting to know. For all we know the OP's script may just not have permission to read the file.