kripa0 has asked for the wisdom of the Perl Monks concerning the following question:

Hello all,

I need to process a 3GB text file, and create a 3GB output file. I use 5.005_02 on Solaris 5.7. I also tried out 5.6.0.

|| % unlimit || % perl -ne1 a.large.file || Can't open a.large.file: Value too large for defined data type. || % perl -e 'open FF, "a.large.file" or die; print 0 . <FF>;' || Died at -e line 1. || % || % perl -v | grep 5.6 || This is perl, v5.6.0 built for sun4-solaris || % perl -V | grep -i large || useperlio=undef d_sfio=undef uselargefiles=define || Compile-time options: USE_LARGE_FILES || %

I haven't dealt with largefiles before, and would appreciate any info on how to read from and write to them. Thanks.

peace,
--{kr.pA}
--
Thanks to the next, so is this one. This sentence is self-referential.

Replies are listed 'Best First'.
Re: Help: Writing to largefiles
by Snuggle (Friar) on Aug 28, 2002 at 16:58 UTC
    Yeah, with Solaris there are problems reading files over 2GB on older versions of Perl (5.00*), check out this node for more on that.

    As far as general handling of large files goes, I would like to know what you are doing with the file. Following TMTOWTDI, the way you do this could depend on what you are doing. If you are prcessing one large file one time to look for something you could do it one way and if you are doing a small process on a large file many times over there could be another way. How about a little more info on your application?

    Anyway, no drug, not even alcohol, causes the fundamental ills of society. If we're looking for the source of our troubles, we shouldn't test people for drugs, we should test them for stupidity, ignorance, greed and love of power.

    --P. J. O'Rourke
Re: Help: Writing to largefiles
by Anonymous Monk on Aug 28, 2002 at 23:07 UTC
    If your Perl is built without large file support, the best option is to recompile it with.

    If you don't have root access and cannot get an administrator to fix that though, then you can use the old kludge of opening pipes to cat to and from large files. (This works because cat understands large files and Perl understands endless pipes.)

Re: Help: Writing to largefiles
by bunnyman (Hermit) on Apr 25, 2006 at 15:02 UTC
    I just had this same problem. The answer turned out to be that Perl was incorrectly compiled without "-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64". I was able to see this in the output from perl -V, in the ccflags value. Once I recompiled with a better compiler, the Configure script was able to add these for me, then 64 bit file access worked right.
Re: Help: Writing to largefiles
by waswas-fng (Curate) on Aug 28, 2002 at 20:19 UTC
    The 5.6 version of perl should work, Solaris 7 (SunOS 5.7) has large file support and so does your perl. As far as how to proccess the file we really need more info on what you are trying to do, but if you want to keep memory usage kinda low you can always:
    open (FILE, "a.large.file") or die "cant open a.large.file: $!\n"; while (<FILE>) { #read line by line so as not to slurp all of the file + into memory. #do stuff to each line here }


    -Waswas
    PS: upgrade to 5.6.1 it has been much better for me that 5.6.0 was =)