rdww has asked for the wisdom of the Perl Monks concerning the following question:

I am trying to stat a file that is 4.9GB in size and it fails. I tried different test operators and stat() itself. I then realized the file was 4.9 gb so I created a small C program to stat the file and the stat function failed with EOVERFLOW errono. I modified the c code to use stat64 and it works appropriately. Is there a stat64 function in perl or a work around?
version of perl is 5.005_03, Operating system is Sun Solaris 2.6/sun4u arcitecture. mentioned file is a file on a NFS server.
Thanks Robert Walkup rdww@ti.com

Replies are listed 'Best First'.
Re: stat and a 2GB+ file
by Abigail-II (Bishop) on Jun 13, 2002 at 14:14 UTC
    You need to have a perl that has "large file" support. Perl 5.6.0 and upwards tries to detect whether your system supports large files, and will enable them (when compiling perl) by default. It might be that you also need 64 bit integer support to be able to report large file sizes, but I'm not sure about that.

    You'll need to upgrade your perl.

    Abigail

(wil) Re: stat and a 2GB+ file
by wil (Priest) on Jun 13, 2002 at 13:52 UTC
    I believe this is a know bug in the way Perl handles large (2 gig and over) files. Upgrading to Perl 5.6 (or greater) should fix this, as I believe most system calls in Perl 5.6 switched to 64-bit integers system calls.

    Hope this helps.

    - wil
Re: stat and a 2GB+ file
by Snuggle (Friar) on Jun 13, 2002 at 14:14 UTC
    Yeah, I have definately heard of this bug before, problems always seem to arise around 2GB and this seems to happen exclusively to Solaris users with older perl versions. An upgrade is definately in order.
    Anyway, no drug, not even alcohol, causes the fundamental ills of society. If we're looking for the source of our troubles, we shouldn't test people for drugs, we should test them for stupidity, ignorance, greed and love of power.

    --P. J. O'Rourke
      Your exclusively is a guess that you made up. Why are you presenting it as if it was authoritative?

      Here is my guess about why you might believe that.

      Solaris has had 64-bit support for a long time, people don't upgrade Perl rapidly there, and people often use Solaris for dealing with large amounts of data. So this combination arises relatively often.

      It does not happen as much on Windows because most people don't throw around files over 2GB on Windows, and those who do usually are running a relatively recent system. It does not happen as much on Linux because Linux added support for files over 2 GB relatively recently, so if your kernel is recent enough to handle the files, then you probably have a recent version of Perl installed at near the same time. It does not happen as much on FreeBSD because there are fewer FreeBSD people.

      However on any OS, if you have Perl without large file support dealing with large files, it will not work.