jrich has asked for the wisdom of the Perl Monks concerning the following question:

Getting error 79 (Value too large for defined data type) when trying to OPEN file in perl script on Solaris. When I originally got error, did some research and found I was hitting the 2GB limit and was using an old version of perl that was not built with largefile support...
Characteristics of this binary (from libperl): Built under solaris Compiled at Dec 22 1999 00:00:57 @INC: /usr/perl5/5.00503/sun4-solaris /usr/perl5/5.00503 /usr/perl5/site_perl/5.005/sun4-solaris /usr/perl5/site_perl/5.005
So I tried using a newer version of perl...
Compiler: cc='cc', ccflags ='-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64', optimize='-O', cppflags='' ccversion='WorkShop Compilers 4.2 30 Oct 1996 C 4.2', gccversion=' +', gccosandvers='' intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=4321 d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=1 +6 ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t', + lseeksize=4 alignbytes=8, prototype=define Linker and Libraries: ld='cc', ldflags =' ' libpth=/usr/lib /lib libs=-lsocket -lnsl -ldl -lm -lc perllibs=-lsocket -lnsl -ldl -lm -lc libc=/lib/libc.so, so=so, useshrplib=true, libperl=libperl.so gnulibc_version='' Dynamic Linking: dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags=' -R /op +t/perl/5.8.0/lib/sun4-solaris/CORE' cccdlflags='-KPIC', lddlflags='-G' Characteristics of this binary (from libperl): Compile-time options: USE_LARGE_FILES Built under solaris Compiled at Jan 14 2003 23:31:22 @INC: /opt/perl/5.8.0/lib/sun4-solaris /opt/perl/5.8.0/lib /opt/perl/5.8.0/lib/sun4-solaris /opt/perl/5.8.0/lib /opt/perl/5.8.0/lib
but I'm still getting the same error. Here is the truss output...
brk(0x00052A50) = 0 read(3, 0x000352EC, 8192) = 0 llseek(3, 0, SEEK_CUR) = 17947 close(3) = 0 getcontext(0xFFBEE970) open("main.sort", O_RDONLY) Err#79 EOVERFLOW Cannot open main feed file main.sort at john.pl line 306. write(2, " C a n n o t o p e n ".., 58) = 58 getcontext(0xFFBEE6A8) setcontext(0xFFBEE6A8) getcontext(0xFFBEE910) llseek(0, 0, SEEK_CUR) = 1798918 _exit(79)
Anybody have any ideas? Thanks.

Replies are listed 'Best First'.
Re: Problem opening large file
by dwm042 (Priest) on Jun 24, 2008 at 20:16 UTC
    Just as a FYI, there can be more than one kind of 2 GB limit in Solaris. Certain versions of vxfs won't handle large files unless they are explicitly set to do so.

Re: Problem opening large file
by pc88mxer (Vicar) on Jun 24, 2008 at 20:03 UTC
    Try opening the file with sysopen(). According to this page, that will open the file with the O_LARGEFILE option which should solve your problem.
Re: Problem opening large file
by jrich (Initiate) on Jun 24, 2008 at 21:55 UTC
    Ok I tried the sysopen call and it got the same error. I even tried explicitly using the O_LARGEFILE option like...
    sysopen(MAINFEED, $main_file, O_RDONLY | O_LARGEFILE) or die "Canno +t open main feed file $main_file";
    As for the file system, it is vxfs but it looks like largefile support is enabled...
    grep AR2 mnttab /dev/vx/dsk/optdg/AR2 /AR2 vxfs rw,suid,delaylog,largefiles,io +error=mwdisable,dev=449c138 1200983054 grep AR2 vfstab /dev/vx/dsk/optdg/AR2 /dev/vx/rdsk/optdg/AR2 /AR2 vxfs 1 + yes -
    Any other ideas? Thanks for the suggestions so far.
      If it's vxfs, the true test of how the file system is currently configured will be a command named fsadm (look in /opt/VRTSvxfs/sbin for it). As an example:

      ./fsadm /foo nolargefiles
Re: Problem opening large file
by Tanktalus (Canon) on Jun 25, 2008 at 22:31 UTC

    Hmmm... TMTOWTDI: opening files. There are a few methods here that may allow you to read from this file - even the first, most stupid example (i.e., mine) using FTP (assuming the ftp server doesn't have the same issue). Better, though, are the pipes from cat. (The entire thread still gives me chuckles...)