in reply to Largefile failure

sk suggested checking with ulimit

ettest:/usr/local/tmp $ulimit unlimited
Lying scum. As BrowserUK pointed out the failures are happening around the 4GB mark.

I wrote a bit of code to test seeking and reading:

#!/usr/bin/perl # try seek to get around my problems with 11G file # $have value derived from the total size of the splits I've extracted $have = 4912152576; open BIG, "<QAPI.0.log" or die "cannot open QAPI ... $@"; $ret = sysseek BIG, $have, 0; print STDERR "sysseeked to $ret \n"; $ret = sysread BIG, $data, 1024; print STDERR "sysread $ret\n"; print $data;
And that fails.
Folks, Thank You for the suggestions. I'm going to try to compress(1) the file. If that fails I'm going to scrub it and redo the process and regenerate the log as a series of smaller files. I can't take up any more of my employer's time on this one off problem.

Now to recreate it at home...

Be Appropriate && Follow Your Curiosity