in reply to Largefile failure
sk suggested checking with ulimit
Lying scum. As BrowserUK pointed out the failures are happening around the 4GB mark.ettest:/usr/local/tmp $ulimit unlimited
I wrote a bit of code to test seeking and reading:
And that fails.#!/usr/bin/perl # try seek to get around my problems with 11G file # $have value derived from the total size of the splits I've extracted $have = 4912152576; open BIG, "<QAPI.0.log" or die "cannot open QAPI ... $@"; $ret = sysseek BIG, $have, 0; print STDERR "sysseeked to $ret \n"; $ret = sysread BIG, $data, 1024; print STDERR "sysread $ret\n"; print $data;
Now to recreate it at home...
|
|---|