I've been crafting workarounds which open new files every time you approach 2GB, but that's getting tiresome.You may want to look at my module File::LinearRaid which lets you access multiple (sequential) files seamlessly using a single filehandle. It was conceived to help seamlessly overcome OS filesize limiations (among other things).
One of the ideas I had with F::LR was that you could have an enormous logical file split into reasonably-sized physical files and use BigInts as (logical) seek offsets. Since the underlying (physical) seeks would still be "reasonably" sized, it should work.. in theory! Unfortunately, I'm still stumped as to how to test this out. In fact, what I just outlined may even work in the module's current state -- I just don't know.
Also, right now there is no mechanism to automatically grow the logical file, although there is a manual mechanism to append physical files to the big logical file.
Anyway, if you think this module could work for you, let me know. I'd be happy to hear your feedback and suggestions.
blokhead
In reply to Re^3: File Size limits
by blokhead
in thread File Size limits
by creamygoodness
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |