If you make an edit near the middle that changes the length of a line, the rest of the file has to be rewritten. It takes time to write a GB of data, and nothing you can do will change that fact. If the files in question are on a shared drive this will be both slow and you'll want to be very careful about locking issues or else you could find yourself losing edits.
I'd strongly suggest looking at the file format and deciding whether you can find some kind of "filler" to even things out. For instance maybe the format allows for comments somewhere. That would let you rewrite the file once and then deal with it as a fixed record length format afterwards. Even with the complexity of having to sanity check that lines appear to start where they should (someone might edit by hand...), this would make your life amazingly easier.
If you can't do that, then any solution that you come up with will invariably suck. But it won't be your fault, it will be a result of the artificial limitations that you have to live under. Not that that will make you feel much better when people complain...
In reply to Re: How to get fast random access to a large file?
by tilly
in thread How to get fast random access to a large file?
by gothic_mallard
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |