in reply to split large CSV file >9.1MB into files of equal size that can be opened in excel

You have a choice when splitting a file; split on its physical size, or split on its logical structure. If you wish to treat each half as its own unit, you have to split logically. Splitting blindly on size only works if you intend to reassemble them later.

To split on logical units that closely approximate physical size, read half the physical file size, then continue forward until you hit the next logical break, and split there.

If you choose to read half, then continue forward, you can either continue byte by byte until you find a record separator. That's not the most efficient approach though. It's often more efficient to read in, say, 4kb or 8kb segments, and then split those segments on the first record separator found. Once you perform that split, append the left-hand of the split to the first document, and prepend the right-hand of the split to the second document. Then read the entire remainder into the 2nd document.


Dave

  • Comment on Re: split large CSV file >9.1MB into files of equal size that can be opened in excel

Replies are listed 'Best First'.
Re^2: split large CSV file >9.1MB into files of equal size that can be opened in excel
by pryrt (Abbot) on Sep 28, 2016 at 23:52 UTC

    This would allow chunking my long line scenario more equally. But it's a matter of whether it's tolerable to the OP to have an input record (line) split across multiple files: if it's tolerable, follow ++davido's advice; otherwise, you will be stuck with potential size imbalances in the output.

    And if you weren't expecting a size imbalance in the lines, you need to look at whatever's generating all_tags2.csv, and figure out why one line is significantly longer than the others.

      Correct, I think. If I was unclear on the tradeoffs I did intend to imply that splitting on a delimiter, even if it's a delimiter near the middle of the file, will almost never result in both sides of the split being of identical size. If you're forced to split into identical sizes on a file where the physical middle is not guaranteed to fall on a record boundary, then you must necessarily lose record-oriented semantics, or deal with the record that spans the physical middle of the file being broken.

      I suspect that in this individual's case, it's a reasonable tradeoff for each file to be approximately equal in size to the extent that retaining logical record integrity permits. That is, records are more important than exact size.


      Dave