Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I have a large file (sometimes more than a million lines in them) wat i would like to do is to divide them into chunks in order to update the database which cant take all the million updates on one go. So lets say I divide them to 1000/100 and update to the database and check whether there is any problem in rolling them back and if not continue. I would like to get some suggestions on how to divide the files intochumks? Thanks Reva

Replies are listed 'Best First'.
Re: divide the large files to small
by toolic (Bishop) on Mar 10, 2010 at 17:19 UTC
Re: divide the large files to small
by JavaFan (Canon) on Mar 10, 2010 at 17:19 UTC
    I'd use the command line utility split. It's designed for that task - no need to reinvent the wheel.

      One CAUTION: I don't think the command line utility split exists on all OS's (e.g., I'm not aware of it on MS OSs).

      But it certainly is easy enough to write a Perl script to either do the equivalent of split or to use toolic's solution in his referenced post.

      ack Albuquerque, NM
        One CAUTION: I don't think the command line utility split exists on all OS's (e.g., I'm not aware of it on MS OSs).
        That depends what you mean by "existing". If you mean, "split" isn't available after installing MS OSses, you are probably right. But last time I looked, Perl didn't come with those OSses either. OTOH, I'd be extremely surprised if there wasn't a Windows port for split. All common Unix tools have been ported to Windows.