in reply to Best way to read and process a text file with multiline records

Please suggest a fast way!
good that you showed your algorithm, but show some code you actually did , which needs improvement!.

Vivek
-- In accordance with the prarabdha of each, the One whose function it is to ordain makes each to act. What will not happen will never happen, whatever effort one may put forth. And what will happen will not fail to happen, however much one may seek to prevent it. This is certain. The part of wisdom therefore is to stay quiet.
  • Comment on Re: Best way to read and process a text file with multiline records

Replies are listed 'Best First'.
My Code
by tsk1979 (Scribe) on May 22, 2009 at 12:07 UTC
    Below I present some code which I have currently finalized.
    The way of reading multiline record using prevline, and prevline2 is not really original, and I saw a colleague of mine doing this.

    It looks very very simple and works, and I am currently writing the complete script with processing and all.
    while (<INDATA>) { chomp; my $linenum = $.; if (/^data_raw/) { $indata{INFILE_RAW}{$linenum}{VALUE} = (split(/\s+/,$prevl +ine))[-1]; $indata{INFILE_RAW}{$linenum}{GROUP} = (split(/\s+/,$prevl +ine2))[-1]; } $prevline2 = $prevline; $prevline = $_; }

    The main reason I do like this, because I am inputting multiple files here. The governing key for all those files will be the line number. In the end the user will be presented statistics like this
    Group name | Total Value ABC 500 DEF 800 . .

    Now another file I will input will just contain the line numbers of invalid data items. So user will be shown another table which will be same as above, but with invalids removed. Another table like the above will be shown which will be for the "invalids" file. In all this, the key is the line number which identifies what to subtract, the line number of the data_raw to be precise.