tsk1979 has asked for the wisdom of the Perl Monks concerning the following question:
I want the output file to containSome garbage More garbage data -start <some string> \ -intermediate <some string> \ -intermadiate <some string> \ . . -end <some string> Some garbage More garbage data -start <some string> \ -intermediate <some string> \ -intermadiate <some string> \ . . -end <some string> Some garbage More garbage data -start <some string> \ -intermediate <some string> \ -intermadiate <some string> \ . . -end <some string> . . .
The catch? After removing intermediates, there will be lots of duplicates, which I want to remove. In my current flow, I read in the file, write out an array, and then unique the array 2 pass process seems to be a waste of time. If I can get a one pass algo, it will be great!data -start <string> -end <string> data -start <string> -end <string> data -start <string> -end <string> . . .
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Sorting and subsituting a data file, one pass
by ikegami (Patriarch) on Jun 21, 2010 at 06:05 UTC | |
by tsk1979 (Scribe) on Jun 21, 2010 at 09:38 UTC | |
by Anonymous Monk on Jun 21, 2010 at 10:42 UTC | |
|
Re: Sorting and subsituting a data file, one pass
by CountZero (Bishop) on Jun 21, 2010 at 06:27 UTC |