in reply to Re^2: Ugly variable-length record handling
in thread Ugly variable-length record handling

Kinda reminds me of an IRS "magnetic tape" file...

You're probably better off going with a process-as-you go model. I just usually go out of my way to avoid temp-files if I can. And I really like to go through a source file only once when possible.

When I suggested tell/seek, I was thinking you needed to see the end to find the start, but that probably ins't the case here so process-as-you-go is probably what you want.

-Paul

  • Comment on Re^3: Ugly variable-length record handling

Replies are listed 'Best First'.
Re^4: Ugly variable-length record handling
by Melly (Chaplain) on Dec 28, 2006 at 15:02 UTC

    Me to (with regards to temp files) - the trouble is that NOT using a temp file would (afaik) double my memory requirements to 2 times the largest temp file.

    Basically, the output_data routine steps through the temp-file line-by-line, pushing the data to various arrays, hashes, etc. - so not using a temp-file would mean that I end up allocating memory to the raw data as well as the various arrays and hashes which contain the processed data.

    map{$a=1-$_/10;map{$d=$a;$e=$b=$_/20-2;map{($d,$e)=(2*$d*$e+$a,$e**2 -$d**2+$b);$c=$d**2+$e**2>4?$d=8:_}1..50;print$c}0..59;print$/}0..20
    Tom Melly, pm@tomandlu.co.uk