in reply to Re: processing huge files
in thread processing huge files

after reading the perldoc for Tie::File it sounded like a good idea™ ... but it's not behaving per docs.

I'm trying:

tie my @array, 'Tie::File', $filename, mode => O_RDONLY; my $recs = @array; warn "we have $recs records to parse "; pop @array; foreach my $element ( @array ) { print "ok, we'd be updating $element \n"; }
the  tie statement is straight out of the perldoc.

i need to skip the first line during processing since it's a "header" line.

the error that comes back is:

we have 280579 records to parse at ./upd_res_data2.pl line 72, <FH> l +ine 280579. Couldn't write record: Bad file descriptor at /usr/lib/perl5/5.8.0/Tie +/File.pm line 665, <FH> line 280580.
and i don't think it *should* be writing, because it's (supposedly) opened read-only. any insight?

Replies are listed 'Best First'.
Re^3: processing huge files
by izut (Chaplain) on Aug 02, 2005 at 19:31 UTC
    I tried:

    file.txt

    # header 1 2 3 4 5 6
    test.pl
    use Tie::File; tie @f, 'Tie::File', "file.txt" or die $!; foreach (@f[1..$#f]) { print $_, "\n"; }
    Result is:
    1 2 3 4 5 6


    Igor S. Lopes - izut
    surrender to perl. your code, your rules.
      yes, that works fine, but if you try to add the  mode option, at least for me it ignored the RO nature and threw the error.
        The cause of this error is that you opened it as O_RDONLY. When you pop() the arrayref, Tie::File tries to remove that line from file too. Do not use pop() to ignore the first line. Iterate starting from the end of your header as my first post or use some regex to ignore unwanted lines


        Igor S. Lopes - izut
        surrender to perl. your code, your rules.