in reply to Re: processing huge files
in thread processing huge files
I'm trying:
the tie statement is straight out of the perldoc.tie my @array, 'Tie::File', $filename, mode => O_RDONLY; my $recs = @array; warn "we have $recs records to parse "; pop @array; foreach my $element ( @array ) { print "ok, we'd be updating $element \n"; }
i need to skip the first line during processing since it's a "header" line.
the error that comes back is:
and i don't think it *should* be writing, because it's (supposedly) opened read-only. any insight?we have 280579 records to parse at ./upd_res_data2.pl line 72, <FH> l +ine 280579. Couldn't write record: Bad file descriptor at /usr/lib/perl5/5.8.0/Tie +/File.pm line 665, <FH> line 280580.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: processing huge files
by izut (Chaplain) on Aug 02, 2005 at 19:31 UTC | |
by geektron (Curate) on Aug 02, 2005 at 21:44 UTC | |
by izut (Chaplain) on Aug 02, 2005 at 22:21 UTC |