in reply to FIle Seeking

well, I don't want to be critical, but this is really an inefficient way to go about it...and it's only going to get worse as your number of records increases. I would suggest moving you system to use a database of one flavor or another. There are several very good open source ones ( MySQL and PostgreSQL are two). What you seem to be doing is equivalent to a 'LIKE' operation in a database.

Asside from that. You're reading the entire contents of your datafile into memory and then doing stuff with matching rows. Depending on the length of each record, that could be a LOT of memory usage...and may be the source of your woes. Try this instead

use strict; # this will help you catch a LOT of errors my $tail = "some string"; open(INF,"data.txt") or die "Couldn't open data file: $!"; my $linenum=0; while (<INF>) { print "$linenum\n"; if ($_ =~ /|$tail/) { # do stuff } $linenum++; } close(INF);
you have the same basic functionality, but you're only placing one row into memory at a time.

HTH

/\/\averick
perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"