in reply to Readdir against large number of files

I also fail to see obvious memory leak issues, so maybe it's time for some paranoia:
  1. use strict; use warnings;
  2. protect against pathologic filenames and use the 3 argument form of
    open(UNMODIFIED,"<","$processedDirPath/$matchingFiles")
    (good style, but unlikely to be the problem)
  3. my bet: add a size check like
    -f "$processedDirPath/$matchingFiles" and -s _ < 40960 and do{warn "size or type problem with $processedDirPath/$matchingFile +s\n"; next};
  4. idle curiousity: there won't ever occur further STRUC20 alone or as part of something else after the first occurance?
cu & HTH, Peter -- hints may be untested unless stated otherwise; use with caution & understanding.

Update:

Replies are listed 'Best First'.
Re^2: Readdir against large number of files
by learningperl01 (Beadle) on Oct 28, 2009 at 17:53 UTC
    Thanks for the help. I gave your code a try but it is only printing the line DATA. What I was looking to do is print all lines in the file after STRUC20. And also all files online contain the line STRUC20 once and they will all contain that line.
    Thanks again for all the help.

    Example file contents of file1.rtsd001
    date time blah blah test test STRUC20 #Code should look for this line and print everything after it +. need lines need lines print these lines