in reply to working with mulitple files
Are you asking how to do this with up to 20 files at once?
If so, I would agree with the reply that suggested you do it without File::Find (or at least, separate the problem of finding the files from the problem of how to process them all). If all the files happen to be in the same directory, don't use File::Find at all -- either of the following methods would be easier/quicker/more maintainable:
Only use File::Find if you have to traverse multiple levels of a directory tree to find all the files you want, and in that case, use it only to create an array of file paths to be used for input. A separate function, not part of the File::Find process, can then do what needs to be done with the array of selected files.my $folder = "/usr28/users/mpcamp/ZDOCK"; chdir $folder or die "chdir $folder failed: $!"; # method 1: use a glob: @files = grep /^com\d+$/, <com*>; # OR method 2: use opendir/readdir: opendir( D, "." ); @files = grep /^com\d+$/, readdir( D ); closedir D;
BTW, if your file names really are "com1" ... "com20", be aware that a regex like "/com1/" (in the sub that you passed to "find()"), will match 11 of those names (as well as names like "noncom12345"). Read up on perlre.
Now, if you have 20 files that actually contain correlated lines of data that you need to integrate in some way, there are different approaches for handling this, but the choice may depend on things like "how big are the files?", "do you really need to have all 20 first lines, all 20 second lines, etc, to do your work?" and "what are you really trying to accomplish in terms of output?"
|
|---|