in reply to Re: Using threads to process multiple files
in thread Using threads to process multiple files
Note the indirect filehandle (it automatically closes when it goes out of scope) and the failure test on the open. If your file is very large, I believe (I could be wrong) that the for(split) construct will be abusive of memory. In that case, you could craft it as a streaming parser as:sub parseLines{ my $content = do { open my $in, '<', $_[0] or die "Open failed $_[0]: $!"; local $/; <$in>; }; my %hash; for (split /(?<=\n)/, $content) { next unless /^\@HWI/; my ($header) = split(/ /, $_); $hash{$header} = 1; } return \%hash; }
It's plausible that slurp mode alone wouldn't be enough to force sequential reads, in which case it might make sense to add a shared read lock.for ($content =~ /(.*\n?)/g) { my $line = $1; next unless $line =~ /^\@HWI/; my ($header) = split(/ /, $line); $hash{$header} = 1; }
#11929 First ask yourself `How would I do this without a computer?' Then have the computer do it the same way.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Using threads to process multiple files
by RichardK (Parson) on Jan 31, 2015 at 00:16 UTC |