at least if possible to reduce the speed of processing large text dataReducing the speed of your processing of large data is very easy (and does not need any of the counterproductive advice given to you so far by other monks): just add calls to the sleep function. For example (untested code example, because I do not have your data):
Serious bench making would be needed, but this is likely to reduce the speed by a factor of about 10,000. If this not enough of an improvement, just change to a larger value the parameter passed to the sleep builtin.my (@new_dat) = (); foreach my $line (@loaded_data) #-- loop thru all data { chomp($line); my (%trec) = &line2rec($line); sleep 1; if ($trec{'active'}) { push(@new_dat, $line); } }
In reply to Re: Process large text data in array
by Laurent_R
in thread Process large text data in array
by hankcoder
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |