I cannot figure out what would make the program slow down so much...
Unnecessary work:
foreach $w1 (sort(keys %kernel)){ ... foreach $w2 (sort(keys %kernel)) { ... } }
For every word in the hash, you sort all of the other words in the hash (even though you've already sorted them). Then you compare every word in the hash again, even those you've already compared.
Instead, sort the hash keys and assign them to an array. Then process the array one word at a time. Grab the first word, then compare it to the remaining words in the array. Then do the same for the second and so on. Note that the list of words to compare gets shorter with every word you process. This is exactly what you want.
Improve your skills with Modern Perl: the free book.
In reply to Re: Mysterious slow down with large data set
by chromatic
in thread Mysterious slow down with large data set
by jsmagnuson
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |