in reply to Mysterious slow down with large data set
I cannot figure out what would make the program slow down so much...
Unnecessary work:
foreach $w1 (sort(keys %kernel)){ ... foreach $w2 (sort(keys %kernel)) { ... } }
For every word in the hash, you sort all of the other words in the hash (even though you've already sorted them). Then you compare every word in the hash again, even those you've already compared.
Instead, sort the hash keys and assign them to an array. Then process the array one word at a time. Grab the first word, then compare it to the remaining words in the array. Then do the same for the second and so on. Note that the list of words to compare gets shorter with every word you process. This is exactly what you want.
Improve your skills with Modern Perl: the free book.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Mysterious slow down with large data set
by jsmagnuson (Acolyte) on Feb 26, 2012 at 23:52 UTC | |
by wwe (Friar) on Feb 27, 2012 at 14:29 UTC |