jdlev has asked for the wisdom of the Perl Monks concerning the following question:
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Perl Program - Out of memory?
by BrowserUk (Patriarch) on Nov 15, 2013 at 19:44 UTC | |
It is impossible to suggest anything or even speculate without seeing the code. With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] |
by jdlev (Scribe) on Nov 15, 2013 at 20:07 UTC | |
I love it when a program comes together - jdhannibal
| [reply] [d/l] |
by educated_foo (Vicar) on Nov 15, 2013 at 23:10 UTC | |
| [reply] |
by Anonymous Monk on Nov 15, 2013 at 21:41 UTC | |
A couple of more obvious errors were fixed, too. Please check that the code still behaves the same. You really ought to use warnings at the very least; it might provide pertinent clues to the issue. Read more... (9 kB) | [reply] [d/l] [select] |
by jdlev (Scribe) on Nov 15, 2013 at 22:11 UTC | |
by Anonymous Monk on Nov 15, 2013 at 21:05 UTC | |
This shouldn't affect anything memory-related, but FL1s is missing a dollar sign there. | [reply] [d/l] [select] |
by Anonymous Monk on Nov 16, 2013 at 15:27 UTC | |
Your code is unreadable and you've been begged to reduce indentation, but you didn't react. | [reply] |
by taint (Chaplain) on Nov 15, 2013 at 20:29 UTC | |
Like I said, a wild guess. Glancing at it here. That was all that jumped out at me HTH --Chris #!/usr/bin/perl -Tw use Perl::Always or die; my $perl_version = (5.12.5); print $perl_version; | [reply] |
by jdlev (Scribe) on Nov 15, 2013 at 20:58 UTC | |
by BrowserUk (Patriarch) on Nov 16, 2013 at 15:12 UTC | |
by taint (Chaplain) on Nov 15, 2013 at 21:13 UTC | |
|
Re: Perl Program - Out of memory?
by BrowserUk (Patriarch) on Nov 15, 2013 at 20:39 UTC | |
Please post the output from perl -V With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] [d/l] |
|
Re: Perl Program - Out of memory?
by Laurent_R (Canon) on Nov 15, 2013 at 19:50 UTC | |
| [reply] |
|
Re: Perl Program - Out of memory?
by locked_user sundialsvc4 (Abbot) on Nov 15, 2013 at 23:47 UTC | |
A few things superficially come to mind: Pursuing the last thought, I frankly do suspect that a lot of this “nested loop” logic could, indeed be expressed as a query which might, indeed, produce many thousands of rows as a “cross-product” between several smaller constituent tables. (“And so what... that’s what SQL servers do for a living...”) But this might then serve to rather-drastically reduce the complexity and memory-footprint of your code, which now only has to consume a record-set that is presented to it. Note also that “SQL” doesn’t have to imply “a server.” The SQLite database system, for instance, is built on single-files, and it runs quite nicely on everything from mainframes to cell-phones. My essential notion here is that maybe you can shove “all that data” out of (virtual...) memory, and into file(s). | |
by Laurent_R (Canon) on Nov 16, 2013 at 12:16 UTC | |
Yes, it is also not clear to me whether sorting the hash keys is necessary, but if it is necessary, you might consider sorting the keys before entering the nested loops and storing them into an array (and walking through the arrays containing the sorted keys rather than the hash keys). I do not know whether it will really reduce memory usage sufficiently, but it will certainly reduce considerably the run time. The hash keys used in the most inner loops might be sorted millions or possibly even billions of times, this is a huge waste of CPU power, as your program is likely to spend most of its running time sorting again and again the same data. Now, to restate, this will certainly also save some memory, but I have no idea whether this will be enough to solve your memory problem. One additional point: how many elements do you have in each of your hashes? | [reply] |
|
Re: Perl Program - Out of memory?
by Laurent_R (Canon) on Nov 16, 2013 at 13:37 UTC | |
Hi, Further to my idea of presorting the hash keys described in Re^2: Perl Program - Out of memory? (rather that sorting them again and again), I would suggest that you try the following modified program. Read more... (10 kB)
I made the following changes: presorting the hash keys into arrays and using the arrays of keys to walk though the data, added my statements to declare your variables so that you can enable warnings and strictures (I might have missed some, I can't really test or compile). Three additional points: (1) I do not see the point of using a delay in your most inner loop, I think you can remove it. (2) The fact that you did not give lexical scope to your $key loop variable is probably a major bug in your program, because I think that the $key variable at one level of nesting gets overwritten by the last value taken by $key within the next nested loop (not entirely sure though, I have not tried in ages to use such loop variable without lexically scoping it, maybe Perl can manage this correctly, but I would be surprised). OTOH, it may have no impact because starting the inner loop is the last thing you do each time, but this is poor and dangerous design. Anyway, this should no be the case with my version above, where each $key variable is lexically scoped within its loop. (But, again, I can't test anything.) (3) Finally, the $salary variable used at the end of the program seems to be coming from nowhere. Using warning and strictures would detect this, just as the fact that you are using $maxPoint and $maxPoints. | [reply] [d/l] [select] |