jaa has asked for the wisdom of the Perl Monks concerning the following question:
For example something that can tell me how many scalars, and estimate their data size + Perl overheads on a per-package basis in bytes?
Or even better something that can take a nested data structure and summarise how much data + overhead (hash buckets, key arrays?) is being used at a specified depth?
Or code analysis tools that let me know about long lived large data structures that are not referenced again?
Does releasing a hash make the recovered memory available to subsequent Perl structures?
Are there any tips on efficient memory usage? I am not programming in a web environment, but rather a large commercial data processing environment. Current processes sometimes suck up to 1.5G RAM when performing set operations. They run ok, but we would like to increase our capacity to process larger sets.
Seeking wisdom, and offering thanks in advance,
Jeff
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: How do I dumping Perls memory usage?
by broquaint (Abbot) on Feb 24, 2003 at 12:02 UTC | |
|
Re: How do I dumping Perls memory usage?
by BrowserUk (Patriarch) on Feb 24, 2003 at 13:40 UTC | |
|
Re: How do I dumping Perls memory usage?
by tachyon (Chancellor) on Feb 24, 2003 at 12:04 UTC | |
|
Re: How do I dumping Perls memory usage?
by davorg (Chancellor) on Feb 24, 2003 at 11:41 UTC |