in reply to Out of memory using chart::clicker

Hmm, I don't know anything about for Chart::Clicker, but it seems that you have a memory leak. When you do this:
for (my $x=0;$x<10000;$x++) { &mysub(\@x_axis, \@{$a_hash{abc}}); print "."; }
mysub will create 10000 objects, which should not be a problem so long as the objects have only a few hundreds of numerical data. But if your module stores also the generated graph, then you might be creating objects that are 100 kB large (or more). And then, you might run into a problem if the created objects are persistent and not released between two calls. I would suggest that you try to let each dataset fall out of scope between two calls. This way, presumably, Perl will release the memory each time and you should not have any trouble.

Passing lexical copies of the arrays, rather than the references, might be the way to go.

Replies are listed 'Best First'.
Re^2: Out of memory using chart::clicker
by Peterpion (Acolyte) on Feb 21, 2014 at 13:55 UTC
    Thanks, I think you hit the nub of my confusion here. Since mysub is a sub and I am calling it, when it returns I thought everything it created itself (like I assume a large bitmap of the graph in an array for instance) got released? I guess thats dependent on everything inside that sub being lexical. But if chart::clicker complies with use strict, surely it would not compile if it contained non lexical variables (unless there is a no strict in chart clicker somewhere but I can't find it if there is).

    So I added use strict to the test script and realised I didn't define $x and $y inside the sub as lexical, fixed that and rerun the test and got the same result (out of mem).

    I don't want to copy each array if possible as theres a lot of data to be copied (actually now I realise copying won't really help). I can see that there is a different way to code this by not originally using a large multidimensional hash to sort my data in the first place, but using hashes makes the coding much easier (at least with my present mindset - maybe if I had originally thought of the problem differently it would be fine) - but, surely it should be possible to use hashes in this way?

    Re letting the dataset fall out of scope, is there a way of doing this without copying the array each iteration? I don't need the array once the sub has finished with it. I tried undefing after the call to mysub (in the original code each array ref passed into mysub was a different array hash element IE @{$hash{x}{y{z}}) so I used
    $mysub(\@{$hash{x}{y{z}}); undef @{$hash{x}{y{z}};
    but that didn't seem to work but maybe my method was incorrect.

    I guess memory leak questions are common but are there any obvious things I am doing wrong here? Is this a leak in the chart module resulting in simply not being able to use it so many times in the same execution?

      Yes, you are most probably right, since you are calling Chart::Clicker in a sub, what you do there should have lexical scope and allocated memory should be redeemed when you exit the sub as the references to objects fall out of scope. So, I somewhat overlooked part of what you are doing in your program and,as far as I can say, you do not seem to be doing anything wrong. It would appear, then, that the memory leak (if any, but it definitely looks so) probably occurs in the module you are calling or possibly in another module called by it.

      The solution (or rather workaround), that I used a couple of times in the past in vaguely similar situations, might be to have your Perl program process, say, a couple of hundred of datasets and then die (using the exec function or maybe forking before dieing or possibly doing it in a shell loop or some other similar means, I just remember I tried various possibilities and I don't remember exactly which one worked, and I can't check the final program before returning to work on Monday) after having called itself with parameters for the next group of datasets, and so on until your are done. This approach is far from being pretty, but, it seems to work.

      Update :

      Actually, I had not yet seen it when I just wrote the above, but davido said it before me and I can only agree with him: yes, forking and suiciding immediately after seems to be a good solution temporary fix.

        yes, I temporarily set dog-nail chartClicker memory leak like
        `perl musub.pl arg1 arg2...`
        in parent perl script.
        The mysub.pl is a command line perl script wich works with $ARGV array. So 'arg' is serialized for command line notation array like
        $arg = join(',',@arg_array)
        witch may be converted back into mysub.pl to array like
        @arg1_array = split(/./,$arg1_scalar)
        for using
        ... values => \@arg_array ...
        for example.

        BUT It's so SLOW working, may be good idea is using '&' in
        `perl musub.pl arg1 arg2... &`