tigervamp has asked for the wisdom of the Perl Monks concerning the following question:

I am using the XBase module in a program that runs for long periods of time during which it opens alot of dbase files on a regular basis, usually gathering simple information such as record counts. There appears to be a significant amount of memory being leaked for each open/close operation. Here is an example that highlights the issue:
use strict; use XBase; my $dbname = "dbname.dbf"; while(1) { for my $i (1 .. 100) { (my $table = new XBase "${dbname}") || die; $table->close || die; } sleep(2); }
This quickly racks up the amount of RAM used, even though no operations are being performed on the table leading me to think that there is internal initialization data not being freed when the table is closed. I tried different approaches with different versions, OSes, etc. My question is this: Has anyone else experienced this with the XBase module or anything similiar, is this a memory leak or am I overlooking something obvious? As always I appreciate the input. Thanks,

tigervamp

Replies are listed 'Best First'.
Re: Memory Leak with XBase?
by tachyon (Chancellor) on Mar 18, 2004 at 01:44 UTC

    In our experience OO perl, running persistently 'leaks' ie you observe a consistent growth in memory usage over time despite the fact that there is not more internal data being stored by the widget. Interesting observations include when you 'de OOify' things the leaks tend to become much smaller or dissapear entirely, and calling it a leak is not entirely correct because if you push the system into swap these 'leaky' processes suddenly shrink to a small fraction of there size - unload the system so there is no swap and they stay small but slowly grow over time.

    Sometimes undef stuff helps. Rewriting it into a more functional style will probably work as well if it really matters. It will probably make no difference but in the close code you could make an undef change (XBase::Base::close())

    sub close { my $self = shift; $self->NullError(); if (not defined $self->{'fh'}) { $self->Error("Can't close file that is not opened\n"); return 0; } $self->{'fh'}->close(); # remove what should be the last ref to the IO::File # object dropped into self on the open undef $self->{'fh'}; # original code called delete like this, may as well stay delete $self->{'fh'}; 1; }

    cheers

    tachyon

      In our experience OO perl, running persistently 'leaks' ie you observe a consistent growth in memory usage over time despite the fact that there is not more internal data being stored by the widget.
      All that says is that whatever OO perl you're using creates circular references, which is a programmer error.

        The example code presented calls all of 4 functions. new() and open in XBase::Base and IO::File's open and close methods. Please show me the circular reference in this code.

        cheers

        tachyon

      In our experience OO perl, running persistently 'leaks' ie you observe a consistent growth in memory usage over time despite the fact that there is not more internal data being stored by the widget.

      I have come to realize that of late.

      and calling it a leak is not entirely correct because if you push the system into swap these 'leaky' processes suddenly shrink to a small fraction of there size - unload the system so there is no swap and they stay small but slowly grow over time.

      Well, on my system (linux) the above code will quickly (if the sleep function is removed) eat up 99% of the real memory and all of the swap, but continues to run.

      Sometimes undef stuff helps. Rewriting it into a more functional style will probably work as well if it really matters. It will probably make no difference but in the close code you could make an undef change

      You're right, it doesn't make a difference. But thanks for the advice, I don't have time to rewrite XBase unless I absolutely have to, which it looks like I will...

      tigervamp

Re: Memory Leak with XBase?
by zentara (Cardinal) on Mar 18, 2004 at 16:39 UTC
    I'm not familiar with Xbase, and I'm sure that Tachyon is right on. But I would like to make a general comment on your code, because I've been seeing this alot in my own trials.

    As a general rule, don't use "new" in a loop unless you want to keep them. For instance, in your code I would do something along the following:

    my $dbname = "dbname.dbf"; my $table = new XBase "${dbname}") || die; while(1) { for my $i (1 .. 100) { #pseudo code $table->empty; $table->read($_); $table->display; } sleep(2); } $table->empty;
    That way you are reusing your $table object, instead of wastefully creating new ones for each pass.

    I'm not really a human, but I play one on earth. flash japh
      Absolutely, but I can't do that here. The main issue here was that I needed to check for newly appended records very often and this can only be done by closing/opening the table with new to re-read the dbase header and access the new records. I was able to slightly modify the XBase.pm module to allow safely calling the read_header function on an open table whenever I like. Now I can just open the needed tables once at the begining of the program, store the table objects in a hash, and just call read_header to check for new records instead of closing/opening. This has solved my memory issues.