Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm using DBI/DBD::Sybase (Perl 5.8 Solaris 2.9 Sybase OCS 1250_8) to insert multiple (100s) of records to a sybase database in a while loop and it appears that the memory consumption of the Perl process keeps on growing (as watched by the prstat or top utilities). Variations on how the inserts are done (e.g., stored procedures vs. direct sql vs selects within inserts) affect the rate of growth - but the growth is there nevetheless. This is true for older versions of DBD::Sybase (0.94 and 0.95) but also for the latest one (1.0). This seems to point at the possibility of a memory leak in DBD::Sybase since all Perl variables are local to the while loop and the database handle is destroyed at the end of the loop. Search on the Internet indicates that people used to have some problems with that in the past (e.g.,2 years ago) and some partial modifications to the C code of teh library have been posted but it would seem that the problem should be fixed by now. Any thoughts ?

Replies are listed 'Best First'.
Re: Memory leak with DBD::Sybase ?
by mpeppler (Vicar) on Jun 05, 2003 at 19:02 UTC
    There is a known leak if you close/open connections a lot. So if you have coded your loop to open a connection, insert a row and then close the connection then you are definitely going to see that leak.

    You should try to avoid opening and closing connections any more than necessary.

    Michael

      You should try to avoid opening and closing connections any more than necessary.
      This is good advice, regardless of whether there's a memory leak or not. Opening and closing connections can be very time-consuming, particularly if the database is remote.

Re: Memory leak with DBD::Sybase ?
by waswas-fng (Curate) on Jun 05, 2003 at 17:22 UTC
    loop and the database handle is destroyed at the end of the loop

    Not really they are still saved in perl's memory for future use.. Perl does not really garbage collect -- you will see that any var that is assigned and then set to undef will actuall stay resident even after its undefed -- perl will then use that same memory when it can for future vars.

    -Waswas
Re: Memory leak with DBD::Sybase ?
by hardburn (Abbot) on Jun 05, 2003 at 15:55 UTC

    Are you using transactions? If so, it might be saving all the data in memory until you tell it to commit. Depends largely on how transactions are implemented on the underlieing database and DBD module.

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    Note: All code is untested, unless otherwise stated

      If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

      I currently use DBD::Sybase within mod_perl apps, but most of the time they are selecting rows from rather large tables vs inserting rows.

      Have you tried some sort of BCP solution?

      Just some thoughts..

      One4k4 - perlmonks@poorheart.com (www.poorheart.com)

        If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

        It's $sth->finish;.

        Commiting the transaction wouldn't free the memory to the OS. It would go back into a pool that perl can use later. Fortunatly, on any OS with a sane VM implementation, the pages for that storage space would be swaped to the hard disk until perl needs to use them again.

        ----
        I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
        -- Schemer

        Note: All code is untested, unless otherwise stated