in reply to Memory leak with DBD::Sybase ?

Are you using transactions? If so, it might be saving all the data in memory until you tell it to commit. Depends largely on how transactions are implemented on the underlieing database and DBD module.

----
I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
-- Schemer

Note: All code is untested, unless otherwise stated

Replies are listed 'Best First'.
Re: Re: Memory leak with DBD::Sybase ?
by one4k4 (Hermit) on Jun 05, 2003 at 18:00 UTC
    If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

    I currently use DBD::Sybase within mod_perl apps, but most of the time they are selecting rows from rather large tables vs inserting rows.

    Have you tried some sort of BCP solution?

    Just some thoughts..

    One4k4 - perlmonks@poorheart.com (www.poorheart.com)

      If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

      It's $sth->finish;.

      Commiting the transaction wouldn't free the memory to the OS. It would go back into a pool that perl can use later. Fortunatly, on any OS with a sane VM implementation, the pages for that storage space would be swaped to the hard disk until perl needs to use them again.

      ----
      I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
      -- Schemer

      Note: All code is untested, unless otherwise stated

        Indeed. Committing transactions and freeing up memory used by Perl are apples and oranges. Sybase has it's responsibility of dealing with it's own transaction-based memory usage and such. I think the problem lies in how to "unpage" that memory Perl just used instead of letting it sit ready to be used again.. Because it may not.

        Then again, if he's inserting 200 rows from a hash/array.. undef'ing the hash/array, then creating *another* hash/array with $x rows.. you'd think it'd use as much of the memory as it could from the previous datastructure.. The answer to this, I don't know.

        One4k4 - perlmonks@poorheart.com (www.poorheart.com)
        The records being processed are all approximately the same size (Sybase IMAGE type). If the memory goes back to the pool and gets reused, it doesn't explain the constant growth of memory.