Beefy Boxes and Bandwidth Generously Provided by pair Networks
Come for the quick hacks, stay for the epiphanies.
 
PerlMonks  

Re: Re: Memory leak with DBD::Sybase ?

by one4k4 (Hermit)
on Jun 05, 2003 at 18:00 UTC ( [id://263427]=note: print w/replies, xml ) Need Help??


in reply to Re: Memory leak with DBD::Sybase ?
in thread Memory leak with DBD::Sybase ?

If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

I currently use DBD::Sybase within mod_perl apps, but most of the time they are selecting rows from rather large tables vs inserting rows.

Have you tried some sort of BCP solution?

Just some thoughts..

One4k4 - perlmonks@poorheart.com (www.poorheart.com)

Replies are listed 'Best First'.
Re: Re: Re: Memory leak with DBD::Sybase ?
by hardburn (Abbot) on Jun 05, 2003 at 18:07 UTC

    If s?he's issuing a $dbh->disconnect; at the end of the say.. subroutine that does the inserts, it should commit the transactions. Issuing a $dbh->finish; (or was it $sth->finish; ?) before the disconnect may commit the records.

    It's $sth->finish;.

    Commiting the transaction wouldn't free the memory to the OS. It would go back into a pool that perl can use later. Fortunatly, on any OS with a sane VM implementation, the pages for that storage space would be swaped to the hard disk until perl needs to use them again.

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    Note: All code is untested, unless otherwise stated

      Indeed. Committing transactions and freeing up memory used by Perl are apples and oranges. Sybase has it's responsibility of dealing with it's own transaction-based memory usage and such. I think the problem lies in how to "unpage" that memory Perl just used instead of letting it sit ready to be used again.. Because it may not.

      Then again, if he's inserting 200 rows from a hash/array.. undef'ing the hash/array, then creating *another* hash/array with $x rows.. you'd think it'd use as much of the memory as it could from the previous datastructure.. The answer to this, I don't know.

      One4k4 - perlmonks@poorheart.com (www.poorheart.com)
      The records being processed are all approximately the same size (Sybase IMAGE type). If the memory goes back to the pool and gets reused, it doesn't explain the constant growth of memory.
        Does "constant growth" mean: The first time you process 200 records, the second 204, the third 100, the fourth 300?

        That'd tend to tell me that Perl is indeed reusing the memory and only growing when it needs to grow..

        Then again, you said "constant".

        On a related note, how can I watch the usage of memory during the execution of a script? "top" ordered by size isn't really what I think Anonymous Monk is doing..

        One4k4 - perlmonks@poorheart.com (www.poorheart.com)
        Are you processing the IMAGE columns with the ct_send_data() API, or with a normal insert?

        Michael

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://263427]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others meditating upon the Monastery: (3)
As of 2024-04-25 17:19 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found