Sandy has asked for the wisdom of the Perl Monks concerning the following question:

I am still struggling with memory management on my Solaris 2.7 machine (no weird stuff on the Solaris 2.8 machine).

When I run my program, which queries a database a huge number of times (yes, I use placeholders), I get different behaviour depending on wether or not I am using DProf.

What happens is this: When I use 'perl -d:DProf myprog', memory is used, but within manageable levels (24Meg), and a test program (not so many queries) takes less than 1 minute to complete. When I use 'myprog' by itself, the memory consumption increases to > 175Meg, and if it completes without inducing heavy paging, it takes about twice as long to execute.

I read documentation that came with DBI, and I tried using $dbh->finish to force some garbage collection, but there was no change.

As mentioned before, this behaviour is not noticeable under Solaris 2.8 (exactly same version of DBI, DBD and perl).

Has anybody else noticed a performance enhancement when using 'DProf'?

UPDATE:

I get the same behaviour if I just use the perl debugger and hit 'c'.

Sandy

UPDATE (copy of reply to BrowserUK)

Well here it is in all it's glory.

Yes it highlighted the source, but not the reason (sigh)

Obviously this is not a problem that many people will encounter. Is it worth raising a bug report. How does one raise a bug report.

The weird behaviour (DProf uses less resources) of the program is written in the code.

#!/usr/bin/perl use warnings; use strict; use DBI; use DBD::ODBC; # ============================================================ # sample to illustrate strange behaviour on solaris 2.7 # (does not occur on solaris 2.8) # (db names have been change to protect the innocent??) # # Using perl 5.8.2 # Using DBI 1.38 # Using DBD:ODBC 1.06 # Using Openlink ODBC driver (version 5.x) for Oracle 8 on Solaris 2.7 # ============================================================ # DEFINITION OF STRANGE BEHAVIOUR: # # when running this program via statement # # 'perl mem_sample.pl my_db my_name my_password' # # * total memory usage: 125 Meg # approx 1.75Meg per SECOND! # * total elapsed time: 73 sec # # # when running this program via statement # # 'perl -d:DProf mem_sample.pl my_db my_name my_password' # # * total memory usage: 5 Meg # * total elapsed time: 36 sec # # ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # this is the part that causes problems # - comment out 'use POSIX' # or 'use CWD' # or '...glob' # and the effect will go away. # # NOTE: Notice that 'Problem_Routine' is never actually # called. # ------------------------------------------------------------ use POSIX ":termios_h"; use Cwd 'abs_path'; sub Problem_Routine($\@;\@) { my $b_err_file; my @b_err_file = glob("./$b_err_file"); } # ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ # ------------------------------------------------------------ # globals # ------------------------------------------------------------ my %firsttable; # inputs my $oracle_db = shift; my $db_name = shift; my $username = shift || ""; my $password = shift || ""; my $start = `date`; # ------------------------------------------------------------ # Connect to Oracle. # ------------------------------------------------------------ my $dbh = DBI->connect ( "DBI:ODBC:$oracle_db", "$username", "$password" ) or die( "\n " . "===========================\n" . "Could not make connection to database:\n" . " $DBI::errstr\n" . "===========================\n" ); print "\nConnected to Oracle.\n"; # Die on any error in an embedded SQL statement (SLB) $dbh->{RaiseError} = 1; # ------------------------------------------------------------ # Define queries # ------------------------------------------------------------ # Define query for top level my $sql_statement = "SELECT col_one col_two\n" . "FROM $db_name.MYTABLE1 A\n"; my $top_level = $dbh->prepare($sql_statement); # Define query for second level $sql_statement = "SELECT b.col_three, b.col_four\n" . "FROM $db_name.MYTABLE1 A, $db_name.MYTABLE2 B\n" . "WHERE b.col_one = ? and a.col_one = b.col_one\n"; my $second_level = $dbh->prepare($sql_statement); # ------------------------------------------------------------ # execute top level queries (~15,000 rows) # ------------------------------------------------------------ my $rows_processed = 0; my $rows = []; my $max_size = 5000; my $ret = $max_size; $top_level->execute(); while ( ($ret == $max_size) && ($rows = $top_level->fetchall_arrayref(undef,$ret))) { $ret = scalar(@$rows); while (my $row = shift @$rows) { @firsttable{qw(label word)} = @$row; $rows_processed++; get_second_table_info(); } } print "\nQuery returned $rows_processed rows.\n\n"; print "Start: $start\nEnd: ",`date`,"\n"; exit 0; # ---------------------------------------------------- # get second table info # (on average < 2 rows per query: max 10) # ---------------------------------------------------- sub get_second_table_info { $second_level->execute($firsttable{label}); my @col_three = (); my @col_four = (); my $i=0; my $rows; $rows = $second_level->fetchall_arrayref(); while (my $row = shift(@$rows)) { ( $col_three[$i], $col_four[$i] ) = ($row->[0],$row->[1]); # ---> do some processing here } $i++; # ---> do some more processing here return; }

Replies are listed 'Best First'.
Re: Change in memory consumption when using debugger
by perrin (Chancellor) on Jun 30, 2004 at 18:39 UTC
    The most likelye explanations are that your code fails when run under the debugger or that you aren't running all of it when you enable the debugger.
      Highly unlikely given the prelimary tests that I have run, but not impossible. I have an exhaustive test suite which I will be running today, using perl -d:DProf myprog. If there are problems, they will show up.
Re: Change in memory consumption when using debugger
by BrowserUk (Patriarch) on Jun 30, 2004 at 18:50 UTC

    I don't think it will help much, but I have to say that what you describe is completely counter to my experience of using DProf. It usually increases memory consumption slightly and slows down processing considerably.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail
    "Memory, processor, disk in that order on the hardware side. Algorithm, algoritm, algorithm on the code side." - tachyon
      If my code wasn't proprietary and very sensitive, I would give it to the DBI crowd and see what they could make of it. Unfortunately, this is not to be.

      PS: The behaviour goes away if I don't call the execute function from DBI.

        The best I can suggest is that you try and reproduce the problem in a non-sensitive and cut-down test prog.

        If you succeed, you have something that you can show people and log a bug with. If your lucky, trying to reproduce the problem will highlight it source.


        Examine what is said, not who speaks.
        "Efficiency is intelligent laziness." -David Dunham
        "Think for yourself!" - Abigail
        "Memory, processor, disk in that order on the hardware side. Algorithm, algoritm, algorithm on the code side." - tachyon