siddheshsawant has asked for the wisdom of the Perl Monks concerning the following question:

Hello friends again...I am doing database project with help of perl and mysql. I am using DBIx and it's helping modules to perform the database operations like inserting in to database and all.Till now I was getting everything done correctly but since yesterday when I tried to run the script at the end of it's execution I used to an error like out of memory exception.The test log is as follows:

2010-03-30 10:18:35 Script started 2010-03-30 10:18:35 Date being processed: 2010_03_18 2010-03-30 10:18:35 Starting to gather data 2010-03-30 10:18:35 Gathering nightly data 2010-03-30 10:18:35 Reading data from http://some_web_site.com 2010-03-30 10:18:35 Creating an internal hash from the data 2010-03-30 12:32:10 Starting to load data into the database 2010-03-30 12:32:19 Inserting data to database

and in the output I am getting something like this :

[root@lcld0037 NightlyDB]# perl load_data_to_db.pl 2010_03_18 2010-03-30_10-18-35: Script started 2010-03-30_10-18-35: Date being processed: 2010_03_18 2010-03-30_10-18-35: Gathering nightly data . . . (287709)(562546)(562546)(562546)(562546)(562546)(275841)(476059)(47605 +9)(476059)(476059)(476059)(476059)(476059)(476059)(476059)(476059)(47 +6059)(476059)(476059)(476059)(476059)(476059)(476059)(476059)(476059) +(476059)(476059)(476059)(476059)(476059)(476059)(476059)(476059)(4760 +59)(476059)(233299)(339473)(339473)(649823)(649823)(649823)(649823)(6 +49823)(649823)(649823)(2203347)(2203347)(2203347)(2203347)(2203347)(2 +203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347 +)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203 +347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2203347)(2 +203347)(2203347)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735 +)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095 +735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2 +095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735)(2095735 +)(2095735)(2095735)....# some random numberkeep on generating (15867513)2010-03-30_12-32-19: Inserting data to database Out of memory!

I have not changed anything in the Database Insertion script .Because the same script ran properly 2 days earlier.What is the cause of getting such exception then? Kindly tell me cause of the such exception.Also what am I supposed to do in order to get rid of such exception? Thanks in advance !!!!!

updated part of question:

I used DBIx::Class and do insertion procedure as follows in any of the table:

sub _insert_test_run { my ($self, $schema, $test_run_data) = @_; my $nightly_web_page_test_run_id = $test_run_data->{test_run_id}; my $test_controller_id = $self->_get_test_controller($schema, $test_run_data->{controll +er}); my $test_run_configuration = undef; #TO BE ADDED LATER my $start_date = $test_run_data->{start_date}; my $end_date = $test_run_data->{end_date}; my $elapsed_minutes = $test_run_data->{elapsed_minutes}; my $log_link = $test_run_data->{log_link}; my $test_run_rs = undef; eval { $test_run_rs = $schema->resultset("TestRun")->create( { nightly_web_page_test_run_id=> $nightly_web_page_test_ +run_id, test_controller_id => $test_controller_id, test_run_configuration_id => $test_run_configuration +, start_date => $start_date, end_date => $end_date, elapsed_time => $elapsed_minutes, log_link => $log_link, } ) || die "Unable to insert data to table TestRun: $!"; }; if($@) { #warn(); $self->log_writer()->write_error_row("Unable to insert data to + table TestRun: $@"); return undef; } else { return $test_run_rs; } }

In above example I am inserting in to test_run table

Replies are listed 'Best First'.
Re: out of memory exception
by BrowserUk (Patriarch) on Mar 30, 2010 at 21:07 UTC

    It is very interesting that the date format in your log suddenly changed:

    2010-03-30 10:18:35 Script started 2010-03-30 10:18:35 Date being processed: 2010_03_18> 2010-03-30 10:18:35 Starting to gather data 2010-03-30 10:18:35 Gathering nightly data 2010-03-30 12:32:19 Inserting data to database 2010-03-30_10-18-35: Script started 2010-03-30_10-18-35: Date being processed: 2010_03_18 2010-03-30_10-18-35: Gathering nightly data . . . 2010-03-30_12-32-19: Inserting data to database

    Underscore instead of space; '-'s instead of ':'s; the disappearance of the '>' from the 'Date being processed' line. And this all starts right at the start of your script before you read your data. Somebody changed something in your logging. Look there first.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      It was a typo !!!!

        You should really copy&paste not retype such information. It would be interesting to see the real timestamps before and after the download. That might give an indication of how much more data was downloaded when it runs out of memory relative to when it doesn't.

        The obvious--though not necessarily feasible--solution would be to populate the database as you download the data, rather than accumulating it in a hash and then populating. LWP has a callback interface, see :content_cb that would allow you to get the data in chunks and store it on the fly, thereby avoiding the accumulation of data in memory.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
Re: out of memory exception
by Your Mother (Archbishop) on Mar 30, 2010 at 18:39 UTC

    Don't have an answer for you. I just want to point out that you keep saying DBIx when you mean DBIx::Class. They aren't the same--DBIx:: is a namespace with all kinds of unrelated things in it--and you'll get better answers if you give the right name. Many users, myself included, abbreviate it DBIC.

Re: out of memory exception
by roboticus (Chancellor) on Mar 30, 2010 at 20:27 UTC

    siddheshswawant:

    If you made no changes and it's crashing now, then I'd suspect that the amount of data you're working on this run is substantially larger than when it worked OK. If that's the case, you may want to split the workload into smaller chunks.

    ...roboticus

Re: out of memory exception
by shmem (Chancellor) on Mar 30, 2010 at 19:08 UTC

    Since in the snippet you posted, no numbers are printed out between parens, that must come from somewhere else.

    I bet it's from line 17.

      I didnot get you....My same code was working properly 2 days ago....Is there any issue with the RAM or its is due to something else??? please clarify...

        please clarify...

        I can't, since I know nothing neither about the environment of your script, nor do I know its purpose, and I don't know of any sample data, whether input or output, and there is no code except a snippet which surely isn't the culprit. With the information you provided all I can do is guessing. How would you make an educated guess from the piece of code you posted without knowing anything else?

        See I know what I mean. Why don't you?

Re: out of memory exception
by nikosv (Deacon) on Mar 30, 2010 at 19:20 UTC

    Maybe it's not related to the script but to the dbms itself.

    I have no Mysql experience but the concepts should be the same...

    Check the transaction log file used for undo/rollback blocks;maybe it has no room to accomodate for all those inserts.If that is the case then just increase it's size or disable logging.

    Check the dbms error log for clues

    Check the temporary working directory which is used by Mysql,do a simple df -k,maybe it is filled up.

      Has your schema changed in such a way as to get DBIx::Class to generate an infinite loop when you do an insert?
Re: out of memory exception
by Your Mother (Archbishop) on Mar 30, 2010 at 21:48 UTC

    Following up on what BrowserUK notes; I'm curious if you have any DateTime inflation in your schema.

    perl -le 'print 2010_03_18' 20100318

    DateTime to large years creates crazy processing demands. I actually doubt that's what's going on but...