chrism01 has asked for the wisdom of the Perl Monks concerning the following question:
I'm selecting the contents of a table from a remote DB, tidying up the values, and inserting the results into a local table.
According to a mysqldump (data only) it should be 350MB of data.
However, when I try to load the remote data into an AoA, it hits the 4GB limit and crashes with 'Out of memory!'.
Why??
Core code
Any ideas?# Collect values while( @db_row = $sth->fetchrow_array()) { $data_rec = [$db_row[0], $db_row[1], $db_row[2], $db_row[3], $db_row[4], $db_row[5], $db_row[6], $db_row[7], $db_row[8] ]; # Add to list push(@my_arr, $data_rec); } Table def (sanitised): A int(4) unsigned B bigint(20) unsigned C char(2) D int(10) unsigned E int(10) unsigned F char(4) G tinyint(2) unsigned H int(10) unsigned I tinyint(2) Num rows: 7022790
Cheers
Chris
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Out of Memory selecting from MySQL
by BrowserUk (Patriarch) on Oct 12, 2007 at 05:12 UTC | |
Re: Out of Memory selecting from MySQL
by Somni (Friar) on Oct 12, 2007 at 05:43 UTC | |
by BrowserUk (Patriarch) on Oct 12, 2007 at 06:11 UTC | |
by Somni (Friar) on Oct 12, 2007 at 09:10 UTC | |
Re: Out of Memory selecting from MySQL
by chrism01 (Friar) on Oct 12, 2007 at 06:09 UTC | |
by BrowserUk (Patriarch) on Oct 12, 2007 at 06:56 UTC |