targetsmart has asked for the wisdom of the Perl Monks concerning the following question:
My system has 4 GB RAM, 250GB HDD. Debian etch OS, PostgreSQL 8.1, perl 5.8.8, perl DBI 1.52
my query may return more than 3000000 records and it is build upon three tables using multiple joins and conditions.
when i execute this query from perl program(which uses perl DBI), it is taking more time to return the records, and when it returns the memory is completely eaten up by the perl process.
so I decided to get the records patch by patch continuously.
a patch is basically 1000.
I did a vast search on the net and found the following methods were useful
- using bind_columns (but more method calls are needed).
- using offset and limit in my query itself, (but internally the server may get all the records and move to the offset then return to us).
- using cursors of postgreSQL.( I don't know much about its efficiency).
I am confused now, what to use among the above choices?,
Does any other better ways exist? If so, please help me to get out of this problem.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: What are effective methods for retriving many number of rows from postgreSQL tables using perl DBI?
by moritz (Cardinal) on Oct 13, 2007 at 10:06 UTC | |
|
Re: What are effective methods for retriving many number of rows from postgreSQL tables using perl DBI?
by perrin (Chancellor) on Oct 13, 2007 at 15:15 UTC | |
|
Re: What are effective methods for retriving many number of rows from postgreSQL tables using perl DBI?
by duff (Parson) on Oct 13, 2007 at 13:46 UTC | |
| A reply falls below the community's threshold of quality. You may see it by logging in. |