Usually, large resultsets scream "you're not using the server enough!". You may want to look over the operators that SQL offers you before you just use PostgreSQL as a data server. Since it's iterating over the records anyway, it's usually a good tradeoff to have the server do your aggregation work as well, so you can (hopefully) return smaller datasets.
Remember, with today's CPUs being so blindingly fast, you can do an awful lot of calculation in the same amount of time you can transmit a packet over the network. Obviously, I don't know just what sorts of operations you're performing on your data, so it may be inapplicable to your situation.
For example, if you're interested in the low, high and average quotes for various stocks on a selection of days, you could use:
select TickerName, convert(char(8),quoteDate,112) quoteDate, min(quote), max(quote), avg(quote) from quotes where quoteDate between @startPeriod and @endPeriod group by TickerName, convert(char(8),when,112) order by TickerName, convert(char(8),when,112)
So if you have a dozen quotes a day for each of the stocks of interest, this query would return far less data than if you would retrieve all the detail and compute the min/max/average in perl.
...roboticusIn reply to Re: PostgreSQL cursors with Perl.
by roboticus
in thread PostgreSQL cursors with Perl.
by atemerev
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |