| [reply] |
Like I said before, we have tried this before and Perl has some serious efficiency issues with respect to PostGres. When you start getting large data sets back it will simply crash. Also since we noticed that the current PostGres CPAN modules were buggy we decided that instead of re-writing them in PERL we would just go ahead and do it in C.
Of course it all boils down to design preferences and what the Developers are comfortable with.
| [reply] |
| [reply] |
Perl exists as a built-in tool to postgreSQL, it's called pl/perl.
See
PL/Perl
documentation from postgres website
It's simply that using DBI to interface postgres to perl is NOT the best way; but perl integrates extremely tightly and efficiently to postgres. | [reply] |
I've been thinking of using Perl with PostgreSQL recently for a proejct. Can you tell me why DBD::Pg is any less efficient than DBD::mysql?
| [reply] |
I wouldn't say that. Actually postgreSQL is much more powerful than MySQL, so you'll more often hit the limits of DBI/DBD using postgres than Mysql...
| [reply] |
This issue recently came up on the mod_perl list. I brought the issue to the PostgreSQL irc channel and apparently libpq doesn't return rows individually as you might expect, instead loading the data into memory and then returning them one by one on a fetch basis IIRC. See this thread for a summary of the issue and a suggested solution using cursors in conjunction with DBI to handle retrievals of many rows from a PostgreSQL database.
| [reply] |