dwinchell has asked for the wisdom of the Perl Monks concerning the following question:
Oh Perl Monks, I seek enlightenment.
I've been tasked to write a Perl script to download BLOB data from an Oracle database. Unfortunately, my client is storing compressed files in the database with sizes up to 2Gb! I can't change this - it's based on a proprietary knowledge management tool that they've pushed to its limits.
The immediate problem I see is that I just don't have the memory for a 2Gb buffer. As far as I know, DBI reads the whole field into memory behind the scenes. Is there a way to get a byte stream to the field and do it piecewise?
Is this, in general, doable? If not, we have a working Java solution, but we're trying to convert everything to Perl.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Fetching 2gb Oracle Blobs
by perrin (Chancellor) on Sep 12, 2008 at 19:21 UTC |