As pointed out above by kyle, the DBI man page describes how to make sure that you can read a large CLOB field. The only remaining question is: how much RAM do you have on the machine that will be running your script, and how many other things might be going on at the same time to compete for active RAM?
In other words, what happens when you try will be determined by the machine and other factors, rather than by Perl or DBI. The worst that can happen is that once the data gets loaded into process memory from the database table, everything on the machine will be slowed down to crawl because of swapping virtual memory back and forth between RAM and disk.
If your machine lacks enough RAM to carry the load (which seems doubtful these days -- lots of folks have 1GB RAM or better), and if you have some other method that is known to work for extracting these huge fields, you might want to have your perl script execute that other method via a system() call, in such a way that the data field is stored directly to a local disk file.
Then the perl script can work from the disk file (assuming you can do what needs to be done without the entire clob unit being held in memory at one time -- e.g. if the clob data can be processed as lines or paragraphs of text). But that's just as a last resort. |