in reply to Inserting Multiple Records in DB.
From skimming through your code, I gather you're reading in the data from an xml file, stashing it in arrays, then building up a single INSERT SQL command to put it in the database all in one step. And you're worried about the need to read all of the data into memory? In that case it would be easy enough to do the INSERTs one row at a time. If you're worried about the efficieny of that here are a few thoughts off the top of my head:
There are usually database-specific tools available to do bulk import of data into a database. Postgresql has a COPY command that imports data from files in a few standard formats: e.g. csv, tab-delimited. There are problems with handling xml in the general case, unfortunately (nested trees of data don't map neatly to tabular formats) but converting the data from an xml format to tab-delimited might be a good way to avoid over-flowing memory or getting bogged down in single row inserts.
The postgresql documentation also has a page with some hints on efficiently populating a database. And for other databases, it's not difficult to craft a web search to look for the tricks, e.g. "bulk import mysql" and "bulk import mysql xml" both look like they turn up useful information.
In any case, you might look through that page of postgresql hints, because it recommends a number of tricks that are portable:
|
|---|