in reply to Reading from a fast logfile and storing on Oracle
I'd definitely go with the approach of putting the data into a text file and using sqlldr to bulk-import the data into the database. In fact, I'd split the task into two simple programs.
One program could just create the file working.txt and write the formatted records to it. Every N lines or M seconds, it would close that file, and then rename it to ready.<YYYYMMDDhhmmss> (date/time stamp for the extension).
Your other program would sit in a loop looking for a file matching ready.*. If it finds such a file, it would bulk load then delete it. If it doesn't find one, it would sleep for Y seconds before beginning the loop again.
...roboticus
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Reading from a fast logfile and storing on Oracle
by longjohnsilver (Acolyte) on Jan 08, 2009 at 11:10 UTC |