I'd definitely go with the approach of putting the data into a text file and using sqlldr to bulk-import the data into the database. In fact, I'd split the task into two simple programs.
One program could just create the file working.txt and write the formatted records to it. Every N lines or M seconds, it would close that file, and then rename it to ready.<YYYYMMDDhhmmss> (date/time stamp for the extension).
Your other program would sit in a loop looking for a file matching ready.*. If it finds such a file, it would bulk load then delete it. If it doesn't find one, it would sleep for Y seconds before beginning the loop again.
...roboticusIn reply to Re: Reading from a fast logfile and storing on Oracle
by roboticus
in thread Reading from a fast logfile and storing on Oracle
by longjohnsilver
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |