in reply to Batch loading data into Oracle - Perl to the rescue!

The more interesting bit of the problem would be correctly extracting the relevent metadata - are you sure that the lengths of the columns in the data dictionary match the length of the corresponding field in the text file? Is the column order the same? How are the numbers represented in the file? You may have to have a hand coded translation/mapping configuration file/table in order to accurately handle the mapping.

Are you using 9i - there's some nice ETL features (such as using a file as a table)?

rdfield

  • Comment on Re: Batch loading data into Oracle - Perl to the rescue!

Replies are listed 'Best First'.
Re: Re: Batch loading data into Oracle - Perl to the rescue!
by hotyopa (Scribe) on Jan 28, 2003 at 21:15 UTC

    Thanks for the reply. I am using 9i, but the files will be used only for querying, and are rather large, so there would be a bit too much of a performance hit to just use them as external tables.

    To answer your first question, the data dictionary is exactly the same for each file. So what I am looking to do is to loop through every file in the directory, replace the table name in the control file, then invoke sqlldr.

    *~-}hotyopa{-~*