in reply to looking for some guidance before I dive in

Converting the reports into a data format that can be sucked into MS SQL sounds just the job for Perl.

You can either write a system that uses configuration files or just a number of similar scripts. The choice here depends on whether you are doing a one-time transition of legacy data or something that needs to be repeated and maintained.

In terms of the 'getting data into SQL Server' question, I would look at writing a Perl script that used the Text::CSV module as a means of writing correctly formatted CSV files. These files can then be sucked into SQL server using the Bulk Import facility which is much better than trying to do SQL Updates.

If you are looking to configure your script using XML files then take a look at XML::Simple which should allow you to read XML files with the minimum of fuss.

Enjoy!
Inman

  • Comment on Re: looking for some guidance before I dive in

Replies are listed 'Best First'.
Re: Re: looking for some guidance before I dive in
by dragonchild (Archbishop) on Oct 21, 2003 at 16:03 UTC
    With the caveat that you lose a lot of control by going with Bulk Importing facilities, I would agree with you. If you need any sort of special handling, especially for errors, I would take the time and build the SQL. It doesn't take that long, frankly.

    ------
    We are the carpenters and bricklayers of the Information Age.

    The idea is a little like C++ templates, except not quite so brain-meltingly complicated. -- TheDamian, Exegesis 6

    ... strings and arrays will suffice. As they are easily available as native data types in any sane language, ... - blokhead, speaking on evolutionary algorithms

    Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.