in reply to Re: CREATE TABLE DDL from DBI Statement
in thread CREATE TABLE DDL from DBI Statement

Thanks for the follow-up!

The output as CSV would look like this:

CONT_ID,DW_EFF_STRT_DT_KEY,DW_EFF_END_DT_KEY,ADMIN_CLIENT_ID,H_CREATE_ +DT,ADMIN_SYS_TP_CD,DW_ACTIVE_IND,H_ACTION_CODE 045234232344234874,42561,2234618,33ASDF948AKDH49234TI,"2005-10-03 11:0 +3:51.144836",2,1,I 002346234135653687,42556,2123123,TUIK84K9D90D0AF2L43I,"2005-11-09 06:4 +4:37.045530",2,1,I

Essentially, I have some routines that query off DB tables and output CSV files (similar to DBD::CSV). Now I want to be able to output to an embedded SQL server, like SQLlite, but I first need to create a similarly structured table.

I fully realise that I could use a simple template and fill in the blanks, but I figured maybe something like this might already exist elsewhere (?)

Thanks for any help you can provide!

Kurt

Replies are listed 'Best First'.
Re^3: CREATE TABLE DDL from DBI Statement
by CountZero (Bishop) on Jul 22, 2008 at 05:24 UTC
    I don't think that will work. The first line gives you just the field-names but as a CSV-file has no idea of the type, length, whether or not it is a key, ... you loose too much info to be able to rebuild a similar table.

    You definitely need something like SQL::Translator. It will read the table structure of your database and save it in an internal representation: that is wat the SQL::Translator::Parser::name_your_database_here does. Next you call SQL::Translator::Producer::name_your_other_database_here on this internal object and it will make you the SQL to build the structure to do so. Changing databases is as easy as replacing the type of the database in 'to'.

    Good thing is that there are parsers and producers for DB2!

    SQL::Translator is a big module but it pays to have a look at it.

    The full program could be as easy as:

    use strict; use SQL::Translator; my $translator = SQL::Translator->new( add_drop_table => 1, no_comments => 0, show_warnings => 1, quote_table_names => 1, quote_field_names => 1, from => 'DBI', parser_args => { dsn => 'dbi:DB2:Your_Database;host=Your_Host', db_user => 'user', db_password => 'password', }, to => 'SQLite', ); my $output = $translator->translate or die $translator->error; print $output;

    CountZero

    A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

      CountZero!

      Thanks for that awesome snippet. I'm going to give it a try for sure. It will be useful in many cases for me and I'll let you know about my success with it. Thanks for the level of detail you provided! :-)

      I just wanted to clarify a bit from my post(s). I currently have a process that connects to a DB, does a SELECT *, and then outputs a CSV. I'm using Text::CSV_XS to join my arrary_ref together, by hand. I may have given the impression I'm using DBD::CSV, but I'm not. Before outputting the CSV, because I have a DBI statement handler, I have access to:

      my $type_ref = $stmt->{TYPE}; my $precision_ref = $stmt->{PRECISION}; my $scale_ref = $stmt->{SCALE}; my $colnames = $stmt->{NAME}; my $nullable = $stmt->{NULLABLE};

      I should have mentioned that there are no foreign keys on these tables and they are essentially flat.

      Instead of dumping the CSV, like I am now, I'd like to be able to generate the CREATE TABLE DDL from the statement handler I already have.

      I'll continue to play around.

      Thanks for your insights.

      Kurt

      CountZero,

      I had a chance to try the SQL::Translator but I wasn't able to get any results. :| The DB2 system I'm hitting (on a mainframe) has 10 main subsystems, each with 50-100 schemas, each with 100-200 tables. In the past I've used tools like hibernate to reverse-engineer schemas but it also takes forever to run. I've got very low priority on the system which doesn't help either.

      With that in mind, I did end up rolling my own little snippet.

      The heart of it is a hash lookup that maps the DBI::sql_type to the DDL word, and determines if scale&precision are applicable. I just did the first few types (CHAR,VARCHAR,INT, ...) but I know there are more to add. Once I get it more complete maybe I should post a snippet. (?)

      Thanks for your time and your advice!

      Regards,
      Kurt