in reply to Combine data in table

Is the dataset so large that you need a relational database's SQL to manage it, or could you simply use Text::CSV to translate the CSV file into an easily manipulated datastructure?


Dave

Replies are listed 'Best First'.
Re^2: Combine data in table
by jZed (Prior) on Aug 07, 2005 at 18:34 UTC
    I'd suggest that the size of the dataset is not the deciding factor bewtween using something like Text::CSV_XS and DBD::CSV, although it could be a factor in deciding between DBD::CSV and more robust DBMSs like SQLite or PostgreSQL or MySQL. IMNSHO, the deciding factors in choosing between a Perl parsing approach and a database approach would be the comfort level of the programmer with SQL and the kind and complexities of operations. Sure, parsing modules and parsing without modules can delete, and insert, and update, and sum, and average, and modify column structure, etc. But when one finds oneself doing many of those operations, a database approach may be called for.
Re^2: Combine data in table
by DrAxeman (Scribe) on Aug 07, 2005 at 17:28 UTC
    I've been using DBI and TEXT::CSV for my work. The files are not that large.