richardwfrancis has asked for the wisdom of the Perl Monks concerning the following question:

Hi all,

In a nutshell, I want to get large amounts of data in to my MySQL database from a text file. I'm using DBI to work with the database.

"Load data local infile" is obviously the way to go but my problem is that between reading in the text file and loading the data, I have to do a number of processing steps on the data.

I'd rather not have to write the processed data back to the filesystem and then use "Load data local infile" as my files can be quite large (~300MB). I would however like to use the speed of "Load data local infile" as opposed to insert statements.

Can anyone think of a way to do this?
Any pointers would be very much appreciated.

Many thanks in advance,

Rich
  • Comment on DBI: pipe data to "load data local infile"?

Replies are listed 'Best First'.
Re: DBI: pipe data to "load data local infile"?
by djp (Hermit) on Jan 18, 2010 at 11:16 UTC
    On Linux and certain Unixes, /dev/stdin is a pathname for standard input. You could use two processes connected by a pipe; the first process reads and massages the data and writes it to standard output, and the second process uses "Load data local infile" with /dev/stdin as the filename. 'man stdin' for details. Hope this helps.

      Thanks djp I'll look in to that. That's useful info to know even if I don't use it for this project.

      Cheers.
      great!! thanks a ton
Re: DBI: pipe data to "load data local infile"?
by stefbv (Priest) on Jan 18, 2010 at 16:44 UTC
    Just for the record, there is a module for this: MySQL::Slurp

      You beauty!!

      I've not implemented this yet but that looks like just the thing I'm looking for

      Many many thanks stefbv for pointing me to this

      Cheers