Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi there,
I hope someone can help me out here, I have an XML feed which I want to take from a website. Then filter straight into my MySQL database. This will be a cronjob which runs every night and updates the data. Any ideas how to go about this?

Initial thoughts are to use LWP::Useragent to get the content, then save it locally on the server. Then use XML::Simple to load up the file into a data structure, before processing it and loading up into the database.

Is this the best way to go about this? Any other more specific perl modules anyone knows about?

Thanks for your help on this one.
Cheers, Tom

Replies are listed 'Best First'.
Re: XML Feed to MySQL
by Joost (Canon) on Oct 20, 2004 at 11:30 UTC
      yeah thanks, that bit of code seems cool, LWP::Simple probably is enough. I have actually found a cool getstore method as well so will use that to store to disk.
Re: XML Feed to MySQL
by gellyfish (Monsignor) on Oct 20, 2004 at 11:31 UTC

    You might find DBIx::XML::DataLoader useful for this although it might prove to be overkill for your particular application.

    /J\

Re: XML Feed to MySQL
by jZed (Prior) on Oct 20, 2004 at 14:40 UTC
    DBD::AnyData can create in-memory databases from remote XML data sources so you could scrape and insert with no parsing or file saves (although you could easily keep a file copy by adding a single line). It uses LWP and XML::Twig in the background.
    #!perl -w use strict; use DBI; my $dbh_a=('dbi:AnyData(RaiseError=1):'); my $dbh_m=( $mysql_dsn ); $dbh_a->ad_import( 'temp','XML', $remote_url); my $select = $dbh_a->prepare("SELECT $colnames FROM temp"); my $insert = $dbh_m->prepare(" INSERT INTO $mysqlTable ($colnames) VALUES ($placeholder_str) "); $select->execute; while( my $row = $select->fetch ) { $insert->execute( @$row ); }
    update - added a line # I forgot to show the mysql connect