Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Is there a Perl module that will help us move a MySql database from one data center to another? We have huge amounts of data on our servers and we need to not lose anything. Is there a fast way to copy a mysql database remotely maybe, within Perl or a Perl module?

We are moving from BurstNet because of all the huge amounts of downtime and latency to RackSpace, and are changing from Cpanel/WHM with Centos 5 to Plesk with Red Hat Enterprise. Cpanel for sure has phpMyAdmin where we can do a dump... and the plesk seems to have it also where we can import it. however, with the huge amounts of data, I fear the browser will time out.

Thank you for any help in my search of a Perl Module preferrably to help me with this.

Richard

Replies are listed 'Best First'.
Re: Perl and MySql
by JavaFan (Canon) on Oct 17, 2008 at 20:51 UTC
    Well, I guess one could always use DBD::mysql and write a dumper/importer, but why not just use mysqldump to dump the database, and import it on the other side?

    If you set up an ssh tunnel, you can even do it in a pipe so you don't need the extra diskspace to store the dump on.

      Seconded.

      There is absolutely no point in re-inventing the wheel - especially for a one-off job - when you can invoke mysqldump with a few seconds of typing.

      mysqldump --user=myuser --password=mypass mydatabase > mydb.sql

      I would prefer to actually see a file there rather than pipe through a tunnel - provided that I had the disc space.

      If the resultant file is large, consider putting it through bzip2 before you move it. As the output of mysqldump is plain text (SQL), I have found that bzip2 can crunch it down enormously.

      bzip2 mydb.sql

      You might also want to take an MD5 sum of the resultant dump file to compare it with the file copied across the network. Better safe than sorry.

      md5sum mydb.sql.bz2
        mysqldump --user=myuser --password=mypass mydatabase > mydb.sql

        You should be able to do the bzip at the same time as the dump: mysqldump --user=myuser --password=mypass mydatabase | bzip2 -- > mydb.sql That way, you don't use as much disk space.

Re: Perl and MySql
by Illuminatus (Curate) on Oct 17, 2008 at 21:55 UTC
    You don't need mysqldump if you aren't using InnoDB. As long as you are moving to the same architecture, you can just tar up the database files and copy them to the new system: http://dev.mysql.com/doc/refman/6.0/en/upgrading-to-arch.html. It may even work for InnoDB, but that would require more checking.

    This assumes, of course, that you are not really asking, "how can I move my database from one data center to the other without taking it down?" That, as the they say, is a horse of a completely different species...

Re: Perl and MySql
by bichonfrise74 (Vicar) on Oct 17, 2008 at 21:41 UTC
    Questions:
    1. How big is the data?
    2. Are you using MyISAM / Innodb or combination of both?

    - bichonfrise74
Re: Perl and MySql
by graff (Chancellor) on Oct 18, 2008 at 05:06 UTC
    You fear the browser will time out? Don't use a browser for this. If you don't have access to a login shell on the host with the existing database, use a machine where you do have login shell access and see if running a mysql client on this other machine will let you connect to the database host -- a command line would be something like this to put the database into one file on the machine you're logged into:
    mysqldump -h existing.db.hostname -p -u dbusername dbname > full_db.du +mp
    The "-p" tells mysqldump to prompt you for the db account password. If the existing db is in use while you do this (inserts and updates are being done), you'll want to check the manual for mysqldump about additional options about locking things while the dump is in progress, so you get a coherent set of data.
Re: Perl and MySql
by Anonymous Monk on Oct 19, 2008 at 06:58 UTC
    I got it... here is how I did it:

    from shell:
    cd /var/lib/mysql ls -las # That listed all our databases... tar -cf databases_to_move.tar database_name database_name2 #once that finished I made sure it was there: ls -las mv databases_to_move.tar /home/path/to/public_html # make sure it moved: ls -las cd /home/path/to/public_html ls -las # Ok it was there now go login to the new database server at the new d +atacenter: cd /var/lib/mysql wget http://www.ourdomain.com/databases_to_move.tar # Make sure the tar file was transferred #(it was 7.82 Gigs so I had to do it in a bunch of files # that were about 800MB each) ls -las # once I made sure it was there I extracted the databases: tar -xf databases_to_move.tar # Remove the tar file rm -f databases_to_move.tar # make sure it was removed ls -las # all done.
    That did it...

    Thank you for all the information. I did a lot of reading on the links and information you provided.

    Thanks again.
    Richard