korjavin has asked for the wisdom of the Perl Monks concerning the following question:

Hello,

I have a very small projects with dbix::class, where database schema changing often.

Its bad, I know, but nothing to do with that.

I wish to find some solution for update db schema/ reimport data without alot of pain.

I was looked at http://search.cpan.org/~jjnapiork/DBIx-Class-Migration-0.039/lib/DBIx/Class/Migration/Tutorial.pod but can't even install that - cpan and cpanm both failed.

Before I spend thousand hours to install and learn this, say me please, is it worth?

May be exists some simple solution for my task, may be not related to dbix::class?

  • Comment on Need advice about database-migration for my dbix::class

Replies are listed 'Best First'.
Re: Need advice about database-migration for my dbix::class
by Rhandom (Curate) on Jun 26, 2013 at 03:30 UTC
    This isn't a dbix problem directly. This applies to most databases in general. There are two options: cut over which includes downtime, or slow migration without downtime.

    In the cut over, your code and your database move at exactly the same time.

    1. You shut down connections to the database.
    2. Migrate the schema.
    3. Deploy your new code.
    4. Start up your database.
    5. And pray.
    Actually, you can skip the last step if you have a proper staging environment to test this migration. Downtime depends upon the size of your tables. If you have GB size tables, be prepared for a lot of downtime.

    A variation of the cutover can be used if you have decent db error handling, and hopefully have automated schema migration. In that case you deploy your code, and let a few errors occur while you update your database. This method only works if database errors are acceptible and your schema size is small.

    In the slow migration you go like this.

    1. Roll out new schema - if your tables are large, use something like percona http://www.percona.com/doc/percona-toolkit/2.1/pt-online-schema-change.html (assuming you're using mysql) for a zero downtime schema migration. The new schema must not have any notnulls or unsatisfied foreign keys.
    2. Deploy code that sends changes to both schema entries.
    3. Sanity check your data
    4. Change your code to read from only the new location.
    5. Apply schema changes removing the old locations and adding any foreign key relations and adding not nulls.

    A variation of this last scheme is to change your code to detect which version of schema you are using and make appropriate db calls.

    The last scheme is much more difficult to execute, but can easily be done in stages with a zero downtime. At places I work we have used this mechanism for 15 years (well, percona is a recent improvement). Figuring out which is best is your call. There isn't a one size fits all solution. Yet.

    my @a=qw(random brilliant braindead); print $a[rand(@a)];