legLess has asked for the wisdom of the Perl Monks concerning the following question:

Monks ~

I'm packaging a mid-sized Perl application and have a question about tests. I have a comprehensive test suite (including WWW::Mechanize interface tests) but the application itself requires a MySQL database. Most of the tests require access to a live data source.

This is fine for local development, but I'm not sure how to handle it when I distribute the app. I'd have to skip most of the tests, somehow, since it wouldn't be possible to run them on install. Should I just run a skeleton suite of really basic tests? If I test before I package, I can be pretty sure that any data source which meets the requirements will work.

There's advice here (Writing tests for modules using database handles) to support other data sources, like DBD::CSV and DBD::RAM. This is a good idea, and I have on my list to support SQLite and Postgres. I don't want to hold up packaging and distributing this (early and often!) until I make support for more data sources, though.

Aside from the more theoretical question, I'm not sure how to implement it. All I can think of now is to pass a command-line argument into the tests which, if received, runs the data source tests. The default 'test' target in the Makefile wouldn't pass such an argument, so these tests would be skipped unless specifically requested. This seems like a nasty hack, and I'm eager for a better solution.

Thanks in advance for any advice.

legless

Edited by BazB: fixed link.

  • Comment on Packaging and testing a module which requires a database

Replies are listed 'Best First'.
Re: Packaging and testing a module which requires a database
by chromatic (Archbishop) on Feb 06, 2004 at 17:04 UTC

    I've used DBD::AnyData with a very small database (ten rows per table) to test against data I could update, delete, and reload as necessary. It worked well, despite there being some SQL differences. They should be fixed now.

    I've also used an environment variable to skip slow or tricky tests. TEST_DATABASE or the like would work. You'd run it with a command such as:

    TEST_DATABASE=1 make test
      I hadn't thought of an environmental variable; that sounds like the best way. Thanks!

      My only problem with AnyData (or SQLite, or anything else) is that I depend on a MySQL-specific function for autoincrementing primary keys. I just don't want to spend the mental energy hacking around that until I've done more work on features that people are asking for.

      But that's probably false laziness. I haven't done a realistic time assessment and I need to.

      In the "small world" department, Jeff Zucker was just yesterday bemoaning the fact that he hasn't had time to implement autoincrement keys in AnyData.
Re: Packaging and testing a module which requires a database
by blokhead (Monsignor) on Feb 06, 2004 at 17:36 UTC
    I've struggled with this exact problem for my recent releases of Class::Tables. Its test requires a working MySQL or SQLite datasource, which is filled up with some sample data to test the module's interface. You can check out the source of my Makefile.PL and test.pl to see how I get the module's build process to configure the test datasource. When developing my test suite, I searched CPAN for DBIx::* modules and borrowed (stole) a lot of ideas from those that had test suites (can't remember exactly which ones). DBD::mysql also has quite an extensive test suite, so you have a few places to start looking.

    Here's the very rough outline of what I do for my tests: (I'd also be interested to know if this isn't the Right Way to do it)

    1. MakeFile.PL prompts for database connection info, and then saves it in a ./testconfig/Config.pm file using Data::Dumper. You should provide some mechanism to change the configuration (I use the -s option to MakeFile.PL). You may also want to provide an option to skip the tests here.
    2. test.pl loads the Config.pm file to get the connection information. If the user has decided to skip the tests, or we couldn't connect to the database, you may want to skip_all of your tests. If we could connect, then test.pl empties out all the tables from this database and inserts the test data manually with SQL statements.

    That's the basic outline, but I think rather than getting into much more detail, you should look at/steal from other test suites that fit your goals. Hope this helps!

    *: Update: Just looked through my browser history: DBIx::FullTextSearch is one of the DBIx:: modules I used as a test template.

    blokhead

      Thanks for your reply. I've got a Config module that loads from a config file (in YAML, as it should be :-). That's not a bad way to do it.

      But the more I think about it, it seems like The Right Way is to use some lightweight RAM or filesystem data source. Which I need to implement anyway. The Intermediate Way will likely be chromatic's envinromental variable suggestion.
Re: Packaging and testing a module which requires a database
by perrin (Chancellor) on Feb 06, 2004 at 18:43 UTC
    The purpose of your tests is to tell if the system is working, so it seems reasonable to me to require a database setup step. You can add a "make db" step to your install, which sets up the database for your app. People have to run that before running make test.
Re: Packaging and testing a module which requires a database
by jZed (Prior) on Feb 06, 2004 at 23:19 UTC
    Just a historical note - DBD::RAM has been superceded by DBD::AnyData (both my modules). Paradoxically, I've been afraid to kill it because it refuses to die :-).