http://qs1969.pair.com?node_id=55985

BoredByPolitics has asked for the wisdom of the Perl Monks concerning the following question:

I'm putting together a proposal for an application which will probably need to run on Linux as well as DR-DOS.

For the Linux version, I'm looking at Perl/Tk, with mySQL providing the database backend (each system running the application will have it's own database, for those times when the central database is out of action). All doable, no problems there.

However, when I start to look into the viability of running Perl on DR-DOS, I find there is a port (DJGPP), it has a few DR-DOS imposed restrictions, but what can I use for the database?

MySQL isn't an option (I think), seeing as it doesn't appear to run purely under DOS (requires winsock2). So the only other solution I could find was using the GNU port of the gdbm libraries.

I was wondering if any Monks have experience of this combination, and could offer any advice, or better suggestions.

Pete

Replies are listed 'Best First'.
Re: Perl, gdbm, and DR-DOS ...
by mr.nick (Chaplain) on Feb 02, 2001 at 19:35 UTC
    Hm. I'm not sure what the restrictions that DR-DOS impose. I think you have three options available to you:
    • Abstract the database IO functions into your own module that can be different for each platform (I know that DBI is fairly abstracted already, but I'm refering to something that will make both a dbm database (for example) (ala tie) and a SQL database (ala DBI) totally transparent to the application). You can then just use whichever module is appropriate for the system (including dynamicly useing it). This will allow you to optimize the db access for each system.
    • Go with a straight dbm style database that will be the same everyplace.
    • Use a flat-file style system that you control yourself. If I need something that *must* work on a system with just basic Perl installed, this is what I do. Very inefficient and slow but I have YET to have one not work someplace*.

    Regardless of which you choose, it's ALWAYS a balance between compatibility and efficiency.

    Good Luck!

    * I write scripts that must be runnable anywhere, at anytime, without any "setting up" required. They must all be drop-and-run. Hell, I had a requirement not long ago that it needs to be runnable from this guy's iPaq :)

      Thanks for the replies, however, I've come to the conclusion that running on DR-DOS is going to be a non-starter.

      There are 3 large databases that will need to be maintained, so based on the feedback I've received I imagine that access will be very slow.

      But the real show stopper is the lack of sockets in the DR-DOS port of Perl - that leaves me with no way of communicating with the backend database on the server (unless someone knows of a way of getting sockets working in Perl under DOS).

      Pete

Re: Perl, gdbm, and DR-DOS ...
by clemburg (Curate) on Feb 02, 2001 at 19:43 UTC

    If you consider using the gdbm stuff, take a look at DBD::CSV, and ask yourself if you can't do it with that. In my experience, using the gdbm stuff results often into nasty problems, either when compiling it, or when installing the Perl extensions that need to use the gdbm libraries.

    Christian Lemburg
    Brainbench MVP for Perl
    http://www.brainbench.com