locked_user sundialsvc4 has asked for the wisdom of the Perl Monks concerning the following question:

I would like to open a discussion of ... what are the best practices (and alternatives) for deploying a MySQL database in such a way that a web-server on a hosting service and the organization's own apps (the later via ODBC) can reach it in real time?

The company would prefer to host the database on the Internet host-server, rather than deploy their own server for this purpose.

What practices and alternatives do you counsel me to consider, and why?

The database contains transactional reservation information, is subject to Sarbanes-Oxeley, but does not contain credit card information. Obviously, at least SSLv2 grade protection (VPN modem tunnel?) would have to be employed on the across-the-net traffic originating from the home office users.

Replies are listed 'Best First'.
Re: Best practices for a web+internal accessible database?
by tirwhan (Abbot) on Dec 19, 2008 at 22:25 UTC

    If either one of the two (web or intranet) require read-only access you could set up a replicating slave server. MySQL natively supports replication over an SSL-encrypted connection, so security of the data transfer should be fine and removed from the users hands (which is often a good idea I find ;-). If both of them require write-access you could also set up multi-master replication, but that has it's own drawbacks and may not be worthwhile.

    Alternatively to that, I'd set up a VPN. OpenVPN works on all major platforms and is relatively easy to deploy.


    All dogma is stupid.