I apologize for the broadness of this question. I'm hoping for some opinions from the monks on the most efficient data storage method for some scripts I'm writing. Descriptions are as follows:
1. The data is a listing of network switchports and their characteristics. It's currently stored in a text file that uses 1 line for each port. The script that currently runs over it stores each line in a multilevel hash, and I'd like to keep that format (something like $ports{switch}{interface}{x|y|z}). I was thinking of using MLDBM, of course, but what underlying DB? There will be a frequent cron job that reads in the data, updates it, and writes it out, as well as some CGI apps that are run by users, mostly to read, but occasionally to write changes. I was thinking maybe GDBM, since it apparently offers file locking, but I can also just use a semaphore file with flock. Would any type of DB have an advantage over others?
2. This one is a bit more tricky. It's going to store a large number of entries in multiple databases. Most of the data is composed of large text fields (1 field in particular is usually a sizeable paragraph). A number of scripts will read from this and may have to parse over different parts of the data. I was wondering if an SQL DB with multiple tables would be best for this, or if some of the DB modules would handle this okay (as a small side note, I don't know SQL, but learning it is not an issue if it would be the best option).
Thanks in advance for your advice!
In reply to Which databases to use? by wink
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |