Long term it sounds like a "real" DB is the way to go. I've been experimenting with MySQL and the performance (when you are careful and get the tables optimized for your app) is fantastic. I don't understand the type of queries that you are making to this huge hash structure - there must be a lot of queries for the app to take 8 seconds past the 12 seconds to load the hash. A "flat" and appropriately indexed SQL DB can be rocket fast - the idea is to push the logic to collect the data for query X into the DB (i.e. get a result set, not data that you collect into a result from multiple queries.)
Books that I would recommend are:
Learning SQL by Alan Beaulieu
MySQL in a Nutshell by Russell Dyer (also has description of Perl DBI and PHP I/F)
Your hash tables are huge. As a possible intermediate step, you could make a Perl server that is initialized with these huge hash tables. Have clients connect to it and ask questions that translate very directly into hash table queries. That would save the 12 seconds of loading the hash tables. You don't mention how many clients could be connected to such an app, but it could be that a single process, and processing a queue one request at a time would be just fine - doing better than 12 seconds ought to be easy. Other solutions fastCGI or modPerl are good, but I worry about running your machine out of memory and disk thrashing.
Could you give an example of the type of query that you are running against this hash table structure?
In reply to Re^5: Fast(er) serialization in Perl
by Marshall
in thread Fast(er) serialization in Perl
by mrguy123
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |