Beefy Boxes and Bandwidth Generously Provided by pair Networks
go ahead... be a heretic

Re: Loading large amount of data in a hash

by maverick (Curate)
on May 01, 2002 at 22:22 UTC ( [id://163413] : note . print w/replies, xml ) Need Help??

in reply to Loading large amount of data in a hash

Right idea. You just have a few little details. You can't natively store references in dbms. You'll have to serialize the arrayrefs down to scalars before you can store them, using something like Storable. Which leads to the second issue. If memory serves, GDBM has a fixed size for those scalars...Berkely BTrees do not. Try something like:
use DB_File; use Storable; tie %data_parsed, "DB_File", "$hashfil", O_RDWR|O_CREAT, 0666; # inside read data loop $data_parsed{$key} = freeze($array_ref); # inside use data loop $array_ref = thaw($data_parsed{$key});
DB_File is slow to create, but fast to read. That may or may not be an issue.



Ya know...after taking another glance at this, 800 is a lot of data. Plus, you have a key, and array combination. Perhaps it's time to move up to a full blown database like MySQL or Postgres?

OmG! They killed tilly! You *bleep*!!