in reply to Re^2: storing a huge text file into a hash
in thread storing a huge text file into a hash
You didn't say how long it is taking on your system? On mine, this one liner loads 11e6 lines into a hash in a < 100 seconds:
>perl -e"BEGIN{keys %h=2**23}" -nE"$h{$_->[0]}=$_->[1] for [split];print qq[\r$.]" junk.dat 11000000
But it does use over 2GB of RAM.
If your system is taking substantially longer than that, it could be that you are moving in to swapping, which would slow things down a lot.
If you are loading this hash frequently, then you'd probably be better to stick your data into a tied DB like SQLite.
|
|---|