in reply to Re^2: Hash of Hashes from file
in thread Hash of Hashes from file

Ultimately if you want to allow more than one website per user, you're going to want an array layer in there: a hash of arrays of hashes. (You could replace the array layer with something functionally equivalent, like another hash layer, or Set::Object, but I see little point in doing that.)

If you can't hold it all in memory, you're going to have to rethink your technique. Might it be possible to sort (or split) the file per-user, and then process the data one user at a time?

perl -E'sub Monkey::do{say$_,for@_,do{($monkey=[caller(0)]->[3])=~s{::}{ }and$monkey}}"Monkey say"->Monkey::do'

Replies are listed 'Best First'.
Re^4: Hash of Hashes from file
by cipher (Acolyte) on Apr 03, 2012 at 14:40 UTC
    Before trying hashes, I was using arrays in my script where I was mapping out the unique usernames, then I was grepping each username and pushing results in a new array and then process data for user1, reinitialize array, grep and process data for user2. script ended up having too many foreach loops.

    I was trying to use hash as database and it looks it does not work that way

Re^4: Hash of Hashes from file
by cipher (Acolyte) on Apr 03, 2012 at 15:04 UTC
    Before trying hashes, I was using arrays in my script where I was mapping out the unique usernames, then I was grepping each username and pushing results in a new array and then process data for user1, reinitialize array, grep and process data for user2. script ended up having too many foreach loops. I was trying to use hash as database and it looks it does not work that way