in reply to Limitation of extra large hash
10,000 objects is not a particularly large hash. I've created hashes with millions of keys. That said, your hash seems to be a multi-level HoHoH...which will consume prodgious amounts of space. But like argel, I doubt that space is the problem here.
Are you sure that all your 10572 objects are unique? Remember that hashes won't retain duplicates. At any level. if the keys are identical then later values will overwrite easrlier ones. Eg. If you have two objects in your file like:
( :netobj (netobj : (object12345 :DAG (false) :add_adtr_rule (false) :cp_products_installed (false) :data_source (not-installed) :enforce_gtp_rate_limit (false) :firewall (not-installed) :floodgate (not-installed) :gtp_rate_limit (2048) :ipaddr (10.1.1.1) :type (host) :IPSec_main_if_nat (false) ) ) ) ( :netobj (netobj : (object12345 :DAG (true) :add_adtr_rule (false) :cp_products_installed (false) :data_source (not-installed) :enforce_gtp_rate_limit (false) :firewall (not-installed) :floodgate (not-installed) :gtp_rate_limit (4096) :ipaddr (100.10.10.10) :type (client) :IPSec_main_if_nat (false) ) ) )
Even though the details of the object are different, only the last one will be retained in the hash because the names are the same, and the second one will overwrite the first.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Limitation of extra large hash
by morpheous1129 (Initiate) on Apr 02, 2009 at 16:30 UTC |