Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"

Re:^5 iteration through tied hash eat my memory

by diotalevi (Canon)
on Dec 09, 2002 at 18:11 UTC ( #218610=note: print w/replies, xml ) Need Help??

in reply to Re:^4 iteration through tied hash eat my memory
in thread iteration through tied hash eat my memory

I did a bit of checking for you and it looks like Linux support for large files was added in version 3.2.9 (see the Changelog at Your signal was from XFS so some other checking brought up the link which indicates that your large file support may be conditional on your glibc library. My recommendation is to get the current version of BerkeleyDB and install it into /usr/local. Be very careful not to disturb your existing library since various parts of your OS probably depend on 3.1.17 staying 3.1.17.

Google is your friend suse xfs 2gb. Obviously just read the changelog on's web site for the scoop on BerkeleyDB.

__SIG__ use B; printf "You are here %08x\n", unpack "L!", unpack "P4", pack "L!", B::svref_2object(sub{})->OUTSIDE;

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://218610]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (2)
As of 2023-05-28 05:43 GMT
Find Nodes?
    Voting Booth?

    No recent polls found