We're trying to do graph operations on a large collection of Web pages and links. We've experimented on a small scale with the Graph module, but the data set we need to work with looks like it would take up about 25GB, which is (somewhat unsurprisingly) more memory than we have.
Does anybody have any suggestions for how we could make this work? For example, a disk-based implementation of the Graph class? Or could we create a 25GB file, mmap it, then tell Perl to use that as backing store?
Any suggestions are appreciated!