...graphs with ~3000 nodes and ~1.5million edges.
Building a graph with Graph::Directed doesn't put all the edges into one big hash. It creates an adjacency list for each vertex, so the size of each individual hash should only be about equal to the number of vertices. A 3000-element hash should be no problem, unless you're looking for trouble.
As I insert this large graph into the Graph::Directed module I note what appears to be an exponential slowdown as I insert edges.
I don't see a slowdown until the graph gets big enough to fill available ram and start swapping, so it sounds like tye is right. Graph::Directed's graph representation is designed to be flexible rather than compact. To handle big graphs, you might need to use a different representation.
does anyone know the big-O notation complexity of inserting edges and nodes into the data structures used by Graph::Directed
It looks like it should be close enough to O(n) as to make no difference.
Aside to Abigail: Ha ha, but exp($n/1_000_000) is still exponential.
In reply to Re: The Upper Limit of Perl's Native Data Structures
by no_slogan
in thread The Upper Limit of Perl's Native Data Structures
by arunhorne
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |