use strict; use warnings; my %hash = (); foreach my $a (1..500) { foreach my $b (1..4000) { foreach my $c (1..10) { foreach my $d (1..7) { $hash{$a}{$b}{$c}{$d} = $d; } } } } print "here\n";
I've got a pretty big, oo perl script that's behaving strangely. The part in question essentially compares two graphs for similarity. For small datasets (~20-30 nodes per graph), it behaves correctly, but for large ones (200-300 nodes per graph) it just stops at some (incorrect) point with no error messages or warnings.
Some clues:
1) For the large dataset, perl's memory usage seems to climb a bit above 500 MB and then at some point jumps back down to around 30 MB.
2) Using Devel::Profiler works fine for the small sets, but for the large sets it tells me that there are some unstacked calls (which makes sense, since it seems to be halting in the middle.)
Any ideas?
In reply to Script halting without error? by Jman1
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |