in reply to Get 10,000 web pages fast

All the monks are giving good help. In terms of resolving the problem with current code, i'd be asking myself why the getstore function is using much memory at all. hopefully it's not doing something really silly like slurping a whole website into a scalar before writing to file. does the program write any files at all before dying? is getstore() aware that it's being passed keys and not actual urls? maybe you already have all these answers or it's not that helpful..but at least i find that sometimes even the most experienced get stumped by simple things. have fun eating web (with your script)
the hardest line to type correctly is: stty erase ^H