It doesn't terminate after 15 minutes i just get fed up and close the command prompt window. Am not too sure about WWW::Mechanize as i would want to do a web crawler which means that all the links it finds they would have to be stored and then check if the first link stored has been visited before and if it hasn't then visit it and get its html content too. Thanks.
It could be recursively getting pages further and further into the hierarchy. I don't know WWW::Robot too well, but you probably want to write something for follow-url-test to see.