The list of web pages to download is generated on the fly by examaning the current web page's data. This way only one web site address becomes available at a time.
In this situation I am thinking that a parallel attempt would fail.
Thoughts?
Another question: is there any way I can find out why it is seg faulting? Perhaps it is the threads, perhaps it's something to do with the code... I've acutally had a few successfully completed runs of the code without fault, but when ran again on the same initial URL it fails.
In reply to Re: Threaded recursive program seg faults
by fx
in thread Threaded recursive program seg faults
by fx
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |