in reply to Threaded recursive program seg faults

Perl threads are currently "experimental". See README.threads for more info. Most likely, you don't want to use perl threads for anything but research. This will get better, but it's not there as of now.

See LWP::Parallel for a non-threaded solution to pulling webpages in parallel.

  • Comment on Re: Threaded recursive program seg faults

Replies are listed 'Best First'.
Re: Threaded recursive program seg faults
by fx (Pilgrim) on Jul 29, 2001 at 22:26 UTC

    The list of web pages to download is generated on the fly by examaning the current web page's data. This way only one web site address becomes available at a time.

    In this situation I am thinking that a parallel attempt would fail.

    Thoughts?

    Another question: is there any way I can find out why it is seg faulting? Perhaps it is the threads, perhaps it's something to do with the code... I've acutally had a few successfully completed runs of the code without fault, but when ran again on the same initial URL it fails.

      Read the README.threads link from my earlier node. Specifically:
      Debugging Use the -DS command-line option to turn on debugging of the multi-threading code. Under Linux, that also turns on a quick hack I did to grab a bit of extra information from segfaults. If you have a fancier gdb/threads setup than I do then you'll have to delete the lines in perl.c which say #if defined(DEBUGGING) && defined(USE_THREADS) && defined(__linux_ +_) DEBUG_S(signal(SIGSEGV, (void(*)(int))catch_sigsegv);); #endif