in reply to WWW::Curl and making it thread safe

Try forking/threading early before curl is loaded
use threads; async sub { print join q/ /, threads->tid, "\n"; sleep 3; return; }; require WWW::Curl::Easy; ...; exit 0;

Replies are listed 'Best First'.
Re^2: WWW::Curl and making it thread safe
by lsuchocki (Novice) on Feb 18, 2016 at 03:49 UTC

    Correctly me if I'm wrong, but in that case I would loose the ability to create a thread (or threads) "on-demand", with the actual data required for the thread to chew on.

    It would be a long-running thread where I would have to manage queues into that single thread, and either handle the data sequentially or thread again from there if I wanted multiple threads, no?

    At what point put my code into a different perl file, serialize my data into another and call it all with Win32::Process::Create ?

      I'd suggest the reverse of the anonymous monk: put WWW::Curl into a thread by itself, don't load it in the main program, and serialise everything through there. You might be able to get that to multi-thread if each thread that uses W::C were to load it individually, but I doubt it.

      I suspect there are better (read: simpler / less prone to headdesk interactions) ways to run multiple HTTP requests simultaneously than trying to get WWW::Curl to play nice with perl threads. An event-based callback (POE, AnyEvent/Coro, etc.) where you can queue up multiple requests at once springs to mind, or pushing the actual W::C calls into forked worker children (that you then get back using either event or thread-based parents). Personally, I would usually use AnyEvent and either LWP::UserAgent (with the AnyEvent "hack" to LWP to get it to play nice with AE) or AnyEvent::HTTP. YMMV.