in reply to Downloading URL's in Parallel with Perl
If speed is the key, but process-spawning is too slow, threads are what you need. As you said, Perl Threads are experimental. To use many C threading, i believe you would end up finding it easier to write the program in C/C++. You might want to look into using fork() to spawn 50 processes, and have each get two files, or some such configuration.
I would think of fork() as the most 'perl' way of doing it in the end, but i have never used LWP::Parallel, and would assume it is based on IO Multiplexing (good section in Network Programming with Perl), which is complex, and eventually the disk IO will cause slowness with large amounts of data, but i would benchmark to find out if it is any quicker than the fork version.
speling champ of tha claz uf 1997
-- MZSanford