in reply to File::Find in a thread safe fashion

I tend to just run multiple copies of the program controlled from the parent along the lines of
if (no runtime param) { @list = (list,of,parameters); foreach (@list) { system("this program with parameter 2>&1 >own.log &"); } } process(runtime param); sub process { ... }
Since I work with databases each copy can just stuff it's data in and the database takes care of the rest.

Maybe you could arrange the parent to pause until all the kid's have finished, then collate the data collected?

I do this 'cos it works and scripting is a minor part of my job so I don't get the time to do it better/different.

Replies are listed 'Best First'.
Re^2: File::Find in a thread safe fashion
by Preceptor (Deacon) on Jul 28, 2006 at 11:41 UTC
    Well that's my fallback plan - dump the data to files, and collate later, but I was mostly just trying to be clever.

      Note that telling File::Find to not chdir will cause it to run slower (every opendir and every stat will have to parse longer and longer path strings and retraverse those directories in order to get to the items involved), though I'm not sure how much slower. You might be better off using multiple processes instead of threads (as demonstrated and without using the non-portable fork to boot; tweetiepooh++).

      - tye