in reply to Threads slurping a directory and processing before conclusion
while ( my $bl = $$q->dequeue_nb()){
There is almost no reason to ever use ->dequeue_nb() in a worker thread. Enqueue undef to tell a worker thread to quit, and enqueue a value (or an arrayref of values for bulk processing) when the worker should do something.
sub slurp_directory_as_bin{ ...
Personally, if you have that many files, I would consider to either store all the file names in a database, or to spawn an external process (like ls -lR or dir /b /s) to read in all the filenames. With the database you get more flexibility to slice and dice your dataset, and if it doesn't change (that often), having convenient access makes all the other things more convenient too. With the external process, you can start processing the first files in the "directory reader thread" while it still reads in more files.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Threads slurping a directory and processing before conclusion
by TRoderic (Novice) on Aug 26, 2011 at 22:06 UTC | |
by Corion (Patriarch) on Aug 27, 2011 at 07:20 UTC |