Reading is definitely my bottleneck. The disk which I read from is working close to its limit, and still the server running the script has some idle CPU time, due to the remote disk's low output. Therefore I need to read from multiple servers (disks) at once.
I believe I do need to iterate, since I don't know which folders might be missing (deleted due to being empty). Remote processes aren't an option, as I have a couple hundreds (windows) servers to search at.
The flow is as follows:
1)Audio files are created on remote servers by some applications.
2)Boss thread searches for filenames on remote servers using File::Find, and enqueues them
3)Worker threads dequeue items and call a conversion application, which converts the file and sends it to a storage (it is never written to the local server's HDD)
4)Upon success, audio file is deleted from remote server
Right now, I need a faster way to search those filenames. I'm trying to grab a few filenames from each remote server, cycling through them (servers) over and over, but I'm not being to feed the queue as fast as it is consumed. So I need to either make it search (and cycle) much faster, or truly make the search multi-threaded.
In reply to Re^4: File::Find won't iterate through CIFS share withouth "dont_use_nlink"
by reinaldo.gomes
in thread File::Find won't iterate through CIFS share withouth "dont_use_nlink"
by reinaldo.gomes
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |