Hi Monks,
I'm yet fairly new to perl but have been assigned some pretty challenging -to me- task which consist in building a script to copy several sources directories to several destination on several servers. All that under Windows environment.
My script is reading a configuration file that contains delevery instructions, thus this can vary between each execution. As the size of data is huge and this needs to be deployed on several servers (I call them targets) we need the process to be multithreaded so each copy task are dispatched to each server.
My problem now is that I can't send each copy task at once on the same server (lower I/Os, increase copy time, may have high impact on overall server performance and so on), so I try to implement queuing in thread.
I try to implement the Thread::Queue module, but the provided example and the other resources I have been reading only consider a defined amount of queued tasks (when this may vary in my case).
Below is my code for thread loop creation:
foreach my $targetServer (@targets) { # Ping host to check its available my $p = Net::Ping->new(); if ($p->ping($targetServer)) { $logger->info("Target [$targetServer] is alive, starting deploymen +t "); foreach my $deploySection (@deployments) { if ($flagDeploy) { $logger->info("Starting deployment for [$deploySection] on tar +get [$targetServer] "); my $deploySource = "$config{$deploySection}{'Source'}" ; my $deployDest = "$config{$deploySection}{'Destination'}" ; if ($flagNoMove) { my $thread = threads->new(sub {deployRoutine($targetServer, +$deploySource, $deployDest, "true")}); push (@Threads, $thread); } else { my $thread = threads->new(sub {deployRoutine($targetServer, +$deploySource, $deployDest)}); push (@Threads, $thread); } } } } else { $logger->warn("Couldn't ping target [$targetServer] host will be i +gnored "); } $p->close(); }
Heart of the problem is my DeployRoutine that is a call to batch file that will execute a Robocopy with provided parameters :
# Main subroutine, here's where we will deploy content sub deployRoutine($$$;$) { my ($remoteHost, $sourceDir, $destDir, $flagNoMove) = @_ ; $flagNoMove="" unless ($flagNoMove); my $cmd = "psexec \\\\$remoteHost /accepteula -u $user -p $pass -h - +e -n 3 -f -c $copyScript $sourceDir \"$destDir\" $flagNoMove" ; my $returnCode = `$cmd`; return $returnCode ; }
While proceeding like this I bump into the below error:
Process can't access file because it is used by another process Connecting to MYSERVER...Starting PsExec service on MYSERVER...Connect +ing with PsExec service on MYSERVER...Copying C:\temp\deploy_files.cm +d to MYSERVER...Error copying C:\temp\deploy_files.cmd to remote syst +em: Free to wrong pool 348d3a0 not 298de8 during global destruction.
So I wish to enqueue each "deployRoutine" tasks for each server, but I don't see how I could achieve this not impacting my current multithreading implementation.
Any wisdom advice will be greatly appreciated! Thanks
In reply to Queuing in multithread context by Hardin
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |