in reply to SFTP more than 50+ files
We used Net::SSH2 and threads to connect to remote systems parally. And when he tried the same the application was taking around 15 mins with cpu utilization reaching to the maximum levels ( 90+) and as expected the browser also timed out. Which made the product to halt.
This is exactly the kind of work you shouldn't do in the web server process while a browser is waiting for a response. Instead, queue the job for a background daemon to process. You can show your client a pretty progress screen with updates from the background job. If you don't already have some sort of job queue you might consider Gearman (which I've used) or TheSchwartz (which I have not).
Now 15 minutes might be too long even if it's done in the background. If that's the case your next step should be to use a profiler (like Devel::DProf or Devel::Profiler) to figure out what's taking all the time. You might find it's something unexpected like a slow NFS share or DNS lookup timeouts.
-sam
|
|---|