in reply to SFTP more than 50+ files

Work I've done has involved support of Perl based transmission code, including SFTP. We handled 2 orders of magnitude more files (per day) to two orders of magnitude more locations (per day), but we launched single jobs for single files. We were using an old obsolete Sparc server, and did not have load issues of any kind unless:

1. Someone tried to PGP encrypt too big a file.

2. If a transmission broke, someone tried to send all the queued files at once. Feeding them into the account a few at a time was far superior.

You really want to stagger the connections, as opposed to doing them all at once.