Hey,
I'm running a photo gallery solution (ImageFolio) coded in Perl running as plain CGI. Everytime a request is received, the software reads all data (category descriptions, etc) from text files (some over 600KB) as well as the photo IPTC information. So having many concurrent users brings the system to a SERIOUS crawl. I'm aware that the software (ImageFolio) is completely non-scalable in the way it has been developed. And I'm working on replacing the whole photo gallery solution for something new and efficient.
However, in the meantime, I'm considering a hardware upgrade for the server. Is there any hardware combination that could possibly speed up the current system (ie: SCSI drives, more RAM, etc)? So that Perl works faster and processes files faster with many concurrent users.
Also, would changing the way I read files in Perl make much of a difference? Right now it uses a while loop to keep RAM usage at a minimum, rather than loading the whole file into an array. I've also considered having some caching system that will reduce the processing load from each request.
Thanks,
Ralph
In reply to Speeding up large file processing by ralphch
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |