kingkongrevenge has asked for the wisdom of the Perl Monks concerning the following question:
Is language really irrelevant to CGI performance on a typical webserver? I was playing around with microbenchmarks with lighttpd on my localhost and got a huge difference between Perl and C++ CGI programs. But deployed on a remote Apache 1.3 system there was little difference. C++ fastcgi on my localhost was also a few times faster than Perl fastcgi. However, my web host doesn't support fastcgi so I couldn't extend the comparison.
I wrote a few little programs: 1) Hello World. 2) Print a list of 500 random numbers. 3) Read in and reverse a file. I made Perl CGI versions (CGI::Simple), C++ cgi vesions (cgicc), Perl FCGI.pm versions, and C++ fastcgi vesions.
I benchmarked with "http_load -parallel 1 -seconds 20". I also tried parallel=2. I ran both lighttpd and http_load on my localhost. The results, measured in fetches per second, were fairly consistent across each program and with different http_load options: C++ CGI is 13 times as fast as Perl CGI. C++ fastcgi is 4 times as fast as Perl FCGI. C++ fastcgi is 5 times as fast as C++ CGI. Perl FCGI is 15 times as fast as Perl CGI.
I then deployed the CGI programs to a remote Apache 1.3 server. This host does not support fastcgi, so testing those versions was not an option. Performance between the Perl and C++ CGI programs was close to identical. The C++ versions would be, like, 1.13 times as fast at the most.
Is there an explanation for the lack of performance difference on the remote host, but an extreme performance difference on my desktop? Do you think there would actually be a scalability payoff to using C/C++ for performance critical bits in a CGI environment?
This is really just idle musing. Alas, I do not actually have any scalability problems.
|
|---|