I'm exploring Perl/cgi and learning as much as I can about it. I came across a paragraph in a book which explained that tracking users through a site using Perl/cgi is not a good way to go since servers can get bogged down.

Obviously, however, this site runs at a pretty good clip and is a Perl/cgi solution. I'm guessing it's because this site runs a very fast server. That leads me to wonder, as servers become faster and faster, will Perl/cgi solutions will become more practical?

Any general thoughts on this?

Replies are listed 'Best First'.
Re: This site
by dws (Chancellor) on Apr 02, 2001 at 01:26 UTC
    CGI relies on forking off new processes to handle requests. This limits CGIs scalability, regardless of the language used to implement the program being invoked. Forking is expensive.

    Technologies such as mod_perl, PHP, or ASP that execute applications "in process" with the web server will, in general, scale better than CGIs.

    Scalability isn't necessarily the only game in town, though. It's often much easier to develop using CGI scripts, since you can isolate them from the web server and debug them stand-alone. Often the mere fact that you can get something running sooner is the big win.

Re (tilly) 1: This site
by tilly (Archbishop) on Apr 02, 2001 at 00:26 UTC
    Actually this site is built in mod_perl.

    And even so, it does not support a particularly large load as far as busy websites go.

      As a stand-in for tilly, I'm providing some links for the next question which will invariably come up :

      This site is based on the Everything Engine, a link to which is found at the bottom of every page. More mod_perl information can be easily obtained from perl.apache.org, documentation, FAQs, the whole enchilada. The database the Everything Engine runs on is (if I still remember it right) MySQL, which can be obtained from http://www.mysql.com.