silent11 has asked for the wisdom of the Perl Monks concerning the following question:

monks,
I'm in the middle of creating a new personal web site for myself (that sounds a bit redundant). One thing I am focusing on this time is speed. Particualry, the amount of time it takes for the page to load. The most server taxing process I am running is a frame that loads a guestbook type of application. I have decided to have a random header image on the page, my question is: 1) Would the page load faster if i did a some of the scripting on the client's side in JavaScript, as opposed to perl on the server side? and if so, 2) are there indicators that I can look for to know when I should begin doing more of the work on the browser and less on the server?

here are some things I'm taking into account:
  1. the script is small and it probably makes no diff where it's run
  2. some people may not have JavaScript
  3. even if the browser can do it faster, that probably wouldn't be the case for everyone
I'd like to hear your opinions and experience.
-Silent11

Replies are listed 'Best First'.
Re: speed : client or server side
by Ryszard (Priest) on Feb 22, 2002 at 23:19 UTC
    The only thing you can be certain of is how you construct your page. Everything else is variable.

    If your focus is speed, then it really depends on what you are optomising it for - broadband or narrowband? If you will be using your site via a modem, and from the office where you have a fat pipe, i'd reckon you'd want to go the path of the lowest common denominator - narrowband.

    From experience, if youre doing mainly narrowband, then you're looking at around 30Kb page sizes. The method in which you construct your pages is where you'd be looking at optomising.

    Making the assumption your content is dynamic, I'd be looking at indexing strategies in your database, and indeed, what database platform you will be using, and how it is built and tuned.

    As a general thing, keep your db handles open during the execution of your script, and perform multiple fetches on the same handle. Creating the handle is quite expensive, and I've seen many people open disreet handles for each time they want to access their database.

    Use placeholders in your queries, the db will cache the statements and use the cached copy rather than "re-doing" a new query with hardcoded variables..

    Use mod_perl. Its fast, but you have to have a decent amount of RAM. there are lots of tuning options there for you to make the most of.

    The most obvious thing I can think of is keep your IO to a minimum - its expensive, however with everything there is a trade off. If you embed all your html, it makes maintaining your site harder, however if you template everything in existance, you will perform too many IO's. As a general rule I template my sites (HTML::Template) but generate some HTML to fill the gaps.

    I read a study a while back which stated a human will consider sub 2 second lag as a contiguous operation, whereas when lag exceeds 2 seconds the operation becomes fragmented to the human mind, and it begins to break the concentration cycle. When i'm designing sites, I try get my pages to load in the sub 2s range as a general rule. Unfort I dont have the reference handy, otherwise I'd quote it.. ;-(

    HTH

Re: speed : client or server side
by beebware (Pilgrim) on Feb 23, 2002 at 01:33 UTC
    Use Javascript and it'll push up the average page sizes (and page rendering times), use Perl and there will probably be more requests to the server (as bad data is submited and the page has to be resent)...
    What I would do is have a combindation (dependent on what exactly is needed). Have Javascript form validation (on the <form onSubmit=> setting - please don't set the 'input type=submit' as 'input type=button' as this breaks older browsers), and then if it passes the validation, send it back to the server. This way, if they enter garbage in a field (such as letters in a number field etc), you can say 'Bad entry' before the data has to go back to the server and then the entire page resent.
    If you can provide a few more details on what exactly you are trying to achieve, I'll probably be able to provide a few more specific answers.
Re: speed : client or server side
by dmmiller2k (Chaplain) on Feb 23, 2002 at 04:16 UTC

    In my opinion, Javascript is best used for glue to hold together a web application, and for minor field validation (if at all). The load time of a page, if written in javascript coompared with perl/cgi, may increase, with the added disadvantage that the spinning globe (or whatever the browser uses to indicate waiting for a request) will stop while the javascript is rendering the page.

    You will, of course, eliminate the perl compile-time delay :), which is less of an issue if your server has mod_perl or its ilk.

    In terms of its use as glue, Javascript may be used to tie form fields together in a coherent way, such as disabling a set of controls depending upon the state of a checkbox or radio button, etc.

    It also potentially has uses in rendering very large tables, since the download time can be shorter if the data is in javascript arrays than in generated HTML. This last use is controversial, since it obviously won't work if the browser has Javascript disabled.