I've been going over my project notes for the type of outsourcing I do. After everything's done & said, I notice the following general pattern, regardless of the exact type of work to be done : and in fact, that's general enough to describe most web transactions.
However, I note that most (all?) that I've seen will implement this in one giant script. I'm curious to people's opinions about modularizing these steps, so that you have a (e.g.)login.pl script, that when a user logs on successfully, calls get_request.pl that calls request_one_options.pl, etc, with appropriate parameters inbetween each script. I realize that security would be an immediate concern. It seems like a session server would be the best (most secure) way to prevent people from faking requests -- login.pl would instantiate a session, and then each successive script checks the params it's passed against the session, and responds appropriately if the session information doesn't match the params.

Is this too insecure? Is this Just A Bad Idea?

Replies are listed 'Best First'.
Re: Futher contemplations on CGI -- putting a pipe paradigm into scripts.
by Masem (Monsignor) on Feb 13, 2001 at 03:52 UTC
    I'm pretty much doing a similar thing on a site I'm working on: login.pl will generate a long string which is a combination of the time, user id, random numbers, and a checksum. This string is passed to successive scripts via a hidden field. This is probably as secure as the least secure method of authenication that you use (in my case, not very as I'm using non-SSL, password field form elements, but that's not an issue in my case).

    Once you start playing with the various scripts, however, I would make sure that you save the data for each page in a temporary area somewhere, only passing along the autherization code between scripts as opposed to all those variables, as in the latter case, you might forget to pass all those variables back to the next script, and upgrading would be a pain in the butt. A database solution, if you already have one working, is fine, but if you don't, it's probably easier to simply write the data out to temporary files (maybe create a directory for each login try, write each page's data out by Data::Dump-ing the $cgi parameter). You'd then simply need one script at the end that is aware of how to encorporate all the data and do the final modifications, whether this means transferring data around in a DB, or rereading the temporary files and putting those into place. I would also make sure that any solution, you have a reaper cron job that uses timestamps to remove old entries from either temporary storage place so that they don't fill up by incompleted form fillings over time.

    Also, given that you're going to have many form elements alike to navigate between pages, I'd suggest grabbing the Template Toolkit (see a recent question I put up on this) to standardize how each 'panel' script prints out the html to go to other scripts.

    You could also use cookies, that is, once they've passed the login.pl state, require that a cookie be set and be present on further pages, and then make sure when you hit the confirm script that collects all data and puts it into place, you set the expiry of the cookie as to make it disappear when the browser next flushes them. Of course, this does require the use of cookies, which may or may not be how you want to approach the site. (the hidden field approach above, of course, does not require cookies and should work on all browsers).

(fongsaiyuk)Re: Futher contemplations on CGI -- putting a pipe paradigm into scripts.
by fongsaiyuk (Pilgrim) on Feb 13, 2001 at 21:26 UTC
    Ok, so this isn't really an answer, but more of a comment regarding those generic web-application steps.

    You might want to check out CGI::Application

    fong