Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello,

Being a newcomer to Perl, I am trying something I'm sure is very simple. I'm trying to touch off an external process within a perl CGI script. The script is designed to act as a "admin page" for the process, so the process has to run in parallel with the script. Any suggestions?

Many thanks in advance (esp if an RTFM answer),
Mike
  • Comment on touching off external processes within perl CGI scripts

Replies are listed 'Best First'.
Re: touching off external processes within perl CGI scripts
by {NULE} (Hermit) on Nov 26, 2001 at 04:28 UTC
    Hi,

    First off this is dangerous - or at least can be. Whatever you do make sure that you are not passing values to the command line that could be passed to the CGI program without doing some serious checking of the values.

    Next is that by default anything executed by a CGI program is going to be run as user nobody (or some rather anonymous and less powerful user). There is good documentation at Apache's web site that can help you set up scripts to run SUID. Basically as whatever user you want. Again this is dangerous - the whole point of a nobody user is that they cannot do as much damage.

    Now, it's hard to tell your level of experience from your question, but a few things you should know in general about CGI coding (and even more generally about Perl coding). You want to specify use strict; at the top of your code and use CGI; since this is a CGI program. Using strict forces some good coding habits upon you. Using CGI protects you in other ways - like some basic taint checking from your incoming request.

    From here executing system commands is easy if you have the rest of your script working properly. The following (untested) snippit sets up a CGI object then upon receiving a certain value from the user executes a system command:

    #! /usr/bin/perl -w use strict; use CGI; my $q = new CGI; my @result; # other stuff happens if ($q->param('command') eq "command1") { # Notice how I pass NOTHING to the # system command that was handed to # the CGI script. @result = `/usr/local/bin/script.sh`; } elsif ($q->param('command') eq "command2") { # command 2... } # more stuff happens. # Include code to render your html around here # then display the result of your system command here: print join "<BR>", @result; # Close out your html document.
    I hope this gets you off on the right foot. Just please approach this carefully - have your external program run by the least priviledged user possible, and pass nothing to the command line that you have not sanitized as much as humanly possible. You have much reading to do, but you have a good start if you wander around PerlMonks.org Super Search looking for examples of how to get started.

    Good luck,

    {NULE}
    --
    http://www.nule.org

Re: touching off external processes within perl CGI scripts
by Zaxo (Archbishop) on Nov 26, 2001 at 04:34 UTC

    We need more information. Is it a daemon process which must live after the cgi process is gone? Does it exit with a return code you must check? Do you need to capture the processeses ouput? Each of these is done differently.

    The perlfunc manpage has a useful introduction which lists functions by category. "Processes and Process Groups" is a good start.

    Update: tid, I wrote a node on starting a daemon process at Re: Daemons???. Your daemon is probably dying because it hasn't made itself a process group leader before the parent exits. It is then inherited by init, which ruthlessly kills widows and orphans. 2: You may want to arrange for your daemon to restart or reread config on, say, SIGHUP. Then kill 'HUP', $pid; would suffice.

    After Compline,
    Zaxo

Re: touching off external processes within perl CGI scripts
by tid (Beadle) on Nov 26, 2001 at 05:28 UTC
    Hi Guys,

    Thanks for the info. To give a bit more information, I need to be able to control another daemon that operates very much like a webserver - it needs to be largely independent and needs to be persistent beyond the life of the perl script. As it's persistent, I'm fairly obviously not looking for a return value. Exec "never returns", system waits for a return value, hanging my apache, as does backticks and qx. Occasionally I even read the manuals :) Admittedly, I have not tried leaving some form of "hanging pipe", but that didn't appear to gaurantee persistence beyond the life of the script.

    As for the security of the CGI script, I'm aware of taint checking and the requirements to move any user inputs away from the shell. The current problem with *that* is the deployment system is not yet fully defined (joy!). But the system security is not a *huge* issue, as the user never sees the commands being executed at the shell, and cannot enter anything anyway (not at the moment at least - when I do file uploads, the story will change).

    Again, many thanks for the replies!
    Mike.

    PS, Got an account now :)
Re: touching off external processes within perl CGI scripts
by Purdy (Hermit) on Nov 26, 2001 at 04:51 UTC
    Not sure exactly if this is what you're shooting for (need more details), but I wrote an article on forking a process, based on the research I did on it, on this site. Randall Schwartz also wrote an article on the same approach for Web Techniques.

    Jason