Alfaromeo has asked for the wisdom of the Perl Monks concerning the following question:

Hi All,

Please check the below CGI Script which executs a simple list command on the remote box and prints the output on the HTML page. This displays the webpage only after the command execution is done. What I need to achieve is to display the page immediately after the command execution and then show the output of the command as and when it comes.

Please let me know if you have any ideas on how to proceed. I posted here with lot of expectations :)
use strict; use CGI qw(:all); # import shortcuts use Fcntl qw(:flock); # imports LOCK_EX, LOCK_SH, LOCK_NB use CGI::Carp qw(warningsToBrowser fatalsToBrowser); # For Debugging use Net::SSH::Perl; print header; my ( $TITLE, # page title & header $DeploySvr, $KIT, $ssh, $UNM, $Pass, $stdout, $stderr, $exit, $cmd, ); $TITLE = "UNIX Deployment Tool"; $DeploySvr=param('DrpServer'); $KIT=param('TxtKit'); $UNM=param('username'); $Pass=param('password'); $ssh = Net::SSH::Perl->new('113.128.160.214'); $ssh->login($UNM, $Pass); $cmd="ls -l"; my($stdout, $stderr, $exit) = $ssh->cmd($cmd); print start_html($TITLE); print h1($TITLE); print hr, start_form; print $stdout; print $stderr; print endform, hr; print h2("Prior Messages"); print end_html;
Thanks

Replies are listed 'Best First'.
Re: Asynchronous Processing a command execution
by Corion (Patriarch) on Jun 18, 2008 at 11:01 UTC

    Your approach does not work because you have structured your program in a way that it collects all information before it starts printing the output.

    The standard approach for running long processes is Watching Long Processes Through CGI, which you can adapt to your needs by launching an external program which writes to a file.

      #!/usr/bin/perl use strict; $|++; use CGI qw(:all); # import shortcuts use Fcntl qw(:flock); # imports LOCK_EX, LOCK_SH, LOCK_NB use CGI::Carp qw(warningsToBrowser fatalsToBrowser); # For Debugging use Net::SSH::Perl; use CGI qw(:all delete_all escapeHTML); print header; my ($TITLE,$DeploySvr,$KIT,$ssh,$UNM,$Pass,$stdout,$stderr,$exit,$cmd, +$session,$cache,$data,$pid,); if (my $session = param('session')) { # returning to pick up session data $cache = get_cache_handle(); $data = $cache->get($session); unless ($data and ref $data eq "ARRAY") { # something is wrong exit 0; } print start_html(-title => "Logging...", ($data->[0] ? () : (-head => ["<meta http-equiv=refresh content=5 +>"]))); print h1("Logging..."); print pre(escapeHTML($data->[1])); print p(i("... continuing ...")) unless $data- +>[0]; print end_html; } else {ExecuteProcess();} sub ExecuteProcess { $session = get_session_id(); $cache = get_cache_handle(); $cache->set($session, [0, ""]); # no data yet $DeploySvr=param('DrpServer'); $KIT=param('TxtKit'); $UNM=param('username'); $Pass=param('password'); if ($pid = fork) { # parent does delete_all(); # clear parameters param('session', $session); print redirect(self_url()); } elsif (defined $pid) { # child does close STDOUT; # so parent can go o +n $ssh = Net::SSH::Perl->new('113.128.122.27'); $ssh->login($UNM, $Pass); $cmd="ls -l"; my($stdout, $stderr, $exit) = $ssh->cmd($cmd); my $buf = ""; while ($stdout) { $buf .= $_; $cache->set($session, [0, $buf +]); } $cache->set($session, [1, $buf]); exit 0; } else { die "Cannot fork: $!"; } } sub get_cache_handle { require Cache::FileCache; Cache::FileCache->new ({ namespace => 'LogOutput', username => 'nobody', default_expires_in => '30 minutes', auto_purge_interval => '4 hours', }) } sub get_session_id { require Digest::MD5; Digest::MD5::md5_hex(Digest::MD5::md5_hex(time().{}.rand().$$)); }
      In the above code , i tried to fork the process. This CGI gets information from another HTML page which posts info into this.basically it connects to a remote box using SSH module using the username and password provided from the HTML page and lists the output in the web page. On execution , it fork's the process , but unable to display the messages from the forked command.Also, the web page seems like waiting, so I think that Apache has not received the end signal. Please review and let me know

        Please show some effort on your part by reducing the problems to the absolute minimum of code. In your case, the problem is that your child process running the ssh command does not produce output. This is because you wrongly transcribed merlyn's code:

        # Original code: =42= my $buf = ""; =43= while (<F>) { =44= $buf .= $_; =45= $cache->set($session, [0, $buf]); =46= } =47= $cache->set($session, [1, $buf]); =48= exit 0;

        Your code does not read from the STDOUT of the SSH process. In fact, I don't think you understood how Net::SSH works at all. I think it directly returns the output of the command after the command has run, so using Net::SSH won't help you, at least in the way you've used it here:

        $cmd="ls -l"; my($stdout, $stderr, $exit) = $ssh->cmd($cmd); my $buf = ""; while ($stdout) { $buf .= $_; $cache->set($session, [0, $buf +]); } $cache->set($session, [1, $buf]);

        Please review your code and the documentation for Net::SSH and consider how they can be used for your program.

Re: Asynchronous Processing a command execution
by casiano (Pilgrim) on Jun 18, 2008 at 12:26 UTC
    You can use "Client Pull" or "meta refresh". include a directive like this in the HTML generated:
    <HEAD> <META HTTP-EQUIV="Refresh" CONTENT="2"> <TITLE>Page</TITLE> </HEAD>
    In this example the page to reload is the current page since no URL attribute has been specified.

    Hope it helps

    Casiano

Re: Asynchronous Processing a command execution
by apl (Monsignor) on Jun 18, 2008 at 11:59 UTC
    Slightly OT... If you surround your code with <code> and </code>, it would be easier for us to read your script.