heezy has asked for the wisdom of the Perl Monks concerning the following question:

Environment (don't laugh, if I could change I would)...

Overview...

A require 'fileX.pl' statement only seems to work when run from the command line. Not when run as a CGI through the SunOneWebServer. (when run as a CGI it seems to cause the termination of my CGI script?)

Details...

I have a simple procedure that just returns an array of files in a directory and displays them ($cgi is a CGI object that is visable by this proc)

sub getVcolFileNames{ my @vcolFileArray; opendir DH, $vcol_dir or die "Cannot open $vcol_dir: ($!)\n"; foreach $file (readdir DH) { next unless $file =~ /\.vcol$/; push @vcolFileArray, $file; print $cgi->p("$file"); } closedir DH; return @vcolFileArray; }

If I put this code in my main CGI file it is executed perfectly and the file names are printed to the web browser.

However, if I put the code in a separate file and then use...

require 'vcol-utils.pl'

... to suck in the subprocedure and then call it from my CGI, the CGI is not processed any further than the require statement. But... if I execute the CGI from the command line it works fine and the subprocedure (from the require file) is called and executed normally.

I have the following at the top of my CGI...

use CGI; $cgi = new CGI; use lib('.'); print $cgi->h1("Test1"); require 'vcol-utils.pl'; print $cgi->h1("Test2");

... the word "Test2" is never displayed when run as a CGI but the word "Test1" is??

Any pointers/views/help on this?

M

Replies are listed 'Best First'.
Re: require fails in CGI
by sri (Vicar) on Oct 10, 2003 at 18:03 UTC
    Just use absolute pathes in your script!
    I don't think that your webserver runs from your cgi-bin. ;]

      you're right it doesn't use a cgi-bin

      Wow! and you were right about the other thing as well. If I put the entire path to the script in the require it works a treat. Thanks!! :)

      require 'c:\s1ws\WebServer6.1\docs\vcol-manager\vcol-utils.pl';

      Why?

      -M

        Because Perl looks in several directories to require or use files. The current working directory is always included in the list. When you tested your CGI script from the command line you cd'd to the directory and then ran it, so it worked. Your web server does not do this - it runs the script without changing its working directory to the CGI directory.

        You could also have solved the problem with:

        use lib 'c:\s1ws\WebServer6.1\docs\vcol-manager'; require 'vcol-utils.pl';
Re: require fails in CGI
by liz (Monsignor) on Oct 10, 2003 at 17:59 UTC
    Something bad is happening when you do:
    require 'vcol-utils.pl';

    If you can't check the server's error log, why not add some code to find out what went wrong?:

    eval { require 'vcol-utils.pl' }; print "Loading of utils: $@\n";

    I bet the error message will tell you what is wrong... ;-)

    Liz

Re: require fails in CGI
by Aristotle (Chancellor) on Oct 10, 2003 at 18:01 UTC
    Which directory is vcol-utils.pl in and what's the value of your @INC path?

    Makeshifts last the longest.

      same directory. I was hoping that my...

      use lib('.')

      ... would sort that out

        Are you sure that your script's current working directory when called from the webserver is the directory it's in?

        Makeshifts last the longest.

Re: require fails in CGI
by hardburn (Abbot) on Oct 10, 2003 at 20:05 UTC

    You can do the same thing getVcolFileNames() does in two lines:

    my @files = <"$vcol_dir/*.vcol">; print map $cgi->p($_), @files;

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    Note: All code is untested, unless otherwise stated