seaver has asked for the wisdom of the Perl Monks concerning the following question:

Dear all,

I dont get this problem if I run the software locally, but it crashes with a segmentation fault if I try to run it via CGI.

Basically i have a XML-RPC cgi file set-up, and when the method inside is called, it runs the main external perl script.

This perl script, amongst other things, uses two external C programs to do some work...

here's the two lines that uses the two programs:
qx "$msms $coords1 $coords2 $$Vertex_Density >$mslog 2>&1"; qx "$hbplus $pdbfile >$hblog 2>&1";

the first program runs without a hitch, the second program is the one that crashes, EVEN though the main perl script runs ok locally.

I've pinpointed two possible areas of fault:

  1. the second program is designed so that it's output occurs in the directory it is run from, thus there is a chdir $path command before it. This chdir works, but does the change in dir crash the program?
  2. the first program is run from a fork deliberately created by perl, because it is run twice, in parallel. there is no deliberate fork for the second as it runs once on it's own. should this be the case?
thanks
Sam Seaver

ps there is no output, just "Segmentation Fault"; pps all uids are go. I hope.

20030516 Edit by Corion Removed PRE tags, added formatting

Replies are listed 'Best First'.
Re: External C ran from External Perl ran from CGI
by halley (Prior) on May 16, 2003 at 18:25 UTC

    If your script isn't collecting the output, don't use backticks (or in this case, the qx// operator).

    Use the system() command instead, and investigate your problems with the returned value.

    --
    [ e d @ h a l l e y . c c ]

Re: External C ran from External Perl ran from CGI
by graff (Chancellor) on May 16, 2003 at 19:31 UTC
    I dont get this problem if I run the software locally, but it crashes with a segmentation fault if I try to run it via CGI.

    This seems to come up a lot. Usually, it happens because when you run it "locally", your personal user account has permission to read, write and/or execute particular files in cited directories, whereas when it runs via a web server, the web server's user account (usually "nobody") does not have the same permission. Check permissions on the various directories and files involved.

    update: This chdir works, but does the change in dir crash the program?

    If this were part of the problem, it most likely would have failed when you ran it "locally".

    the first program is run from a fork deliberately created by perl, because it is run twice, in parallel. there is no deliberate fork for the second as it runs once on it's own. should this be the case?

    I suppose it might depend on what these programs are supposed to be doing. Could there be a problem involving resources that should be locked (to avoid concurrent access), but aren't?

      The external C program, $hbplus, that keeps crashing. I wrote a seperate CGI to run it alone, and it still crashes.

      Ive used both qx and system, and the only output I get from either, is a line in my httpd log that says "Bad header: Segmentation Fault"..

      i've attempted to write a little xtra output into the C program (the source code is available) right at the beginning so I could being to trace the crash, but it won't even output these lines, it crashes straight away.

      Does this give anyone an idea of what is going on? I even set up $hbplus in '/home/tomcat' so that when httpd runs $hbplus, it is writing the output into it's own directory, but that still doesn't work.

      Does anyone know of a good CGI/'C' mailing list?

      Thanks Sam

      I dont get this problem if I run the software locally, but it crashes with a segmentation fault if I try to run it via CGI. This seems to come up a lot. Usually, it happens because when you run it "locally", your personal user account has permission to read, write and/or execute particular files in cited directories, whereas when it runs via a web server, the web server's user account (usually "nobody") does not have the same permission. Check permissions on the various directories and files involved.

      Httpd.conf runs has the user set to be "tomcat" The program ($hbplus) itself is owned by root, but has these permissions: rwxr-xr-x.

      The directory into which the resulting file is created (cwd) is owned by "tomcat", and is writable. To prove that, the log file created by '>$hblog' IS created in the same directory.

      However the log file is empty, so it seems like the program crashed before it even started...