You'd probably need to post some code, to be honest. From an initial guess, though, I'd guess that, actually, the CGI script doesn't have the right permissions to execute.
As I say, though, it's just a guess at the mo..... | [reply] |
Hi,
first of all, thanks for your quick answer.
Talking about the issue, the permissions are right because the cgi works until a "dbmopen,dbmclose,tie,untie" is reached.
I have many cgi scripts with html embedded, and when I request a cgi with any of the functions mentioned above, the result it's shown partially in the browser:
use CgiLib;
use ScmConf;
use ScmLib;
use LWP::Simple;
use AnyDBM_File;
&ScmConf::Init();
$starting_time = time;
&PrintHead(*STDOUT,"Document Index", "");
&ReadParse() ||
&PrintErrorMessage("Error in parameter list");
$key = $input{"key"};
$key =~ s/\s*$//;
($search_string = $key) =~ s/-/ /g;
$selected_category = $input{"category"};
@selected_status = split(",", $input{"status"});
if ($key =~ /^\w\w\w ?\d\d\d\d\d[ \w]*$/) {
print "<br><strong>Note</strong>: Search time can be greatly improve
+d if a document number is specified using dashes '-' between prefix,
+regnumber, variant and doctype<br><br>\n";
}
print "<table style=\"border: 1px solid #666666; margin-left: 20px; ma
+rgin-right: 20px;\" cellpadding=\"2\" cellspacing=\"4\">\n";
print " <tr>\n";
print " <th class=\"colheader\">Number<br><nobr>[Document Info]</no
+br></th>\n";
print " <th class=\"colheader\">Title<br>[Document]</th>\n";
print " <th class=\"colheader\">Author</th>\n";
print " <th class=\"colheader\">Status</th>\n";
print " <th class=\"colheader\">Last Updated</th>\n";
print " </tr>\n";
&flush();
my $count = 0;
my $count2 = 0;
my @result;
tie(%DOC, 'AnyDBM_File', &documentdb, O_RDONLY, 0664);
And at this point it fails and shows the following message in the "error_log" file:
child pid 29271 exit signal Segmentation fault (11), possible coredump in /export/home
But, in the other hand, when I request a cgi without any of this functions, it is shown correctly.
Also, I have trying to execute a simply perl outside of the server, in command line:
my $db = shift(@ARGV);
my $file = shift (@ARGV);
my (%DOC);
open (SAL, ">$file");
dbmopen (%DOC, $db, 0666);
foreach $keys (%DOC)
{@a = split ('\0', $DOC{$keys});
$q = join ('-',@a);
print SAL $keys ."=".$q."\n";}
dbmclose (%DOC);
close (SAL);
exit 0;
and it generates the error: Segmentation fault (core dumped)
And for all of this, I'm a little bit confused about what could I do to solve this.
Thanks again for your help | [reply] [d/l] [select] |
foreach $keys (%DOC)
{@a = split ('\0', $DOC{$keys});
$q = join ('-',@a);
print SAL $keys ."=".$q."\n";}
You probably meant (I'm also reformatting the code):
foreach $key (keys %DOC) {
@a = split ('\0', $DOC{$key});
$q = join ('-', @a);
print SAL $key ."=".$q."\n";
}
| [reply] [d/l] [select] |
Aside from posting the code, the excerpt from the Apache's error_log that shows the actual error message would help a lot too.
On the other hand, please note that BerkeleyDB files are platform dependent, so you can't use a BDB from a GNU/Linux box on a Solaris SPARC. | [reply] |
Hi,
the error message is: child pid xxxx exit signal Segmentation fault (11), possible coredump in /export/home, and it generates a core file.
As I have wrote in the previous message, this error also appears when I execute in command line a simply perl script.
About BDB, I downloaded the package from sunfreeware, selecting the one which is valid for x86/Solaris 10 (I forgot to tell you, the platform which I'm trying to run the cgi scripts is a x86)
Thanks
| [reply] |
Well, I think I have discovered "why". These are the steps I have followed:
- The files I'm trying to open with "dbmopen/tie" have already data inside, but this info hasn't been generated in the server where I'm getting the error.
- These files ".dir" and ".pag" where copied from a workstation to a server via ftp, because they contain useful info to keep
- In the workstation (where these files where created and updated), with the same code as in the server, these files are opened and closed without problems.
- But after doing the ftp (with the same lenght in both machines), the files transferred to the server are unreadable when I try to use a "dbmopen" or "tie" with them.
- However, if I generate a perl script which creates and reads the content of ".dir" and ".pag" files, it runs ok.
- So, without all these ".dir", ".pag" files which already contains info, all the cgi's are shown ok.
I have analize deeply this situation in both machines, and here are the results:
* Server (SunOS 5.10 - Spanish, Perl 5.8.4) --> The "dir/pag" files created in the workstation, doesn't work here
* Workstation 1 (SunOS 5.8 - English, Perl 5.8.7) --> The "dir/pag" files created in the server, doesn't work here
I also have checked the "dir/pag" files into other workstation:
* Workstation 2 (SunOS 5.8 - English, Perl 5.8.6) --> The "dir/pag" files created in the server, doesn't work here; however, the files from Workstation 1, works fine.
I'm afraid this could be a problem from the OS, what do you think?
| [reply] |