in reply to problem with do and file existence.

I don't really understand your question. Maybe it would help if you showed how those 2 chunks of code exist in relation to each other. I think a little more explanation would help us help you.

-sauoq
"My two cents aren't worth a dime.";
  • Comment on Re: problem with do and file existence.

Replies are listed 'Best First'.
Re^2: problem with do and file existence.
by emilford (Friar) on Oct 04, 2005 at 15:43 UTC
    Sorry for the lack of clarity. We have our main file, say homepage.cgi, that will execute 1 of 2 Perl files: x.cgi and y.cgi. Both x.cgi and y.cgi worked correctly, being called by homepage.cgi's do. Someone else went in and modified x.cgi to include the if (-e "/some/file") code that I mentioned above. Executing x.cgi loads, but I know get an error message with the do in homepage.cgi, complaing that the file in x.cgi does not exist. So the issue is that something with the file check in x.cgi causes the do to dork up in homepage.cgi. Hope that makes better sense.

      Yes, that makes much more sense. Your problem is that the -e is setting $! when the file doesn't exist. It's staying set when you get back to your homepage.cgi. And apparently your x.pl script is returning undef. I'd check the logic in x.pl to see if that new if statement causes a return.

      Update: To elaborate, if you want to do manual error checking like that, it's necessary for you to return something useful from the file you do. Otherwise, the last expression evaluated in the file will be your return value. If that value is undef, you can't tell it apart from those times when do returns undef because of an error or a non-existent file.

      -sauoq
      "My two cents aren't worth a dime.";
      
        It doesn't cause a return, but it is the last bit of code executed in x.cgi. By why would the file not existing cause x.cgi to return undef? How can this be resolved?