Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi 1 and all

I'm still trying to find out if there's a resource limit with my server, but if someone could shed some light on the situation anyway....

Series of Perl scripts - all variables are passed thru either GET or POST methods thru a single 'index.cgi' file.

The system is multi-lingual (depending upon user setting) and I have the various language libraries outside of webspace in their respective directories (ie: .../langs/eng .../langs/esp .../langs/fre etc...).

Depending upon the 'action' of the user, various files are 'required' and the appropriate sub routine(s) is/are duly called. (top-level actions being the 'Main Menu' for example, and 2nd-level actions being 'Main Menu --> User Config' - etc. You get the idea.)

The vast majority of 'on-screen' text is contained within arrays - most are multi-dim arrays, and only very few are 2-dim or simple variables. The vast majority of images/icons used are assigned to individual variables, and some are contained within their own arrays.

With the exception of 'images.lib' (which is required within the index.cgi file) each language file is 'required' as and when, but this 'break-down' system isn't used 100% (85% yes) within the system. Also, the whole thing is template-based, opening various *.tmpl files according to the 'action' and then doing a

\$line =~ s/<THIS\>/<THAT\>/g;

substitution for each line within the tmpl file.

The odd thing that is happening (problem appears to be affecting 2 separate Admin systems hosted on different servers) is that certain 2nd level 'actions' are being executed correctly - as they should - but others are not. I can't asess if those that aren't working are more processor-intensive than the ones that are working correctly (can't = don't know how to, or even if I can find out!).

No 500 errors, no 404's, none of the usual errors occur - except when I've caused a fatal error for self re-assurance! - the system just times-out. Even down to the modem icons; no data is being exchanged.

In my system, it works if I have fewer variables set in the 'images.lib' file (which normally contains +/- 100 variables set to the HTML IMG tag) yet my client's system was working as of last week. Now, not having made any changes to that system at all, 2 of the 2nd-level menu actions have the same problem.

My host says they haven't made any changes to their servers and due to my initial tests (using my system) it appears to point towards a server processor +/or memory issue (I think).

Is there any way of finding out how 'hungry' my scripts are? Is there a limit (in Perl internals) to the number of variables and/or arrays in memory at any one time? Is there a limit on the number of processes at any one time? What constitutes a 'big' program - 100 vars, 1,000 vars, 5MB/50MB in memory ?!?

I've been looking through the Perl Camel book and nothing jumps out at me. I'd like to solve this one, 'cos my system is worthless at the moment, and I have a 3rd system based on the 1st & 2nd and am waiting for that to go south too!

Can anyone shed some light as to the possible causes? Pointers in the right direction won't go unwelcomed either! ;-) I also hope I've given a reasonalbe explanation for others to understand.

Looking forward to any help whatsoever.

Kind regards
Richard.

Replies are listed 'Best First'.
Re: Possible Server Resources problem
by tachyon (Chancellor) on Sep 10, 2004 at 00:37 UTC

    Perl has essentially no issues with data structure size. A 'typical' perl process does use quite a lot of memory. 10-40MB would not be unusual and you can go a lot larger.

    If you are not logging any errors you need to get some debugging happening, otherwise you are working in the dark. The simplest method is to add $DEBUG && warn "Got to here!\n" type statments all though your code and set $DEBUG=1 at the top of the script to make it active. At that point you can accurately see where things grind to a halt.

    Your mention of 5/50MB of memory looks like a quote from a shared server brochure. Shared servers limit use of resources in many ways but can certainly throttle your processes. If you have shell access fire up top and watch it as you make a request.

    cheers

    tachyon

      tachyon - Thanks for your prompt reply - I'm currently still working on the problem (4.10am here)!!!

      I doubt that I'm even approaching 10MB of memory - if one can go by the filesize * nš files 'required' etc... at least as a stating point?!? But thanks for the figures - at least now I know what ballpark we're talking about re: memory usage! :-)

      Re: debuging - I have included a &my_err() sub to sort-of 'trap' errors, but only for opening files and requiring files, nothing else. However, it certainly does the job.

      But the prob I'm having doesn't even get that far! It's nothing to do with headers not being sent either. The whole thing just freezes! Almost as if the program were in some sort of loop - but I can't see where a loop might be! And all I'm trying to do is appent html output to a scalar variable for s///g;ing later in the script.

      The "Gets to here" bit I do in my sleep! (I've learnt the hard way many a time b4! ;-)

      Shared server would be correct, although it was not a quote! I have no idea how many other sites are using the same server, and the host (Powweb) say that they have not changed their server config, neither do they allow their clients shell access. And, understandably, 'cos this is a script problem, neither am I expecting them to offer support (or classes in Perl programming in this case!)

      So, apart from using a debug sub and following the 'flow' of data by putting "Ooh look! I managed to get this far" in the script, is there anything else I might be able to try or am I destined to go thru 1,000's of lines of code, doing CTRL-V and getting my script to go "Boo!" on-screen?!?

      My 'just as important' problem is to find out what's happening so I can temporarily plug the whole, in order to start selling this app. which'll give me time to rewrite completely in a far more 'professional' way. At the moment, I just can't afford another rewrite (which'll probably take > 2-3 months!!!

      Any further suggestions are more than welcome! Please!!!

      kind regards and big thanks in anticipation...

      Richard.

        I doubt that I'm even approaching 10MB of memory - if one can go by the filesize * nš files 'required' etc...at least as a stating point?!?

        You can't make that assumption at all. Hello World in Perl takes over 1MB although it is obviously only one line of Perl. I can use *all* your systems resources with just one 12 char line of perl. Perl is a memory pig in many ways. This reflects the optimisation strategy of spending memory to gain speed.

        Re: debuging - I have included a &my_err() sub to sort-of 'trap' errors, but only for opening files and requiring files, nothing else. However, it certainly does the job.

        You are totally missing the point. You need to see where your code gets to, so you add the debugging as suggested to do that. BTW you dont need the & if you call function().

        $DEBUG=1; $DEBUG && warn "Doing foo!\n"; foo(); $DEBUG && warn "Done foo!\n"; #etc

        The whole thing just freezes!

        It most certainly does not. You just don't know where it stops which is somewhere between starting and completing :-) That is why you need the debugging flow messages. The messages appear in the error log. See CGI Help Guide

        So, apart from using a debug sub and following the 'flow' of data by putting "Ooh look! I managed to get this far" in the script, is there anything else I might be able to try or am I destined to go thru 1,000's of lines of code, doing CTRL-V and getting my script to go "Boo!" on-screen?!?

        You don't seem to get it. You *need* to add the debug code. If the $DEBUG is set it writes messages to the error log for you to review. If $DEBUG=0 it does nothing. The code runs as usual. Thus you can put the code in and leave it. You don't even know where your code gets to, so you have no idea what the problem is. I have not seen your code, nor do I want to. All I can offer is the logic used by millions of programmers across the world. Don't speculate. Isolate the problem first, then fix it.

        But there is a shortcut of sorts. If you don't mind going through reams of output then install Devel::Trace. In your cgi activate it as shown. Every *line* that is executed will then generate a logging message in the error logs.

        #!/usr/bin/perl -d:Trace Devel::Trace::trace('on'); # Enable print "Hello World\n"; foo(); Devel::Trace::trace('off'); # Disable foo(); sub foo{ print "Foo\n" }

        Make sure this sample runs first, then just add -d:Trace to the shebang +/- Disable/Enable it in sections. Here is what happens when I run that and redirect stderr to a file.

        C:\>test.pl 2>out Hello World Foo Foo

        So there is all the expected output, no more no less. Situation normal....but if we look in file 'out' (this will automatically be error_log with a CGI) we have a complete stack trace:

        C:\>more out >> C:\test.pl:3: Devel::Trace::trace('on'); # Enable >> D:/Perl/site/lib/Devel/Trace.pm:31: my $arg = shift; >> D:/Perl/site/lib/Devel/Trace.pm:32: $arg = $tracearg{$arg} while +exists $tracearg{$arg}; >> D:/Perl/site/lib/Devel/Trace.pm:33: $TRACE = $arg; >> C:\test.pl:5: print "Hello World\n"; >> C:\test.pl:7: foo(); >> C:\test.pl:13: sub foo{ print "Foo\n" } >> C:\test.pl:9: Devel::Trace::trace('off'); # Disable >> D:/Perl/site/lib/Devel/Trace.pm:31: my $arg = shift; >> D:/Perl/site/lib/Devel/Trace.pm:32: $arg = $tracearg{$arg} while +exists $tracearg{$arg}; >> D:/Perl/site/lib/Devel/Trace.pm:33: $TRACE = $arg; C:\>

        cheers

        tachyon

Re: Possible Server Resources problem
by graff (Chancellor) on Sep 10, 2004 at 02:43 UTC
    You said:
    In my system, it works if I have fewer variables set in the 'images.lib' file (which normally contains +/- 100 variables set to the HTML IMG tag) yet my client's system was working as of last week. Now, not having made any changes to that system at all, 2 of the 2nd-level menu actions have the same problem.
    So, does this mean you have two systems running, and both are showing the same (or basically similar) behavior? And both are running on the same server?

    When you say "not having made any changes..." do you mean just the code is unchanged, or that both code and (quantity of) data are unchanged?

    Some possible strategies for diagnosis might depend on what type of server it is (windows or unix, with some minor details that might depend on what type of unix) -- though tachyon's advice would apply equally on all systems.

    If the server is unix, you could try supplementing the simple "Got to here" messages with some "ps" output -- e.g. with a bsd-style of "ps", the following would give you information on your process size, what memory limits are in effect, and what percentage of memory you're using:

    my $ps = `ps -p $$ -o vsz,rsz,lim,%mem`;
    Write $ps to a log file together with some indicator of where you are in the script, and do this at strategic points. (You'll want to consult the appropriate "ps" man page for correct options and meanings of reported values -- these vary with the unix flavor.) I don't know what sort of method would be available on a windows server to do the same thing, unless a GNU/cygwin/other port of "ps" is available...