in reply to opcode caching / speed-up app

The perlperf manual page has some advice.

In my opinion, the most important part is profile first, so that you know where the slow parts are. Devel::NYTProf is the best profiler I've ever used, and I can highly recommend it.

If your application is backed by a database, it might also worth investigating if indexes might be missing. Most database servers the capability to log slow queries, which you should use and act upon.

Replies are listed 'Best First'.
Re^2: opcode caching / speed-up app
by rpike (Scribe) on Sep 14, 2013 at 11:48 UTC
    Thanks moritz. I tried NYTProf for the first time 2 days ago and it seemed pretty good. The 'database' is actually textfiles. There are 3-4 XML documents used within the application, XML::Simple opens and parses them. I appreciate the response.
      The 'database' is actually textfiles. There are 3-4 XML documents used within the application, XML::Simple opens and parses them. I appreciate the response.
      So it sounds like you're running a CGI that gets compiled every time and also loads and parses its entire data set every time. That may well be the problem and something like SQlite could bring a big boost there. And/or, as others mentioned, FastCGI.