in reply to snakes and ladders

...or I have to find an efficient solution to the compiler problem.

If parsing the document is slow, then save the results. After you have parsed the document, store the "compiled" version on disk. Next time a request for the same document comes in, look and see if it has changed since the last time you "compiled" it. If it's newer, then recompile (of course) but if nothing has changed then use the pre-compiled version. You should get a good performance boost.

...but only if compilation is actually your performance bottleneck. You could take a look at Devel::NYTProf - a code profiler for perl - and it will help you see what parts of your code are taking longer than other parts. It takes some tinkering to master, but unless you use it (or another good profiler like the venerable Devel::DProf) you might only be guessing about the source of performance issues.

Replies are listed 'Best First'.
Re^2: snakes and ladders
by Logicus (Initiate) on Aug 26, 2011 at 05:49 UTC

    Yah man, the difficulty I'm finding is in writing the compiler. As Larry say's programs that write programs are happy programs and I like that concept a lot. It really doesn't matter how long the compiler takes to run as it will only be run once per source code update.

    I think I've also found a much faster way to parse the documents which I'm also working on.

    I would like to end up with an aXML > PSGI compiler such that even the noobiest of nooby noobs who ever did any noobying around a Perl script can use it to write simple declarative apps and enjoy immense performance out of the box. It might be a while before that happens, and something else might come along in the meantime but yeh that's where this seems to be going atm.