Another thought...sounds like you are into a maintenance problem. Not sure what kind of application could generate that many macros!!! But anyway you might consider converting a bunch of these things into "inline" subroutines. That yields performance of a macro but with type checking etc on the args. Lots of compilers (including gcc) support this. I presume you are driven into writing this code because its hard to figure what these macros are really doing! Maybe a bigger project that "fixes" the source code is in order? I have worked on projects before where Perl actually writes code and .h files as part of pre-compile step - this is a weird idea, but in right app, it can work. 13,000 macros is a mind boggling number - even on a big ASM project!
I have found that using some of the special variables like $+ can slow things down a lot with regex - some of these things can introduce extra overhead - sorry can't find a code example of last case where I found this.
The most likely suspect is the regex related code. I would use a subset of test data to try to find out: a)if exe time scales linearly or exponentially, b)maybe by hacking around, you can find some types of macros that take WAY longer than others. Sorry that I can't be of more help right now.
In reply to Re^2: define analyser - performance problem
by Marshall
in thread define analyser - performance problem
by grizzley
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |