in reply to Re: Tokenizer / Lexical Analyzer for parsing Wiki
in thread Tokenizer / Lexical Analyzer for parsing Wiki

That is a rather good point. But i know that Text::WikiFormat is not very speed-oriented so i can do a lot speedier in Perl. My entire Wiki is in Perl too.

Currently looking at Parse::Lex.
  • Comment on Re^2: Tokenizer / Lexical Analyzer for parsing Wiki

Replies are listed 'Best First'.
Re^3: Tokenizer / Lexical Analyzer for parsing Wiki
by stvn (Monsignor) on Sep 02, 2004 at 16:32 UTC
    That is a rather good point. But i know that Text::WikiFormat is not very speed-oriented so i can do a lot speedier in Perl

    Is Text::WikiFormat a lost cause? Or do you think it could be speed up? If so, why not see if the author is interested in you attempting to do so?

    However, If you really feel Text::WikiFormat is a lost cause, then I would recommend writing your own lexer and tokenizer, that way you will be sure to squeeze all the speed out of it you need.

    -stvn