apprentice has asked for the wisdom of the Perl Monks concerning the following question:

I am the lone Perl coder in a QA group. In order to make the others more productive without having to force them to learn Perl (I can't figure out why they don't want to???), I want to write a test engine that reads from files they write using keywords and test data. An example would be:
open browser <optional machine info> goto <some url> verify page <some info to verify> set form <field1=value1,field2=value2,...> submit form <optional button or image to click> verify page <some info to verify> etc...
Each keyword would map to a function or to a file containing other keywords, so that 'add user' might be used as part of multiple test files and the steps to 'add user' would be defined in its own test file.

Does anyone know of a module that handles this type of functionality, the parsing of the keyword-based file and mapping to other files and/or functions?

Alternatively, does anyone have any suggestions on how to go about finding the file that contains the sub functionality like 'add user'? One thing that would be nice is if the keyword module could also handle things like program flow control such as for and while loops, next, last, and if conditionals. I'm probably hoping for something that isn't out there, but I had to ask before tilting at it myself.

Thanks in advance for any info and/or ideas on this one!


"Peace, love, and Perl...well, okay, mostly just Perl!" --me

Apprentice

Replies are listed 'Best First'.
Re: Keyword parser function mapping module
by Corion (Patriarch) on Jan 06, 2004 at 15:14 UTC

    That sounds to me like testing/verifying a web application. If that is the case, WWW::Mechanize together with Test::HTML::Content might be a solution, as might the HTTP::Webtest kit.

    The scripts you'd use for either of the two are different from what you envision - HTTP::Webtest gets close to your usage though.

    As I am the author of Test::HTML::Content and WWW::Mechanize::Shell, I would create some extension to WWW::Mechanize::Shell to accept some more commands to test the web pages, and then generate the Perl scripts from those command scripts, which then again, Test::Harness would run for me...

    The www-mechanize-dev mailing list on http://www.sourceforge.net has Mark Stosberg on it, who does such stuff.

    perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
Re: Keyword parser function mapping module
by dragonchild (Archbishop) on Jan 06, 2004 at 16:19 UTC
    I have been in your exact shoes, being the developer in a QA group. What we did was this:
    1. Create an event handler
    2. Each event is associated with a function (or some object/method pair ... your choice)
    3. The event handler chews through a list of pending events
    4. A given event can modify the list of pending events (so, LoadCommands would "Do The Right Thing")
    5. Create a starting event (like LoadCommands <filename>, or somesuch)

    You might want to look at POE. We rolled our own, but that was almost 4 years ago, when POE was still in infancy. The other nice thing about doing event handlers is that simple scripting commands (SET, JZ, JNZ, etc) are very simple to implement. Once you have JZ and JNZ, you have (almost) all of your basic logic-flow (if, while, for, etc) implementable in those terms. (They're actually implemented in those terms in most ASM flavors.)

    ------
    We are the carpenters and bricklayers of the Information Age.

    Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

Re: Keyword parser function mapping module
by jonadab (Parson) on Jan 06, 2004 at 16:19 UTC
    require "testsuite.pl"; open_browser(optional machine info); goto ("http://some.url"); verify_page(some info to verify); set_form (field1 => "value1", field2 => "value2"); submit_form(optional button or image to click); verify_page(some info to verify);

    Seriously, they wouldn't need to learn Perl, per se, in order to write Perl scripts that do nothing but call functions that you've predefined. Learning this syntax wouldn't be significantly harder than learning the syntax you laid out, and it saves you the trouble of writing a parser; all you have to do is write the procedures.

    I've used a system comparable to this successfully. My sister, an end user, writes quizzing questions using my quizques mode in Emacs. She doesn't do anything else in Emacs and does not know *any* of the standard Emacs stuff (most notably, she does not know about Ctrl-C anything, Ctrl-X anything, or M-x anything). She just uses the mode that I set up. You can do the same thing with Perl. Write your code so that they don't have to do anything but call your prefab procedures.


    $;=sub{$/};@;=map{my($a,$b)=($_,$;);$;=sub{$a.$b->()}} split//,".rekcah lreP rehtona tsuJ";$\=$ ;->();print$/
      Writing the parser is easier in the long run, because it allows the testers to define their own "functions", so to speak. And, if the event handlers (which are your functions) are well defined (and OO, if possible), then enterprising QA people can overload their own events. :-)

      ------
      We are the carpenters and bricklayers of the Information Age.

      Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

        Writing the parser is easier in the long run, because it allows the testers to define their own "functions"

        Could you clarify what you mean by this? The only meaning I can make out for it requires them to learn Perl. I assume that's not what you meant, since it was expressly stated they don't want to do that, and also because your use of quotation marks around "functions" seems to indicate a different a different meaning, but I'm not sure what. How could non-programmers define their own "functions"?


        $;=sub{$/};@;=map{my($a,$b)=($_,$;);$;=sub{$a.$b->()}} split//,".rekcah lreP rehtona tsuJ";$\=$ ;->();print$/
Re: Keyword parser function mapping module
by adrianh (Chancellor) on Jan 06, 2004 at 15:29 UTC

    You might want to have a look at using a FIT testing framework.

    It's a different approach from the one you're proposing, but I've found it a useful for dealing with non-coding testers.

    I rolled my own Perl variant some time ago that's too ugly for public consumption. However, since then Test::FIT has arrived on CPAN along with a perl version of the original, although I've not yet played with either myself in any serious way.

Re: Keyword parser function mapping module
by muenalan@cpan.org (Initiate) on Jan 07, 2004 at 10:38 UTC
    Parse::RecDescent is the universal response to everyone needing a fast implementation of *jet another*(tm) mini-language.

    BEGINNER
    I would translate it into code employing some CPAN module mentioned above.

    ELITE
    I would translate it into xml and then translate that again into code (with XSLT) employing some CPAN module mentioned above. *Yeah, throw your eggs on me*

    Cheers, Murat
      Why on earth would you want the overhead and complication of Parse::RecDescent for a simple macro language?? Gods, NO!

      What's wrong with something like this:

      sub parse_commands { my ($filename) = @_; my $fh = IO::File->new($filename) || die ...; while (<$fh>) { # All commands being with a letter and must be the first chara +cter next if /^[^A-Za-z]/; # A command is space-delimited and must be on one line. # We want to deal with the entire command in uppercase. my ($command, @line) = split /\s+/, uc; unless (exists $DispatchTable{$command}) { warn "'$command' isn't a valid command.\n"; next; } push @commands, [$command, \@line]; } return \@commands; }

      All of a sudden, you have a parsed list of commands. Obviously, you'd do something a little more in-depth when dealing with WHILE/ENDWHILE, IF/ELSE/ENDIF, SUB/ENDSUB, and FOR/ENDFOR, but it's not that much more complicated. It can even be a one-pass compiler if you

      • do all includes during the parsing phase (this requires reading the initial file first into an array for splicing)
      • require all subroutines be defined before usage (if SUB is even allowed)
      • require all labels to be on a line by themselves. (Something along the lines of /^([A-Z]\w*):$/)
      • don't care if a label is defined until runtime.

      If interested, I'll write one. Remember - the handler for each command is the one that cares about parameters and the like. This isn't a fully-featured language (though it is Turing-complete).

      ------
      We are the carpenters and bricklayers of the Information Age.

      Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.