http://qs1969.pair.com?node_id=1161311


in reply to Re^2: Bullish on Moose, how about you?
in thread Bullish on Moose, how about you?

The main reasons I like/use Moo, etc, are to share behavior in a larger application (like a Catalyst application where the model is needed by the controllers, the tests, and the command line) and to wrap up, normalize, and auto-document, complex behaviors of conjoined simple parts that would otherwise quickly become spaghetti; make things like push and @{$object->{bunch_of_data}} semantic, less dense. This web spider stub is off the top of my head so take it as a terse draft but it shows the kinds of things I like to do–

package MooSpider; use Moo; use MooX::HandlesVia; use Carp; our $VERSION = "42"; has "_pages" => is => "ro", handles_via => "Array", default => sub { [] }, handles => { pages => "elements", queue_page => "push", next_page => "shift", }; has "agent" => is => "lazy", isa => sub { confess "Need a flavor of WWW::Mechanize" unless eval { $_[0]->isa("WWW::Mechanize") } }, handles => [qw/ get post request links uri success response /]; sub _build_agent { require WWW::Mechanize; WWW::Mechanize->new( autocheck => 0, agent => join("/", __PACKAGE__, $VERSION) ); } sub crawl { my $self = shift; return unless my $uri = shift || $self->next_page; if ( $self->get($uri) and $self->success ) { $self->queue_page($_->url_abs) for $self->links; } else { carp "Problem fetching ", $self->uri, $/, "Moving on to next link"; $self->crawl; } $self->response; } 1; my $spider = MooSpider->new; $spider->crawl("http://example.org"); print join($/, $spider->pages), $/; $spider->crawl; print join($/, $spider->pages), $/;