I recently had a similar issue with hash references and sub-routine recursion and explored two solutions :-
- The first solution was to use a single blessed hash which was referenced through each iteration of the subroutine. An example of this solution can be seen at Local::SiteRobot - a simple web crawling module - In particular, have a look at the _crawl subroutine where this technique has been employed.
- The second solution involved the shift from a recursion based iteration through elements to stack-based iteration. This technique meant that the recursion process, deemed most evil depending on whom you talk to, could be replaced in its entirity. An example of such code can be seen in the File::Find module shipped with Perl 5.6 and later (which can be compared to the recursion based File::Find module shipped with earlier distributions).
An example of stack-based code may look like this:
foreach my $element (@list) {
# ... process element, search for sub-elements ...
splice (
@list,
($depth_first_processing) ? 0 : @list,
0,
$subelement
);
}
A quick note on this code - The variable $depth_first_processing allows new sub-elements can be either added to the end of the list (breadth-first iteration) or as elements to be iterated through within the next loop (depth-first iteration).
This structure of stack-based iteration has been incorporated into the current version of WWW::SimpleRobot.
perl -e 's&&rob@cowsnet.com.au&&&split/[@.]/&&s&.com.&_&&&print'