I just get to sit here and make executive decisions about the code like, "Well, since I am controlling all the input and output of this program, and I am controlling all the changed to the XML file, I'll just write a simple handler to accomodate what I know is going to be in the XML file," and be done with it.
To which I say "Amen, Brother!" Based on the very tidy and fairly simple XML sample in your original post, I don't see a problem with writing a "tightly-bound" (i.e. ad-hoc) "parser" in a dozen or so lines of perl -- the point being to get the job done with minimal fuss (including, mainly, minimal fuss with the folks who are paying for this job). What this really means is that you just need to be very careful about testing the script that creates this XML stream, to make sure its output always meets the constraints assumed by the downstream "parser" script.
Assuming that you can manage the quality of the XML stream as it's being created, then something like the following would probably suit the bill for reading that stream:
So what's wrong with that? If you really are creating the XML stream as well as processing it -- and if the data structure is really as flat as your example makes it out to be -- then you really don't need an XML parsing module.open( XML, "source_of_xml.data" ) or die "I died 'cuz $!"; { local $/ = "</item>\n"; my %item; while (<XML>) { # read one whole <item>...</item> into $_ for my $tag (qw/name working uptime downtime/) { ($item{$tag}) = m{<$tag>(.*?)</$tag>}s; # (leave off "s", # if tags are always fully contained on one line) } # now, do what you want with %item... } }
In essence, you seem to be using XML simply as a means of "embellishing" (reformatting) a flat table, and there's no need for a hefty, C-compiled module to handle that.
In reply to Re: Re: Re: Parsing XML into a Hash
by graff
in thread Parsing XML into a Hash
by mcogan1966
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |