I wrote some code today to solve a co-worker's problem. Apparently a client was concatenating a series of XML files into one big file to send via email (or some other numbskull reason) & then needed them split up afterwards for processing.
Here's what I came up with. Being that TIMTOWTDI, which would be the Better/Stronger/Faster/SHORTEST possible way to accomplish the same thing:
use strict; my @input_file = <>; my @temp; my $filecount = 0; my $filename_prefix = "outfile_"; my $file_is_open = 0; my $out_name; foreach my $in (@input_file) { if ($in =~ m/^<\?xml version/) { if ($file_is_open) { close(OF) or die "Couldn't close $out_name!!\n"; $file_is_open = 0; } ++$filecount; $out_name = $filename_prefix.$filecount.".xml"; open(OF,"> $out_name") or die "Couldnt open $out_name for writin +g!!\n"; print(STDOUT "Writing to $out_name.\n"); $file_is_open = 1; } if ($file_is_open) { print(OF $in); } } close(OF);
My first thought was to use the command flag to put perl into looping mode, but I haven't gone there yet... Any other brilliant ideas? I find I learn more quickly when other people try to tackle the same problem as I & we can share insights.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: XML File Slicing Golf(ish)
by mirod (Canon) on Jun 20, 2001 at 01:58 UTC | |
by ZZamboni (Curate) on Jun 20, 2001 at 03:46 UTC | |
(tye)Re: XML File Slicing Golf(ish)
by tye (Sage) on Jun 20, 2001 at 18:13 UTC | |
Re: XML File Slicing Golf(ish)
by ZZamboni (Curate) on Jun 20, 2001 at 03:39 UTC | |
Re: XML File Slicing Golf(ish)
by Abigail (Deacon) on Jun 20, 2001 at 01:55 UTC | |
by gregor42 (Parson) on Jun 20, 2001 at 18:10 UTC |