Dr Manhattan has asked for the wisdom of the Perl Monks concerning the following question:
Hi all
I am having some problems with reading through a bunch of xml files in a directory and processing them. I am using File::Find and File::Copy. The problem seems to be that only 1 file is being processed and then the script stops before getting to the others
#!/usr/bin/perl use strict; use warnings; use utf8; use XML::Twig; use Data::Dumper; use File::Find; use File::Copy; use Cwd; my $dir = getcwd; my %size; my %hash; find(sub {$size{$File::Find::name} = -s if -f;}, "$dir"); foreach my $files (keys %size) { if ($files =~ /(.*)(\.xml)$/) { $hash{$1}++; } } #foreach my $x (keys %hash) #{ # print "$x\n"; #} foreach my $file (keys %hash) { my $twig = new XML::Twig(TwigRoots => {title => 1, text => 1}); $twig->parsefile("$file.xml"); + open(Output, ">:utf8", "$file 2.xml") or die "can't open file $!\n +"; $twig->root->print(\*Output); close (Output); }
Any ideas? Like I said the script works perfectly for the first file it reads, and then it just stops. When I run the foreach loop that I commented out, all the file names that is supposed to be there is there, so I have no idea whats wrong.
Thanx in advance for any help. Much appreciated
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Reading through a large bunch of files
by choroba (Cardinal) on Aug 23, 2013 at 11:03 UTC | |
by Dr Manhattan (Beadle) on Aug 23, 2013 at 14:50 UTC | |
by Dr Manhattan (Beadle) on Aug 23, 2013 at 15:18 UTC | |
by choroba (Cardinal) on Aug 23, 2013 at 15:21 UTC | |
|
Re: Reading through a large bunch of files
by Anonymous Monk on Aug 23, 2013 at 11:24 UTC |