Hi all
I am having some problems with reading through a bunch of xml files in a directory and processing them. I am using File::Find and File::Copy. The problem seems to be that only 1 file is being processed and then the script stops before getting to the others
#!/usr/bin/perl use strict; use warnings; use utf8; use XML::Twig; use Data::Dumper; use File::Find; use File::Copy; use Cwd; my $dir = getcwd; my %size; my %hash; find(sub {$size{$File::Find::name} = -s if -f;}, "$dir"); foreach my $files (keys %size) { if ($files =~ /(.*)(\.xml)$/) { $hash{$1}++; } } #foreach my $x (keys %hash) #{ # print "$x\n"; #} foreach my $file (keys %hash) { my $twig = new XML::Twig(TwigRoots => {title => 1, text => 1}); $twig->parsefile("$file.xml"); + open(Output, ">:utf8", "$file 2.xml") or die "can't open file $!\n +"; $twig->root->print(\*Output); close (Output); }
Any ideas? Like I said the script works perfectly for the first file it reads, and then it just stops. When I run the foreach loop that I commented out, all the file names that is supposed to be there is there, so I have no idea whats wrong.
Thanx in advance for any help. Much appreciated
In reply to Reading through a large bunch of files by Dr Manhattan
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |