Hello-
I have written the following script for parsing XML, and it works for small files. For larger files, however, it runs out of memory. I have inserted the Tie::File module, to no avail. Is there anything I could do to improve I/O? Thanks in advance.
#!/usr/bin/perl $[ = 1; # set array base to 1 print "Enter modifier (e.g ISIN), input file, output file, each separa +ted by a space\n"; $enter = <STDIN>; @fields = split(/ /,$enter, 99999); #save array of modif +iers my @input; use Tie::File; #use Tie:File to get + data from xml doc tie @input, 'Tie::File', $fields[2]; @all = join("", @input); #join all of the li +nes of input together in one string $all = "@all"; @stories = split(/<\?xml/, $all, 99999); #split up the strin +g by story, using the <?xml tag foreach (@stories) { $_ = $_; $match = $_ if m#$fields[1]#; #go through the sto +ries, matching a modifier push (@matches, $match) #if it matches, add + to a new array } open(STDOUT, ">$fields[3]"); #print that new arr +ay to a new file print "@matches";
In reply to Parsing Large XML by shravnk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |