I am writing a script to read many files with structured data, the ultimate goal is to create a CSV file where each line has the server name and the value of the fields.
I cant seem to build and array of hashes to hold the data, the application only inserts the first record
typical file data:my program:define host { host_name whplnsweb-dr use xiwizard_windowsserver_host alias whplnsweb-dr display_name whplnsweb-dr address 172.28.17.115 hostgroups AntiVirus End Point Protection +,windows-servers contacts helpdesk,admin register 1 }
use strict; use Data::Dumper; my @host = (); my $record = {}; my $key; my $value; my $line; my $rec_num = 0; my $filename = ""; my $dirname = "S:/Projects/Nagios/usr-local-nagios/etc/hosts"; opendir(DIR, $dirname) or die "cant opendir $dirname: $!"; while (defined ($filename = readdir(DIR))){ next if $filename =~ /host_data/; next if $filename =~ /^[\.]+/; print "filename: $filename \n"; open(FH, "<", $filename); while ($line = <FH>){ chomp $line; next if $line =~ /^$/; next if $line =~ /[#{}]/; $line =~ /^\s*(\S+)\s*(\S+)/; $record->{$1} = $2; } close(FH); push @host, $record; $rec_num =+ 1; } closedir(DIR); print Dumper(@host);
In reply to array of hashes use by salatconed
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |