Angharad has asked for the wisdom of the Perl Monks concerning the following question:
Hi there
I'm wanting to store the contents of a particularly large text file into a hash so I can use it as a look up table later on in the perl script. I've written the code to create the hash, but the act of storing the data is taking an impossibly long time. Heres the code:
I'm wanting to store the contents of a particularly large text file into a hash so I can use it as a look up table later on in the perl script. I've written the code to create the hash, but the act of storing the data is taking an impossibly long time. Heres the code:
Can anyone tell me a better way of doing this so that the contents of the file are stord in the hash more quickly? Any thoughts much appreciated!!use strict; use Data::Dumper; my $file = shift; my %hash; open(IN, "$file") || die "ERROR: can't open $file: $!\n"; while(<IN>) { chomp; my @info = split(/\,/, $_); my $md5 = $info[0]; # I only want to store the data # in columns 1 and 2 my $uni = $info[1]; $hash{$uni} = $md5; } close IN;
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: storing a huge text file into a hash
by roboticus (Chancellor) on Dec 07, 2010 at 18:44 UTC | |
by Angharad (Pilgrim) on Dec 07, 2010 at 19:17 UTC | |
Re: storing a huge text file into a hash
by raybies (Chaplain) on Dec 07, 2010 at 20:08 UTC | |
Re: storing a huge text file into a hash
by sundialsvc4 (Abbot) on Dec 07, 2010 at 21:55 UTC | |
Re: storing a huge text file into a hash
by BrowserUk (Patriarch) on Dec 07, 2010 at 18:23 UTC | |
by Angharad (Pilgrim) on Dec 07, 2010 at 18:34 UTC | |
by BrowserUk (Patriarch) on Dec 07, 2010 at 18:58 UTC |
Back to
Seekers of Perl Wisdom