I need to save a growing list of words (>100.000), and occasionally update it. What would be the best way to do it? I need the list in an array to create a regex. At the moment I save it in a plani text file, one word per line. I read it like this:
my @words; my $filename="terms.txt"; if (open my $FH, "<:encoding(UTF-8)", $filename) { while (my $line = <$FH>) { chomp $line; push @words, $line; } close $FH; } my $wordsRX = join "|", map quotemeta, @commonwords;
I write it like this
#I read with the script above my @terms and then add new words to it, +eliminate duplicates, if any, and write it back. my @wordsfiltered = uniq_array(@terms); my $fh = openFileAndWrite($filename); foreach (@wordsfiltered){ print $fh $_ . "\n"; } close $fh;
This runs fine, but I wondering if there are better ways to do it. I am thinking for example at serialisation, incredibly compact. Any drowback?
store(\@wordsfiltered, "terms.array"); my @words= @{ retrieve("terms.array") };
PS: Is there in Perl an easy way to compare the performances (speed) of the two approaces without working directly with timestamps myself?
In reply to best way to read and save long list of words by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |