in reply to Load a single column file into a hash

I liked the post by AnomalousMonk++.
I used Inline::Files so that I could produce a single post with runnable Perl code.

Instead of the INLINE FILES, You will need something like this:

open (FILE1, '<', "pathtofile1") or die "unable to open pathtofile1 $! +"; open (FILE2, '<', "pathtofile2") or die "unable to open pathtofile2 $! +";

Replies are listed 'Best First'.
Re^2: Load a single column file into a hash
by johngg (Canon) on Feb 05, 2018 at 15:27 UTC
    I used Inline::Files so that I could produce a single post with runnable Perl code.

    You can do that without a module by open'ing a reference to a HEREDOC. That would work for multiple input "files" as in your mentioned solution.

    use strict; use warnings; use Data::Dumper; open my $inFH, q{<}, \ <<EOD or die qq{open: < HEREDOC: $!\n}; COA213345 COA213345 COA213445 DOB213345 EOA213345 EOD my %lookup = map { chomp; $_ => 1 } <$inFH>; close $inFH or die qq{close: < HEREDOC: $!\n}; print Data::Dumper->Dumpxs( [ \ %lookup ], [ qw{ *lookup } ] );

    The output.

    %lookup = ( 'EOA213345' => 1, 'COA213445' => 1, 'COA213345' => 1, 'DOB213345' => 1 );

    For the purposes of the OP's request, note that using a hash removes the duplicate value in the input.

    Cheers,

    JohnGG