in reply to filter a file using an exclusion list

You've got the loops backwards. This is what you have
for ( 4 .. 6 ){ print "$_\n"; for ( 0 .. 10 ){ print "\t$_\n"; } }
Your goal should be to read the file once
my %stopwords = map { $_ => } <$excludes>; while( <$large> ){ print unless $stopwords{$_}; }

Replies are listed 'Best First'.
Re^2: filter a file using an exclusion list
by coldy (Scribe) on Feb 13, 2011 at 20:45 UTC
    Thanks for the help - but It still isnt working...I have included print statements for debugging:
    #!/usr/bin/perl -w use strict; open(my $excludes,"<",$ARGV[0]) or die("could not open $ARGV[0]!"); open(my $large,"<",$ARGV[1]) or die("could not open the $ARGV[1]!"); my @excludes = <$excludes>; chomp(@excludes); my %stopwords = map { $_ => } <@excludes>; while( <$large> ){ my ($test,@temp) = split(/\s+/,$_); print "testing: $test\n"; print $stopwords{$test} , "\n"; #print $test, "\n"; } close($excludes); close($large);
    I get this message for each line of $large:
    testing: 9999853 Use of uninitialized value in print at ./filter_exclusions.pl line 13, + <$large> line 1.
    Is it something Im missing with my input files?
      There are some errors. The one that blows up is:

      my %stopwords = map { $_ => } <@excludes>;

      If you print %stopwords in Data::Dumper you would have had:

      $VAR1 = { '9999853' => '999986' };

      And this is not what you want! In the map, you are failing to assign a value with each key. You could state that as:

      my %stopwords = map { $_ => 1} @excludes;

      Here I am assigning a value of '1'. It conveniently then tests true when you are looking for stopwords. I also removed the angle brackets around @excludes in your code, (that would be a glob, not what you want).

      Altogether, it could be solved like this:

      #!/usr/bin/perl use strict; use warnings; use 5.012; use Data::Dumper; my $large =<<EOF; 9999853 5615 4 148656321 999986 5615 14 94873609 9999883 5615 4 860669 9999929 5615 4 73689618 9999931 5615 4 31286083 9999944 5615 4 148596445 999995 5615 10 78405504 9999963 5615 4 84291761 9999966 5615 4 5978256 9999979 5615 4 135953341 EOF my $excludes =<<EOF; 9999853 999986 EOF open my $fh1, "<", \$excludes or die $!; my %stopwords = map {chomp; $_ => 1} <$fh1>; close $fh1 or die $!; open my $fh2, "<", \$large or die $!; while( <$fh2> ){ my ($test) = /^(\d+)/; # if $test is in the hash # then $stopwords{ $test } == 1 or true print unless $stopwords{ $test }; } close $fh2 or die $!; #print Dumper \%stopwords;
        The above code works well on sample data - For a 70,000 item exclusion list and 40GB of data to filter the script does not work - it is not filtering the exclusion list and outputs every line of the large file (even though I have gone through and used grep on some of the items and it finds them in the large file - I would use grep -v -f excludes.txt large.txt but that does not work either on the full data). Is there a maximum limit on the size of a perl hash? Any other reason it would not work on the full data?
        ahh - dos2unix !! I think it will work now. Many thanks!