Update 3: Thanks a lot to everyones VERY useful comments which are valid and constructive. I am going to work on the code for a bit and once I get it working will come back for more detailed help if appropriate. Thanks again for everyones generous time and help..... Update2: I think I got something working can someone check if its OK? Its working but not sure if its the most efficient? Thanks again for everyones help, spologizies for all the questions. Dave
Update Thanks for everyones comment so far, can someone give me some pointers to how to Array of Arrays or a Hash of Arrays from 100 files or point me to some code that does this... Thanks again. Hi everyone If I had say 100 files and I want to put each file into an array into memory like below. But how do I automatically create the name for each array so that I have 50 unique names? I can then get each array to spit out the data eg: Thank you for your help!#!/usr/bin/perl open(FILE, "args.test"); @files=<FILE>; my @allfiles; for my $filename(@files){ open FILE, $filename || die "Cannot open $filename for reading: $!\n"; push @allfiles, <FILE>; close FILE; } @test=grep /chr1:8325525/,@allfiles; print "@test\n";
print "$array[1],$array2[1],$array3[1],$array4[1]...etc\/n";
while(<FILES>){ chomp; process_into_array($_); } sub process_into_array{ my $file=shift; open(DAT, $file); @file=(<DAT>); }
In reply to slurping many Arrays into memory...Please Help by david_lyon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |