"... 100+ GB ...combine rows...consolidated output..."
Life is hard - so perhaps you better go with sqlite?
See also Re: Reading HUGE file multiple times and Limits In SQLite.
Best regards, Karl
P.S.: And remember:
#!/usr/bin/env perl use strict; use warnings; use feature qw(say); use Try::Tiny; # say $0; try { ...; } catch { say $_} __END__ karls-mac-mini:playground karl$ ./bfdi533.pl Unimplemented at ./bfdi533.pl line 10.
«The Crux of the Biscuit is the Apostrophe»
perl -MCrypt::CBC -E 'say Crypt::CBC->new(-key=>'kgb',-cipher=>"Blowfish")->decrypt_hex($ENV{KARL});'Help
In reply to Re: Memory utilization and hashes
by karlgoethebier
in thread Memory utilization and hashes
by bfdi533
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |