I hate to deflate your ego, but you're dealing with a tiny amount of data, assuming a modern machine ( less than 5 years old ) equipped with modern main memory ( 256 MB? 512 MB? 1 GB? ).
25,000 records, at (pessimistically) 100 bytes per value, takes up 2.5 MB, or 1% or a small memory space.
Some people read multi-megabyte log files into a hash for post-processing. You should have no problems at all.
--
TTTATCGGTCGTTATATAGATGTTTGCA
In reply to Re: Efficient Data Comparison
by TomDLux
in thread Efficient Data Comparison
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |