in reply to slurping many Arrays into memory...Please Help
Indeed, usually it is not a good practice to slurp a lot of data “into memory,” because all that you’re actually doing is trading one kind of disk file for another. All “memory” is, of course, virtual, and if you start grabbing lots of data into your virtual-memory space, paging is going to start happening and now you are (very expensively...) moving data from one part of the disk-drive to another. I suggest that you design the logic to perhaps locate all of the files first, but then to process them in some semi-sequential fashion such that you are not constantly closing-and-reopening them. Such programs not only get-started faster, but they have more predictable (and favorable...) performance characteristics more-or-less regardless of actual data volume. This is a “rule of thumb” to be sure, and every real-world case is different, but it’s a good rule of thumb to me.