in reply to Quickest way of reading in large files?
I'd have to say the while version's quicker, simply because it's not going to hog tons of memory storing the the entire file in an array as the foreach version will. In fact, just the other day, somebody posted some code that addressed this problem - after changing to a while loop, execution time was cut by over 40%.
From the "I forgot to mention this" dept.: If your file is over 100 meg, not only will foreach be a lot slower, it may not even finish running, due to the memory issue I describe above. I highly recommend using a while loop for a file that large.
[Update: Changed title to match root node]
His Royal Cheeziness
|
|---|