in reply to How to work on a large Data file line by line

The files that you access can be arbitrarily large:   you pay no performance-penalty no matter how large the file may be... provided that you remain fully-aware of just what you are asking the computer to do! You need to remain mindful of just how your Perl-code translates to requests that are issued to the operating system.

For example, “reading the file into memory” is not possible...

The operating-system will handle many details for you. For example, it will probably figure out on-its-own that you are reading the file “sequentially,” and it will buffer large amounts of data in-advance of your anticipated need to read it. Perl will dole-out the data to you “one line at a time,” but it's actually carving those lines out of a rather large buffer.

As noted in the referenced post (from ~2001), files larger than 2GB might be problematic in some older systems, since a “file position” was at one time represented by a 32-bit signed integer, but that constraint is probably long-gone for you by now.