in reply to splitting files into multiple sections to be processed as a file
I believe that what you are referring to is memory-mapped files, where the contents of a file (or a window into its entirety) actually is mapped into a process’s virtual-address space such that page-faults to that area are resolved by the contents of the file (window). A quick search of the term “mapped” at http://search.cpan.org immediately shows Win32::MMF as the top hit.
Another thing that pops into my head is that you could split a file pretty-well just by fseek()ing into it at an approximate location, then reading forward or backward until you hit a newline \n character or what-have-you. You know where you positioned to, and you know how many bytes you had to read to find a newline, so you therefore know the actual split-point position. If you know the file’s contents reasonably well, this simple strategy should work just fine.
Logic to process arbitrary files “line by line” really doesn’t have to be complicated: just read it a chunk at a time. You don’t need to fool with memory-mapping. When your search for “the next newline” comes up empty-handed, move the unprocessed content to the top of the buffer and then read enough bytes to fill the buffer back up again. And “CPAN to the rescue” again, e.g. with Text::Buffer.