My SearchFU needs help. :( I remember dealing with exactly this sort of thing many moons ago, but I can't remember how I eventually resolved this. My gut tells me that I changed how \n was defined and how Perl eats files. I'm chewing through files line by line using
Works fine on my sample set, until I realize the working data actually consists of text files from a variety of flavors of Windows and Unix. Oops! Now I can't eat the files line by line. I could eat slurp the files whole, do a bit of substitution for \r & \n and deal with it that way. But that's an ugly solution and I can't gamble the files are consistently small enough to gobble up like that. Can someone point me in the right direction? I've been mucking with the Camel book for a while and my eyes are starting to water.while (<FILE>) { #Random code }
Edit: Fixed incorrect code example
----
Thanks for your patience.
In reply to Unix and Windows CRLF vs LF by SavannahLion
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |