I haven't understood everything that has been presented but enough to use some of the infomation posted and complete a working script for my purpose soon.
There was some talk of slurping sections or even whole filesfiles:
On that topic; let me explain very briefly what the intended usage is. The code will be used to search and extract thru some fairly massive piles of files at times
Once File::Find is added into the script it will likely be expected to recurse thru usenet style hierarchies (hierarchies of my own creation, so smaller than real ones) that might consist of as many as 45000-55000 messages in total (not per group)
So, with that scale of usage in mind would slurping of whole files still be a wise way to go? Or would that be so labor intensive as to make it worth while to do it a different way?
In reply to Re^2: Grab 3 lines before and 2 after each regex hit
by HarryPutnam
in thread Grab 3 lines before and 2 after each regex hit
by HarryPutnam
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |