I usually use a while<filehandle> loop to run through the file line-by-line and do my work . But for a a new script I had to do a global search on the input file because it has a unique 80 chars per line file and I have to no way of knowing if & where my search string will be split at new line "\n" or other control characters. Now when i try to load the whole file into a string and then do a pattern match by
my $data = do {local $/; <FILE> }; #do pattern match
This gives out of memory error as expected for even file few hundred MB's in size
My question here is can we modify the memory allocated to perl kind of like we can configure JVM memory allocation(I m primarily a j2ee programmer). More so because we run these script on enterprise servers with lots of memory
In reply to Out of memory by sandy105
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |