sandy105 has asked for the wisdom of the Perl Monks concerning the following question:
I usually use a while<filehandle> loop to run through the file line-by-line and do my work . But for a a new script I had to do a global search on the input file because it has a unique 80 chars per line file and I have to no way of knowing if & where my search string will be split at new line "\n" or other control characters. Now when i try to load the whole file into a string and then do a pattern match by
my $data = do {local $/; <FILE> }; #do pattern match
This gives out of memory error as expected for even file few hundred MB's in size
My question here is can we modify the memory allocated to perl kind of like we can configure JVM memory allocation(I m primarily a j2ee programmer). More so because we run these script on enterprise servers with lots of memory
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Out of memory
by Ratazong (Monsignor) on Jun 09, 2015 at 12:30 UTC | |
by Athanasius (Archbishop) on Jun 09, 2015 at 15:49 UTC | |
by sandy105 (Scribe) on Jun 15, 2015 at 11:54 UTC | |
by sandy105 (Scribe) on Jun 15, 2015 at 11:52 UTC | |
|
Re: Out of memory
by Corion (Patriarch) on Jun 09, 2015 at 12:28 UTC | |
|
Re: Out of memory
by BrowserUk (Patriarch) on Jun 09, 2015 at 12:37 UTC | |
by Anonymous Monk on Jun 09, 2015 at 17:05 UTC | |
|
Re: Out of memory
by Discipulus (Canon) on Jun 09, 2015 at 19:59 UTC | |
|
Re: Out of memory
by karlgoethebier (Abbot) on Jun 10, 2015 at 11:43 UTC | |
|
Re: Out of memory
by karlgoethebier (Abbot) on Jun 11, 2015 at 13:48 UTC | |
by sandy105 (Scribe) on Jun 15, 2015 at 11:56 UTC | |
|
Re: Out of memory
by locked_user sundialsvc4 (Abbot) on Jun 09, 2015 at 14:05 UTC |