Re: Out Of Memory
by choroba (Cardinal) on Sep 18, 2013 at 09:10 UTC
|
Records of half a million of what? Without seeing the code and a sample of the input, we can only guess, or recommend to buy more memory.
| [reply] |
Re: Out Of Memory
by zork42 (Monk) on Sep 18, 2013 at 09:12 UTC
|
| [reply] |
Re: Out Of Memory
by Utilitarian (Vicar) on Sep 18, 2013 at 10:28 UTC
|
| [reply] [d/l] |
|
| [reply] |
Re: Out Of Memory
by roboticus (Chancellor) on Sep 18, 2013 at 11:03 UTC
|
Naveen_learn:
Perhaps it's time to add more memory to your machine. If you've already maxed out your memory, then maybe it's time to get a machine that can use more memory and use a 64 bit OS so you can use all the RAM you add to your machine.
...or you could change your program to use less RAM. But since I can't see your program, you'll have to figure that out yourself.
...roboticus
When your only tool is a hammer, all problems look like your thumb.
| [reply] |
Re: Out Of Memory
by jesuashok (Curate) on Sep 18, 2013 at 09:57 UTC
|
Hi Naveen -
Can you provide more information as such,
(a) On what operating system you are running this code?
(b)How these input records are given to program to read/parse?
(c) Are you getting the out of memory message immediately upon executing the program or after certain process that happened and seeing it?
Further more information would really help to provide valuable feed backs to help.
| [reply] |
Re: Out Of Memory
by davido (Cardinal) on Sep 18, 2013 at 16:14 UTC
|
I can't believe how many responses say to just add more memory. I realize this might be tongue-in-cheek, but the responsible thing for a developer to do is to figure out how to consume less memory.
We haven't seen any code yet, nor any sample input. Thus, the question is mostly unanswerable. But attempts at providing suggestions for an unanswerable question ought to at least lean in the direction of what a developer might do to solve the problem, rather than what a hardware tech might do.
If at all possible, read your input one record at a time, process that record, write out anything that needs to be written, accumulate whatever needs to be accumulated, and then discard from memory the current record and move on to the next. Some variation on this theme is how most of these scalability issues are resolved. Maybe you need to hold onto two or three records. Or maybe you can immediately drop any records that are uninteresting and only hold onto ones that need further work. But in the end your approach will probably look something like this.
Another technique might be to break up the list of records into smaller chunks, and work on them individually. Accumulate results for each chunk, and in the end, compile results for all the accumulated data. This is a "map-reduce" strategy.
So these are some very generalized suggestions, which are about as specific as possible given the ambiguities of the original question.
To the original poster: Please consider this: When you present a question that is devoid of detail, it usually means that you have become so exacerbated with the problem that you've given up trying to figure it out yourself. But we have no chance of figuring it out either unless presented with those very details that have so frustrated you.
| [reply] |
|
| [reply] |
Re: Out Of Memory
by NetWallah (Canon) on Sep 18, 2013 at 14:00 UTC
|
| [reply] |
Re: Out Of Memory
by Anonymous Monk on Sep 18, 2013 at 16:09 UTC
|
This program probably was designed to do everything in-memory such that it fails when the files grow even slightly large. Unfortunately a redesign is in order. Throwing silicon at the problem might be a stopgap .. and, if so, might be the right thing to do. Chips are cheap. | [reply] |
|
| [reply] |
Re: Out Of Memory
by Anonymous Monk on Sep 18, 2013 at 20:39 UTC
|
Depends on the system and the business situation overall. This might be a mission critical hairy beast ... feed it silicon ... or it might be something much smaller ... fix it. "A system which uses Perl" sounds potentially ugly. | [reply] |