Hi,
I've got huge program. It's used to process even more huge amounts of ascii data. The program exists in two version - as a script for UNIX, and executable ( ActiveState PDK ;-) ) for Windows. Everythink runs well, except memory usage, which gets bigger and bigger. I've got 2GB RAm on my PC - so it works somehow. For UNIX Workstation the run ends with "out of memory" error.
Program consits of modules which open the file, process it and save it back. This happens 10 to 20 times (average) for simngle run. Each file is also written double (requirement of data processing).
Example: two files, each 10MB. My program runs, and at the end of the run -> Windows Task Manager shows 1.7GB(sic!) of memory usage.
I use "my" statement where it's possible - however some data structures must be global.
Thequestion is: what can I do? How can I track the things which make problem?
Greetings
TheMarty