earl has asked for the wisdom of the Perl Monks concerning the following question:

Hello!

I am currently experiencing problems with Perl 5.6.1 on AIX 4.3 compiled with large file support.

We have scripts that perform an sort on some data-files to remove duplicates before these files get loaded into the database. These are legacy-scripts that I only maintain and would not like to change.

The problem that I now see is that scripts (that still run fine with the old Perl version (5.5.3)) run out of memory when running under 5.6.1. (This is especially bad since we are expecting the data-files to get bigger (this was the reason to use 5.6.1 with large file support in the first place).

I have tried to link Perl with -bmaxdata:0x80000000 but the problem did not go away (maybe I've been doing something wrong).

Does anybody have any idea about how to get around this problem?

Any help is greatly appreciated.

Replies are listed 'Best First'.
Re: memory problems with 5.6.1 on AIX
by perrin (Chancellor) on Mar 11, 2002 at 20:30 UTC
    If it's really running out of memory, and the files are too large for 5.5.3 to handle, you will probably need to look at splitting up the job somehow. You could also use standard unix utilities like sort (which can remove duplicates too). I believe sort can handle just about anything that your OS can.
      Thanks perrin, but using UNIX-sort is not really an option for me (for reasons I can't explain here).

      Since 5.5.3 does not show this behaviour I assume the problem is finding the right linker options when building Perl.

      Anyone with ideas?