You didn't give us a lot of information. Here are some of my random thoughts on possible causes that i would check:
- Are there any security/safety systems active that may view the process as a possible threat or as a possible resource hog? What does your system log (/var/log/syslog and dmesg) say?
- On (modern'ish) Linux systems, ulimit isn't the only way to limit RAM allocation. There's also stuff like cgroups.
- AppArmor can limit access to system resources on a per-binary config. So your shell may have no limits, but your perl binary might have.
- Do you use the system perl or a custom one installed in a home directory. If you use a custom binary, that might have different memory limits than your shell (/usr/bin/* vs. /home/*).
- How does the script get started? Manually on the command line? Or by another process that may have lower memory limits (cron, etc) that get inherited by the perl binary?
- Without knowing the code you run, it's hard to know exactly what TYPE of memory you are running out of. Does your Perl script fork in any way and run into shared memory issues, for example?
- There could also be a problem mmap()ing a huge file into memory, perhaps? Creating a tie'd file in a small tempfs directory?