Segmentation faults can be caused by a couple things. Many times they are caused by accessing a memory address not allocated to the process (ie a bad pointer). Generally this is not a problem for code written in perl.
A segfault can also be caused by a process running out of stack space. This tends to show up in programs that have recursion based algorithms. I don't know for sure but I thought I once heard perl deals with lots of recursion just fine. That said perl is written in C and maybe there is a problem with a library function for example.
Since the program runs normally on small runs but not on large runs stack overflow is my first guess. I think the stack size limit is 8 Megabytes by default on Linux. Once the stack grows to more than that a segfault occurs.
You can try increasing the stack space using the "ulimit" command built into bash. (Or see the documentation for your shell.) For example:
$ ulimit -s
8192
$ ulimit -s 16384
$ ulimit -s
16384
For more information on ulimit read the man pages for "ulimit" and "bash" or run "ulimit -a".
If you haven't tried it yet you could also try to run each job in a separate process. ie call perl once for each job using a loop rather than have one execution of perl run all of the jobs. I don't know how to predict how many jobs could run in a single process before it gets to be too many.
Segfaults can also be caused by flaky hardware. It is also possible to intentionally cause a segfault by sending a signal 11 with the "kill" system call.
Update: eval will not catch a segfault. A segfault causes Linux to kill the process and create a core file (if ulimit is set to allow core files). See the man page for "signal" in section 7 of the man pages (not section 2) by running "man 7 signal" for descriptions of what the system does for different signals. A segfault is signal number 11 and is also known as a segmentation violation.