People often say that Perl is different to C in that it has a lot fewer built-in limits. I have got used to the power that this gives us and go around with scant regard for any limits.

On a UNIX(AIX) platfrom, I have been bitten in the past by the size of the C-Shell buffer in globs but what other limits are there that I should bear in mind when programming with perl on a UNIX platform.

(For example: I assume that ulimit restricts the amount of memory that Perl can use but are there any ways round this....)

Replies are listed 'Best First'.
Re: Limits in Perl
by trantor (Chaplain) on Aug 09, 2001 at 17:48 UTC

    It's a shared opinion that Perl was born to be useful, rather than restrictive. That's why there are (almost) no limits (computationally speaking) in terms of what a Perl program can do.

    Beware, this does not mean you're free to do whatever you want. You have enough rope to hung yourself, if you want, but you can also use strict or features not even conceivable in other languages like taint checks.

    So basically it is like you said: empirical limits are almost certainly dictated by the operating system (e.g. maximum memory usage -- but you can always compile Perl so thatit reserves some space for emergencies), and then an experienced programmers knows how to scope and limit their code so that it can be reliable, tested, documented and useful. For example, limiting, if not eliminatig, global variables is considered very good practice, even if the language itself allows you to do that. So, you are supposed to know when it's a good thing and when it's more trouble than it's worth.

    So, for operating system limits in a UNIX context you may want to check out getrlimit(2), setrlimit(2), and sysconf(3). There's not much you can do about these limits, if you're not root. Especially a maximum number of open file descriptors which is not too high might cause troubles in Perl, which is so useful for dealing with files and network connections (they count as open file descriptors). Or you can be creative and overcome a max memory usage limit forking your process and distributing the computation, but then again you may incur into a maximum number of processes limit :-)

    Another annoying problem may be a quota limit, that forces you to use no more than X Mb on filesystems. Being so good at slurping and processing files, this can affect you if you handle e.g. Web server logs.

    The Posix module defines some constants that can help you in finding out for example the biggest integer that the architecture can save in a C int variable and so on. It also allow you to import (at least in 5.6.1) the sysconf function, so that you know what limits are you subjected to, and the workaround can be (if needed) tailored to your specific program, maybe using your creativity :-)

    I don't know much about other Perl internal limits. Surely someone more experienced (Him in person?) knows much more about them. I can think of maximum number of identifiers in symbol table, maxim length of identifiers and so on, but I would be very suprised in bumping into those limits, if they exist at all.

    -- TMTOWTDI

Re: Limits in Perl
by nakor (Novice) on Aug 09, 2001 at 23:15 UTC
    The one that pops to the top of my head is your C stack limit. The Perl regular expression engine is recursive, so if you write an RE that requires ridiculous depths of backtracking (on the order 10**3, depending on your system), you may blow your stack segment and segfault. Other than that, there's your (OS-imposed) open file limit, (until 5.8.0 comes out with PerlIO) your C runtime-imposed stdio limits (e.g., Solaris only allows 256 FILE* streams at once), and general memory limits. Oh, your line numbers are 16-bit integers, so they wrap at 2**16 ;-).
Emperical Limit on a line
by John M. Dlugosz (Monsignor) on Aug 10, 2001 at 02:19 UTC
    perl -e "$|=1;$_=x; {print '.'; $_ .= $_; redo }"