f77coder has asked for the wisdom of the Perl Monks concerning the following question:

All,

I did search but maybe didn't use the correct terms. How does perl without using an external lib handle arrays that have huge indices of type UInt?

Most compilers seem to optimize around Int which seems like a huge waste of memory, having to use a size_t greater than what is needed, IMHO.

example

i have an array of chr's or UInt8's (pick your data type) but the indicies are UInt64 -1 > index > Int64

On a fully 64-bit OS,how would or does perl 5.2x handle this?

Replies are listed 'Best First'.
Re: internal size of array indices limit?
by Corion (Patriarch) on Mar 16, 2017 at 20:41 UTC

    Ideally, all Perl array indices are of size_t. See perlguts:

    SV** av_fetch(AV*, SSize_t key, I32 lval);

    I'm not sure what you mean by "without using an external lib". Every array element in Perl takes at least 16 bytes, on a 64-bit Perl, 24 bytes.

      Thanks, I'll do the reading.

      External libs would be like hooking in GMP, MPFR or BigInt, BigRat, etc.

Re: internal size of array indices limit?
by LanX (Saint) on Mar 16, 2017 at 20:53 UTC
    Perl has a clever internal strategy to make arrays as flexible and fast as possible, to allow immediate push and unshift of data, with rare need of allocating or moving.

    This comes with the cost of extra allocated space.

    If one doesn't need this flexibility and data elements are of fixed size, it's not uncommon to use a byte° string as workaround and to address and replace elements with substr , in order to safe space.

    Cheers Rolf
    (addicted to the Perl Programming Language and ☆☆☆☆ :)
    Je suis Charlie!

    °) I e not utf8