in reply to 64-bit digest algorithms

Just out of curiosity, why?

64 bits message digests are considered unsafe, hardware has become to powerful. 128 bits is only "moderately" safe, if you want your digests to be really secure you have to opt for 512 bits. Or is security not an issue for you?

Update

If you insist on 64 bits message digests I think DES is the way forward, like brother kvale already suggested.

Replies are listed 'Best First'.
Re^2: 64-bit digest algorithms
by BrowserUk (Patriarch) on Nov 13, 2008 at 08:39 UTC

    The application has nothing to do with either cryptography or security. Essentially I want a hashing function with (far) less risk/frequency of collisions than a 32-bit hashing function, and far less space utilisation that a 128-bit digest.

    Assuming linear distribution (and unlimited space), a 64-bit hash has only a 0.000000023% chance of collisions relative to a 32-bit hash. but uses half as much space as MD5.

    I also considered 48-bit, but they're harder to calculate; and 53-bit (because of the possibility of using doubles for the calculations), but most hashing algorithm using shifting and that doesn't work with FP. And both are as rare as rocking horse doo-doo.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.

      A list of Hash functions. There area a few 64 bits alternatives. Maybe elf64?

      HTH

        Many thanks for this link.

        After browsing around and reading a lot of very interesting information, it led me to FNV-1 & FNV-1a which seem to have been explicitly designed for my purposes with particular attention paid to the problems of clustering and collisions.

        This seems to be my best choice.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

      When I was playing around recently I used CRC32, taken once forwards across the input and a second time backwards across it. This certainly increased the effective space of the hash -- my sums is not up to any assertion about whether that gives a full 64 bits of "entropy".

      There are also two other respectable 32 bit CRCs -- avoiding the need to reverse the input.

      There is also CRC64 itself, but a quick poke at CPAN doesn't reveal an implementation (except buried in some "Bio..." stuff). Wouldn't be too difficult to bang out a table driven CRC64.

      You don't want anything crypto-secure, so a CRC could well be sufficient, and a good implementation will run like stink compared to any of the crypto-digest stuff.

        There is an arbitrary-output-size hash function family called RadioGatún...
        []s, HTH, Massa (κς,πμ,πλ)
      Doubles have 52 bits. The 53rd bit is the implied leading "1".

      Update: As pointed out by oshalla, the implied bit does count. As pointed out by BrowserUk, I forgot about the sign bit. So there's 54 bits of precision for signed integers.

        No! Doubles have 64-bits.

        Of which:

        • 1 bit is used for the sign;
        • 11 bits are used for the exponent;
        • 52 bits are used for the mantissa;
        • 1 extra bit is implied in the mantissa by the specification;

        With the result being that a 64-bit IEEE-754 double precision floating point value can act as a 53-bit signed integer for some purposes.

        If you're going to try to be a pedant, rather than contribute something useful--at least be right.