You won't get any collision because obviously packing is a bidirectional transformation, otherwise gzip wouldn't know into which version it should unpack.
But if your strings don't have lots of repetition in them packing won't get those strings very small. Gzip works best with really long data
A digest on the other hand might have collisions but will get your string down to a definite length. The collisions are VERY unlikely with a good hash, but then the time to calculate the hash may eat some of the speed you hope to get out of a memory based solution
In reply to Re: How good is gzip data as digest?
by jethro
in thread How good is gzip data as digest?
by isync
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |