Actually, both CRC and MD5 can be calculated incrementally without having to stringify the entire dataset at once.
You misunderstand what I meant by stringification; or rather, I didn't make myself clear enough.
Each item in a collection could be a single character; or it could be an entire data structure representing anything. Each item might for instance be:
- An entire 19x19 array representing a position in a game of go;
with a subset of items being a sequence of moves within a game.
- Each a known marker sequence -- variable length sequence of ACGT;
with a subset of items being a DNA fragment.
- It could be a frequency spectrum derived by Fourier Analysis representing one instant in a piece of music or voice recording;
with the subset representing a few bars of the tune or a phoneme of the speech.
- each could be a snapshot of stock prices or movements at a given moment of the trading day;
with the subset representing elapsed time periods.
- Many other things; simple or complex.
The signature has to capture both the individual items and their ordering. It is easy to demonstrate that if you run MD5 over the subset of items simply concatenated data, a single combined sequence, two or more quite different subsets can easily result in the same signature. Eg.
Take the simply concatenated subset of DNA markers CTCGGTGCGACGGTCTGCCAAGATCGCGTT. That could have been formed from any of these subsets:
- CTC GGTGCGAC GGTCTG CCAAGAT CGCGTT
- CTCGG TGC GACGGTC TGC CAAGA TCG CGTT
- CTCGGTGCGACGG TCTGCCAAGATCGCGTT
And many others, but the resultant MD5 would be the same for all of them. Of course you can prevent that by adding (say) nulls between items in the concatenation; but then think about how you would deal with the same problem with the audio samples; or video or ...
The classic way of dealing with this problem is that instead of concatenating the raw data, you assign each item in the collection a unique letter or number, and you run the MD5 on the concatenation of those; by doing a hash mapping of individual items to unique placeholders; concatenating them, and the MD5ing those. This is what I meant by two passes.
By using a mathematical algorithm to combine the placeholder numbers as they are looked up, a second pass is avoided; and the item to unique number mapping process results in a consistent output regardless of the type of raw input processing, all down-stream processing ccan be applied to any type of input data.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
In the absence of evidence, opinion is indistinguishable from prejudice.
|