'corruption' - apparently just because the data is hexadecimal rather than decimal
I think the OP was quite specific in the definition of the input format - "a normal word will be 7 0's followed by a number between 0-9 (8-digits total)". To put some perspective on this from an ECE point of view, I find this kind of corruption is completely "normal", for example, in a RS-232 or wireless serial data stream corrupted by noise. Simply skipping the obviously corrupted values until a good value is seen is a valid approach to regaining synchronization with the stream. Of course there are ways to add error detection and/or correction encodings on the stream on the transmitting end so the corruption is less likely in the first place, but a large number of "modern" devices I've worked with still don't do this.