I fully agree, that's how my own software does it. Let me re-formulate my answer: Quite a few prominent early computer systems (like the IBM/360 and the VAX) had support for BCD math. Since those were designed by quite smart engineers, i assumed that this had some advantages.
BTW, the 6502 processor also has a decimal mode. I've watched a video by Ben Eater where he explains how to convert a binary number into a decimal for displaying it, and it is quite an intense process. For things like displaying a game score, it might work out that using decimal/BCD mode for the score could save some CPU cycles. But i'm in no way an expert on this.
In reply to Re^6: Behaviour of int() unexpected
by cavac
in thread Behaviour of int() unexpected
by ceade1000
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |