Ok, if I add 'use integer;' or 'use bigint;' (in both cases the behavior here is the same) to the test script and then run it on my Intel PC with Linux, I get the following output:
12.00
16.00
192
While the Mac M1 user gets:
12.99
16.25
192
Basically on the Mac M1 only in the last 'eval' (the one with the multiplication of the previous two values) the decimals get ignored, they don't get ignored everywhere and that's what makes this problem so strange, if he had some env setting that forces integer then all values would be integer like in my test that I just did with 'use integer;'.
I don't think it's a fundamental Mac M1 problem either, but I'm suspecting that maybe Perl and/or the Perl-Tk module from MacPorts for the M1 cpu have been compiled with some incorrect flags that cause this behaviour.
That's why I was hoping that maybe somebody else here can reproduce the issue on a M1 or M2 Mac with the test script.