in reply to Re^2: Meaning Of error in perl
in thread Meaning Of error in perl

That's not quite true, actually. Division by zero is impossible -- that is to say, zero does not have a multiplicative inverse -- in any field or indeed any ring, but general rings can have zero divisors (i.e., there may be a non-zero element a for which there exists a non-zero element b such that ab = 0).

If n is a composite number, then the quotient ring Z/nZ (basically the same as the integers, but with addition and multiplication carried out mod n) is an easy and natural example. Suppose that n = p * q for some p and q; [p] and [q] are elements of this ring, but since we're multiplying mod n, we've got [p] * [q] = [n] = [0].

Replies are listed 'Best First'.
Re^4: Meaning Of error in perl
by Laurent_R (Canon) on Jul 08, 2014 at 21:31 UTC
    Agreed, if we redefine multiplication and division to mean something else than their common meaning, and also have them operate on objects that are not the common numbers, then many strange things can happen. Just as redefining white as a special shade of black might lead to embarrassing paradoxes. Don"t get me wrong, I know that mathematicians commonly "overload" the basic operators to mean something else than the common sense operations, and they have good natural reasons to do so.

    But within the context we are really talking about, i.e. common arithmetic multiplication and division between natural, relative, rational, algebraic, transcendental, real or complex numbers, division by zero is mathematically impossible and even inconceivable. And the same goes for integers and floating-point numbers in CS.

    I made the point that a division by 0 is mathematically impossible (Ok, granted, within the framework of the previous paragraph), because I feel this is a much more general and profound statement than just saying that it is not possible with all known programming language, which could be construed to mean that existing languages all have this limitation. This is not a language limitation, this is something which has been proven to be mathematically impossible. In other words, a very bold statement that it not only so, but will forever be so.

      Now wouldn't you concur this eternal proof to hinge upon the existence of a mathematical dogma?

        Well, the use of "forever" might be an overstatement on my part. But no one has disproven so far mathematical theorems demostrasted more than 20 centuries ago by Thales, Pythagoras, Archemedes or Euclid. Twenty centuries is obviously not eternity, but it is quite impressive. No other field of human knowledge has resisted so well to the passage of time. About everything that was held to be true 2000 years ago in the fields of physics, medecine, astronomy, natural sciences, etc., is known to be false or at least grossly incomplete and inaccurate; just about everything that was held to be true in the field of mathematics is still held to be true. A rather impressive accomplishment, don't you think? Call it a dogma if you wish, but I do really not see anything religious about it.

        Math may be incomplete, and the current mathematical corpus will certainly be superceded by an higher order theory, but I believe that things that are know to be true will remain true.