It doesn't matter. Sure, it doesn't "make sense" to divide by 0. But there are several cases where you can decide to define that some specific expression that doesn't really "make sense" should be considered to have a specific value because doing so would simplify having to deal with some edge cases.
For example, 0**0 is generally defined to be 1 even though 0**X is 0 for all values of X > 0. Also, 0! (factorial) is defined to be 1. Because these "conventions allow us to extend definitions in different areas of mathematics that would otherwise require treating 0 as a special case."
I wish I could remember cases where continuous functions were the complete justification for such conventions (I know there is at least one such, but I can't think of it).
- tye (but my friends call me... um... something)In reply to (tye)Re2: What is zero divided by zero anyway?
by tye
in thread What is zero divided by zero anyway?
by BrowserUk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |