# Thread: Division by 0

1. >>Do you just enjoy trolling Bob? Or do you actually have a point?

I was considering asking you nearly the same thing. I provided something neat others may not have seen.

And the 0.999~ = 1 thing is so old and has been proven more ways then I care to count.
How is this comment not trolling, considering everything here is some 'old math concept that has been proven in more ways than you care to count'

???

2. Is there an equation, division, etc. that gives you 0.99999999999999999...(goes on infinitely)?

3. Originally Posted by BobMcGee123
This has all sorts of intriguing implications. People don't seem to realize the extent to which math is horribly limited. Math is a description of the reality we perceive. The idea that any mathematical construct is somehow 'right' in the absolute sense is, almost in a definitional manner, a flawed idea. Alter the brains of humans and you will see the rules and 'definition' of math change accordingly.

1 + 1 = 17 (base 10) in my perception of reality. Anybody who argues with me is a god damn moron.
Your understanding of mathematics is silly. Mathematics is useful (and profitable) when it describes reality, but it has nothing to do with reality. Generally speaking, mathematics is a game played with symbols. Make a bunch of rules for rewriting expressions, start with a few non-contradictory axioms, and you're off.

It is fine to say that 1 + 1 = 17, as long as this can be proven by or does not contradict your other axioms. If 1 is representing a real number or integer and + representing the standard real-valued addition operator, though, then you better pray that 17 * 17 = 17 + 17.

In many cases, people pick axioms that have some use in reality. For example, the real numbers are very useful for physics, and rational numbers are handy for cooking. They have even tried and failed to come up with axioms that represent the universe. For example, Euclidean geometry worked pretty well, until Einstein's relativity came along.

If you make a system of mathematical notation for which 17 represents 1 + 1, and "bat" represents 17 + 1, that is fine, but the standard way of representing 1 + 1 is with the symbol 2.

4. Originally Posted by ajaxthegreater
Is there an equation, division, etc. that gives you 0.99999999999999999...(goes on infinitely)?
yes:
Sum from i=1 to ∞ of ( 9 / 10^i )

5. Originally Posted by ajaxthegreater
Is there an equation, division, etc. that gives you 0.99999999999999999...(goes on infinitely)?
1 / 3 = 0.333333333333333333...

(1 / 3) * 3 = 0.999999999999999999...

The real "reason" that 0.99999... = 1, though, depends on how you construct the real numbers. One way of representing real numbers is through infinite sequences of rational numbers that "converge" -- and then decimal representations of numbers really represent sequences of rational numbers. E.g. 3.14159... represents the sequence 3, 3.1, 3.14, 3.141, 3.1415, 3.14159, ...., which consists of rational numbers every step of the way. And 1 could be represented by the sequence 1, 1, 1, 1, ....

Under this system, another sequence of rational numbers that represents 1 is the sequence 1.1, 1.01, 1.001, 1.0001, ..., 1 + 0.1^n, ....

In this system, two different sequences of rational numbers are equivalent if and only if given any value 1/n, you can find a position in the sequences such that the distance between all the pairs of elements beyond that point is less than 1/n. 0.99999 and 1 meet this definition of equivalence.

The main source of confusion that people have with regard to this matter is that they think that the decimal representation "is" the number. But the sequence of digits is merely a representation.

6. Your understanding of mathematics is silly.
...
Mathematics is useful (and profitable) when it describes reality, but it has nothing to do with reality

The point of the 1 + 1 = 17 isn't a particularly intuitive or one, but that is the point. This idea of picking axioms which describe 'reality' really means we pick axioms which describe the reality that we perceive. How do you know there isn't a reality in which it is correct to say 1 + 1 = 17 but also 17 * 17 = 17 + 17? We aren't wired to accept that. You can't really prove or disprove this because according to our definition of reality it really doesn't make sense. There are many aspects of 'reality' which we weren't aware of until the advent of certain technology, and, whenever we as humans update our description of reality we typically need to find hack workarounds in mathematics in order to make it work. You can't divide by zero, because as we perceive reality it doesn't make sense, but you can factor out the number in the denominator and call it a limit, because in our perception of reality it makes sense that at any instant there is pertinent information about some object (velocity for example). I took a high level math course that explores the idea that even continuity might not even exist: the perceived force of gravity is not applied continuously (delayed because it follows the speed of light), changes in energy levels is not continuous (smallest change in energy is a quantum), the spread of molecules when performing mass moment of inertia calculations is not continous (molecules have a finite size).

The point I'm trying to make is that these rules of math that 'work' are based on assumptions, and these assumptions rely entirely on how we are wired as humans.

1 + 1 = 17
17 * 17 = 17 + 17

I challenge you to 'prove' me wrong. Certainly, you cannot without first making certain assumptions, and these assumptions are entirely based on your humanness and, as I said earlier, how you perceive reality.

Is this useful? Not in the god damn slightest. Do I still support my view? You bet.

but the standard way of representing 1 + 1 is with the symbol 2.
It works great with your perception of reality. I think you're being absurd!

7. Originally Posted by BobMcGee123
How do you know there isn't a reality in which it is correct to say 1 + 1 = 17 but also 17 * 17 = 17 + 17?
When you say "a reality," what do you mean?

Originally Posted by BobMcGee123
We aren't wired to accept that. You can't really prove or disprove this because according to our definition of reality it really doesn't make sense.
What is a "definiton of reality" and what does "making sense" have to do with proving something? Something does not have to make sense to be provable. Provability of a statement depends entirely upon the base set of axioms. If your set of axioms lets 17 = 1 + 1, then 17 * 17 = 17 + 17 makes sense. 17 is just a symbol. Some systems of axioms make it possible to prove that you can decompose a sphere into five pieces and then put those pieces back together, without changing their shape or size, into a sphere with half the volume.

Originally Posted by BobMcGee123
...
You keep going on about "reality." What does this have to do with mathematics?

8. Let me try to characterize what you are saying in my words.

You seem to be saying that mathematics should not really need to have anything to do with reality, and that it would be different if you were in a universe with different physical laws. Is this close?

My view is that you are correct with respect to your definition of mathematics. But your definition or at least your view of other people's definitions of mathematics is incorrect, because you seem to be arguing when I think most mathematicians would wonder "what is there to argue?" It would help if you wrote more precisely.

There are indeed situations where the meanings of = and + are defined such that 1 + 1 = 17. For example, this is true in arithmetic modulo 3.

Modulo three, it is in fact true that 1 + 1 = 17, and 17 * 17 = 17 + 17.

This also works modulo 5 or modulo 15.

(Working modulo 3, it is still true that 1 + 1 = 2, because -1 = 2 = 5 = 8 = 11 = 14 = 17 = 20 = 23 = ...)

9. I think we have been looking at it all wrong.

Instead of n/1 = n, I think it should be n/0 = n. That is to say with human langauge N devided by nothing, or rather not devided at all is still n. Further, to devide n one time creates 2 groups.

With this method, all your divisions are zero based. So
n/0 = n
n/1 = half of n
n/2 = third of n
n/etc = etc+1 of n

Some of the problems is that when trying to apply this method to the number scheme you lose the definition of zero or nothing, yet you use it as the basis of division so it creates it's own problems.

10. Dividing a number by X can be interpreted as dividing it into groups of size X. If I divide memory by kilobytes I might be grouping it into one-kilobyte sized chunks.

Also, you're assuming the thing being divided is linear. What happens when you divide a pie with 5 cuts from the center? You get five pieces.

You just move the problem to -1, instead of 0. Also, n/1.5 is not very intuitive. And n/-2 is worse. I just typed n/-0.5 and my head exploded.

11. The point of the 1 + 1 = 17 isn't a particularly intuitive or one, but that is the point. This idea of picking axioms which describe 'reality' really means we pick axioms which describe the reality that we perceive. How do you know there isn't a reality in which it is correct to say 1 + 1 = 17 but also 17 * 17 = 17 + 17?
We do live in that reality, we just use the symbol "2" for what 1+1 is equal to, and we use the symbol "17" for another number.

Our definition of 17, as in the number after 16 and before 18, does not and _can not_ equal 1+1 assuming the usual definitions of "1", "+" and "equal".

You can't divide by zero, because as we perceive reality it doesn't make sense, but you can factor out the number in the denominator and call it a limit, because in our perception of reality it makes sense that at any instant there is pertinent information about some object (velocity for example). I took a high level math course that explores the idea that even continuity might not even exist: the perceived force of gravity is not applied continuously (delayed because it follows the speed of light), changes in energy levels is not continuous (smallest change in energy is a quantum), the spread of molecules when performing mass moment of inertia calculations is not continous (molecules have a finite size).
Mathematics as a concept is not limited by reality, it is simply a language, it can be used to describe reality or it can be used to describe non-reality.

The reason you cannot divide by zero is not because it doesn't make sense to us, what one can and cannot do in mathematics is not tied to the nature reality. The reason you cannot divide by zero is because based on the axioms mathematics sits upon, division by zero is undefined, as a description it lacks meaning.

Your statements regarding energy levels and gravity have no bearing on the _nature_ of mathematics, reality is not in principle constrained, mathematics however is simply a language, it can be used to describe. A description is only meaningfull if it means something to us - the ones doing the the describing.

Consider it this way, you ask the question could there be a reality in which 1+1 = 17? So i ask what do "1", "+", "=" and "17" stand for? Because before we think about what reality might be, we have to work out what it is we are asking. If you use those symbols as we most commonly define them then "1+1 = 17" is meaningless because it violates the definitions it is attempting to use.

The point I'm trying to make is that these rules of math that 'work' are based on assumptions, and these assumptions rely entirely on how we are wired as humans.
... Maths is based on axioms but those axioms say nothing whatsoever about the nature of reality.

1 + 1 = 17
17 * 17 = 17 + 17

I challenge you to 'prove' me wrong
If you are attempting to use the usual definitions of those symbols (base 10 that is) proving you wrong looks reasonably easy:

17 = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1

17 = 1 + 1

1 + 1 = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1

Which violates "=" since we dont have the same amount on both sides

(from the lhs 1 = 16, from the rhs 1 = -14)

12. I think it is Lagrange (or one of those french L mathematicians) who once said :

"A mathematical model does not have to be true, so long as it is is practical"

Enough said.

13. This has mostly been said:
• x/0 is undefined, because formulaes involving division by zero doesn't make sense anyway. It would destroy all algebraic rules and make it impossible to count.
• x/0 is not equal to infinity. Its limit towards 0 is not equal to infinity. It does not exist.
• 0! is defined as 1 because it makes a lot of formulas involving factorials more generic. This is a definition that works very well.
• lim x->0+ (1/x) = +∞

JeremyG: [new definition of division] That way we would have no inverse to multiplication, which would be bad.

Originally Posted by Thantos
You can't divide by zero because its undefined. Meaning we haven't found an answer that meets all of the conditions assocated with division and multiplication.
Heh, I love this. Remember back when you were first learning subtraction, and subtracting the larger number from the smaller number was "undefined"? Or you were learning about radicals, and having a negative expression under the radical was "undefined"? Sounds like we just haven't learned enough yet.
Originally Posted by Thantos
Lets say you take 3 / 0 = x. That would mean that x * 0 = 3

Give me a number that you can multiple by 0 and get 3.
Let's assume such a number exists...call it a. How far can we get? What rules of division and multiplication do we need to meet?

15. Originally Posted by pianorain
Heh, I love this. Remember back when you were first learning subtraction, and subtracting the larger number from the smaller number was "undefined"? Or you were learning about radicals, and having a negative expression under the radical was "undefined"? Sounds like we just haven't learned enough yet?
Wrong. Complex numbers is actually the end of the road. No more numbers are needed. This can be proven, but it is quite advanced and I haven't taken that course yet.

Popular pages Recent additions