Um ok. It still doesn't mean the limit for 5/0 exists.Look at Lim x->0 5 / abs(x)
edit: And the 0.999~ = 1 thing is so old and has been proven more ways then I care to count.
Um ok. It still doesn't mean the limit for 5/0 exists.Look at Lim x->0 5 / abs(x)
edit: And the 0.999~ = 1 thing is so old and has been proven more ways then I care to count.
Do you just enjoy trolling Bob? Or do you actually have a point?
He means zero factorial. [Edit: with the time I took to write my reply, the previous sentence is redundant, yes.] There are a few reasons. One is that it allows us to represent "n choose k" as (n! / (k! (n - k)!)). But another is that it is consistent with the relationship n! = n * (n - 1)!:
(-1)! is not defined, because this relationship can be extended no further. Also, 0! = 1 is the value that works with the gamma function.Code:n! = n * (n - 1)! 1! = 1 * (1 - 1)! 1 = 1 * 0! 1 = 0!
We could look at n! as the number of ways to order n distinct elements, and then 0! = 1 is consistent with that definition.
One way of explaining why n / 0 is undefined is simply that if you define the real numbers as "an ordered field with the least upper bound property," this comes inherently with the definition of a field. Infinity is not a real number.
>>Do you just enjoy trolling Bob? Or do you actually have a point?
I was considering asking you nearly the same thing. I provided something neat others may not have seen.
How is this comment not trolling, considering everything here is some 'old math concept that has been proven in more ways than you care to count'And the 0.999~ = 1 thing is so old and has been proven more ways then I care to count.
???
Last edited by BobMcGee123; 10-02-2005 at 08:02 PM.
I'm not immature, I'm refined in the opposite direction.
Is there an equation, division, etc. that gives you 0.99999999999999999...(goes on infinitely)?
yes:Originally Posted by ajaxthegreater
Sum from i=1 to ∞ of ( 9 / 10^i )
1 / 3 = 0.333333333333333333...Originally Posted by ajaxthegreater
(1 / 3) * 3 = 0.999999999999999999...
The real "reason" that 0.99999... = 1, though, depends on how you construct the real numbers. One way of representing real numbers is through infinite sequences of rational numbers that "converge" -- and then decimal representations of numbers really represent sequences of rational numbers. E.g. 3.14159... represents the sequence 3, 3.1, 3.14, 3.141, 3.1415, 3.14159, ...., which consists of rational numbers every step of the way. And 1 could be represented by the sequence 1, 1, 1, 1, ....
Under this system, another sequence of rational numbers that represents 1 is the sequence 1.1, 1.01, 1.001, 1.0001, ..., 1 + 0.1^n, ....
In this system, two different sequences of rational numbers are equivalent if and only if given any value 1/n, you can find a position in the sequences such that the distance between all the pairs of elements beyond that point is less than 1/n. 0.99999 and 1 meet this definition of equivalence.
The main source of confusion that people have with regard to this matter is that they think that the decimal representation "is" the number. But the sequence of digits is merely a representation.
Last edited by Rashakil Fol; 10-02-2005 at 08:25 PM.
Your understanding of mathematics is silly.
...
Mathematics is useful (and profitable) when it describes reality, but it has nothing to do with reality
The point of the 1 + 1 = 17 isn't a particularly intuitive or one, but that is the point. This idea of picking axioms which describe 'reality' really means we pick axioms which describe the reality that we perceive. How do you know there isn't a reality in which it is correct to say 1 + 1 = 17 but also 17 * 17 = 17 + 17? We aren't wired to accept that. You can't really prove or disprove this because according to our definition of reality it really doesn't make sense. There are many aspects of 'reality' which we weren't aware of until the advent of certain technology, and, whenever we as humans update our description of reality we typically need to find hack workarounds in mathematics in order to make it work. You can't divide by zero, because as we perceive reality it doesn't make sense, but you can factor out the number in the denominator and call it a limit, because in our perception of reality it makes sense that at any instant there is pertinent information about some object (velocity for example). I took a high level math course that explores the idea that even continuity might not even exist: the perceived force of gravity is not applied continuously (delayed because it follows the speed of light), changes in energy levels is not continuous (smallest change in energy is a quantum), the spread of molecules when performing mass moment of inertia calculations is not continous (molecules have a finite size).
The point I'm trying to make is that these rules of math that 'work' are based on assumptions, and these assumptions rely entirely on how we are wired as humans.
1 + 1 = 17
17 * 17 = 17 + 17
I challenge you to 'prove' me wrong. Certainly, you cannot without first making certain assumptions, and these assumptions are entirely based on your humanness and, as I said earlier, how you perceive reality.
Is this useful? Not in the god damn slightest. Do I still support my view? You bet.
It works great with your perception of reality. I think you're being absurd!but the standard way of representing 1 + 1 is with the symbol 2.
I'm not immature, I'm refined in the opposite direction.
When you say "a reality," what do you mean?Originally Posted by BobMcGee123
What is a "definiton of reality" and what does "making sense" have to do with proving something? Something does not have to make sense to be provable. Provability of a statement depends entirely upon the base set of axioms. If your set of axioms lets 17 = 1 + 1, then 17 * 17 = 17 + 17 makes sense. 17 is just a symbol. Some systems of axioms make it possible to prove that you can decompose a sphere into five pieces and then put those pieces back together, without changing their shape or size, into a sphere with half the volume.Originally Posted by BobMcGee123
You keep going on about "reality." What does this have to do with mathematics?Originally Posted by BobMcGee123
Let me try to characterize what you are saying in my words.
You seem to be saying that mathematics should not really need to have anything to do with reality, and that it would be different if you were in a universe with different physical laws. Is this close?
My view is that you are correct with respect to your definition of mathematics. But your definition or at least your view of other people's definitions of mathematics is incorrect, because you seem to be arguing when I think most mathematicians would wonder "what is there to argue?" It would help if you wrote more precisely.
There are indeed situations where the meanings of = and + are defined such that 1 + 1 = 17. For example, this is true in arithmetic modulo 3.
Modulo three, it is in fact true that 1 + 1 = 17, and 17 * 17 = 17 + 17.
This also works modulo 5 or modulo 15.
(Working modulo 3, it is still true that 1 + 1 = 2, because -1 = 2 = 5 = 8 = 11 = 14 = 17 = 20 = 23 = ...)
Last edited by Rashakil Fol; 10-02-2005 at 09:47 PM.
I think we have been looking at it all wrong.
Instead of n/1 = n, I think it should be n/0 = n. That is to say with human langauge N devided by nothing, or rather not devided at all is still n. Further, to devide n one time creates 2 groups.
With this method, all your divisions are zero based. So
n/0 = n
n/1 = half of n
n/2 = third of n
n/etc = etc+1 of n
Some of the problems is that when trying to apply this method to the number scheme you lose the definition of zero or nothing, yet you use it as the basis of division so it creates it's own problems.
c++->visualc++->directx->opengl->c++;
(it should be realized my posts are all in a light hearted manner. And should not be taken offense to.)
Dividing a number by X can be interpreted as dividing it into groups of size X. If I divide memory by kilobytes I might be grouping it into one-kilobyte sized chunks.
Also, you're assuming the thing being divided is linear. What happens when you divide a pie with 5 cuts from the center? You get five pieces.
You just move the problem to -1, instead of 0. Also, n/1.5 is not very intuitive. And n/-2 is worse. I just typed n/-0.5 and my head exploded.
Last edited by Rashakil Fol; 10-02-2005 at 10:33 PM.
We do live in that reality, we just use the symbol "2" for what 1+1 is equal to, and we use the symbol "17" for another number.The point of the 1 + 1 = 17 isn't a particularly intuitive or one, but that is the point. This idea of picking axioms which describe 'reality' really means we pick axioms which describe the reality that we perceive. How do you know there isn't a reality in which it is correct to say 1 + 1 = 17 but also 17 * 17 = 17 + 17?
Our definition of 17, as in the number after 16 and before 18, does not and _can not_ equal 1+1 assuming the usual definitions of "1", "+" and "equal".
Mathematics as a concept is not limited by reality, it is simply a language, it can be used to describe reality or it can be used to describe non-reality.You can't divide by zero, because as we perceive reality it doesn't make sense, but you can factor out the number in the denominator and call it a limit, because in our perception of reality it makes sense that at any instant there is pertinent information about some object (velocity for example). I took a high level math course that explores the idea that even continuity might not even exist: the perceived force of gravity is not applied continuously (delayed because it follows the speed of light), changes in energy levels is not continuous (smallest change in energy is a quantum), the spread of molecules when performing mass moment of inertia calculations is not continous (molecules have a finite size).
The reason you cannot divide by zero is not because it doesn't make sense to us, what one can and cannot do in mathematics is not tied to the nature reality. The reason you cannot divide by zero is because based on the axioms mathematics sits upon, division by zero is undefined, as a description it lacks meaning.
Your statements regarding energy levels and gravity have no bearing on the _nature_ of mathematics, reality is not in principle constrained, mathematics however is simply a language, it can be used to describe. A description is only meaningfull if it means something to us - the ones doing the the describing.
Consider it this way, you ask the question could there be a reality in which 1+1 = 17? So i ask what do "1", "+", "=" and "17" stand for? Because before we think about what reality might be, we have to work out what it is we are asking. If you use those symbols as we most commonly define them then "1+1 = 17" is meaningless because it violates the definitions it is attempting to use.
... Maths is based on axioms but those axioms say nothing whatsoever about the nature of reality.The point I'm trying to make is that these rules of math that 'work' are based on assumptions, and these assumptions rely entirely on how we are wired as humans.
If you are attempting to use the usual definitions of those symbols (base 10 that is) proving you wrong looks reasonably easy:1 + 1 = 17
17 * 17 = 17 + 17
I challenge you to 'prove' me wrong
17 = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1
17 = 1 + 1
1 + 1 = 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1
Which violates "=" since we dont have the same amount on both sides
(from the lhs 1 = 16, from the rhs 1 = -14)
Last edited by Clyde; 10-03-2005 at 04:16 AM.
Entia non sunt multiplicanda praeter necessitatem
I think it is Lagrange (or one of those french L mathematicians) who once said :
"A mathematical model does not have to be true, so long as it is is practical"
Enough said.
Teacher: "You connect with Internet Explorer, but what is your browser? You know, Yahoo, Webcrawler...?" It's great to see the educational system moving in the right direction
This has mostly been said:
- x/0 is undefined, because formulaes involving division by zero doesn't make sense anyway. It would destroy all algebraic rules and make it impossible to count.
- x/0 is not equal to infinity. Its limit towards 0 is not equal to infinity. It does not exist.
- 0! is defined as 1 because it makes a lot of formulas involving factorials more generic. This is a definition that works very well.
- lim x->0+ (1/x) = +∞
JeremyG: [new definition of division] That way we would have no inverse to multiplication, which would be bad.
Last edited by Sang-drax; 10-03-2005 at 11:02 AM.