(01-14-2014, 05:56 PM)BAndrew Wrote: It is really Math heavy going. If you don't accept the definition of course then I can't prove anything
Look, I'm not trying to be difficult. I'm not even taking a shot at mathematical limits, I just have a hard time accepting the definition of infinity used in these calculations because it isn't consistent with the real world.
Here's a straightforward question: How long can you continue to divide 1 by half until it reaches 0? And once you "reach" 1/inf, I take it if you continue to divide it by half you just get 0 from that point on, correct? Doesn't that mean that at some point the series - dividing 1 by 2 an infinite amount of times - the value "became" 0? So this supposedly infinite sequence of division can be described like this:
1/2 = 0.5; 0.5/2 = 0.25; 0.25/2 = 0.125 … 0/2 = 0; 0/2 = 0
I'm sorry but that just doesn't make sense to me. It's an infinite sequence of taking
something and halving it. How can it disappear?