# Zeno, Limits, and Arguing About Numbers

/One of my favorite things about mathematics is that it's its own insular world in many ways (note the correct it's-its usage there; as an aside I think passing an it's/its, your/you're, and their/they're/there test should be a high school graduation requirement, but I digress). As I mentioned to my colleague and fabulous co-instructor Eric, we make choices in mathematics all the time. They are not arbitrary, but we do make them, and we try to do so in a way that's as intuitive and clear as possible. The first example is Euclid's axioms for plane geometry, which we've already seen and which we know cause some trouble once you try to use the parallel postulate. It just gets more exotic from there, but at all times it is important to remember that mathematics is based on axioms and definitions. Once we define a concept, we then try to prove things about it. Then we might worry about whether it has a practical application or not (I said *might*; G.H. Hardy famously abhorred applications of mathematics).

Zeno's Paradox, the one where you can never get where you're going because you first have to go halfway and then half the remaining distance and then half that remaining distance and so on forever, is evident in all sorts of literary works. Woolf's *To the Lighthouse* has it embedded in there a bit--will they ever get to the lighthouse? Will Lily finish her painting? (Yes, and yes, as it turns out.) But it's more blatant in Kafka's *Before the Law*, which we read last week in class. The man comes from the country to see the "law," whatever that is. There is a gatekeeper who will not let him pass at the moment, but he informs the man that beyond the gate there is another, with its own gatekeeper, and that beyond that gate is another whose gatekeeper is so fearsome that even he (the gatekeeper) cannot bear to look at him. So, we are led to conclude that there are an infinite number of gates and gatekeepers, each more powerful than his predecessor. What would such a set up look like? An infinite string of gates like this?

Or maybe it's more like an infinite collection of concentric circles:

Question: can we ever reach the law? Which law are we even talking about? Does it even exist? Of course, the man never even gets past the first gate (this *is* Kafka, after all) and dies waiting, so we never discover the structure of the building which houses the law.

So where's the math here? Well, it's all in the question of how to resolve Zeno's Paradox. This leads to the idea of limit, developed by Bolzano, Cauchy, and Weierstrass in the early 1800s. Finding the limit of a sequence \(a_1,a_2,a_3,\dots\) amounts to playing the following adversarial game: I claim the sequence converges to some number \(L\). You then tell me how close to \(L\) you need the terms of the sequence to get. Then I find a positive integer \(N\) so that if I go beyond the \(N\)th term of the sequence I'm within your tolerance. In math: \[ \lim_{n\to\infty} a_n = L\] if for every \(\varepsilon >0\) there exists an \(N\) so that if \(n\ge N\), we have \[ |a_n-L| < \varepsilon.\] If you imagine plotting the values of the sequence (after all, a sequence is just a real-valued function with domain the set of natural numbers), then this definition says that if I go far enough out, all the plotted points live inside the horizontal strip \(L-\varepsilon < y < L+\varepsilon\).

But we still haven't gotten to Zeno (will we get there?). What we are trying to do there is add up an infinite string of numbers \[\frac{1}{2} + \frac{1}{4} + \cdots + \frac{1}{2^n}+\cdots\] and the problem is that we don't know how to do that. Can we? This is where the mathematician gets to make a choice. Here's how we deal with infinite sums: we can definitely add up a finite collection of numbers, so given an infinite sum \(a_1+a_2+\cdots +a_n+\cdots\) we define the \(k\)th partial sum to be \[s_k = a_1+a_2+\cdots + a_k\] and then say \[ \sum_{n=1}^\infty a_n = S\quad \text{if} \quad \lim_{k\to\infty} s_k = S.\] So, in the case of Zeno's sum, we have \[s_k = \frac{1}{2} + \frac{1}{4}+\cdots + \frac{1}{2^k} = 1-\frac{1}{2^k}\] (the last equality should be pretty obvious to you--think about how far you are from the end if you've gone \(k\) steps). This sequence clearly has limit \( 1\) *et voila* we've resolved the paradox.

Or have we? Our students weren't so sure. What we've really done is define the paradox away. That is, by *defining* what we mean by an infinite sum, we are able to demonstrate that it makes sense to add these powers of \( 2\) and that the answer is \(1\). But we haven't really resolved it *philosophically*, have we? Alas.

But that's not what mathematicians do. The definition above is extremely useful and allows us to make sense of all sorts of interesting things like the natural exponential function, trig functions, Fourier series, etc. We'll trade philosophical quandaries for useful mathematics any day.

But here's one more fun thing to talk about, one which invariably spawns arguments. A *geometric series* is an infinite series of the form \[ a+ ar +ar^2 + \cdots + ar^n +\cdots = \sum_{n=1}^\infty ar^{n-1}.\] The number \(r\) is called the *ratio* of the series. We can actually find a formula for the sum of such a series. The trick is to consider the \(k\)th partial sum \(s_k = a+ar+\cdots +ar^{k-1}\), then multiply it by \(r\) to get \(rs_k = ar+ar^2+\cdots +ar^{k-1} + ar^k\). Subtracting the latter from the former and then dividing by \(1-r\) we get \[s_k = \frac{a(1-r^k)}{1-r}.\] Now, if \(|r|>1\), this sequence has no limit since the term \(r^k\) goes off to infinity. If \(r=1\) this series clearly diverges since I'm just adding \(a\) to itself infinitely many times (assume \(a\ne 0\)). But, if \(|r|<1\), the term \(r^k\to 0\) and so we get the formula \[\sum_{n=1}^\infty ar^{n-1} = \frac{a}{1-r}.\] We'll come back to this later in the course when we talk more about infinity and Cantor's work, but for now, let's have an argument.

What number is this: \[0.99999999\dots\] Note that any repeating decimal represents a geometric series. In this case, we have \[0.99999999\dots = \frac{9}{10} + \frac{9}{10^2} +\cdots +\frac{9}{10^n} +\cdots\] and this is a geometric series with first term \(9/10\) and \(r=1/10\). The sum is then \[\frac{9/10}{1-1/10} = \frac{9/10}{9/10} = 1.\] Thus, we see that \[0.99999999\dots = 1.\]

Wait. How can that be? This is where the fight begins, and if you think about it, this is just a rephrasing of Zeno's paradox, where instead of going half the distance at each step, we go \(9/10\) the distance (same difference, just different sized steps). Well, I just proved to you that the infinite sum is \(1\). But wait, you say, that's just in the limit; it never actually *equals* \(1\). But, I say, that's the definition of an infinite sum and the calculation is correct. But, you say, that number has to be less than \(1\). And round and round we go. OK, I say, here's another proof. Let \(x=0.9999999\dots\). Then \(10x = 9.99999999\dots\) and then we see that \[9x= 10x - x = 9.999999999\dots - 0.999999999\dots = 9\] from which it follows that \(x=1\). You can't really argue with this logic. I didn't use limits or the definition of an infinite sum. I just did some algebra. I don't know, you say, something still seems fishy...

Well, ok, how about this one, which I learned from my high school math teacher, Mrs. Ruth Helton. Note the following pattern \[\frac{1}{9} = 0.111111111\dots \] \[\frac{2}{9} = 0.222222222\dots \] \[\vdots\] \[\frac{8}{9} = 0.8888888888\dots\] So we must have \[\frac{9}{9} = 0.9999999999\dots,\] right? I'm being facetious, but you have to admit that it's a good heuristic.

These two numbers really are the same, but it comes down to what we mean by "number." We all understand what a natural number is because we use them to count. It's then not too hard to get to rational numbers because we understand how to divide wholes up into parts. We understand negative numbers because we have all owed someone money at some point. But then we reach the question of what an arbitrary real number is, say \(\sqrt{2}\). It is not a rational number (the fact of which allegedly got its discoverer killed by the Pythagoreans), yet we know it exists since we can construct an isosceles right triangle. More generally, how do we *define* the real numbers? That's a rather complicated question, one which we won't discuss here, but which more or less comes down to approximating any number by rationals via a sequence (truncate the decimal expansion of the number at each place; these are all rational).

So, that's that for this week. Up next, more Kafka and more infinity.