more multiplication

So I wrote about that Chinese multiplication video and how it's not really that great a method if you have large digits.  I gave the example of \(78\times 89\) to illustrate why.  Here's a nice way to do it that is perhaps a little more visually pleasing than the standard algorithm we all learned in school.  I'll explain it in a second, but here's an animation of it.

pretty...

pretty...

Here's what you do.  Draw a box and divide it into rows and columns for each digit in your factors.  In this case, it's a \(2\times 2\) grid. Then draw diagonals passing southwest to northeast in each box.  For each pair of digits, multiply them and place the answer in the corresponding box, one on each side of the diagonal line.  For example, the \(8\) in \(78\) times the \(9\) in \(89\) gives a \(72\) in the lower right corner.  Then add along the diagonals.  If a sum is more than \(9\), carry the appropriate number to the next diagonal and add it along with the numbers you find there. 

In this example, we get the product \(78\times 89 = 6,942\).  Neat, huh?

Really, this is the same algorithm we use, but it has the advantage of not requiring you to keep up with shifting things over in the successive rows when you stack up the various intermediate products.  I try to use this method when I multiply, but to be honest I am so used to the old-timey way I learned in elementary school that I usually default to that.  But I encourage you to give this a try the next time you find yourself calculatorless.

Chinese (?) multiplication

OK.  So my wife posted this video to my facebook page with the question "true or false?"

Chinese method of Multiplication

It's cute and it has the advantage of appearing to make the "complicated" act of multiplication more visual by reducing it to drawing some lines and counting their intersection points.  But did you notice that the examples all had small digits?  As in, \( 123\times 321 \).  This one trick makes the whole thing appear simpler.

But let's try this one:  \(78 \times 89\).  Here's the picture of the calculation I did using this method:

ugh.

ugh.

First note that I made a mistake, getting the wrong answer on the first try.  That was because I confused which number of lines was crossing another in each of the four blocks.  Only after I checked my answer with the "western" method did I realize I had done something wrong.  To see just how inefficient this method is, consider the block on the right side of the diamond.  It has \(8\times 9 = 72\) intersection points.  Let's assume I can count that many things in a small area correctly.  Then I have to do it three more times.  Then I have to add up the numbers in the top and bottom columns and get that correct.  Then I have to carry the \( 7\) from the right block and add it to the \(127\) in the middle and then carry the resulting \(13\) to the left to get the final answer \(6,942\).  Whew.

And, this is really the same as the algorithm you learned in elementary school.  It's just in disguise.  I suppose this is a useful instructional tool to help motivate the general algorithm, but as a practical computational tool I'd say it falls short.  We now return you to your everyday lives in which you're probably using a calculator to do this anyway, if you are multiplying numbers at all.

there's a 14% chance of rain...

My first blog post of this calendar year.  I've made some changes, such as stepping down from my administrative position at the university.  That means I'm on sabbatical this year, or as the provost calls it, "time off for good behavior."  I could get used to this, but just as soon as I do, it'll be over.

Anyway, I've noticed a subtle change to the Weather Channel app on my phone.  Here's a screenshot:

Notice anything unusual?  What does it mean to say there's a 14% chance of rain?  Or a 30% chance? Or a 90% chance? 

I've been noticing a lot of statistical misinterpretation lately.  Malaysian Airlines has two accidents in the span of a week (one of them not their fault, to be sure) and people stop booking flights.  If only everyone knew about Poisson processes...

But what's up with the Weather Channel?  I'm sure they think that putting more exact percentages, like 47% at 5:00 pm, will inspire confidence.  "Wow, they must be getting really good at forecasting!"  I ask again:  what does it mean to say there's a 30% chance of rain?  It really means that over time, looking at instances where similar atmospheric conditions were in place, it rained about 30% of the time.  That's all.  Reporting such a percentage to one more significant digit does not make me think it's a more reliable forecast, because weather is much too complicated for an accurate forecast more than a few minutes into the future (maybe a couple of hours).  After that, the possible trajectories from the given initial conditions may diverge too wildly to make an accurate prediction. 

So, nice try, Weather Channel, but I'm not convinced.  I've never taken a 20% chance or rain as a serious threat, so I'm not about to consider a 22% chance any other way.  

Christmas calendar

photo(6).JPG

You've probably seen one of these.  This one was on display at my mother-in-law's house this year.  I've never really looked closely at one to figure out how they work, and I decided not to look at this one either.  Rather, I determined the distribution of the numbers on the two cubes while driving around my hometown during the holidays.  My parents live at one end; my in-laws live at the other.  So, lots of opportunity.

The first observation to make is that because of the 11th and 22nd, both cubes must have a 1 and a 2.  If you don't think deeply at all beyond that you leap to an incorrect conclusion: since there are 8 faces remaining and you have 8 numbers left to distribute (0 and 3-9) you can just put half of them on one cube and half on the other.  But that's a problem.  If 0 is on only one face then you can get only 6 of the single digit dates (the 6 numbers on the cube without a 0).  It follows that you must put a 0 on both cubes along with the 1 and the 2. 

But wait, that leaves 7 digits to place on the remaining 6 faces.  If we weren't using Arabic numerals we'd be in trouble, but luckily the numbers we use have the (in this case) useful typographical property that the 9 is an upside-down 6 (or vice versa, if you prefer).  And, since we don't need the 6 and the 9 at the same time for this application, we can safely put only the digits 3 through 8 on the remaining 6 faces.  Voila! Calendar cubes.

I realize this is utterly trivial, but it's about as much brain power as I can muster while on vacation.  A more interesting question is: how many such cubes can there be?  That leads to all kinds of interesting combinatorics.  Note, though, that determining which three digits you place on the first cube determines those on the second cube.  Choose three digits from the set of six possibilities; there are \(\binom{6}{3} = 20 \) ways to do this.  We then have a set of six numbers: \( \{0,1,2,a,b,c\} \).  How many different cubes can we make labeled with these digits?  This is a classical combinatorics question, usually phrased in terms of paint colors.  There are \( 6! \) ways to place the digits on the cube, but we need to consider rotational symmetry: there are 24 rotations of a cube and so there are 720/24 = 30 such cubes.  So, for each selection of three digits we get 30 distinct first cubes and 30 distinct second cubes, making 900 pairs of cubes for each of the 20, giving us a grand total of 18,000 pairs of calendar cubes.  But wait, I counted each pair twice, so there are only 9,000.  Still quite a few, but I'm sure they only come in one variety.

Maybe next time I'll just look at the cubes.  

solitaire's the only game in town

Everywhere I turn I see people playing Candy Crush Saga on their phones.  My wife did for a while (she's gotten tired of it, I think); my in-laws play a lot.  I've never once tried it.  I like video games as much as anyone, and indeed I spent my early teens shoving quarters into the Galaga machine at the convenience store (high score over 2,000,000--now you think I'm cool).  These days I'm methodically working my way through Super Mario 3D World on the Wii U.  But games on the phone?  Not so much.

A quick inventory reveals the following games on my iPhone (still a 3GS--I know, how do I survive?):

  1. Set
  2. BClassicLite
  3. Eliss
  4. Doodle Fit
  5. Jelly Car
  6. Paper Toss
  7. Mr. AahH!!
  8. BiiPlane
  9. Paper Glider
  10. Galaga Remix
  11. Flight Control
  12. Sol Free Solitaire

Most of these were installed as restaurant distractions for my son when he was younger.  I've never played BClassicLite, Eliss, Paper Toss, Mr. AahH!!, BiiPlane, Paper Glider, or even Galaga Remix (screen is too small).  The others I've played only a few times.

Except for Sol Free Solitaire.

I'm about to reveal an embarrassing screenshot--the stats the game keeps:

I'll gloss over the fact that this is the second iPhone I've had...

The only really embarrassing thing here is the Total Time Played field.  I realize that 55 hours seems like a lot, but I don't really sit around playing this game all the time. I mostly play when I'm waiting for something--marching band practice to end (lots of time there), doctor's office waiting room--so this could be viewed as a commentary on the drudgery of everyday life.  And I've had this phone for nearly two years, so that works out to only about 5 or 6 minutes a day on average.

I think I'm a pretty good solitaire player.  I play quickly, so it's possible I could improve my win percentage if I were a bit more careful.  I doubt it could be much better though.  I win roughly 1 game in 6 on average, and if you've ever played much yourself you know that's not bad.  But I did get to thinking--what kind of win percentage would a really good solitaire player have? 

To answer that, it would be helpful to know the probability of winning any particular game.  Some games are unwinnable from the start--you can't move any of the initial up cards and passing through the waste (as the pile remaining after the deal is called) yields no moves.  Supposedly these are rare, but they seem to happen with an alarming frequency to me.  The number of possible hands is over 7,000 trillion.  That's way too many to analyze all the possibilities, so we can only estimate the probability of winning.

So I looked around the interwebz a bit and found some interesting stuff.  Using Monte Carlo methods, some have estimated the following probabilities:

  • fraction of winnable games: 82%
  • probability of winning with "good" play: 43%

So, it seems that I'm not a very "good" player, even though I think I am.  More discouraging, however, is that in theory I should be able to win about 8 times in 10.  So, why the discrepancy?  The problem is that knowing that a game is theoretically winnable requires complete knowledge of all the positions of the 52 cards.  So, when I lose a game of solitaire, the chances are good that I could have won had I not made a particular move.  The problem, of course, is that I don't which move was wrong.  Alas.

On the plus side, though, the app on my phone makes this all a much more efficient waste of time than it was when I was a kid and had to physically deal an entire deck of cards.  Losing in that scenario was a real drag because it meant gathering, shuffling, and redealing the deck.  I kind of miss that, though.

(N.B. if you don't get the reference in the title, listen to this.)

spherical trigonometry

So I just finished reading Heavenly Mathematics, by Glen Van Brummelen because, you know, that's the sort of thing I do for fun.  I didn't go full nerd on it, though: I didn't do the exercises.  I'll save that for another time.

Man, did I learn a lot from this book.  Spherical trigonometry was a fairly standard part of the mathematics curriculum until the middle of the 20th century, but it is all but forgotten these days.  Its primary use is for navigation at sea and modern technology has rendered it obsolete.  Well, not obsolete, exactly, but the technicalities have been incorporated into our modern devices that do our navigating for us.  No doubt logarithms will be the next thing to be dropped (and indeed, no student today learns what logarithms are really for).

My favorite things: in the first chapter, we learn how to estimate the Earth's circumference, construct a sine table with our bare hands, and determine the distance from the Earth to the Moon.  Here's the basic idea of the circumference calculation from the medieval Pakistani mathematician Biruni.  First, you need the height of a mountain you can climb.  To do so you make a big square out of wood or something and measure the distance \( E'G\) and the distance from \( F_1' \) to the point where \( BF_1' \) crosses the left side of the square.  Then you use similar triangles to get the height \( AB\).  In Biruni's example, he found the peak to be 305.1 meters.

I've been to the mountaintop...

I've been to the mountaintop...

The next thing to do is to climb the mountain.  If you've stood on top of a big enough hill, you notice that the horizon curves down from where you are.  You can then measure the angle of declination; that is the angle \( \theta \) in the diagram below which is made by taking a tangent to the Earth's surface at the horizon passing through the peak and the horizontal ray from the peak.  Biruni measured this as \( \theta = 34' = 0.56667^\circ \).  Then use the fact that \( \Delta LJB \) is a right triangle to get

the earth, not to scale.

the earth, not to scale.

\[\cos\theta = \frac{r}{r+305.1}\]

where \( r\) is the radius of the Earth.  Provided we have a sine table (which we can build with our bare hands, as the author describes, and which Biruni had available), we know \( \cos\theta \).  Solving for \( r\), we get \( r=\) 6,238 km.  The actual value, accepted by scientists today, is 6,371 km.  Not bad for having only primitive tools.

Once you know the radius of the Earth, it's actually not too difficult to figure out the distance to the moon, provided you can make some measurements in two locations during a lunar eclipse, which Ptolemy was able to do nearly 2,000 years ago.  He missed on the distance to the sun by a factor of 19 or so (too small), but I suppose we can let that slide.

The other thing I really enjoyed was the proof of Euler's Formula \( V - E + F = 2 \) for convex polyhedra.  There are lots of proofs of this, the simplest using graph theory and more high-powered ones via topology.  It turns out that Legendre came up with a wonderful proof using spherical trigonometry.  The idea is this:  take a convex polyhedron, like, say, a cube, and scale things so that the unit sphere fits around it comfortably.  Then project the polyhedron out onto the sphere (imagine a light source in the center of the polyhedron).  You then have a spherical polyhedron.  Now, we know the surface area of the sphere is \(4\pi \).  We need only then add up the areas of the spherical polygons covering the surface.  This is where the spherical trig comes in:  a spherical triangle \(\Delta ABC \) has area \[\frac{\pi}{180}(A + B + C - 180^\circ)\]  where \(A, B, C\) are the triangle's angles measured in degrees.  By dividing a polygon with \( n\) sides into \( n \) triangles we find that the area of a polygon is \[ \frac{\pi}{180} ((\text{sum of polygon's angles}) - (n-2)\cdot 180)\]  Putting all this together we get
\[ \sum \frac{\pi}{180} ((\text{sum of polygon's angles}) - (n-2)\cdot 180) = 4\pi \] and cleaning up a bit we see \[ (\text{sum of all angles}) - \sum n\cdot 180 + 2F\cdot 180 = 720.\]  But the angles go around the vertices, so their sum is \( V\cdot 360\) and since each edge gets counted twice, \( 2E = \sum n\).  So the last equation reduces to \[ V\cdot 360 - 2E\cdot 180 +2F\cdot 180 = 720,\] and dividing by 360 gives us the result.  Cool.

I also learned why Napier was really interested in logarithms and why you used to see tables of logarithms of sines.  It was all about efficient navigation and the resulting cost savings.  Money does indeed make the world go around.

I recommend this book heartily.  You can pick it up and read a chapter in 30 minutes and learn something interesting.   

the real fundamental theorem of calculus

Every time I teach calculus there's one lecture that I love and the students meet with disdain.  Well, maybe not disdain  exactly, but indifference.  Yesterday was that lecture and this year was no exception--the low murmur of inattention was all the feedback I needed. 

Why all the hate?  The Mean Value Theorem can't help it.  Oh, and by the way, it's probably the most fundamental result in differential calculus.  It makes most of the rest of calculus work, a point that was driven home to me the first time I taught advanced calculus and had to pull it out in every other proof.  One of these days I'm going to write my calculus for kids book, Where's the Mean Value Theorem? , a Where's Waldo? -style exploration of the subject in which the reader is invited to spot the MVT wherever it's lurking.

The MVT is the Tesla of calculus--so thoroughly embedded you don't even realize it.  In case you've forgotten, here's a picture. The statement of the MVT  is that for a differentiable function \(f \) on a closed interval \( [a,b] \) there is a \( c \) in the interior where \(f'(c)(b-a) = f(b)-f(a) \).  In other words, somewhere in the interior, a tangent line has the same slope as the line joining the endpoints of the curve.  In other other words, somewhere in the interior the instantaneous rate of change of the function equals the average rate of change over the whole interval.

the graph of a function with some lines on it. 

the graph of a function with some lines on it. 

That might not seem like a big deal, really, but it is.  Here's the first baby application.  Suppose you take a drive and you travel 120 miles in 2 hours.  How many times did your speedometer read 60 mph?  The MVT says it happens at least once:  since the average speed over the whole 2 hours is 60 mph there is at least one spot where the instantaneous velocity (i.e., what your speedometer reads) equals the total average.  Now really it must happen at least twice since you probably go above 60 at some point and then drop back down to 0.  Or if you have your cruise control set to 60 then it happens infinitely often, but you get the point. 

Here's another math-y application:  suppose a function \( f\) satisfies \(f'(x) = 0 \) on some interval.  Then \( f \) is constant on that interval.  We know that constant functions have zero derivative; this is the converse.  It's easy to prove, too: take \( x_1 < x_2 \) in the interval.  By MVT, there is a \(c\) between them with \(f'(c)(x_2-x_1) = f(x_2)-f(x_1)\).  But the left hand side of that equation is \( 0 \) by assumption and so \(f(x_1)=f(x_2)\); that is, \( f \) is constant.  Corollary:  if \(f'(x) = g'(x)\) on an interval, then \( f(x) = g(x) + C\) for some constant \( C\).  This is proved by letting \(F(x) = f(x) - g(x) \) and applying the previous result.  The big consequence of this is that any two antiderivatives of a function on an interval differ by a constant, thereby unleashing a tidal wave of \( 1\)-point deductions on calculus tests everywhere when students omit the \( + C\).

Want to prove a function only has one real root?  Mean Value Theorem (well, really Rolle's Theorem, the special case where the function has the same value at the endpoints).  L'Hospital's Rule in its full generality?  Mean Value Theorem.  The Fundamental Theorem of Calculus?  Yep, you use the Mean Value Theorem.  Well, there are a couple of proofs.  The first uses the MVT for integrals (which is a consequence of the Intermediate Value Theorem, and asserts that a continuous function takes on its average value on an interval).  The second is more direct and uses the MVT directly.  You can read it here.

There's also a several variables version which implies that the mixed partials of a function \(f:{\mathbb R}^n \to {\mathbb R}\) are equal: \[\frac{\partial^2 f}{\partial x_i\partial x_j} = \frac{\partial^2 f}{\partial x_j\partial x_i}. \]  So, while I know it will never catch on (and it probably shouldn't) I will continue to call the MVT the Real Fundamental Theorem of Calculus. 

Calculus as performance art

that little dot in front of the middle screen is me... 

that little dot in front of the middle screen is me... 

I volunteered to do this, you know.  The department found itself in a bind when the lecturer who had been teaching this class (and coordinating all of Calc I and its 1,800 students) quit abruptly over the summer.  Now I know why, and all I do is lecture. 

The odd thing about "teaching" such a large class is that the workload is relatively light.  I have a phalanx of TAs to do the exam proctoring and grading.  There's an online homework system (it's got lots of problems, but it exists).  So really, all I have to do is show up three days a week and be awesome at talking about calculus.  I've been doing that for years, so no problem. 

Except there's all these bizarre technical issues.  Like, "WebAssign says I'm not signed up, but I paid for access weeks ago," or "My clicker doesn't show up on the screen."  This has made me a manager more than a professor, and frankly I don't have much patience for faulty technology and troubleshooting.  I've got derivatives to talk about, dammit.   

What's been most interesting for me, though, is the opportunity to reflect on the idea of the "lecture" and whether or not it works.  When you have an audience, er, I mean class, of 643 I'm not sure "lecturing" is the proper term.  I wear one of those boy band hands-free microphones (it's even flesh-colored) and harness a pile of technology to pull it off.  Here's a picture of everything I need to make the class go: 

tools of the trade

tools of the trade

Well, that's not even everything.  I also need the classroom computer and the document camera (a modern-day overhead projector) to really do everything.  The only thing missing is pyrotechnics.

We use these clicker devices to do in-class quizzes, but really they're just a way to take attendance.  The receiver is about the size of a small wireless router, and in a room that size it's difficult for everyone to have the responses counted.  And don't get me started on the software for them--I'll just say this about it:  it is not possible to search for a name in the class roster. 

Does it work?  Well, I don't know.  There is a constant low hum in the room, and I don't even say anything about it.  I think maybe a third of them are really paying attention.  It's hard to say since I'm looking at a sea of faces and it's effectively impossible to make any sort of human connection.  Add to that the fact that more than 90% of them have taken calculus before and you have a perfect opportunity for inattention.  There hasn't been any crowdsurfing yet, though, so maybe they're more engaged than I think.   

Mostly I think we're missing a real opportunity.  Most of them have seen the material before and yet we still just run through the course like it's the first time for them.  And, frankly, we have to do it that way because of the 60 or so for whom it is the first time.  That's the mistake.  We could have a section for those students.  Why not take the 600 and use the fact that they already know how to calculate derivatives to really get a better conceptual understanding of the material?  I'm thinking here of the approach Rob Ghrist takes in his Calculus MOOC (which I passed, yay!) in which it really gets emphasized that the derivative is the first-order variation of a function, and that the functions we use most of the time have Taylor expansions and that's how we really calculate things.  You know, the really interesting stuff.  And I really think this would help them in their engineering studies since it would inculcate the idea that approximation is extremely important in practice.

But that's for another time.  For now I'll just keep practicing my dance moves and snake oil pitches.  As Billy Flynn would say, "Razzle dazzle 'em." 

Nicholson Baker is not even wrong

Like every other mathematician, I had a knee-jerk reaction to novelist Nicholson Baker's piece (warning: paywall) in the September Harper's bashing Algebra II.   I mean, really, what does some fiction writer know about it, amirite? 

Then, after tracking down a physical copy of the magazine at an actual bookstore, I read it.  And I was frustrated.  And not for the reasons you might think.   And not for reasons others have pointed out (and those are good reasons, by the way--read them).

No, I actually agree with much of what Baker has to say.  My whole issue is that the article is disingenuous, perhaps even intellectually lazy.  The real issue under discussion is school reform and the whole accountability movement; Baker simply uses a very unpopular subject (Algebra II) as his straw man.  In particular, he grabs rational functions as an example and then goes on to pillory the topic as evidence of how useless Algebra II is for the average person and how this causes students to feel stupid and worthless.

And he's not even wrong (apologies to Wolfgang Pauli).  Baker is smarter than this, and indeed, he goes on to argue that the real problem with high school mathematics is that it's been decontextualized and broken into a sequence of discrete steps, each devoid of meaning.  He's largely correct, but his counterproposal has lots of its own flaws.   

We've done this to all of K-12 education, though.  Our slavish devotion to test-taking has forced students and teachers to develop coping mechanisms.  Here's an example from my own life, which happened just this weekend.  My son began high school this year and he's taking something called "Honors Pre-AP English I" (this is real, not some Orwellian nightmare).  On Saturday, I learned the "CD/CM method" of essay writing.  It goes like this:  fill out a sheet with your "funnel" (4-7 sentence introduction), your thesis statement, and then for each of three paragraphs you have 11 (!) sentences--your topic sentence (fine) and then CD#1, CM#1, CD#2,CM#2,...,CD#5,CM#5.  What is a CD, you ask?  Concrete Detail.  A CM?  Comment, of course.  Now, this is really just a super-extended outline for an essay, but my son was extremely frustrated by this, eventually exclaiming, "I just want to write the damn paper!"  But the worksheet is part of the grade.  What to do? 

Well, we worked it all out, but the bigger point is this:  All of education these days is a sequence of discrete steps, decontextualized and devoid of meaning, which are to be spit out on a standardized test.  My beef with Baker and Harper's is that they picked on math to sell magazines and all this article will accomplish is to make the Common Core crowd (of which I am not a member) dig in their heels to defend Algebra II.  Meanwhile, American students continue to languish in underfunded schools that crush creativity.  Business as usual.    

MOOC update #3

I confess:  I am too far behind at this point to watch all the lectures.  A week off at the beach followed two weeks later by a week at a conference in Germany has left me unable to catch up.  I did watch the BONUS material in Chapter 3, which was very cool (more on that below), but I skipped most of the rest.  Now, to be honest, I don't feel too bad about it since those are the lectures about techniques of integration and that's pretty standard stuff.   Still, this isn't quite what I promised to do, is it?

And, that, it seems to me, is the issue with MOOCs.  You make a promise, but really only to yourself.  It's kind of like a new year's resolution--no harm if you don't keep it.  If you are conscientious, then you may really gain something, but otherwise you just maintain the status quo.  In 2012 I resolved to read Moby Dick  and I did it (although I found this version last month, which in many ways is just as good) .  This year I promised myself I would read War and Peace ; I quit after 30 pages.  Meh.

I've also stopped doing the homework.  To really take this seriously requires about an hour per day of work: watch the lecture and do the homework.  And, frankly, I've done enough calculus problems in my life.  I still enjoy them, but I don't really need the practice.  I take the quizzes, though, and I plan to take the final, but that's mostly out of intellectual curiosity. 

And really, it's a shame.  This course is spectacular .  I cannot praise it enough.  The lectures are superb.  The production value on the videos is top-notch.  Rob gives excellent examples, explains things well, etc.  In short, it's almost perfect and I would recommend it highly to anyone.  Well, anyone with the self-discipline to see it through. 

One of the Chapter 3 videos I did watch was the bonus material about the Fundamental Theorem of Calculus.  I assumed (correctly) that this is where we would see a proof of the theorem.  I once had an idea for a book called Where's the Mean Value Theorem ? It would be a Where's Waldo? style math book in which the reader would be invited to find where the Mean Value Theorem is lurking in the proofs of the theorems.  I often assert to my students that the real  Fundamental Theorem of Calculus is the Mean Value Theorem--it drives almost everything that comes after it.  So, in the case of this course, I was wondering how Rob was going to pull off the proof of the FTC without using the Mean Value Theorem, which had not appeared in any of the lectures.   

In case you've forgotten, the Fundamental Theorem of Calculus states the following.  Suppose \( f \) is a continuous function defined on some interval \( [a,b] \) and that \( F \) is an antiderivative of \( f \) on the interval.  Then \[ \int_a^b f(x)\, dx = F(b) - F(a). \]  Calculus students love  this theorem.  Finally, I can stop dealing with those ridiculous Riemann sums and compute integrals easily (well, assuming I can find an antiderivative).   

The standard proof of this involves first proving that the function \[ F(x) = \int_a^x f(t)\, dt \] is an antiderivative of \( f \) on the interval.  This is really the tricky part and it is here that we typically invoke the Mean Value Theorem.  I won't write the proof or the MVT out explicitly; you can look them up in any calculus text.  What Rob does here, though, is use the idea that the derivative measures the first order variation of a function and then use the Taylor expansion of \( f \) to show that this function \( F(x) \) is indeed an antiderivative of \( f \).  This requires an assumption, of course, that \( f \) has  a Taylor expansion.  Now, most reasonable functions do have such representations, but there are plenty of continuous functions that don't.  Still, this is a reasonable approach in keeping with the overall theme of the course.  And, in practice, most of the functions we're interested in do have such expansions.

So, on to Chapter 4, which covers applications of integration.  I will try to watch most of these lectures as I am always interested in seeing good examples.  More updates to come. 

 

 

MOOC update #2

Uh oh.  I fell behind.  Well, really, I made a choice.  I took a vacation and even though I had occasional internet access I decided not to think about the MOOC.  This is a positive and negative thing, of course.  It's nice that there are no real deadlines associated with this MOOC (and with many online courses in general) so I can choose to take some time away.  I did take the Chapter 2 quiz before I left town, though, so I wouldn't destroy my grade (I'm joking--each quiz counts only 4%).  However, I know the material, so it was easy for me to do this.  If a student trying to learn this stuff for the first time fell this far behind, it would be difficult to catch up.  I'm back at work this week and I watched the last couple of Chapter 2 lectures yesterday.  I still have all of Chapter 3 to get to; these were released in two chunks on June 21 and June 28.  I suppose I better get cracking.

This experiment is mostly about the MOOC format for me, although I do love the material.  Calculus is still one of my favorite things to teach, and it's interesting to see how a colleague does it.  Chapter 2 was about derivatives and their applications.  Rob chose a more conceptual approach, opting not to focus on the derivative as a slope.  This is good.  Derivatives are much more than that, and I particularly like his emphasis on the fact that the derivative measures the first-order variation of a function.  Via this definition, the Leibniz rule for the derivative of a product falls out for free.  I do have one quibble, though--the discussion of the chain rule was a little hard to follow.  That is, while thinking of the derivative as first-order variation allows one to see that the derivative of a composition is a product of (some) derivatives, it's not really clear what those individual factors are.  With 25 years of calculus under my belt I was able to watch and understand, but I wonder if the typical first-year college student, even at Penn, is able to get this idea clearly.  Then, all of this goes away in the exercises when it is assumed that students will calculate derivatives using what they already learned in high school: \[ (f\circ g)'(x) = f'(g(x))g'(x). \]  

But this is nitpicking.  Overall, the presentation is still top-notch, with good examples and interesting applications.  The discussion of L'Hopital's Rule showed clearly why this result works (via Taylor series).  I especially loved the bonus material about the infinite power tower: \[ f(x) = x^{x^{x^{x^{x^\cdots}}}}. \]  This is a tricky function, and it's not possible at this level to really treat it carefully, but using implicit differentiation we can calculate its derivative quickly.  Neat.

I'm not so sure how the UF students who are taking this course with me are doing, though.  I created a poll in our internal e-Learning site to gauge how it's going, asking questions to find out if they are still watching the lectures, doing the homework, etc.  There are more than 30 students signed up for the MOOC; I got 5 responses to my poll.  I think this says a lot, and goes to the heart of the matter for MOOCs generally at this stage--there is little incentive to keep up and complete the course.  Online courses for credit at the university are different since there is a tuition charge involved and the result goes on the student's permanent record.  If universities begin to offer credit for MOOCs, then I suspect this will change.  Rob's course has been recommended by the American Council on Education as one of five worthy of college credit.  Surely more will follow.  I don't know of any universities that have decided to offer such credit, but I suspect some will eventually.  What sort of impact will this have on higher education generally?  I'm not sure.  It could go either way, really.

For now, I'm going to get back to work and find out what Rob's take is on integration.  Personally I find much of this topic to be a bit dry and I de-emphasize the techniques portion.  Really, that is a bag of algebra tricks and I prefer to focus on applications.  More than a week of talking about techniques causes my students' eyes to glaze over, mostly because they think they mastered that in high school already.  Maybe so, maybe not.  That's the real challenge of teaching calculus at a university these days--overcoming the notion that the students think they know it all.  This is where this MOOC shines. 

Another update soon.  But first, Independence Day! 

calculus at the beach

logarithmic spirals everywhere... 

logarithmic spirals everywhere... 

I'm on vacation at Edisto Beach, SC, this week, so I'll have a MOOC update when I return (vacations are good and bad for MOOCs, obviously).  Still, being the good math geek that I am, I can't help noticing mathematics all over the place, even when I'm supposed to be taking a break.  I found the shells in the picture above in a fairly short time just by looking carefully.  I have jars full of these at home; it's a bit of a problem, really.  I love the different color combinations that occur, and of course I also love logarithms, so win-win. 

Tides are interesting to think about.   Newton spent a lot of his spare time, you know, between alchemy and universal gravitation, trying to figure out a formula that would predict the tides.  It's extremely complicated, of course, because it depends on the Earth, Sun, and Moon, but sitting on the beach all day allows you to notice certain things.  For example, when is the water level rising or falling the fastest?  There is an analogy here with the length of the day, something which can be predicted nicely using a sine function.  Day length increases at the fastest rate at the spring solstice because that is when the derivative of sine (i.e., cosine) is greatest.  If you've ever paid attention, you've probably noticed that.  Well, when should the sea level be rising fastest?  Halfway between low and high tide, right?  That's what I noticed anyway, but of course I couldn't make careful measurements while hiding from the sun under an umbrella. 

And then there is the 12-variable optimization problem arising from trying to maximize the happiness of a whole house full of people.  Luckily, my mother-in-law is pretty good at solving that one, although it's really only possible in practice to find saddle points.  Local maxima are hard to come by with small children involved.

Back to work on Monday for a short week.  And back to the MOOC, too. 

MOOC update #1

I'm near the end of week 2 of Dr. Rob Ghrist's Calculus MOOC , so I thought it would be a good time for an update on how this experiment is going.  There are 38 UF students enrolled in this course with me.  I really don't know how many of them are keeping up with the class, doing the homework, etc.  That is one thing that I'm most interested in--how many people will actually complete the course? 

I created a poll to find out how much math each person has taken.  All 20 students who responded have already completed at least Calculus III at the university.  So this course is definitely review for all of us.  That's a general occurrence with MOOCs at this stage--many of the participants already know the material.  I suppose it would be a more interesting experiment for me to take a course on a subject in which I am not an expert, but I thought it might be better to start with something that would not require a lot of extra work on my part.   

Anyway, here are my initial impressions of the MOOC experience.  Fifteen-minute lectures are nice.  My attention doesn't really wander because there isn't time for that.  Rob's videos are particularly good because he draws well; the screen is always colorful and attention-grabbing.  I'm not surprised by this since Rob has a reputation for his careful attention to detail and seemingly boundless energy (fueled by Monster, I think).  His approach to the material is one I like--begin with Taylor series as definitions for functions.  Of course, this assumes that students have seen calculus before and know how to differentiate and integrate basic functions, but this idea leads to a more solid conceptual understanding of what's going on.  It also allows us to calculate things effectively and efficiently.  Here's one of the (challenge) homework questions: 

\[ \lim_{x\to 0^+} \frac{\sin (\arctan (\sin x))}{\sqrt{x} \sin 3x + x^2 + \arctan 5x} \]

A first-semester calculus student  would cringe at this before plowing ahead with L'Hopital's Rule.  The derivative of the numerator is a bit of a nightmare and the potential for algebraic errors is enormous.  But, if you know about the series expansions of trig functions, this becomes much easier.  For \( x \) close to \( 0 \), both \( \sin x \) and \( \arctan x \) are approximately equal to \( x \); therefore, the numerator above is about \( x \).  The denominator is approximately \( 3x\sqrt{x} + x^2 + 5x \) .  The limit is therefore \( 1/5 \) and we arrived at that answer pretty easily (indeed, I did it in my head).

So, the pedagogy is sound, but what about the interface?  As I said, the videos are great, but there is a fundamental problem which has no solution--there is no ability to ask questions of the instructor in real time.  Yes, there are discussion groups at the coursera site and they can be very useful.  That sort of peer instruction is valuable, and in this case one of the participants found a flaw in one of the homework exercises that could only be solved using complex analysis.  Rob himself was impressed by this, as was I, because no one had noticed a problem with the exercise in two runs of the course.  I certainly didn't catch it.  So I've definitely learned a couple of new things.  Still, if I were learning calculus for the first (or second) time via this platform, I'm not sure I would necessarily like it.  Students definitely have to be self-motivated.  That's true for traditional lecture courses at any university, of course, but in the coursera environment it is easier to let things slide.  That may be a function of the cost--free at coursera versus pricey at the university.

A word about the homework:  there aren't that many problems, but I think that's appropriate.  We often assign too much homework to our students; this course walks the line correctly in my view.  Each lecture has two homework sets attached, the core and the challenge.  The former reinforces the material in the lecture while the latter goes a bit deeper.  The problem I gave above is a challenge problem, of course, because it goes to the more conceptual aspects of Taylor series.  I will confess to making occasional errors on the problems, mostly small arithmetic mistakes because I tend to do them quickly (duh, I know this stuff already so surely I don't need to spend any time on the problems, right?).   

This is proving to be an interesting experience.  I'll be back with more updates in the coming weeks.  I still don't know what I think of the whole MOOC idea at this stage, but I'm glad I will be able to make a more informed decision about them in the future. 

the temptation of MOOCs

When I was in college, Martin Scorsese's controversial film, The Last Temptation of Christ​, premiered.  No theater in Blacksburg screened it, but I did manage to catch it one weekend when I went home.  One very polite lady approached us as we walked up to the box office, offered us some Christian literature, and suggested that perhaps we should skip the film.  Other protests across the country were not so tepid.  A guy who lived across the hall from me told me he was boycotting the film, even though he had not seen it.  All he knew was that it had some scenes in which Jesus has a family (which implies he had sexual intercourse at some point) and that the film was therefore blasphemous.  However, these scenes are hallucinations when Jesus is on the cross, and the final scene shows him rejecting this last temptation to simply be a man.  He jubilantly accepts who he is and embraces his identity.  Frankly, I think the film affirms the Christian faith while addressing the dichotomy of Jesus as human and divine.    

​I bring this up because there has been a lot of talk lately about Massive Open Online Courses (MOOCs).  Critics worry that MOOCs will be used to justify further cuts to higher education funding and will transform most faculty into glorified teaching assistants.  In the last couple of weeks, the philosophy faculty at San Jose State University published an open letter to Michael Sandel, a Harvard professor whose Justice MOOC was being pushed upon them by the SJSU administration.  The letter is interesting and makes some valid points, many of which I agree with, but as I read it I couldn't help thinking about my experience with The Last Temptation of Christ​.  These professors rejected something without even trying it.  I suppose you could say that they should just stand on principle, but I think they are missing a chance to engage in an educational experiment.  They may very well be right that Prof. Sandel's MOOC isn't right for their students, but how do they know​?  I propose the following:  Mad MOOC: Beyond Thunderdome.  Two courses enter; one course leaves.  Have half the SJSU students take the MOOC and put the other half into the local course.  Perform the same assessments.  Get some data.  Then​ decide what the right course of action is.

I haven't made up my mind about MOOCs yet.  I see the potential to use them in interesting, blended ways that could enhance the traditional classroom experience.  I also share my colleagues' concerns about how MOOCs could be misused by politicians looking for cost-saving measures.  I thought I'd investigate myself and try to get some current UF students to join me.  Here's the email I sent them:

In the spirit of intellectual inquiry, I have enrolled in a MOOC this summer and I am writing to invite you to join me.  The course is taught by Professor Robert Ghrist, a mathematician at the University of Pennsylvania.  The course is called Calculus: Single Variable and you may sign up at coursera:  https://www.coursera.org/course/calcsing.  Now, to be fair I’m cheating a bit—after all, I am a professor of mathematics myself so the workload won’t be too demanding for me.  However, I know Prof. Ghrist, and I can assure you that he gives the best lectures I have ever seen.  He is very engaging and funny.  Check out the preview video for the course and you’ll get a feel for his style.
In conjunction with this course, I have created an e-Learning site where we can interact and discuss this on two levels:  the actual mathematics involved, and the meta-analysis of the MOOC experience.  If you would like to join me in this adventure, you should do the following:
1.       Sign up for the course at https://www.coursera.org/course/calcsing.  It is free.
2.       Send me an email with your gatorlink and I will add you to the e-Learning site.
The course begins May 24, 2013 and runs all summer.  Dr. Ghrist assumes that you have seen calculus before (e.g. Calculus AB in high school) and can do basic things.  You will gain a much deeper conceptual understanding of calculus if you work at this course.
​If any current or incoming UF students reading this post would like to participate but did not receive this email, feel free to drop me a line at kknudson@honors.ufl.edu.  I've already had a couple dozen or so students sign up.  I'll report back from time to time to let you know how it's going.

birth of Venus

Have you seen the ad for the new Facebook phone?  I guess it's a phone.  Maybe it's an operating system for a phone.  Actually I don't know what it is, but the commercial still makes my blood boil.  I don't want to embed it, but here's the link.​

The implication is this:  art museums are a stoopid waste of time and thank god you have your Facebook phone to keep you up-to-date on important stuff like photos of your friends eating cheese puffs and what time you'll all meet up at the dance club later.  Don't get me wrong--I like eating cheese puffs and when I was in my 20s I didn't mind going to dance clubs, although these days even standing around at a reception for an hour makes my lower back ache.  But I really have a hard time with an ad campaign that feeds the modern attitude that one shouldn't be here now, that there are more important or interesting things to be doing than whatever it is you're doing at that moment.​

I've been to the Uffizi.  Ellen and I stood in front of The Birth of Venus​ awestruck at its beauty.  The reproductions you see in books cannot even come close to the depth and subtlety of the colors that Botticelli employed.​  This was back in the late 90s, so cell phones were pretty rare and they certainly didn't have internet capabilities or even cameras (que horror!).  Imagine the torture everyone suffered by actually having to look at exquisite works of art.   

Boy, are we lucky today not to have to live our actual lives.  Thanks to Facebook and AT&T we can do it vicariously and virtually.  I, for one, welcome our new technological overlords.​

I guess I should be happy that at least they weren't picking on math this time...​

what is worth doing

A friend of mine shared the following excerpt from A.R. Ammons's poem The Ridge Farm​ on Facebook recently:​

doing what is worth doing is worth
what doing it is worth
but doing what is not worth doing
that can really be worth doing

Because of the repetitions of the words "worth" and "doing" I had to read it a couple of times, slowly, to parse it correctly.  Was that worth doing?  Is poetry worth doing?  Is reading poetry worth doing?  ​

I've had a bit of a love-hate relationship with poetry my whole life.  Back in high school I thought the analysis we put poems through was tedious.  Yeah, yeah, there's subtle meaning in there, but maybe the poet wasn't trying to build all those layers (said the 16-year-old me, dozing in the back row).  And even now, if a poet uses obscure language and overly complicated syntax I turn off.  This probably explains why I prefer poets such as Mary Oliver, who focus on nature and the quotidian.  ​

I've written poems since high school.  I've only ever had one published, but I don't think that's really the point for me.  I like to use forms that have rules attached--haiku, tanka, sestina, villanelle.  I'm a mathematician, which means I like patterns and structure.  I also think limitations (like the 5-7-5 and 5-7-5-7-7 syllable rules for haiku and tanka) force you to be creative with word choices, and that can make the end result more evocative.​

Which brings me back to the bigger question--is any of this worth doing?  Our country is by nature a bit anti-intellectual, so poetry, literature, and art often take a beating in public discourse.  The focus in education (both K-12 and higher ed) these days is on STEM, and various governors, including those in my current and home states, spend time deriding the humanities and social sciences as topics unworthy of public support.  The National Science Foundation is no longer allowed to fund projects in political science.  At various times in my life, members of Congress have sought to eliminate funding for public television, public radio, the National Endowment for the Humanities, etc.  I vividly remember Jesse Helms's anti-Robert Mapplethorpe crusade.  This is not new and it will never stop, but it's depressing nevertheless.    ​

If you stop and really think about it, there is not much in life that is "worth doing."  Ours is the only species on the planet that attaches value judgments to our activities.  All the other animals are just trying to live; that means food, shelter, procreation.  So the question is, beyond those basic activities is anything worth doing?  Things that help others survive or live better lives?  Surely.  That means things like curing diseases, aiding the sick, feeding the hungry, etc.​

But what about things that nourish the soul?  Strictly speaking they're not necessary and we often put these in the category of "not worth doing."  Music, art, literature, craft.  If you really put it to politicians this way, they won't deny that of course these things make the world a better place but they'll come up with mealymouthed excuses about priorities and how people who want to engage in such activities should do it for the love of it.  They will argue that such activities should certainly not receive support from the government because they won't lead to cancer cures or better technology, because that's all that "matters."  ​

Do I have a point?  I don't know.  I think it comes down to this:  just as it is true that everyone is special and no one is, so it is true that everything is worth doing and nothing is.  Maybe it's time to stop labeling everything and just get on with life.  ​

humanities, math, medicine, death

So I was on this panel last night about the Humanities and STEM, sponsored by the UF Center for the Humanities and the Public Sphere.​​​  From the description:

This panel and audience discussion will explore the relationship of research inquiry and teaching in the humanities disciplines and Science, Technology, Engineering, and Mathematics (STEM). Participants in the round-table will describe various ways in which advances in history, literature, and philosophy inform and are informed by work in computer engineering, biomedicine, neuroscience, and mathematics.​

What did I talk about?  Well, I tend to be shocked by the general level of innumeracy among our citizens and I gave a couple of examples where a well-rounded education in humanities, social sciences, and mathematics might help people understand things better.  Example 1:  the election of Jesse "The Body" Ventura to the governorship of Minnesota in 1998.  Jesse won with just under 37% of the vote and the punditry proclaimed that Minnesotans voted for an "outsider."  Except they didn't.  Not really.  In fact, 62% of the citizens voted for a seasoned politician (Republican Norm Coleman and Democrat Hubert Humphrey III), but that's not what the talking heads would have us believe.​

Example 2: I was watching Real Time last Friday and there was a segment where a filmmaker went to a Tea Party anti-spending rally and asked people what should be cut from the federal budget.  Defense?  No.  Social Security?  No.  Medicare?  No.  Education?  No.  Veterans benefits?  No.  Then what?  A popular answer was Congressional salaries, which at $700,000,000 (including staffers) comprises 0.02% of the federal budget.  So cutting that is like asking someone making $100,000 per year to take a $20 pay cut.  Another popular answer was foreign aid, which is bigger at $56.1B, but still a drop in the bucket.  But that guy on TV says we should cut them to save money, so we should, right?  This gets to the heart of the matter--many people don't really understand orders of magnitude, especially when they get to be as big as the federal budget $3.5 \( \times 10^{12} \).​

But that's not what really intrigued me about this panel.  One of my fellow panelists is a physician who studies imaging data to determine if doctors order too many diagnostic tests.  The short answer:  yes, way too many.  ​And he talked about the advances that have increased life expectancy:  if the average human lived to be 40 in the Middle Ages and lives to 80 now, how did we get there?  About 10-15 years comes from clean water (separating drinking water from sewage), another 10 or so from vaccinations and similar public health measures, and another 10 or so from the use of antibiotics to treat infectious diseases.  So that gets us to around 75 years of age.  The rest comes from ever more expensive treatments for diseases we really don't know much about (cancer and various degenerative diseases, mostly). 

The question for society is how we deal with these, and frankly, Americans simply don't.  That is, we do not, as a culture, know how to deal with death very well.  We'll take every possible measure, spend any amount of money, to stave it off.  That's mostly what drives our rapidly expanding health care budget--end of life care that only extends life for a year or two.  Is this worth the cost to society?  These are the sorts of questions that are difficult to answer, and lately our strategy as a nation has been to not answer, hoping that they will either go away or that some new way of extending life expectancy that doesn't cost so much will come along.​

All of which got me to thinking about my father, who died of lung cancer about a year ago.  I went to see him near the end when he had entered hospice and was on morphine for the pain.  I'm not sure he knew I was there.  I don't want to write some maudlin tale about fathers and sons, but I will say this:​  he had been sick for over a year and he was suffering.  I couldn't help thinking that surely we can do better as a society to help terminal patients end their suffering.  There can be many ways to go about it, but we at least have to begin the conversation.  And that's one of the many places the humanities can help.  These sorts of questions can only really be answered by philosophy, religion, maybe literature.  Data helps, but it can't answer ethical questions.

So, I hope politicians ​will stop demonizing the humanities and social sciences as "useless."  Not holding my breath, but hoping nevertheless.

sinkholes, etc.

This made me laugh:​

bugs.gif

The best part may be the tiny Bugs Bunny jumping for joy on Cumberland Island, GA, after he cuts Florida off.  ​

Except it's not that implausible (apologies to Orwell for that construction).  This week a Tampa man died when a sinkhole opened up under his bedroom.  Sinkholes have long been a problem in Florida, making it difficult to find a company willing to insure your home (well, that and the hurricanes), and they're getting worse.  A combination of extended periods of drought and too much pumping from the aquifers is wreaking havoc.  And the governor and legislature seem uninterested in doing much about it (the pumping, that is).  ​

One of Gainesville's best known landmarks is a giant sinkhole, Devil's Millhopper.​  It's actually quite beautiful, and in the summer it's lush and at least a little cooler inside it.  Local residents walk up and down the hundreds of wooden steps for exercise.  It's all great fun, especially since no one died or had their home destroyed when it was formed a few hundred years ago.

photo of Devil's Millhopper from the Florida State Parks website

photo of Devil's Millhopper from the Florida State Parks website

Before I moved to Florida I didn't know anything about sinkholes.  Well, that's not exactly true--a sinkhole plays an integral role in The Simpsons Movie, but I had no firsthand knowledge of them.  They just don't happen where I grew up.  Now I know people who have had them buckle the floors in their homes.  In this, and many other ways, living in Florida has been very​ educational.

binary fractions

I always enjoy teaching my origami class.  Last week things got a little mathy.  I learned the following trick from Tom Hull's excellent book, Project Origami, which I highly recommend.​

Here's the problem:  say you have a piece of paper and you want to divide it into \( n \) equal subdivisions.  If \( n \) is a power of \( 2 \), then no problem; you've been doing this since you were a kid (just fold in half repeatedly).  Since every positive integer \( n \) may be written in the form \( n = 2^r m \), where \( m \) is odd, we need only figure out how to do this for odd numbers of subdivisions.​

There's an interesting iterative method to find an approximate solution to this (more on approximate vs. exact later), and it goes like this.  Make an initial guess of what you think \( \frac{1}{n} \) is and make a small pinch in your paper at \( \frac{1}{n} \) from the left edge.  Then the right side of the paper has an estimate of \( \frac{n-1}{n} \).  Since \( n \) is odd, \( n-1 \) is even, so you can fold that in half (just make a pinch at that point).  Now, one of two things happens:  either \( \frac{n-1}{2} \) is even or it is odd.  In the former case, you have an even number of \( \frac{1}{n}\textrm{'s} \) to the right of your new mark; in the latter case you have an even number of these to the left.  Fold the appropriate side in half (again, just pinch your marks).  Continue this process until you get back near to your initial guess of \( \frac{1}{n} \).  You probably won't be exactly at your initial guess, but if you're at all good at estimation you'll be close.  Run through this process again.  When you get back to what you think \( \frac{1}{n} \) is the second time you may stop.  That point is very​ close to where \( \frac{1}{n} \) actually is, and if \( n \) is large enough will be the same as where you wound up after running this algorithm only once. 

Here's the sequence of folds for \( n=5\).  Note that mark number \( 5 \) is my updated estimate for \( \frac{1}{5} \), and that mark number \( 9 \) is the result of two passes through the algorithm.  The last mark is virtually indistinguishable from mark number \( 5 \).​

The situation for \( n=7 \) is a little different.  Here is the picture of one pass through the algorithm.​

As you can see, this is a little different than the \( n=5 \) case.  Notice that I only made marks at my guess for \( \frac{1}{7} \), at \( \frac{4}{7} \), and \( \frac{2}{7} \) before getting back to an estimate of \( \frac{1}{7} \) (which turned out to be very close to my initial guess).  There's some interesting number theory going on with that, but I'll skip that for now.​

Finally, here's what happens after one pass through for \( n=11 \).​

In this case, I made a mark at all \( 10 \) of the subdivisions, coming back to a place near my initial guess.  Running through the algorithm again would get me back so close to mark number \( 11 \) that I didn't bother to do it.  ​

Now, here's something we can do.  At each stage we do one of two things: fold the right edge to a point in the interior or fold the left edge to a point in the interior.  If we keep track of the sequence of folds we get the following.​

\( \frac{1}{3} \quad RLRLRLRL\dots \)

\( \frac{1}{5} \quad RRLLRRLLRRLL\dots \)

\( \frac{1}{7} \quad  RLLRLLRLL\dots  \)

\( \frac{1}{9} \quad  RRRLLLRRRLLL\dots \)

\( \frac{1}{11} \quad  RLRRRLRLLL\dots  ​\)

There's a connection with this and writing the fraction \( \frac{1}{n} \) in binary.  We all know from grade school that \( \frac{1}{3} = 0.3333\dots \) and \( \frac{1}{7} = 0.142857\dots \).  Of course, what this really means is that

​\( \frac{1}{3} = \frac{3}{10} + \frac{3}{100} + \frac{3}{1000} +\cdots + \frac{3}{10^r} +\cdots \)

\( \frac{1}{7} = \frac{1}{10} + \frac{4}{100} + \frac{2}{1000} + \frac{8}{10^4} + \frac{5}{10^5} + \frac{7}{10^6} + \cdots \)​

But there's nothing special about \( 10 \), except that we have \( 10 \) fingers.  We could just as well write these fractions with powers of \( 2 \) in the denominators:

\( \frac{1}{3} = \frac{0}{2} + \frac{1}{2^2} + \frac{0}{2^3} + \frac{1}{2^4} +\cdots \)​

\( \frac{1}{5} = \frac{0}{2} + \frac{0}{2^2} + \frac{1}{2^3} + \frac{1}{2^4} + \frac{0}{2^5} + \frac{0}{2^6} + \cdots \)​

\( \frac{1}{7} = \frac{0}{2} + \frac{0}{2^2}​ + \frac{1}{2^3} + \frac{0}{2^4} + \frac{0}{2^5} + \frac{1}{2^6} +\cdots \)

\( \frac{1}{9} = \frac{0}{2} + \frac{0}{2^2} + \frac{0}{2^3} + \frac{1}{2^4} + \frac{1}{2^5} + \frac{1}{2^6} + \cdots \)​

and so on.  Note that for any odd \( n \), we'll always get a repeating "decimal" expansion this way.​

Now look closely.  If we use the following dictionary: \( R = 1, L = 0 \) and read from right to left beginning at the last digit in the repeating "chunk" we see that the fold sequence tells us the binary expansion of \( \frac{1}{n} \).  For example, \( \frac{1}{5} = 0.\overline{0011} \) and the fold sequence is \( RRLL \).  Since the fold sequence for \( n=11 \) is \( ​RLRRRLRLLL \), we find that

\( \frac{1}{11} = \frac{0}{2} + \frac{0}{2^2} + \frac{0}{2^3} + \frac{1}{2^4} + \frac{0}{2^5} + \frac{1}{2^6} + \frac{1}{2^7} + \frac{1}{2^8} + \frac{0}{2^9} + \frac{1}{2^{10}} + \cdots \) ​

This is a fun fact.  So, for any odd \( n \), if you want to know the binary expansion of \( \frac{1}{n} \), you only need to figure out the fold sequence, and really, you don't even need a strip of paper to do it.  You only need to think about the fold sequence; maybe use a pen and paper to keep track.  For example, if \( n = 2^r + 1 \) for some \( r \), then the fold sequence is obviously

\( \underbrace{RRR\cdots R}_\text{\( r \)}\underbrace{LLL\cdots L}_\text{\( r \)} \)

and so

\( \frac{1}{2^r + 1} = \frac{0}{2} + \cdots + \frac{0}{2^r} + \frac{1}{2^{r+1}} +\cdots + \frac{1}{2^{2r}} \)​

Finally, a word about the iterative process.  There are exact methods to fold a strip of paper into exact \( n \)ths, but they are (a) very complicated, and (b) not more accurate in practice.  Indeed, it is difficult to fold exactly, and errors inevitably creep in.  This algorithm works just as well and is much easier to implement.​

Happy folding!​

team Leonard

Full disclosure:  as a former contestant, it's pretty difficult for me to watch Jeopardy!  (Grammar question:  should I put a period at the end of that since the exclamation point is part of the show's title?)  Having had the misfortune of facing Roger Craig in his epic run during Premiere Week in 2010, it pains me to tune in to the typical mid-season episode filled with more average players and ponder what might have been.  I mean, we really could stand to remodel the kitchen and the money would have been nice.

Anyway, last night's finale of the 2013 Teen Tournament did several things for me.  First, it reminded me that when a game of Jeopardy! is good, it can be really good.  To set the stage, recall that the finals of such tournaments consist of two-day games in which the totals earned in the two games are added together to determine the winner (that is, the day one totals do not carry over into the players' pots on the second day).  The three contests were Nilai Sarda, an Indian immigrant living in Marietta, GA; Barrett Block, from Lexington, KY; and Leonard Cooper, from Little Rock, AR.  I'll have more to say about them later.  Their scores after the first day were Nilai: $19,000, Barrett: $17,600, Leonard: $3,000.  So, clearly Leonard was in the hole and probably had no chance to win.

Except he did.  Late in Double Jeopardy!, with $18,200 in his pot, Leonard found the second Daily Double hiding under an $800 question (which meant it was likely to be fairly easy).  Category: American Lit.  Wager: $18,000.  He had to do that if he wanted any chance to win, given the day one totals.  He answered correctly, essentially doubling his money.  He did not get the Final Jeopardy! answer, but it didn't matter (only Barrett correctly answered Eisenhower, but he couldn't wager enough to beat Leonard).

When Leonard got the Daily Double we all let out a huge cheer.  You see, we loved this kid.  An African-American young man with big hair, Leonard was wearing a windbreaker, a t-shirt, some baggy pants, and a pair of Chuck Taylors.  During the interview stage, where we learned about Barrett's community service and how Nilai had been runner-up in the previous year's National Geographic Bee (an event, much like the Scripps-Howard National Spelling Bee, which has become dominated by children of Indian immigrants for some reason), Leonard told us that he was learning to play the electric guitar, just because he was interested in it, and that he looks to Jimi Hendrix as an inspiration.   

Which got me thinking about No Child Left Behind, of course.  It seems to me that the most deleterious effect of NCLB is that it has turned students in K-12 into relentless box-checkers.  That is, the system is constructed in such a way that it is more important to have done things than to do them, and to document instead of experience.  This is especially vivid right now as we receive applications for the program in which students will sometimes list, in excruciating detail, their high school activities, sometimes going as far as to tell us about the one hour (!) they spent on a particular service project.  The message these students seem to be receiving from somewhere is that they can succeed by a preponderance of evidence of their excellence.  They may or not actually be excellent, but their individual mounds of data sure give them a feel of excellenciness (with apologies to Stephen Colbert).

The Barretts and Nilais of the world will no doubt be successful since they are doing all the things they are told will lead to success.  But it feels like a cold sort of success, lacking in passion and heart.  Please note that I am not picking on these two young men--I'm sure they're great and I'd love to have them in my classes.  My larger point is this:  are we not doing our children a disservice when we encourage (force, really) them to engage in activities that will "get them somewhere" rather than learning to play the Star-Spangled Banner a la Hendrix?  And, more importantly, is that why pop music is so awful these days?