Today’s mathematical journey will go from India to Europe and back, starting with Madhava of Sangamagrama’s invention of infinite series and culminating in Srinivasa Ramanujan’s discovery of the most potent piece of mathematical clickbait of all time: the outrageous assertion that 1 + 2 + 3 + 4 + … is equal to −1/12.
Like many people, I learned two trigonometries, a few years apart. The first was about triangles and it had pictures like this:
The angle θ was seldom more than 90 degrees and never more than 180 degrees (because what kind of triangle has an angle bigger than 180 degrees?).
Happy January 48th, everyone! (More about that strange date later.)
Mathematician Henri Poincaré once wrote “Mathematics is the art of giving the same name to different things,” and he wasn’t wrong, exactly. He was thinking about the way mathematics advances by generating new concepts that unify old ones. For instance, mathematicians noticed that adding 0 to a number, like multiplying a number by 1, doesn’t change the number you apply it to. Eventually they celebrated this resemblance between 0 and 1 by coming up with new vocabulary: nowadays we say that 0 and 1 are “identity elements” (the former for addition, the latter for multiplication).1 Two different things, same name.
But giving different things the same name is only half the story. Mathematics also invites us – and frequently requires us – to give different names to the same thing.2 Seventeen isn’t just 17. It’s also 10001two. It’s the fraction 34/2 (or the mixed number 16 2/2, if we’re feeling goofy). It’s the real number 17.000… and the real number 16.999…. It’s the complex number 17 + 0i.
It hapneth also some times, that the Quotient cannot be expressed by whole numbers, as 4 divided by 3 in this sort, whereby appeareth, that there will infinitly come from the 3 the rest of 1/3 and in such an accident you may come so neere as the thing requireth, omitting the remaynder…
— Simon Stevin, The Tenth (1585)1
Many people find fractions and decimals confusing, counter-intuitive, and even scary. Consider the story of the A&W restaurant chain’s ill-fated third-of-a-pound burger, introduced as a beefier rival of the McDonald’s quarter-pounder. Many customers were unhappy that A&W was charging more for a third of a pound of beef than McDonald’s charged for a quarter of a pound. And why shouldn’t they be unhappy? Three is less than four, so one-third is less than one-fourth, right?
Well, that’s what many of those aggrieved customers told the consultants who had been hired to find out why A&W’s “Third is the Word!” innovation had gone so disastrously awry. But I wonder if those customers were rationalizing (sorry…) after the fact. Maybe some of these people had had such bad experiences when learning about fractions in school (the awkward fraction 1/3 in particular) that they preferred to avoid eating at establishments that triggered their math anxiety.
Perhaps part of the problem is that for many people, the standard middle school curriculum on fractions and decimals doesn’t hang together well, with its mélange of different representations of things that they’re told are really the same thing under different names, such as 1 1/5 and 6/5 and 12/10 and 1.2 (and let’s not even mention 120%). And as if that weren’t bad enough, there are decimals that never end?!? It’s easy to come away from this experience confused and disheartened.
Tens of thousands of years ago, long before humankind hit on the the nifty trick of preserving language with marks on clay or papyrus, our ancestors notched tally marks on animal bones to count … things. We don’t know what the things were. Take for instance the 40,000-year-old Lebombo bone found fifty years ago in southern Africa. We can make guesses about what the person who carved it was counting1, but we’ll never know. That’s the thing about tally marks (and the numerals that replaced them much later): their power to describe huge swaths of reality stems from an essential vagueness.
The notion of quantity shorn of context – that is, the advent of the concept of Number – was the greatest mathematical revolution of all time, the one that made all subsequent developments possible. I don’t have much to say about it because we know so little about it, but since most great advances involve trade-offs I want to mention two of the hazards made possible by the abstract number concept.
I’m sure you’ve counted (“One, two, three, . . . ”) on too many occasions to count. The process can be boring (counting sheep), exciting (counting your winnings at a casino), or menacing (“If you kids aren’t at the dinner table by the time I reach ten, I’ll …”). But one thing counting is not is liberating. What could be less free than the inexorable succession of the counting numbers? And yet the very regularity of counting numbers gives us the freedom to think about them in multiple ways, arriving at conclusions along delightfully varied paths.
I know that the sentence “The year is 2022” is just a bland statement of fact, but it hits my ear like a voice-over in a trailer for a bad science fiction movie made in the 1900s. Blame Walter Cronkite; I grew up watching his TV series The Twenty-First Century (1967-1969) and came to indelibly associate the 2000’s with The Future. Now that I actually live in The Future, surrounded by many of its predicted marvels, my degree of enthrallment varies from marvel to marvel, but I never tire of the wonders of magic paper. You know the stuff I mean: you write something in one place and the paper makes copies of itself elsewhere so that people in those other places can read the words you just wrote. I’m sure you’ve all used it. I’m using it now.
Magic paper helps me with some problems that have long bedeviled classroom teachers like myself: How do you find out what’s going on inside your students’ heads in the midst of a lesson without derailing it? How do you get all your students to actively participate without having the class descend into chaos? How do you communicate with a large group of students without the conversation devolving into what math educator Henri Picciotto calls a “pseudo-interactive lecture” dominated by the teacher and the two or three most vocal students?
I want to tell you about difference tables for polynomials, not only because they’re fun but also because they’ll give us a chance to see how polynomials played a role in the dawn of the computer age through the work of computer pioneers Charles Babbage and Ada Lovelace.
But first, where did polynomials come from?
THE ART OF THE THING
“Thing” is a marvelously flexible word, as are similar words like “res” and “cosa” that other languages have used to signify unspecified objects. Often the word denotes a group of people who have come together for some purpose: think of the Roman Republic (the “public thing”) or the Cosa Nostra (“Our Thing”). Curiously, the English word “thing” itself seems to have traveled in the opposite direction, starting out as meaning an assembly of people and ending up as meaning, well, any-thing. Math has made its own uses of nonmathematical words for indefinite objects: in Indian and Arabic algebra, the quantity being sought was often called “the thing”. It was natural for European algebraists to borrow this usage, and indeed Renaissance algebra was sometimes referred to as “the art of the thing”. (See Endnote #1.)
Mathematicians celebrate the French thinker René Descartes for inventing Cartesian coordinates.1 But we should also remember him as the person who tilted the terrain of Europe’s mathematical alphabet, using early letters of the alphabet to signify known quantities and imbuing later letters (especially x) with the pungent whiff of the Unknown. If you learned to write quadratic expressions as ax2 + bx + c instead of xa2 + ya + z (and I’m guessing you did), it’s down to Descartes.2
My topic this month is polynomials like ax2 + bx + c. In school math, you first learned about x as an unknown, a number hiding behind a mask. (“What is x? Let’s find out.”) Later you learned to view x as a variable, so that a formula like y = ax2 + bx + c is a function or rule: if you give me an x, I’ll give you a y. (“What is x? No number in particular; x ranges over all real numbers.”) I’ll touch on both points of view today, but I’ll be stressing a viewpoint that’s probably less familiar, where x is neither an unknown nor a variable, but just, well, itself. From this perspective, polynomials appear as number-like objects in and of themselves, with their own habits and mating behavior.
If new kinds of numbers were like new consumer products, mathematicians would have every right to fire the marketing company that came up with the names “complex numbers” and “imaginary numbers”. I mean, what kind of sales pitch goes with that branding? “Psst: wanna buy a number? It’s really hard to understand, and best of all, it doesn’t even exist!”?
We mathematicians have nobody but ourselves to blame, since it was one of our own (René Descartes) who saddled numbers like sqrt(−1) with the term “imaginary” and another mathematician (Carl-Friedrich Gauss) who dubbed numbers like 2+sqrt(−1) “complex”. Now it’s several centuries too late for us to ask everybody to use different words. But since those centuries have given us a clearer understanding of what these new sorts of numbers are good for, I can’t help wishing that, instead of calling them “complex numbers”, we’d called them — well, I’ll come to that in a bit.
I’ve long been a fan of comedies of remarriage (“It Happened One Night”, “The Philadelphia Story”, “His Girl Friday”, etc.), and one of the greatest comedies of remarriage is the story of Math and Physics (or “Phyz”, as Math likes to call her).