Hamilton’s Quaternions, or, The Trouble with Triples

In 1853, the mathematician and physicist William Rowan Hamilton paid one last call on Catherine Barlow, whom he had once loved passionately and for whom he still had some affection. She had won his heart three decades earlier when he was a first-year student at Trinity College, Dublin and she was Miss Catherine Disney – before her parents had decided that Reverend William Barlow, a man of means fifteen years her senior, would be a good match for her. Events proved them wrong.1

If, thirty years later, Reverend Barlow resented Hamilton’s presence in his home, he probably forgave the trespass, since his wife was dying and had asked for one last visit from Hamilton. Hamilton, a published poet as well as scientist, had written of her fondly in some of his early verse, but what he bestowed upon her now was not poetry but a mathematical treatise that he had written upon a topic of his own invention, quaternionic analysis, which had won so much acclaim that it was a mandatory examination topic in Dublin. Indeed, a year earlier Catherine’s son had needed some instruction in quaternions and Hamilton had tutored the young man, perhaps enjoying the opportunity to play a paternal role to the son of his old flame.

Catherine spoke of how her marriage to Barlow had been thrust upon her by her parents, and Hamilton was outraged on her behalf. She told him of her unfulfilling marriage, and of her unwavering love for Hamilton throughout the years, and he was devastated. And then, near the end of the interview, he tried what he should have tried thirty years earlier. “Rising, I received, or took, as my reward, all that she could lawfully give – a kiss, nay many kisses: – for the known and near approach of death made such communion holy. It could not be, indeed, without agitation on both sides, that for the first time in our lives, our lips then met. . . . Yet dare I to affirm that our affectionate transport, in those few permitted moments, was pure as that of those who in the resurrection neither marry nor are given in marriage, but are as the Angels of God in Heaven.”

Continue reading

Seekers of the One-Stone

There are some who would begin the story this way:

Long before Earth was formed, before any planet or star existed, there was the One-Stone. Not an actual stone, of course – just an idealized shape that certain two-legged, one-headed inhabitants of Earth would later call “the hat”. It existed in the realm of Pure Form, awaiting instantiation and recognition. The waiting would take billions of years, but the One-Stone did not know impatience. None of its Kind could. It simply Was, and it waited.

The people who would start the story this way are called mathematical Platonists. Some of them talk about mathematical objects in reverent, mystical tones, imputing a timeless reality to them, though I’ve noticed that they don’t talk about art in the same way; they don’t say that Shakespeare’s Tempest predated Shakespeare just because the potential for arranging strokes to form letters and for letters to be arranged in that one particular Tempest-uous way has existed since the dawn of time.1 It would be interesting to argue that that the difference between artistic creation and mathematical discovery is a difference not of kind but of degree, but I’m not going to try to make that argument today (partly because I’m not convinced it’s true). Instead, I’ll dive into one very recent example of mathematical creativity intertwined with esthetic choices.

Continue reading

Unlimited Powers

Today’s mathematical journey will go from India to Europe and back, starting with Madhava of Sangamagrama’s invention of infinite series and culminating in Srinivasa Ramanujan’s discovery of the most potent piece of mathematical clickbait of all time: the outrageous assertion that 1 + 2 + 3 + 4 + … is equal to −1/12.


Like many people, I learned two trigonometries, a few years apart. The first was about triangles and it had pictures like this:

The angle θ was seldom more than 90 degrees and never more than 180 degrees (because what kind of triangle has an angle bigger than 180 degrees?).

Continue reading

Things, Names, and Numbers

Happy January 48th, everyone! (More about that strange date later.)

Mathematician Henri Poincaré once wrote “Mathematics is the art of giving the same name to different things,” and he wasn’t wrong, exactly. He was thinking about the way mathematics advances by generating new concepts that unify old ones. For instance, mathematicians noticed that adding 0 to a number, like multiplying a number by 1, doesn’t change the number you apply it to. Eventually they celebrated this resemblance between 0 and 1 by coming up with new vocabulary: nowadays we say that 0 and 1 are “identity elements” (the former for addition, the latter for multiplication).1 Two different things, same name.

But giving different things the same name is only half the story. Mathematics also invites us – and frequently requires us – to give different names to the same thing.2 Seventeen isn’t just 17. It’s also 10001two. It’s the fraction 34/2 (or the mixed number 16 2/2, if we’re feeling goofy). It’s the real number 17.000… and the real number 16.999…. It’s the complex number 17 + 0i.

Continue reading

Denominators and Doppelgängers

It hapneth also some times, that the Quotient cannot be expressed by whole numbers, as 4 divided by 3 in this sort, whereby appeareth, that there will infinitly come from the 3 the rest of 1/3 and in such an accident you may come so neere as the thing requireth, omitting the remaynder…

— Simon Stevin, The Tenth (1585)1

Many people find fractions and decimals confusing, counter-intuitive, and even scary. Consider the story of the A&W restaurant chain’s ill-fated third-of-a-pound burger, introduced as a beefier rival of the McDonald’s quarter-pounder. Many customers were unhappy that A&W was charging more for a third of a pound of beef than McDonald’s charged for a quarter of a pound. And why shouldn’t they be unhappy? Three is less than four, so one-third is less than one-fourth, right?

Well, that’s what many of those aggrieved customers told the consultants who had been hired to find out why A&W’s “Third is the Word!” innovation had gone so disastrously awry. But I wonder if those customers were rationalizing (sorry…) after the fact. Maybe some of these people had had such bad experiences when learning about fractions in school (the awkward fraction 1/3 in particular) that they preferred to avoid eating at establishments that triggered their math anxiety.

Perhaps part of the problem is that for many people, the standard middle school curriculum on fractions and decimals doesn’t hang together well, with its mélange of different representations of things that they’re told are really the same thing under different names, such as 1 1/5 and 6/5 and 12/10 and 1.2 (and let’s not even mention 120%). And as if that weren’t bad enough, there are decimals that never end?!? It’s easy to come away from this experience confused and disheartened.

Continue reading

Beneath (and Beyond)

Tens of thousands of years ago, long before humankind hit on the the nifty trick of preserving language with marks on clay or papyrus, our ancestors notched tally marks on animal bones to count … things. We don’t know what the things were. Take for instance the 40,000-year-old Lebombo bone found fifty years ago in southern Africa. We can make guesses about what the person who carved it was counting1, but we’ll never know. That’s the thing about tally marks (and the numerals that replaced them much later): their power to describe huge swaths of reality stems from an essential vagueness.

The Lebombo bone. Image permission pending.

The notion of quantity shorn of context – that is, the advent of the concept of Number – was the greatest mathematical revolution of all time, the one that made all subsequent developments possible. I don’t have much to say about it because we know so little about it, but since most great advances involve trade-offs I want to mention two of the hazards made possible by the abstract number concept.

Continue reading

The Infinite Stairway

I’m sure you’ve counted (“One, two, three, . . . ”) on too many occasions to count. The process can be boring (counting sheep), exciting (counting your winnings at a casino), or menacing (“If you kids aren’t at the dinner table by the time I reach ten, I’ll …”). But one thing counting is not is liberating. What could be less free than the inexorable succession of the counting numbers? And yet the very regularity of counting numbers gives us the freedom to think about them in multiple ways, arriving at conclusions along delightfully varied paths.

Continue reading

Teaching with Magic Paper

I know that the sentence “The year is 2022” is just a bland statement of fact, but it hits my ear like a voice-over in a trailer for a bad science fiction movie made in the 1900s. Blame Walter Cronkite; I grew up watching his TV series The Twenty-First Century (1967-1969) and came to indelibly associate the 2000’s with The Future. Now that I actually live in The Future, surrounded by many of its predicted marvels, my degree of enthrallment varies from marvel to marvel, but I never tire of the wonders of magic paper. You know the stuff I mean: you write something in one place and the paper makes copies of itself elsewhere so that people in those other places can read the words you just wrote. I’m sure you’ve all used it. I’m using it now.

Magic paper helps me with some problems that have long bedeviled classroom teachers like myself: How do you find out what’s going on inside your students’ heads in the midst of a lesson without derailing it? How do you get all your students to actively participate without having the class descend into chaos? How do you communicate with a large group of students without the conversation devolving into what math educator Henri Picciotto calls a “pseudo-interactive lecture” dominated by the teacher and the two or three most vocal students?

Continue reading

What Lovelace Did: From Bombelli to Bernoulli to Babbage

I want to tell you about difference tables for polynomials, not only because they’re fun but also because they’ll give us a chance to see how polynomials played a role in the dawn of the computer age through the work of computer pioneers Charles Babbage and Ada Lovelace.

But first, where did polynomials come from?


“Thing” is a marvelously flexible word, as are similar words like “res” and “cosa” that other languages have used to signify unspecified objects. Often the word denotes a group of people who have come together for some purpose: think of the Roman Republic (the “public thing”) or the Cosa Nostra (“Our Thing”). Curiously, the English word “thing” itself seems to have traveled in the opposite direction, starting out as meaning an assembly of people and ending up as meaning, well, any-thing. Math has made its own uses of nonmathematical words for indefinite objects: in Indian and Arabic algebra, the quantity being sought was often called “the thing”. It was natural for European algebraists to borrow this usage, and indeed Renaissance algebra was sometimes referred to as “the art of the thing”. (See Endnote #1.)

Continue reading

Let x Equal x

Dedicated to the memory of Herb Wilf

Mathematicians celebrate the French thinker René Descartes for inventing Cartesian coordinates.1 But we should also remember him as the person who tilted the terrain of Europe’s mathematical alphabet, using early letters of the alphabet to signify known quantities and imbuing later letters (especially x) with the pungent whiff of the Unknown. If you learned to write quadratic expressions as ax2 + bx + c instead of xa2 + ya + z (and I’m guessing you did), it’s down to Descartes.2

My topic this month is polynomials like ax2 + bx + c. In school math, you first learned about x as an unknown, a number hiding behind a mask. (“What is x? Let’s find out.”) Later you learned to view x as a variable, so that a formula like y = ax2 + bx + c is a function or rule: if you give me an x, I’ll give you a y. (“What is x? No number in particular; x ranges over all real numbers.”) I’ll touch on both points of view today, but I’ll be stressing a viewpoint that’s probably less familiar, where x is neither an unknown nor a variable, but just, well, itself. From this perspective, polynomials appear as number-like objects in and of themselves, with their own habits and mating behavior.

Continue reading