The Mathematics of Irony

The more you study, the more you know.
The more you know, the more you forget.
The more you forget, the less you know.
So why study?
The less you study, the less you know.
The less you know, the less you forget.
The less you forget, the more you know.
So why study?

— “Sophomoric Philosophy”

Poor Oedipus! The mythical Theban started out life with every advantage a royal lineage could offer but ended up as the poster child for IFS: Inexorable Fate Syndrome. His parents packed him off in infancy to evade a prophecy that he’d kill his father and marry his mother. He was found on a mountain and raised by a shepherd, so Oedipus didn’t know who his birth parents were. Once he learned about the prophecy he did everything he could to avoid fulfilling it (aside from not killing or marrying anyone, which in those times would have been an undue hardship), but he still ended up doing exactly what he was trying not to do.

If the story of Oedipus seems a bit removed from real life, listen to episode 3 of Tim Harford’s podcast “Cautionary Tales”, titled “LaLa Land: Galileo’s Warning”, to hear about systems that were designed by intelligent, well-meaning people to avert disasters but which ended up causing disasters instead.

In Harford’s diagnosis, the problem is that in adding safeguards to a system we increase its complexity, which makes it harder for our feeble human minds to imagine all the ways in which the system might fail. Yet sometimes it’s not complexity that bites back at us but some simple variable that we’ve failed to take into account. For example, say you live on an island with too many wolves. The obvious solution is to encourage wolf-hunting. Unfortunately, suppressing the wolf population means there will be less predation of the deer population (Did I mention there are deer on this island?), which means the deer population will surge next year, which means there’ll be more young deer for mama wolves to feed to their cubs, and come the year after, there’ll be more wolves than ever.

And that’s if you’re lucky. If you’re unlucky, you succeed in killing all the wolves, which leads to an explosion of the deer population, which leads to irreversible overgrazing (Did I mention the grass?), and then your once-complex ecosystem becomes a wolf-free, deer-free, grass-free desert island.

If this had been a real-world example instead of a parable, there would have been lots more species, and the island probably wouldn’t be an island but a region with porous borders that allow wolves, deer, and other animals to cross into the region and out of it. Maybe the overlooked variable wouldn’t have been the deer population but something subtler — something that in retrospect is hard to miss, but in advance is not so easy to pick out of a crowd of jostling variables. Here’s an example mentioned to me by pre-reader Shecky Riemann (can anyone provide a source for his story?). I quote Shecky’s version:

“In some New England town wildlife officials realized the Wood Duck population was inexplicably falling, while they noticed the raccoon population was growing; they believed raccoons were raiding Wood Duck nests for the eggs, causing the subsequent decline. So they hunted, or trapped and relocated, much of the raccoon population, only to find in time that the Wood Duck population declined even more precipitously… At that point they discovered that Wood Duck babies were indeed hatching out but upon leaving the nest and plopping into the water (as they do while very young, well before they can fly) they were immediately being taken by snapping turtles, whose population had ballooned because the only thing previously keeping them in check were the raccoons who ate snapping turtle eggs.”

The predator-prey systems I’ve described embody the idea of a negative feedback loop. This is a causal loop between two or more variables (let’s stick to just two and call them X and Y to keep things simple) where making X bigger makes Y bigger (and making X smaller makes Y smaller) but making Y bigger makes X smaller (and likewise making Y smaller makes X bigger).1 For instance, say X is the severity of a pandemic and Y is the amount of care people take to prevent disease transmission. When the disease is ripping through a population, people get scared and take care, which after a while causes the number of new infections to drop. But then people get lax, causing the rate of new infections to rise again. To the extent that such a simple picture accurately captures key features of the SARS/CoV2 pandemic in 2020, the right question (despite the magazine caption) is not “Will infections rise as states reopen their economies?”, but “How much?”

Why do I call this essay “The mathematics of irony”? I won’t be so foolish as to say that all, or even most, irony comes from negative feedback loops. But a lot of irony comes from reversal of expectations, and the presence of negative feedback loops involving variables you’ve ignored can be one reason for reasonable interventions to backfire.

This isn’t news to anyone who studies complex systems. The problem is that scientists and science educators (and I include myself among their number) haven’t done a good enough job of explaining the complexity of reality, and overcoming people’s desire for simple answers by exploiting people’s love of a good story. Too many members of the voting public see a sentence like “Population dynamics can be counterintuitive” as a cowardly equivocation or an outright lie, and are all too ready to throw in their lot with someone who runs on a simple, thrilling campaign slogan like “I will kill the wolves.”2

Many physical systems can be understood through the lens of feedback, as seen in something as simple as a pendulum. If you pull the bob to the right, the rightward deflection gives rise to leftward acceleration, which gives rise to leftward velocity, which gives rise to leftward deflection, which gives rise to rightward acceleration, which gives rise to rightward velocity, which gives rise to rightward deflection, and so on.

Here’s a simplified overview of what happens. The four sketches in the middle show, in each of four illustrative cases, where the pendulum bob is (the dot) and where it’s going (the small arrow), and the descriptions on the outskirts give verbal descriptions of each case; the big arrows along the outside show how the system evolves over time.

The mathematics governing this system, expressed in the form of a pair of mutually referential differential equations, has many similarities to the mathematics of an oscillating spring, or an inductor in an electrical circuit. In fact, if you ignore nonlinearities3 in the equations, you’ll find that the equations for a pendulum become formally identical to the equations for an electrical circuit; the quantities in one system (deflection, velocity) are different from the quantities in the other (current, voltage), but the way a given quantity in one system evolves over time is identical to the way the corresponding quantity in the other system evolves. The underlying schema is “the” simple harmonic oscillator, one of the stars of an undergraduate physics education.

I hasten to say that no scientist would call a pendulum a negative feedback system as I have, but it isn’t for mathematical reasons; it’s because scientists normally use the term “feedback” to describe a relationship between one part of a system and another (wolf-population and deer-population, say), whereas the deflection and velocity of a pendulum bob aren’t different “parts” of a system — they’re different ways of describing one part. But the mathematics of feedback loops doesn’t know about parts and wholes; it only knows quantities and how they evolve in time under the sway of differential equations. And from that point of view, a simple harmonic oscillator could be considered the prototype of a system with oscillatory behavior due to negative feedback. Systems with negative feedback loops give rise to oscillating behavior, with oscillations that can decay over time, grow without bound4, settle down into a stable cycle, or approach a single stable state.

Fiction is rife with cycles arising from negative feedback loops. A classic kind is the time-travel paradox: if you travel back in time and kill your grandfather before he has kids, then your father never exists, so you can’t exist, but then you never travel back in time, so your father does end up existing, and so do you, with the result that you do end up traveling back in time after all, etc. (Maybe calling this a feedback loop is a bit of a stretch, since it involves discrete variables — you’re alive or you’re not, your grandfather is alive or he isn’t — rather than continuous variables of the kind we’ve discussed so far.) You can also see the negative feedback loop governing a fictional couple for whom “A approaches B” leads to “B avoids A” leading to “A avoids B” leading to “B approaches A” leading back to “A approaches B”. If you have favorite examples of negative feedback loops in life or in literature, please let me know in the Comments.

My favorite example of a negative feedback loop in my own life comes from a time — and I hasten to say that this happened many, many years ago in case anyone who works for my auto insurance company is reading this — when I was about to leave my home to teach a calculus class and foolishly tried to save a bit of time by adjusting the seat while starting to drive. To slide the driver’s seat forward, one had to reach underneath the seat and pull on a release lever that would allow the seat to slide freely on its tracks. That’s what I did, and if I’d been smarter I would have moved the seat to its new position and let go of the release lever before starting to drive. But instead I started the car and put my foot on the gas pedal while my seat was still free to move forward and backward. Can you guess what happened before reading on?

The car started moving forward, but remember, objects at rest tend to remain at rest, so my seat (being free to slide) stayed put with respect to the street, which is to say that my seat moved backward with respect to the car. This caused my foot to leave the gas pedal. That caused the car to slow down, which caused my seat to move forward with respect to the car. That movement pressed my foot against the pedal again, which caused the car to lurch forward again, and so on. The negative feedback loop was unstable, so each successive motion of the car was more violent than the one before, until finally the car (which had manual transmission) stalled out.

In the twentieth century, back before the prefix “cyber-” got repurposed to mean “computer-y”, there was a worldview called cybernetics that saw feedback loops everywhere.5 Cybernetics bloomed in conjunction with an engineering discipline called control theory. A key concept of both fields is homeostasis, an equilibrium state achieved through use of a feedback loop. An example of an engineered feedback loop is the thermostat, which allows cooling to happen when something becomes too warm, and warming to happen when it becomes too cool. In the natural world, an example of equilibrium is seen in the predator-prey model I mentioned before. And you can thank feedback mechanisms built into your warm-blooded human body for keeping your core temperature in a zone that permits you to be alive right now. (Not to mention dozens of other metabolic variables that your body is always silently, cybernetically adjusting.)

The cybernetic worldview rightly stressed the importance of feedback loops (back in college I was especially enthralled with Gregory Bateson’s “Steps to an Ecology of Mind”), but many proponents of cybernetics thought that the study of feedback loops would explain everything. It didn’t, and now the word “cybernetics” itself has become an oddity, known to many only through the name of the fictional Sirius Cybernetics Corporation invented by Douglas Adams as part of his “Hitchiker’s Guide” universe. Cybernetics had a great first act, but it ran out of new ideas with predictive power. And in a way that’s a shame, because the key insight of cybernetics is one that, regardless of whether it leads to new scientific advances, could lead humankind to a more sophisticated understanding of cause and effect.

Cybernetics was followed by catastrophe theory, chaos theory, complex systems theory, and so on. Each new wave yielded new explanatory power, and beyond that, new metaphors. With the new metaphors came hype. And within the scientific establishment there was some backlash to the hype (yes, feedback loops pop up everywhere once you start looking for them). I feel torn: hype has no place in scientific research, but in popular writing about science, tasteful enthusiasm for deep ideas has the power to awaken wonder and inspire an appreciation of just how beautifully complex the world is.

But I never told you what happened to Oedipus after he fulfilled the prophecy. When he was king of Thebes there was a great plague, and in trying to figure out what had caused it, he came to realize that he himself was the cause. And he acknowledged his culpability publicly, with an act of self-mutilation that symbolized his former blindness to the machinations of fate.

So, pity Oedipus. And while you’re at it, get ready to pity yourself and me and everybody else in the year 2020, because in the current year and the years after, many intelligent and well-meaning people will be doing their best to steer two complex systems (the biosphere and the world economy) that we don’t understand. We’re flying blind, and there’s a good chance that some of our interventions will backfire and have ironic consequences for reasons that will only be obvious in retrospect.6

POSTSCRIPT:

Having written the preceding paragraph in early June and having re-read it in mid-June, I think our biggest problem isn’t that we humans will take action based on simplistic models of how the world works and that those actions will have perverse consequences. I think a bigger problem is that we’ll know the right thing to do and we’ll nevertheless fail to take the simple steps that actually work, just because we get tired of doing the right thing, day after day. It’s so hard. It was hard from the start, but at least at the start it was something different to do. Now it’s hard and boring.

And now that I’ve re-read those words, I see an even gloomier possibility. It seems clear that, at a societal level, abandoning social distancing now is self-destructive. But what if all the people going around in public without face masks this week are, in a certain sense, making completely rational individual decisions? After all, the main purpose of wearing a standard-issue mask is to protect others, not yourself. If you’re wearing a mask when no one else is, you’re putting up with discomfort and inconvenience for the sake of other people and their acquaintances — people you don’t even know7, and if your action turns out to save someone’s life you’ll never find out about it. You’d be so much more comfortable if you weren’t wearing the mask; so taking off that mask would be the rational thing to do, wouldn’t it?

This pandemic won’t kill off humankind, and neither will the next. But if a species that evolved intelligence for its survival-value ends up going extinct because of the selfish rationality of its individual members, that might be the biggest irony of all.

Thanks to Sandi Gubin, Dave Jacobi, Andy Latto, Fred Lunnon, Gareth McCaughan, Shecky Rieman, Evan Romer, and Steve Strogatz.

Next month: Math, games, and Ronald Graham.

ENDNOTES

#1: This should not be confused with a situation in which making X bigger makes Y smaller and likewise making Y bigger makes X smaller. That might naively seem “even more negative”. But that’s an example of positive feedback, in terms of how a change in X will tend to reinforce itself rather than reverse itself over time.

#2: The cowardly lie “Nobody knew wildlife-management was so complicated” can be saved until after the candidate is safely elected.

#3: Here I’m assuming that the deflection angle θ is so close to 0 that we can replace sin θ in the differential equation by θ, resulting in a linear differential equation that gives a good approximation to the pendulum’s actual behavior. Nonlinear differential equations are extremely important, but nonlinearity isn’t one of the themes of this essay.

#4: When oscillations grow without bound in a linear model of a phenomenon, the upshot in the real world is that the oscillations grow until the system leaves the regime within which linearity is a good approximation to reality.

#5: The word “cybernetics” is derived from the Greek word κυβερνήτης from the Greek root meaning to steer, navigate, or govern, and indeed “governor” was the name James Watt gave to the feedback mechanism of his steam engine.

#6: For instance: having everybody stay home during a pandemic seems like a good way to prevent viral transmission, and it probably is, on a societal level. But if the severity of an infection depends on initial viral load, this strategy has the unintended effect that uninfected people who shelter with infected people stand to get worse cases of the illness. I learned about this from Siddhartha Mukherjee’s article listed in the References. It’s not clear to me how many epidemiologists or public health officials took this into account in the earliest days of the coronavirus pandemic. Who knew? Did they try to tell us? Were we listening hard enough?

#7: I’m thinking of words from the Twilight Zone episode “Button, Button” that we hear more than once (with chilling implications the last time we hear it): “… someone whom you don’t know.” If there’s a more horrifying dramatization of the tragedy of the commons, I haven’t seen it.

REFERENCES

Tim Harford, “LaLa Land: Galileo’s Warning”, http://timharford.com/2019/11/cautionary-tales-ep-3-lala-land-galileos-warning/

Siddhartha Mukherjee, “How Does the Coronavirus Behave Inside a Patient?”, The New Yorker, March 26, 2020. Yes, I know this article is reportage, not science. If any of you can point me toward relevant medical literature, please do so in the Comments.

2 thoughts on “The Mathematics of Irony

  1. Pingback: Confessions of a Conway Groupie |

Leave a comment