You and your computer have a fundamental disagreement about how to represent numbers. Your computer was designed to calculate in base two (binary), while you use base ten (decimal). But there is something that your decimal self and your binary computer can agree on: representing numbers in base three-halves is a damn fool thing to do. I mean, I haven’t even told you yet what “base three-halves” is, but you probably already guessed it’s one of those things mathematicians came up with not because anyone asked them to but simply because they can and because they think it’s fun.

Here’s what counting from one to ten looks like in base three-halves (or “sesquinary”, from the Latin prefix “sesqui-” meaning one-and-a-half):

**1**, **2**, **20**, **21**, **22**, **210**, **211**, **212**, **2100**, **2101**

(I should confess that my use of the term “base three-halves” is a bit nonstandard; see Endnote #1.)

To explain what’s really going on with the sesquinary system — where it comes from, and why a mathematician might develop a fondness for it — we’ll need to dive deeper and talk about the kinds of computers that could use such a system.

Actually, my use of the word “computers” (plural) is misleading, because there’s only one computer in the world that uses the sesquinary system, and pretty soon there won’t be any at all — its components are no longer being manufactured, and its creator, Glen Whitney, is about to take it apart so that he can use those components for other purposes. But I’m a pure mathematician, so that’s okay; for me Whitney’s real-world sesquinary computer is only interesting to the extent that it approximates an ideal sesquinary computer, operating in the timeless, placeless realm of pure mathematics.

The ideal sesquinary computer belongs to a species of machine I wrote about last month: Engel machines. If you didn’t read my essay on Engel machines, that’s okay, but I’ll need to tell you (or remind you) how to use them.

**THE BINARY ENGEL MACHINE**

Here’s the hardware of a fairly simple Engel machine that turns out to be closely related to base two:Unlike last month’s machines, this one is theoretically endless, though if you’re going to play with physical approximations (and you should!) you can get by with the one I’ve drawn or one slightly larger. Like your laptop, this machine is animated by patterns of current, but here the current consists not of electrons but of objects called chips (which could be buttons or beads or anything you like — even potato chips or chocolate chips). You feed the chips into the machine by adding them to the rightmost square, one at a time, subject to the rule that before you get to add a new chip, you have to move the existing chips around to prevent overcrowding. A square that has at least as many chips as it has outgoing arrows is overcrowded, and you must release the tension by sending a chip along each outgoing arrow from that square. These moves are called *firing* moves, and when you can’t perform any more of them, the configuration of chips is called *stable*. (Note that the octagons have no outgoing arrows; you can let chips pile up there without limit.)

Here are the stable configurations you get for the Engel machine when 1, 2, 3, 4, 5, 6, 7, or 8 chips have been added (the video shows the dynamic process that creates these configurations). If we represent each of these configurations by a sequence of numbers that shows how many chips are in each square (ignoring the octagons), we get sequences of **0**‘s and **1**‘s that might look familiar:

**00001**

**00010**

**00011**

**00100**

**00101**

**00110**

**00111**

**01000**

They’re just the binary representations of the numbers 1 through 8. (In principle each sequence has infinitely many **0**s to the left of the first **1**, but you can leave them out if you like.) For instance, when five chips have been put in the machine and the firings are finished, we get the configuration represented by the sequence …**00101** (or **101** if you like), corresponding to the number five.

What’s going on is that our Engel machine has the binary place-value concept built into its structure. Every chip that’s at a square has a value that depends on which square it’s in. A chip in the rightmost square has value 1; a chip in the next square over has value 2; a chip in the next square over from there has value 4; and so on, with the value doubling with each leftward step. So the configuration …**00101**, which has a chip of value 4 and a chip of value 1, has total value 5.

One way to understand what the binary Engel machine is doing is to track the total value of the configuration as we go. We’ve already seen that …**00101** has total value 5. What happens when we add a chip to …**00101**? When we add a chip in the rightmost square, we increase the value of the configuration by 1; that is, the unstable configuration …**00102** has total value 6. When we fire the two chips at the right, turning …**00102** into … **00110**, we don’t change the total value. This keeps happening: every time we add a chip at the right, we increase the total value by 1, and the chip-firing moves that restore stability don’t affect the total value, because we always replace a pair of chips with value 2^{k} by a single chip with value 2^{k+1} for some value of *k*, and 2^{k} + 2^{k} = 2^{k+1} for all *k*.

If you’ve seen Exploding Dots, this should all look very familiar. There’s just one difference: in our Engel machine, when a square spot fires two chips, one chip goes to another square spot (one position to the left) and the other goes to an octagonal spot; in Exploding Dots, when a square spot fires two chips, one chip goes to another square spot (one position to the left) and the other disappears.

Here’s a hybrid sort of picture we could draw:We keep the arrows, but we leave out the octagons. Let’s do that from now on.

**HIGHER BASES**

If this were a standard mathematical tour, your docent would lead you off to one side to look at this Engel machine:In this Engel machine, each square has three outgoing arrows, so we don’t get to fire a square until it has three or more chips.

In a standard mathematical tour, your guide would show you how this chip-machine is doing base-three arithmetic. And then she’d show you another Engel machine in which each square has one arrow going to the left and *nine* arrows going nowhere; she’d demonstrate that its internal logic is just good old base ten. She’d go on to show you how it (like the base-two and base-three Engel machines you were looking at before) can be used not just for counting, but for adding and even multiplying, and she’d convince you that it’s a lot like the sort of decimal abacuses people have been using for many centuries. Lastly she’d reveal that “firing” is just a dramatic term for what your grade-school teachers called “carrying”! And this fun and entertaining tour would convey ideas that are actually useful.

Imagine that you’re on that kind of tour, but that lurking among the last of the stragglers you see a disreputable old man who seems less interested in looking at the exhibits and more interested in looking at the people taking the tour — someone who (unbeknownst to you) used to work for the museum but got fired for not sticking to the script, and who now spends his waking hours taking the tour and looking for museum-goers who seem likely to share his taste for the useless and obscure. The respectable tour guide leads the group down one hallway, but your eyes meet the old man’s and before you can look away he beckons, pointing in the opposite direction. “Want to see something that’s not part of the tour?” he asks. “Something a bit … weird?”

**A WEIRD MACHINE**

Here, at last, is the Engel machine for base three-halves:When there are three chips on a square, you get to take one of them off the board and move the other two to the left. So, with this kind of machine, here’s how you count (check for yourself that I’m telling the truth):

**001**

**002**

**003** → **020**

**021**

**022**

**023** → **040** → **210**

**211**

**212**

…

Just as **101** in base two means **1**×2^{2}+**0**×2^{1}+**1**×2^{0} (which is five), **212** in base three-halves means **2**×(3/2)^{2}+**1**×(3/2)^{1}+**2**×(3/2)^{0} (which is eight).

This is something I played with about twenty years ago (back when I was learning about Engel machines), and when I shared it with James Tanton he got very excited, because it tied in beautifully with ideas he was ruminating on — ideas that have borne fruit in his Exploding Dots approach to grade school math. If you’ve seen base three-halves before, you have Tanton to thank. He calls this Engel machine a “2 ← 3” machine (watch his video!) though I prefer to call it a “3 → 2*x*” machine, for reasons I explain in Endnote #3.

I gave a talk about this Engel machine at the MOVES 2017 conference hosted by the National Museum of Mathematics and devoted to the Mathematics Of Various Entertaining Subjects. (The slides are online for those who want them.) Assisting me in my presentation was Glen Whitney, one of the founders of the Museum, who brought with him a physical embodiment of the base three-halves Engel machine, dubbed the “SESQUIAC”. Here is a video of the SESQUIAC in action, counting to twenty-four.

You may note that, partway through the counting process, Glen sticks some balls into a “helper track” (the track closest to you); these balls are used as a kludge to make the mechanism work more smoothly, but other than that, they don’t count (in the literal sense of the phrase). Also note that at the very end, two balls cause and overflow and get dumped onto the floor; the SESQUIAC can’t count past twenty-three.

I’ve posted on my Barefoot Math channel a partial recreation of the conclusion of my talk, walking you through a musical representation of how the SESQUIAC counts to twenty-three.

Can you figure out how to write one hundred in base three-halves? Can you think of more than one way to arrive at the answer? See Endnote #2.

If you want to see Mike Lawler and his kids playing with base three-halves, check out their 2013 video and their 2015 video.

**OTHER MACHINES**

If you’re the kind of reader who likes to play with things a bit on your own, and you’re looking for a system that’s simpler than base three-halves but still on the weird side, you should look at this Engel machine:Each square has three outgoing arrows: one that goes nowhere, one that goes to the left, and one that goes to the square itself. That means that when a square has three or more chips on it, you can fire three chips by taking one of them off the board, sending one of them to the left, and leaving one of them where it is. So (leaving aside the infinitely many 0’s at the front) the “counting” process for this Engel machine goes

**001**

**002**

**011**

**012**

**021**

**022**

**111**

**112**

…

(Here I’m leaving out all the non-stable configurations that occur in the course of stabilization.) I explain in Endnote #4 what’s going on here. Can you figure out what the number one hundred would look like in this system?

Or how about this machine?Each square has four outgoing arrows: two that go nowhere and two that go to the square to the left. Counting in this system looks like

**001**

**002**

**003**

**020**

**021**

**022**

**023**

**200**

…

What do you get when you feed a hundred chips into this system? I answer this in Endnote #5.

**THE POINT**

The sesquinary system that I’ve showed you is a curiosity, with no importance in and of itself. The most I can say for it is that it illustrates one kind of mathematical enterprise: breaking down two or more things (in this case the binary and decimal systems) into a shared set of components (the spots and arrows of Engel machines) and then recombining those components in new ways to see what results. In this instance, nothing very profound emerges, but I wouldn’t go so far as to call it a dead end (also, see Endnote #6). It remains to be seen what could emerge from studying the interplay between Engel machines and systems for representing numbers.

Here’s a problem that I think is hard but might be doable: how many counting numbers are there that have a sesquinary representation that (if we ignore the **0**‘s at the front) reads the same forward and backward? (It’s easy to write down palindromic sequences of **0**‘s, **1**‘s and **2**‘s, but the overwhelming majority of them, like **11**, don’t correspond to counting numbers.) The biggest sesquinary palindrome I know is four hundred ninety-four, with sesquinary representation **2120010100212**. Are there others? Are there perhaps infinitely many others? In Endnote #7 I’ll explain (a) why there probably aren’t infinitely many others, (b) why we may never know for sure, and (c) why, despite (b), there’s room for hope.

Postscript: Stephen Lucas has pointed out that in the article “Powers of rationals modulo 1 and rational base number systems” (Israel Journal of Mathematics, November 2008, 168:53), Shigeki Akiyama, Christiane Frougny, and Jacques Sakarovitch rediscover and substantially extend my unpublished work on base 3/2. So anyone who wants to explore these ideas seriously should check out what they’ve done.

Next month: ~~The ChipChip Reversal Problem~~ The Global Roots of Exploding Dots

*Thanks to Sandi Gubin, Brian Hayes, Evelyn Lamb, Andy Latto, Mike Lawler, Simon Norton, Henri Picciotto, David Radcliffe, Don Reble, Shecky Riemann, James Tanton, Glen Whitney, and the folks who make OmniGraffle.*

**ENDNOTES**

#1. My term “base three-halves” is not a standard mathematical term. If you ask most mathematicians what “base three-halves” means, or what it ought to mean, they’d say that it refers to a number system using just the digits **0** and **1**, in which you repeatedly subtract the largest power of 3/2 that you can from the number you’re trying to represent. For instance, the biggest power of 3/2 you can subtract from 13/4 is 9/4 = (3/2)^{2}, leaving a remainder of 1 = (3/2)^{0}, so the base three-halves expansion of 13/4 would be **101**. This is a special case of β-expansions, with β equal to 3/2. One feature of this way of representing numbers is that most counting numbers have a non-terminating base three-halves expansion; for instance, 2 would be written as **10**.**010000010**… (corresponding to the expansion of 2 as (3/2)^{1} + (3/2)^{−2} + (3/2)^{−8} + …). In this system, the only digits that appear are **0**‘s and **1**‘s. My kind of base three-halves is different: for the cost of allowing the digit **2**, we get a system (the chip-firing version of base three-halves) in which every counting number has a terminating expansion. But the chip-firing version of base three-halves has lots of defects; for instance, it’s unclear to me how to write 1/3 (for example). The problem isn’t that there’s no way to do it — rather, the problem is that there are too many. There are infinitely many ways to write 1/3 as *a*_{1} (3/2)^{−1} + *a*_{2} (3/2)^{−2} + *a*_{3} (3/2)^{−3} + …, where each of the digits *a*_{1}, *a*_{2}, *a*_{3}, … is **0**, **1**, or **2**, but it’s not clear to me which of them should be considered the “right” one, since none of these expansions terminate (that’s not too hard to show) and none of them are periodic (as Don Reble recently proved).

Beta-expansions and chip-firing are mainstream topics, but this way of combining them is my own (though I wouldn’t be surprised if other people who’ve read the things I was reading in the 1990s were led to the same idea).

It can be shown that every counting number has exactly one representation as a sum of powers of 3/2 in which each power occurs at most twice.

#2. The slow way to figure out how to write one hundred in sesquinary is to count. A faster way is to use repeated addition, and an even faster way is to use multiplication.

I won’t demonstrate the slow method, though you can see how it starts in the video and in the slides, as far as twenty-three = **21222**. Twenty-four is **210110** (as in Glen Whitney’s video); twenty-five is **210111**; and so on. If you got the correct sesquinary representation of one hundred that way, congratulations! I doubt I could perform a hundred sesquinary operations without making a mistake that would make my final answer wrong.

A faster way is to use the sesquinary machine to do addition. Let’s add the number twenty-five to itself to get fifty: when we add **210111** to itself we get **420222**, which (after a single firing) becomes **2120222**. Now double that fifty to get one hundred: **2120222** doubled is **4240444**, which after a lot of firings becomes **212001201**.

(This calculation is predicated on assumptions about sesquinary arithmetic that I won’t prove; all I’ll say here, for the pedants reading this, is that the correctness of the procedure hinges on the confluence property I wrote about last month; see Endnote #3 from that essay. For instance, were you wondering which of the **4**‘s to deal with first, when dealing with **4240444**? And did you find yourself wondering whether it mattered? The answer to the second question is “No, it doesn’t matter.” And if you’re wondering why, the answer is: “Because confluence.”)

An even faster way to compute one hundred is to use the sesquinary machine to do multiplication. Recall that ten is **2101**. If we were to square this number in decimal, the computation would look something like this:

And we’d be done, because **4** is an allowed digit in base ten. But in base three-halves, we have some firing/carrying to do, and when we do it, **4414201** reduces to **212001201**, as before.

There’s an even faster method that Simon Norton pointed out to me. It’s based on the fact that when *n* is written as 3*k* + *r*, where the remainder *r* is 0, 1, or 2, the sesquinary representation of *n* is equal to the sesquinary representation of 2*k*, with the digit *r* tacked on at the end. This gives us a quick way to write the sesquinary representation of *n* from right to left via a process of repeatedly dividing by three, rounding down, and doubling.

For instance, since 100 divided by 3 leaves a quotient of 33 and a remainder of 1, the sesquinary expansion of 100 is the sesquinary expansion of twice-33 (namely 66) with a **1** tacked on at the end. To find the sesquinary expansion of 66, we continue the process.

*n* = 100, *k* = 33, *r* = 1

*n* = 66, *k* = 22, *r* = 0

*n* = 44, *k* = 14, *r* = 2

*n* = 28, *k* = 9, *r* = 1

*n* = 18, *k* = 6, *r* = 0

*n* = 12, *k* = 4, *r* = 0

*n* = 8, *k* = 2, *r* = 2

*n* = 4, *k* = 1, *r* = 1

*n* = 2, *k* = 0, *r* = 2

If you write the *r*‘s from right to left, you get **212001201**.

Norton’s approach can be described directly in chip-firing terms. Load all one hundred chips into the rightmost square at the start, and then let the chips fall (or rather fire) where they may! The magic of confluence assures us that this process will yield the same outcome as the original process of counting. If you fire as many of the hundred chips from the rightmost square as you can, and then fire as many chips as you can from the next square, and so on, you’ll essentially be doing Norton’s numerical process, but with chips instead of numbers. You can implement decimal-to-sesquinary conversion with a single line of Mathematica code:

`S[n_] := If[n < 3, {n}, Append[S[2 Floor[n/3]], Mod[n, 3]]]`

#3. In the ordinary binary system, firing means trading in two chips in the 2^{n} position for one chip in the 2^{n+1} position. Instead of calling these positions the 2^{n} position and 2^{n+1} position, let’s call them the *x*^{n} position and the *x*^{n+1}position, to keep our notation neutral. Then we could represent the firing operation algebraically as *x*^{n} + *x*^{n} → *x*^{n+1}, or more compactly as 2 *x*^{n} → *x*^{n+1}.

With this kind of notation, base three-halves is associated with the firing rule 3*x*^{n} → 2*x*^{n+1}, and the two rules discussed in the section “Other Machines” are associated with the firing rules 3*x*^{n} → 1*x*^{n+1}+1*x*^{n} and 4*x*^{n} → 2*x*^{n+1}.

Since each of these four rules involves coefficients that don’t depend on *n* (as opposed to rules like the often-reinvented “base factorial” rule *n**x*^{n} → *x*^{n+1}) I prefer to compress the notation even further, by replacing *n* by 0:

2 *x*^{n} → *x*^{n+1} becomes 2 → *x* ,

3 *x*^{n} → 2 *x*^{n+1} becomes 3 → 2*x* ,

3 *x*^{n} → *x*^{n+1} + *x*^{n} becomes 3 → *x* + 1 , and

4 *x*^{n} → 2 *x*^{n+1} becomes 4 → 2*x* .

One could go further in this direction by looking at other rules of the form *m* → *p*(*x*), where *m* is some positive integer and *p*(*x*) is a polynomial with nonnegative coefficients, and I hope someone will eventually do this. The values of *x* that make *p*(*x*) equal to *m* have special importance in the study of the associated machine.

One can go even further by looking at systems in which carrying goes not just to the right but also to the left; for instance, the kind of chip-firing we did in Swine in a Line could be described by the rule 2 → *x*^{1} + *x*^{−1}. One of my favorite systems is given by the rule 3 → *x*^{1} + 2*x*^{−1}; it was inspired by a biased one-dimensional random walk (“hop left with probability 1/3, hop right with probability 2/3”), but it leads to an interesting “unary-binary” way of representing counting numbers by a string of 0’s, 1’s, and 2’s that gives the correct value whether you interpret it in base 1 or base 2! Some MIT students and I looked into some of these rules about six years ago, but unfortunately I dropped the ball and this work was never published.

#4. The representation of *n* for the abacus that implements the rule 3 → *x* + 1 is just the base-two representation of *n*+1 with its first digit lopped off and every other digit incremented by 1 (so that **0**s become **1**s and **1**s become **2**s). For instance, one hundred and one in base two is **1100101**, so one hundred in the 3 → *x*+1 system is **211212**.

#5. To write the counting number *n* in the fashion of the abacus that implements the rule 4 → 2*x*, first divide *n* by 4 and write *n* = 4*k* + *r* (where *k* is the quotient and the remainder *r* is 0, 1, 2, or 3); then write the ordinary binary representation of *k*, double each digit (so that **0**s stay **0**s and **1**s become **2**s), and stick *r* at the end. For instance, twenty-five in base two is **11001**, so one hundred in the 4 → 2*x* system is **220020**.

#6. Near the end of the essay I wrote “In this instance, nothing very profound emerges”, but that’s only true if one thinks of base three-halves in isolation. If one takes a broader view, chip-firing leads naturally to the study of the abelian sandpile model, which is a very active area of research that beautifully ties together seemingly disparate ideas from separate branches of mathematics. If you want to see what a two-dimensional number system looks like, check out the Numberphile video on sandpiles. For more background, see Jordan Ellenberg’s Nautilus article “The Amazing Autotuning Sandpile”.

Going beyond the precincts of pure mathematics, there are scientists who think that sandpiles have something to tell us about complex systems in the real world. Consider: When you add a chip to a chip configuration, sometimes nothing much happens and sometimes a whole lot happens; the system is poised on the brink of instability. Physicists say that such a system exhibits *criticality*. Many complex physical systems exhibit the same sort of criticality, and indeed appear to be set up so as to drive themselves into a critical state: this is called *self-organized criticality*. It’s conceivable that the study of sandpiles could at some point shed light on real-world phenomena.

#7. At the very end of the essay I talked about the “sesquinary palindrome problem”. The largest known sesquinary palindrome is the counting number whose decimal representation is **494** and whose sesquinary representation is **2120010100212**. David Radcliffe has continued the search up to one quadrillion (10^{15}) and failed to find any new palindromes. I doubt that there are others at all, but even if there are others, I’m pretty confident that there are only finitely many. That’s because as far as I can tell, the leftmost digits (leaving aside the first two or three) behave pseudo-randomly, so that the *k*th digit from the left agrees with the *k*th digit from the right only about 1/3 of the time. That’s not such bad odds, but that means when the 3 → 2*x* counter spits out an *m*-digit sesquinary numeral, the chance of it being palindromic is about 1/3 to the power of *m*/2 (where *m*/2 is the number of pairs of digits that have to agree in order for the numeral to be a palindrome). Pretending for a moment that the sesquinary representation of the counting number *n* is a random sequence of *m* 0’s, 1’s, and 2’s, where *n* is about (3/2)^{m}, each *n* has a positive probability of being a “winner” (that is, could yield a sesquinary palindrome), but the probability gets smaller as *n* gets bigger, and it can be shown that the infinite series that represents the expected number of wins converges to a finite value. So, if the digits were random, we’d expect there to be only finitely many palindromes.

Now, this is a very weak argument, because it’s based on the assumption that the digits are random, and we know that they’re not! But it does carry some evidentiary weight, because arguments that are just as flimsy (and are flimsy for similar reasons) have a decent track record in number theory. So, until someone shows me a pattern that governs the digits of sesquinary representations, I’m going to think of them as being, for most purposes, as good as random (or, equivalently, as bad as random), and I’m going to expect there to be only finitely many palindromes. In fact, I’d be surprised if there were any past 494. (It’s curious, but probably coincidental, that this number is a decimal palindrome as well, and that if you add 1 to it, you get a binary palindrome.)

I wrote “In the Endnotes I’ll explain (a) why there probably aren’t infinitely many others, and (b) why we may never know for sure!” There are two famous unsolved problems in math about repeatedly multiplying by 3/2 and rounding: one of them is Mahler’s 3/2 problem and the other is the Collatz problem. (There’s probably nothing especially fiendish about 3/2; it just happens to be the simplest rational number bigger than 1 that isn’t an integer.) Despite the fact that both problems have attracted a fair bit of attention (the latter from recreational mathematicians as well as professionals), not much progress has been made. It’s possible that both problems are forever beyond the scope of human mathematics; it’s also possible (and I like to believe) that someday we’ll crack them. Given the history of the problems, I don’t think mere cleverness will be enough; a certain amount of hard work will be involved, some of it devoted to “nearby” problems.

But I think that the sesquinary palindrome problem neither deserves nor is likely to receive as much attention as those two problems, and it’s not clear to me that there are any “nearby” problems that will lead researchers to develop techniques that could then be repurposed to attack the sesquinary palindromes puzzle. So I think there’s a decent chance that nobody will ever have the incentive, or the luck, or the cleverness, to actually prove that 494 is the last sesquinary palindrome.

On the other hand, in preparing this article (and trying to shore up my assertion about how hard the problem is), I came across a pattern related to the initial digits of sesquinary representations, which now makes me think I overestimated the difficulty of the problem. Here’s some Mathematica code I ran (using the earlier code for converting decimal to sesquinary):

`Table[Length[Tally[Table[Table[S[n][[i]], {i, 1, m}], {n, 1000000, 2000000}]]], {m, 1, 20}]`

This code tells, for each *m* from 1 to 20, how many different possibilities there are for the first *m* sesquinary digits of *n*, as *n* goes from one million to two million. Here’s what Mathematica reported:

1, 1, 2, 3, 4, 6, 9, 14, 21, 31, 47, 70, 105, 158, 237, 355, 533, 799, 1199, 1798

It turns out that this sequence of numbers has been studied before: see the Online Encyclopedia of Integer Sequences entry A005428. Each term is obtained from the term before by adding all the preceding terms, adding 1, dividing by 2, and rounding down to a whole number. This could be a coincidence, but I think it’s more likely that there’s a “story” governing the initial digits of sesquinary representations, and that human minds can find it. Maybe one of you will figure it out.

Ross“….There are infinitely many ways to write 1/3 …none of these expansions terminate …and none of them are periodic…”

Fascinating.

So “1/3” is irrational in base 3/2?

LikeLike

jamesproppPost authorIf by “irrational” you mean “non-terminating and non-repeating”, then yes! (Most mathematicians would prefer to reserve the word “irrational” to mean “not expressible as a ratio of integers”, and for the rational/irrational distinction to be independent of the system used to represent numbers.)

LikeLike

jamesproppPost authorFollowing up on Endnote #7: After I wrote the last paragraph of the essay, I realized that the different possibilities for the first m sesquinary digits of large numbers are precisely the sesquinary representations of the even integers in a certain range, and Glen Whitney pointed out that there’s a nice recursive tree-structure for these representations. So there are definitely things that can be proved (though I don’t know if they’ll be enough to help us find more palindromes or disprove their existence).

LikeLike

Steve LucasI first saw your version of base 3/2 in an article on “The Art of Problem Solving Website” produced by a 16 year old in 2007. The link no longer works, but I kept a copy. Unfortunately, there is no reference to where the idea came from, but if you were doing this 20 years ago, you clearly get precedence!

An amusing variation I’ve looked at would be to (in your notation) build a 6 to 4x machine. This would be equivalent to base 6/4, and you get a completely different sequence. And yet their beta-sequences are identical.

Thinking aloud, I wonder if there is an equivalent set of new (interesting?) representations of natural numbers in base 3/2 with digits larger than 2 allowed?

Finally, after reading this article, I was motivated to think about representing numbers as sums of Fibonacci numbers — known as Zeckendorf notation. I actually talked about this at the first MOVES conference (I didn’t make it this year). An Engel machine version could be built, but it would require a nested version. One rule is 2 to x + x^{-2}. The second would be 1 + x to x^2. Cool!

LikeLike

jamesproppPost authorNeat! One subtlety of firing rules like 1 + x goes to x^2 is that the confluence property gets lost (or at least requires a different proof).

LikeLike

Pingback: Sharing Jim Propp’s base 3/2 essay with kids | Mike's Math Page

Pingback: Sharing Jim Propp’s base 3/2 essay with kids part 2 | Mike's Math Page

Pingback: Sharing Jim Propp’s base 3/2 essay with kids part 3 | Mike's Math Page

Pingback: Math Monday: A Base Beyond | Make:

Pingback: Math Monday: A Base Past | Make: | IMAGICLAY.COM

Pingback: Math Monday: A Base Beyond - Daily Gossip

Pingback: Patterns in the sesquinary representations of integers

Pingback: Holiday Math and More: Math Teachers at Play #114 – Denise Gaskins' Let's Play Math

ChrisWhat would be the code or formula to do sesquinary-to-decimal? Been fiddling around with it and it’s beyond my abilities.

LikeLike

jamesproppPost authorI’d do it recursively. If the last digit of some sesquinary numeral n is r, and we let m be the sesquinary numeral you get by crossing off the r, then the decimal value of n equals 3/2 times the decimal value of m, plus r. So (letting d(n) mean the decimal value of n) for instance we get d(2101) = (3/2) d(210) + 1 and d(210) = (3/2) d(21) + 0 and d(21) = (3/2) d(2) + 1 and d(2) = 2 (the recursion bottoms out), so d(21) = (3/2) 2 + 1 = 4 and d(210) = (3/2) 4 + 0 = 6 and d(2101) = (3/2) 6 + 1 = 10.

LikeLike

BaronHow could you write one-half in base 3/2?

LikeLike

jamesproppPost authorI think the best way is to write 1/2 is as …111122, where the expansion goes infinitely far to the left instead of the right. This is related to p-adic number systems. I plan to write an essay about this one of these days. In the 3-adic numbers, the infinite series 2*(3/2)^0 + 2*(3/2)^1 + 1*(3/2)^2 + 1*(3/2)^3 + 1*(3/2)^4 + 1*(3/2)^5 + … converges to 1/2.

LikeLike