John Horton Conway, the English-born Princeton mathematician who by his own account never worked a day in his life — thereby earning many prizes, and his reputation as a creative, iconoclastic, magical genius — died on Saturday. He was 82.
His wife Diana Conway confirmed that he died from Covid-19.
Dr. Conway’s oeuvre ran broad and deep, from the rigorously highbrow to the frivolously fun.
Following his promiscuous curiosity, Dr. Conway produced profound contributions in number theory, game theory, coding theory, group theory, knot theory, topology, probability theory, algebra, analysis, combinatorics and more. Foremost, he considered himself a geometer.
“His swath was probably broader than anyone who ever lived,” said the mathematician Neil Sloane, the founder of the On-Line Encyclopedia of Integer Sequences. “I’ve worked with a lot of people, and he was the fastest at solving a problem and would pursue a topic as far as it would go.” The two were co-authors of 50 papers and published the 706-page book “Sphere Packings, Lattices and Groups,” aka SPLAG.
During what Dr. Conway called his “annus mirabilus” (which he stretched from 1968 to 1969 to 1970, encompassing various bits here and there) he discovered the Conway group, an entity in the realm of mathematical symmetry that inhabits 24-dimensional space, as well as a new type of number, “surreal numbers.” And he invented the cellular automaton Game of Life, which is among the most beautiful mathematical models of computation. Dr. Conway described it as a “no-player never-ending” game.
His friend Martin Gardner, the longtime Scientific American mathematical games columnist, called it Dr. Conway’s “most famous brainchild.” He reckoned that when Life went viral — with addicts programming it at home and at work — one quarter of the world’s computers were playing.
“Conway’s LIFE changed mine,” the musician Brian Eno said in an email. “I think Conway himself thought it rather trivial, but for a nonmathematician like me, it was a shock to the intuition, a shattering revelation — to watch glorious complexity emerging from staid simplicity. It crystallised many thoughts I’d been having about art, music and evolutionary processes. I still go back to it regularly and with the same fascination.”
But by about 2000 Dr. Conway had disinherited this brainchild, proclaiming, “I HATE LIFE!” every chance he got, preferably from a podium. (Only recently had he come to love Life again.)
He would rather be famous for surreal numbers, the creation of which he was proudest. Reported by Mr. Gardner as “an astonishing feat of legerdemain,” the surreals are a super-continuum of numbers including all the old-fashioned real numbers (integers, fractions and irrationals like pi) as well as those that go above, beyond, below and within, embracing all the infinites and infinitesimals.
Dr. Conway always hoped that the surreals might find application, perhaps in helping to illuminate the universe on the cosmic and quantum scales. He viewed this discovery as so fundamental that he named it “No” in boldface type, meaning “Numbers”. The Stanford computer scientist Donald Knuth, author of “The Art of Computer Programming,” came up with “surreal” while writing his novelette “Surreal Numbers: How Two Ex-Students Turned On to Pure Mathematics and Found Total Happiness.”
“Although John was a pure mathematician, he covered so many bases that he has been mentioned more than 25 times (so far) for different contributions to The Art of Computer Programming,” said Dr. Knuth. “For me, he is my second-favorite mathematician — outshone only by Leonhard Euler.”
One of Dr. Conway’s favorite accomplishments was the Free Will Theorem, conceptualized casually over the course of a decade with his friend and fellow Princeton mathematician Simon Kochen, first published in 2006 and later revised. (The simplest general statement of the Free Will Theorem is: If physicists have free will while performing experiments, then elementary particles possess free will as well. And this, Dr. Conway and Dr. Kochen reckoned, probably explains why and how humans have free will in the first place.)
“In mathematics and physics there are two kinds of geniuses,” Dr. Kochen said, by phone from his home in Princeton, echoing what the mathematician Mark Kac once said about the physicist Richard Feynman. “There are the ordinary geniuses — they are just like you and me but they are better at it; if we’d worked hard enough, maybe we could get some of the same results.
“But then there are the magical geniuses,” he added. “Richard Feynman was a magical genius. And the same always struck me about John — he was a magical mathematician. He was a magical genius rather than an ordinary genius.”
John Horton Conway was born on Dec. 26, 1937, in Liverpool, England, to Cyril and Agnes (Boyce) Conway. His father, an autodidact, had left school at age 14 and, with his photographic memory, made a living playing cards. Later he was a technician in the chemistry lab at the Liverpool Institute High School for Boys, setting up experiments for students, among them George Harrison and Paul McCartney.
Dr. Conway’s mother, a great reader, especially of Dickens, had worked from age 11. Family lore has it that she boasted about finding her son at the age of 4 reciting the powers of two. He had two older sisters, Joan and Sylvia, and Joan recalled that her brother loved to count, persistently asking: “What’s more? What’s the more?! When does it end?”
At 18, in 1956, John left home for Cambridge University, where he eventually finished a Ph.D. dissertation on ordering infinite sets, with the number theorist Harold Davenport as his thesis advisor. Dr. Davenport once said that he had two very good students: Alan Baker (later a Fields Medalist), to whom he would give a problem and Baker would return with a very good solution. And Conway, “to whom I would give a problem and he would return with a very good solution to another problem.”
As a student he cultivated his penchant for being lazy, playing games, and doing no work. He could be easily distracted from his academic work by what he called “nerdish delights.” He went on a hexaflexagon binge (courtesy of Mr. Gardner, who described flexagons in his column as: “polygons, folded from straight or crooked strips of paper, which have the fascinating property of changing their faces when they are flexed”). He built a water-powered computer, which he called Winnie (Water Initiated Nonchalantly Numerical Integrating Engine). He read and annotated H.S.M. Coxeter’s edition of W.W. Rouse Ball’s classic work, “Mathematical Recreations and Essays” and wrote the author a lengthy letter that started a lifelong friendship between these two classical geometers.
Hired at Cambridge as an assistant lecturer, Dr. Conway gained a reputation among students for his high jinks (not to mention his scruffy dishevelment). Lecturing on symmetry and the Platonic solids, he might bring in a turnip as a prop, carving it one slice at a time into, say, an icosahedron, with 20 triangular faces, eating the scraps as he went. “He was by far the most charismatic lecturer in the faculty,” his Cambridge colleague Peter Swinnerton-Dyer once said.
At the same time, Dr. Conway invented a profusion of games — like Phutball (short for Philosopher’s Football; a little like playing checkers on a Go board) — and collected them in the book “Winning Ways for Your Mathematical Plays,” in collaboration with Elywn Berlekamp and Richard Guy.
All the gaming — always in the common room, his office being a hazardous tip — was supported by a loyal following of graduate students, among them Simon Norton, with whom Dr. Conway published the “Monstrous Moonshine” conjecture, investigating an elusive symmetry group that lives in 196,883 dimensions. “One feels the Monster can’t exist without a very real reason,” Dr. Conway said. “But I don’t have any idea what that reason is. Before I die, I really want to understand WHY the Monster exists. But I’m almost certain I won’t.”
Dr. Conway’s Ph.D. student Richard Borcherds received the Fields Medal in 1998 for his proof of the Monstrous Moonshine conjecture. “He was too bright,” Dr. Conway said. “He never needed me.”
At Cambridge Dr. Conway rose to become a “professor of mathematics” (while others opted for titles like “professor of number theory,” he chose the whole shebang) as well as a supernumerary fellow at Gonville and Caius College, his alma mater. He was named a fellow of the Royal Society in 1981 (with the “FRS” honorific, he bragged about now being officially “Filthy Rotten Swine”)
Four years later he published “The ATLAS of Finite Groups,” a book 15 years in the making and written with Robert Curtis, Simon Norton, Richard Parker and Robert Wilson. It is one of the most important books in group theory, a branch of mathematics at the heart of physicists’ theories of the universe and the fundamental forces of nature.
That same year he was invited to give a talk at Princeton, and a job offer followed. In 1987 he took up the position of the John von Neumann professor of applied and computational mathematics. In announcing the hire, Princeton’s president called Dr. Conway “one of the most eminent mathematicians of the century.”
In America, Dr. Conway, a mischievous seducer, began drawing media attention. Asked by a reporter for The New York Times about his life of the mind, he replied: “What happens most of the time is nothing. You just can’t have ideas often.”
He became a fellow of the American Academy of Arts and Sciences in 1992. A fellow inductee, the mathematician Robert MacPherson, recalled that at the ceremony Dr. Conway accepted his honor in what appeared to be green running shorts (Joyce Carol Oates, a fellow Princetonian, was honored the same year, though she has never registered Dr. Conway’s existence). The citation noted Dr. Conway was a mathematician and an educator — and increasingly, he considered himself more the latter. As he liked to say: “If it sits down, I’ll teach it. If it stands up, I will continue to teach it. But if it runs away, I maybe won’t be able to catch up.”
His first two marriages, to Eileen Howe and Larissa Queen, ended in divorce. He is survived by four daughters from his first marriage, Annie, Ellie, Rosie and Susie; two sons from his second marriage, Oliver and Alex; as well as by his wife Diana Conway and their son Gareth; three grandchildren; and six great-grandchildren.
At Princeton he was almost invariably recruited to give the first-year course intended to persuade students to become math majors. And he offered extracurricular content, like a campus tour titled “How to Stare at a Brick Wall.”
“I worked (and played) a great deal with him during my first year at Princeton,” said Manjul Bhargava, one of Dr. Conway’s students, now a mathematician at Princeton. “I was not ready to let him go so soon and so suddenly. There were still many games left to play.”
He gave over his summers — prime research time — to teaching at math camps, embedding as a camper for weeks at a time. He was a star attraction, despite the fact that his talks were advertised vaguely as “John Conway Hour, NTBA” (Not to Be Announced). He would take requests from students and deliver an extemporaneous lecture, his signature style.
Math, Dr. Conway believed, should be fun. “He often thought that the math we were teaching was too serious,” said Mira Bernstein, a mathematician and former executive director of Canada/USA Mathcamp, an international summer program for high-school students passionate about mathematics. “And he didn’t mean that we should be teaching them silly math — to him, fun was deep. But he wanted to make sure that the playfulness was always, always there.”
Dr. Conway persevered in finding the fun through triple bypass surgery, a suicide attempt and a number of strokes. Sometimes he would regale anyone willing to listen on the science of rainbows, or his Doomsday rule for calculating the day of the week for any given date. He gleefully triumphed at a pie-eating race following a pi recitation contest on pi day.
And there were evermore games of Phutball, which Dr. Conway was not very good at, even though he was the inventor. Sometimes when all seemed lost — when he was almost certainly beaten at his own game, though he might yet magically prevail — he’d delight in borrowing from Mark Twain and admonishing his opponents, bellowing: “Reports of my death have been greatly exaggerated!”