The Seldon Fallacy

Like some of my role models, I am inspired by Isaac Asimov’s vision. However, for years, the central ability at the heart of the Foundation series–‘psychohistory,’ which enables Hari Seldon, the protagonist, to predict broad social trends across thousands of galaxies over thousands of years–has bothered me. Not so much because of its impact in the fictional universe of Foundation, but for how closely it matches the real-life ideas of predictive modeling. I truly fear that the Seldon Fallacy is spreading, building up society’s exposure to negative, unpredictable shocks.

The Seldon Fallacy: 1) It is possible to model complex, chaotic systems with simplified, non-chaotic models; 2) Combining chaotic elements makes the whole more predictable.

The first part of the Seldon Fallacy is the mistake of assuming reducibility, or more poetically, of NNT’s Procustean Bed. As F.A. Hayek asserted, no predictive model can be less complex than the model it predicts, because of second-order effects and accumulation of errors of approximation. Isaac Asimov’s central character, Hari Seldon, fictionally ‘proves’ the ludicrous fallacy that chaotic systems can be reduced to ‘psychohistorical’ mathematics. I hope you, reader, don’t believe that…so you don’t blow up the economy by betting a fortune on an economic prediction. Two famous thought experiments disprove this: the three-body problem and the damped, driven oscillator. If we can’t even model a system with three ‘movers’, because of second-order effects, how can we model interactions between millions of people? Basically, with no way to know which reductions in complexity are meaningful, Seldon cannot know whether, in laying his living system into a Procustean bed, he has accidentally decapitated it. Using this special ability, while unable to predict individuals’ actions precisely, Seldon can map out social forces with such clarity that he correctly predicts the fall of a 10,000-year empire. Now, to turn to the ‘we can predict social, though not individual futures’ portion of the fallacy: that big things are predictable even if their consituent elements are not.

The second part of the Seldon Fallacy is the mistake of ‘the marble jar.’ Not all randomnesses are equal: drawing white and black marbles from a jar (with replacement) is fundamentally predictable, and the more marbles drawn, the more predictable the mix of marbles in the jar. Many models depend on this assumption or similar ones–that random events distribute normally (in the Gaussian sense) in a way that increases the certainty of the model as the number of samples increases. But what if we are not observing independent events? What if they are not Gaussian? What if someone tricked you, and tied some marbles together so you can’t take out only one? What if one of them is attached to the jar, and by picking it up, you inadvertently break the jar, spilling the marbles? Effectively, what if you are not working with a finite, reducible, Gaussian random system, but an infinite, Mandelbrotian, real-world random system? What if the jar contains not marbles, but living things?

I apologize if I lean too heavily on fiction to make my points, but another amazing author answers this question much more poetically than I could. Just in the ‘quotes’ from wise leaders in the introductions to his historical-fantasy series, Jim Butcher tells stories of the rise and fall of civilizations. First, on cumulative meaning:

“If the beginning of wisdom is in realizing that one knows nothing, then the beginning of understanding is in realizing that all things exist in accord with a single truth: Large things are made of smaller things.

Drops of ink are shaped into letters, letters form words, words form sentences, and sentences combine to express thought. So it is with the growth of plants that spring from seeds, as well as with walls built from many stones. So it is with mankind, as the customs and traditions of our progenitors blend together to form the foundation for our own cities, history, and way of life.

Be they dead stone, living flesh, or rolling sea; be they idle times or events of world-shattering proportion, market days or desperate battles, to this law, all things hold: Large things are made from small things. Significance is cumulative–but not always obvious.”

–Gaius Secundus, Academ’s Fury

Second, on the importance of individuals as causes:

“The course of history is determined not by battles, by sieges, or usurpations, but by the actions of the individual. The strongest city, the largest army is, at its most basic level, a collection of individuals. Their decisions, their passions, their foolishness, and their dreams shape the years to come. If there is any lesson to be learned from history, it is that all too often the fate of armies, of cities, of entire realms rests upon the actions of one person. In that dire moment of uncertainty, that person’s decision, good or bad, right or wrong, big or small, can unwittingly change the world.

But history can be quite the slattern. One never knows who that person is, where he might be, or what decision he might make.

It is almost enough to make me believe in Destiny.”

–Gaius Primus, Furies of Calderon

If you are not convinced by the wisdom of fiction, put down your marble jar, and do a real-world experiment. Take 100 people from your community, and measure their heights. Then, predict the mean and distribution of height. While doing so, ask each of the 100 people for their net worth. Predict a mean and distribution from that as well. Then, take a gun, and shoot the tallest person and the richest person. Run your model again. Before you look at the results, tell me: which one do you expect shifted more?

I seriously hope you bet on the wealth model. Height, like marble-jar samples, is normally distributed. Wealth follows a power law, meaning that individual datapoints at the extremes have outsized impact. If you happen to live in Seattle and shot a tech CEO, you may have lowered the mean income in the group by more than the average income of the other 99 people!

So, unlike the Procustean Bed (part 1 of the Seldon Fallacy), the Marble Jar (part 2 of the Seldon Fallacy) is not always a fallacy. There are systems that follow the Gaussian distribution, and thus the Marble Jar is not a fallacy. However, many consequential systems–including earnings, wars, governmental spending, economic crashes, bacterial resistance, inventions’ impacts, species survival, and climate shocks–are non-Gaussian, and thus the impact of a single individual action could blow up the model.

The crazy thing is, Asimov himself contradicts his own protagonist in his magnum opus (in my opinion). While the Foundation Series keeps alive the myth of the predictive simulation, my favorite of his books–The End of Eternity (spoilers)–is a magnificent destruction of the concept of a ‘controlled’ world. For large systems, this book is also a death knell even of predictability itself. The Seldon Fallacy–that a simplified, non-chaotic model can predict a complex, chaotic reality, and that size enhances predictability–is shown, through the adventures of Andrew Harlan, to be riddled with hubris and catastrophic risk. I cannot reduce his complex ideas into a simple summary, for I may decapitate his central model. Please read the book yourself. I will say, I hope that as part of your reading, I hope you take to heart the larger lesson of Asimov on predictability: it is not only impossible, but undesirable. And please, let’s avoid staking any of our futures on today’s false prophets of predictable randomness.