Reading Foundation for the very first time in 2025
Isaac Asimov’s seminal sci-fi novel is remarkably prescient when it comes to the dominant power of technology, especially who wields it and who can turn it off.
After decades of intending to, I've just finished reading Isaac Asimov’s novel, Foundation, for the very first time. Initially, I worried that having watched Season 1 of Foundation on Apple TV+, the show would inevitably influence how I imagined the characters and locations in this hugely influential work of science fiction. I needn't have worried. Foundation, the book, is so vastly different from the show as to be almost entirely unrecognizable.
Foundation is a very cerebral book; I often felt I was studying it, rather than reading it for pleasure. It’s more like a play than a movie, very thoughtful, very dialogue-driven, and full of big, complex ideas. It’s easier to imagine key scenes taking place on stage rather than an IMAX screen: more Henry V than Dune. The strategic machinations of its protagonists—like Salvor Hardin’s shrewd and pragmatic maneuvering against plots from within and without—feels less like a modern sci-fi epic and more akin to a minimalist production of Shakespeare, complete with sparse, austere set design, Mid-Century costume, and minimal lighting, all in stark contrast with the rich, imaginative, sumptuous, and effects-laden production of the version on Apple TV+.
None of this makes Foundation in any way a bad book—far from it; Asimov’s ideas are genuinely fascinating. I simply found it very different from what I was expecting. Once I began reading, however, I immediately realized why so many people in Silicon Valley cite this book as a major influence today: Foundation is fundamentally about using technology to intentionally mold worlds and alter the natural course of human history to make the galaxy a better place, according to those who wield the most technological power.
Published in 1951 by Gnome Press, Foundation is the first volume in arguably one of the most famous and epic science fiction series ever written. It's also the first book by Isaac Asimov I’ve ever read. Originally a trilogy of books published in quick succession, the official Foundation series spans 10 books and nearly 50 years of publication, with the first seven penned by Asimov between 1951 and 1993, while the remaining three were written by various authors after his passing. But Foundation also incorporates elements from Asimov’s other works, notably the Robot and Galactic Empire books, which add about eight more books and dozens more short stories to the full tally of tales. No matter how you spin it, that’s a lot to read. For now, I’m just glad to have finished the series' foundational novel (pun intended, forgive me, I couldn’t resist. Carry on).
The first thing to know about Foundation is that the book is, as I have learned and like much of Asimov’s early work, a collection of previously written short stories published in Astounding Science Fiction between 1942 and 1944. Only the prologue, “The Psychohistorians,” was written specifically for the novel. Asimov’s tales are bound by a common thread: the causes and effects of the stagnation and fall of an unimaginably vast Galactic Empire over 155 years, and the mathematical and philosophical vision of one extraordinary man, Hari Seldon.
The key to Seldon's vision is Psychohistory, a fictional science that can predict the future of humanity across millennia through a mathematical and psychological analysis of billions of people across millions of worlds. Asimov explained in lectures and essays that Psychohistory was like the science of gas molecules: You can’t say where any single molecule will go, but you can predict the pressure and temperature of the gas as a whole. It’s a rather elegant explanation for a profoundly complex concept.
Seldon’s plan aims to shorten the impending dark ages from 30,000 to (only) 1,000 years by establishing The Foundation on a remote planet, Terminus, ostensibly to compile the Encyclopedia Galactica but in truth so that humanity has what it needs to overcome several predicted crises without being fully aware of this supportive hand. Even The Foundation’s own leaders are not entirely cognizant that their choices are being subtly guided by Seldon’s equations. It’s the observer effect as a parable: If they knew the plan, it would fall apart. What does that mean in terms of free will vs. determinism? Good question.
If this all sounds rather dense and esoteric, it is. What strikes me as a reader coming to Foundation for the first time is how intellectual it feels—making it very apt that Paul Krugman, the Nobel laureate economist and former New York Times columnist, wrote the introduction for the Folio Society edition I read.
The book is written in a style that feels almost scholarly, driven by dialogue and exposition rather than sprawling action sequences. Aside from a proclivity for characters' hairstyles, hand gestures, or speaking style—reading Lord Dorwin's dropped 'r's felt reminiscent of Michael Palin’s Pontius Pilate in Monty Python's Life of Brian—Asimov seems reticent to paint more than a passing description of his worlds and their inhabitants.
To be clear, this stylistic sparseness is no bad thing. Indeed, I found myself imagining the dialogue-dense scenes in an almost off-Broadway black box setting, as I ignored the visuals and Asimov’s ideas caught my imagination like a battery toy plugged into mains power, hitting me with the realization that The Foundation in his world is comparable to Silicon Valley in ours.
The parallels between the declining Galactic Empire and real-world historical collapses are strikingly obvious. The early chapters, particularly "The Psychohistorians," paint a picture of political and societal stagnation and a pervasive malaise that immediately brought to my mind descriptions of the decline and fall of the Roman Empire in history lessons at school. Even more strikingly, moments in the book echo many of the current challenges our world is facing today with startling precision. Hari Seldon’s speech at his trial on Trantor resonates as remarkably apt in 2025:
"The fall of Empire, gentlemen, is a massive thing, however, and not easily fought. It is dictated by a rising bureaucracy, a receding initiative, a freezing of caste, a damming of curiosity—a hundred other factors. It has been going on, as I have said, for centuries, and it is too majestic and massive a movement to stop."
Perhaps his most poignant line, which feels entirely copy/pasted from a modern headline, is: "Already, they recall the lives of their grandfathers with envy." Seldon’s speech isn’t just about the past; it’s about a creeping sense of decline, a societal yearning for a mythical "Make the Galaxy Great Again" era that never quite was. This backward gaze is not merely benign nostalgia, but a crucial symptom of an empire losing its will to innovate and adapt.
Yet, it’s Asimov’s ideas around technology where Foundation truly hits home, and where its biggest Silicon Valley fans face a stark, uncomfortable irony.
I’m not talking about his imagined fictional tech, where everything from interstellar spaceships, mega-cities, force fields, and atom blasters to self-sharpening kitchen knives and washing machines runs on atomic energy (which makes perfect sense considering when the stories were first written). Instead, it's how The Foundation’s advanced technology is used as either a bribe or a cudgel (sometimes both) to coerce governments, populations, and entire planetary societies into ceding control. To prevent anyone from developing and adapting the technology for themselves, The Foundation wraps it in an impenetrable aura of the divine, propagating the cult of the “Galactic Spirit"—a techno-religious hegemony controlled, maintained, and operated by a priesthood held in awe by the people dependent upon them.
Today, we may not have a singular, small but all-powerful group using a mystical and wholly concocted religion to deify their technology for political control and economic domination, but that’s only because Silicon Valley doesn’t need to go to the bother of all that mummery: why dress it up in ritual when we already think it’s magic.
However, for all their admiration of Seldon's grand design, it seems many of Foundation's most vocal proponents in Silicon Valley may have overlooked its most crucial, perhaps most disconcerting, implication: the inherent dangers when such immense power, even if well-intentioned, tends to consolidate outside democratic control.
In 2025, we live in a world where technology, powered by opaque algorithms and locked operating systems understood by a select few companies, dominates the lives of literally billions. This isn't merely about economic dominance, though the numbers are absolutely staggering: The "Magnificent Seven" (Apple, Microsoft, Alphabet, Amazon, Nvidia, Tesla, and Meta) currently account for approximately 34% of the S&P 500's total market capitalization, a value over $19 trillion. That’s over one hundred times more valuable (adjusted for inflation) than the entire S&P 500 in 1951 when Foundation was first published. More profoundly, this insane wealth translates into direct control over the digital infrastructure of commerce, communication, and culture that underpin our modern existence.
Beyond market cap, this technological and economic control extends to tangible power over critical ecosystem—just like The Foundation’s command over galactic energy. One man, Elon Musk, commands a near-monopoly on low-orbit satellite communication through Starlink and SpaceX, and has already demonstrated a willingness to suspend service and potentially alter the course of the war in Ukraine. Concurrently, Huawei holds a growing dominance in global telecommunications systems, potentially offering the Chinese company significant leverage over information flows. Meanwhile, Nvidia chips are the fundamental power source for the vast majority of today's AI (even in China), conceivably powering (at least for now) the “brains” of our automated future, and Palantir is embedding its technology deep within government and military systems, granting it unparalleled data insight with real-world implications. Apple, Amazon, Microsoft, Alphabet/Google, and Meta all vie for lucrative government contracts, including military and defense, a sign of their growing influence. Remember the time Mark Zuckerberg reportedly walked in on a White House meeting about F-47 fighter jets?
In this way, Planet Earth is already like Anacreon: our stock markets are hooked on the wealth these companies create, and the global population is increasingly dependent on the services they provide. This isn't accidental; it’s a classic dependency play, where essential services are rendered through opaque technology that few truly understand or can live without. And as Frank Herbert wrote in Dune, "The power to destroy a thing is the absolute control over it." Or in the case of The Foundation, to simply turn it off.
In “The Mayors” section of the book, the political power struggle between Anacreon and The Foundation culminates when the threat posed by a powerful, ancient, two-mile-long Imperial battleship is thwarted by The Foundation literally hitting the off switch, rendering the Anacreons powerless.
This is our world today. Though admittedly highly unlikely, any one of these technology companies could push a button, either by intention or by accident, and the world could go dark. Instantly. We’ve seen this already when CloudFlare, AWS, or Microsoft sends a bad update and planes are grounded or the internet goes down. Never before in human history has there existed the potential for so much catastrophic destruction to be inflicted by so few on so many with such haste and without a single shot being fired.
And so, in that sense alone, Foundation isn’t so much science fiction as an ominous and distinct possibility. A planetary blackout isn't a prediction of what will happen, but it is no longer science fiction to admit it could. We've built a system literally too complex to let fail.
This stark reality of our technological vulnerability transforms reading Foundation for the first time into far more than just an exercise in literary science fiction history. It's a lens through which to examine our own volatile present, especially in the realm of technology. Whereas Neuromancer is an effervescent, visceral collage of cyberpunk sights, sounds, music, and atmosphere, Foundation, by comparison, feels quiet, sparse, cerebral, and slightly menacing.
Asimov, writing at the dawn of the atomic age, foresaw a future where technology's power wasn't just in its destructive force, but in its ability to be wielded as a tool of control, dressed up in a palatable, even spiritual, guise, welcomed by the populace with open arms. The parallels with our current tech landscape—where technology behemoths dominate through product indispensability, data control, and an almost cult-like fanboy following—ring uncomfortably true today.
Hari Seldon warned of the decaying Galactic Empire, "It's a worship of the past. It's a deterioration – a stagnation!" The unsettling genius of Foundation is its timeless warning about who truly holds the off switch of civilization, and whether our collective yearning for convenience powered by technology, or the nostalgic hankering for a mythological past, leaves us vulnerable to becoming unwitting participants in a plan we don't fully comprehend or control.
Editor’s Note:
A few weeks back, I visited Seattle for a wedding. In between the ceremony and reception, a few of us went to a dive bar to wait out traffic. It’s here that I overheard someone talk about downloading their genome into a zip file and uploading it to ChatGPT. A few people at the bar, including the bartender whose attention he held (at a polite distance, to be sure), gave a few quiet-but-impulsive chuckles until he made it clear he was serious. Over the course of the next 15 minutes or so, I watched him try to convince several people to do the same, as he proselytized all the ways in which it would give medical advice his doctors had never even considered.
Some bystanders kept up the polite banter with some prodding questions, to the point where I genuinely thought he might convert a few. That is, until he asked for everyone’s LinkedIn contact information so he could send tutorials on how to upload their own 23andMe data to an LLM, which is when everyone turned him down. The social contract of politeness extends only as far as the beers can physically travel.
I watched all this, with a buffer human in between, alongside a fellow wedding-goer and a devout Catholic, and we couldn’t help but whisper to ourselves about the parallels between tech’s most devout evangelicals and those of various other denominations. There is a real power in something confidently (even if often incorrectly) having all the answers, and as we’ve seen time and time again lately, AI and LLMs are becoming worshipped and trusted in very concerning ways. We may have gotten a chuckle out of this encounter, but who’s to say this isn’t a harbinger of what’s to come?
I have not read the Foundation series, but I was aware of Psychohistory as a concept. When I was in college studying probability and game theory for a bachelor’s in math, inevitably someone would compare concepts like Markov chains as a precursor to Seldon’s more supercharged prediction engine—the probability of each future event is built off a chain of events leading from the past. (For more on Markov, there’s a great Veritasium video that I cannot recommend enough: The Strange Math That Predicts (Almost) Anything, which ties into search engines, predictive text, and ultimately, the foundational ties to LLMs and AI chatbots.)
Generative AI is pattern recognition made manifest; its knowledge base is borne of everything we’ve told it. Does having access to the whole of human output give AI some elevated sense of where we’re going? Does that make it a human-borne deity, something that should be worshipped? I’d argue no, but at the same time, I really don’t want to argue with anyone who says otherwise. It’s rather exhausting at this point, and besides, I’d rather save up my energy for if/when these fringe ideals become something much larger. —Ross






