Nerd Rant

Westworld Season 3 is just a lazier version of Isaac Asimov's Foundation

Why can't Serac predict the future? Maybe he should have taken some lessons from Hari Seldon.

In the penultimate episode of Westworld Season 3, "Passed Pawn," Aaron Paul's Caleb reduces the show's entire plot into one familiar science fiction trope. He calls the super-computer Solomon an "insane A.I." If only it were that simple!

Westworld Season 3's biggest plot contrivance is, without a doubt, the twin influences of the society-altering supercomputers Solomon and Rehoboam. Throughout the season, we've learned that the life of every human on Earth is being predicted and controlled by a self-aware version of "big data."

But is anyone buying this? The high-concept conceit of Westworld Season 3 makes very little sense, and that's partly because this whole Rehoboam and Solomon supercomputer business is just a lazier version of Isaac Asimov's mega-famous book series, Foundation.

Call it Foundation without the foresight. Here's why that's led to a muddled mess.

Warning! Westworld Season 3 spoilers ahead.


By the end of Episode 7, the narrative goal of Westworld Season 3 are clear. In Seasons 1 and 2, Westworld meditated on the nature of free will by having enslaved robot Hosts go off their pre-programmed loops and revolt. But that's old news.

Season 3 wants us to believe that ordinary humans are just as enslaved as the Hosts they created. The ripped-from-the-headlines approach here is to convince us that a company called INCITE can "accurately" predict people's behaviors thanks to a ton of data-mining that surely violates all sorts of privacy laws. Westworld wants us to think that various algorithms that predict human behavior can, and will, be used as modes of control on a large scale.

So far, so good. Or maybe I should say, so far, no actual science fiction. After all, Facebook might not be putting criminals in cold storage (at least, I hope not), but tech companies already use big data to influence our behavior on a daily basis.

In fairness, all tech-based sci-fi basically does the same thing: Take something plausible (interstellar space travel, advanced A.I., widespread genetic engineering) and crank it up to 11. The issue with Westworld is that because it wants to be "relevant," the writers try to keep their tech fairly modest. The Hosts are made with 3D printers (real), Serac talks a lot about data and algorithms (obviously real), and Dolores uses drones to carry out a raid, making it seems like she just borrowed some gear from Jeff Bezos.

The problem is that with the big stuff (like the total prediction and control of the human race through big data and self-perpetuating A.I.) Westworld feels hilariously short-sighted. But to explain why, we'll have to dive into another piece of popular science fiction, Isaac Asimov's Foundation.

Chris Foss

Psychohistory versus INCITE — In Isaac Asimov's mega-famous Foundation series — a "trilogy" of novels with several spin-offs and a prequelthe entire future of a galactic empire is mapped out by a guy named Hari Seldon based on a concept called "psychohistory." Think of psychohistory as sociology and mathematics, but with unlimited funding that creates public policy for every government in a giant interplanetary empire. Based on their sociological and mathematical calculations, Seldon and his cohorts are worried a period of dark ages is coming for the galactic empire, but with adjustments to society, the dark ages period can be cut down from 30 thousand years to about a thousand.

The basic concept here is that the Foundation can't predict or control everything, and certainly can't chart the courses of each and every single individual. Instead, Asimov's Foundation floats the plausible — but far-out idea — that given enough humans on enough planets over the course of several thousand years, you'd have enough data to predict the rise and fall of governments. In a sense, you could forecast society the same way we forecast the weather now.

Sound familiar? The only problem is, Asimov's story uses way more data and tries to do less with it. By comparison, Foundation makes Westworld Season 3 look totally ludicrous.

EVERYBODY knows about the Supercomputer that controls their life.


Westworld doesn't have the data or the vision of Asimov — Now, it's not like Hari Seldon and the Foundation are 100 percent successful, otherwise, what would be the point of the books? But with psychohistory, at least Asimov thought really long and hard about how you might be able to predict (and control) the behavior of large societies. And part of that equation was that you needed a lot of data.

Because Foundation takes place so far in our own future with so many different planets with people living on them, there's a ton of (fictional) data Hari Seldon can mine to make his society-fixing plans. It's also worth noting that Seldon isn't trying to control every single person, just influence society at large.

With that in mind, here are the "rules" Asimov sets-up in order to make this improbable concept more plausible. Behold, the rules of psychohistory!

1. The population under scrutiny is oblivious to the existence of the science of Psychohistory.
2. The time periods dealt with are in the region of 3 generations.
3. The population must be in the billions (±75 billions) for a statistical probability to have a psychohistorical validity.

Now, I know it's not fair to measure Serac's "failures" in Westworld Season 3 by Asimov's criteria, but it's hard not to when he says so much dumb stuff. Speaking to one of the super-computers, probably Solomon, in Episode 5, Serac says, "There were problems, things you couldn't predict."

Yeah, no shit. Serac and his brother started their project when they were kids in 2025, orphaned by a nuke that hit Paris. In theory, the "present" of Westworld Season 3 is 2052-ish. This means they've been working on their data for less than one generation. Basically, we're dealing with less than 30 years for all these huge society-predicting models to be created. Plus, Serac is notably obsessed with controlling "outliers," which, of course, messes with all of his data because it creates a bunch of people who are "aware" of the system at work.

The guy who Caleb and Francis have to execute in the flashback in Episode 7 is fully aware of the forces at work. He even explains all of it to Caleb. This means that Serac's system is truly moronic, at least by Asimov's standards. By not accepting that he (or Rehoboam) can't control every person on Earth, both super-computers actually create more opportunities for outliers to exist and mess up their plan.

If one random guy is such a chatterbox to Caleb and Francis in that one flashback, you can quickly imagine hundreds of scenarios where that happened. To borrow from another mega-famous sci-fi franchise, the more Serac tightens his grip, the more outliers slip through his fingers. We're meant to think this guy is smart?

Serac acts super-sneaky and confident and all-controlling with his stolen (or purchased) data, but the reality is, we see him flaunt this information constantly. This clearly violates the first rule of running the world with super-secret data and AI: stop constantly announcing you're doing it.

Rehoboam is in a giant public building for crying out loud!

Can robot William fix all of this?


The timeline twist that could save it all — So far, the exact when of Westworld Season 3 isn't totally clear, even though an early trailer really made us think it was roughly 2052. That said, because of Caleb's dissociative memories and Westworld's historical love of timeline twists, the series could save itself by somehow revealing that both Solomon and Rehoboam are much older than we're being told.

At the end of Season 2, we meet a Host version of the Man in Black who showrunner Lisa Joy revealed exists in "the far future." How far? Far enough that someone like Serac could have actually collected enough data to start making useful predictions? And have we already caught up to that future? If so, this slightly silly system of control could feel, a tiny bit more plausible.

In other words: The only way for Westworld Season 3 to make a lick of sense is if the finale reveals that we've actually been much further in the future than we thought this entire time. That might feel like a huge copout, but it makes a lot more sense than an "insane A.I." micromanaging everyone's lives just a few decades from today.

If Serac really wanted to change the world, he would have taken some notes from Hari Seldon. And if Westworld really wanted to comment on the "real world," it should have jumped much further into the future. Because this version of the real world doesn't feel remotely real. I think I liked it better back in the park.

Westworld airs its final episode of Season 3 this Sunday on HBO.

Related Tags