Westworld is a show with big ideas...about good and evil, fate and free will and the extent to which technology organizes our lives. But a single supercomputer that controls every aspect of humanity is a really big idea, one that might be too much to fathom for even Westworld's most conspiratorial fans.

Season 3 of Westworld sometimes seems like a different series when compared to Seasons 1 and 2, which were mostly set in the park that gave the show its name. It's not just the change in scenery. There's the reduced and mostly new cast, and the heightened real world (we think) stakes, which involve Dolores trying to destroy (or save?) the human race. Plus, this season has been surprisingly upfront with its audience about time and place: California circa 2058, give or take a few obvious flashbacks and Maeve's escape from the simulation. What's remained the same about the show, though, is the sense of mystery that surrounds who the characters really are.

RELATED: Westworld’s New Drug ‘Genre’ Is a Cinematic Trip Unlike Any Other

Some of those mysteries have already revealed themselves. We learned that Stubbs is a host, Bernard is really Bern-Arnol, and Lee Sizemore was just a sloppy recreation. In the biggest reveal, we found out that Dolores has been copying and pasting herself, and not smuggled host-consciousnesses, into replicas of her victims. Then there's Caleb, who is basically this season's ranch hand daughter, confused and angry about the true nature of his reality, but starting to put the pieces together. Westworld first led us to believe that he's a traumatized veteran and dissatisfied blue-collar worker whose destined to commit suicide in the near future. However, recent hints suggest there's more to his story, and that what he thinks are memories might not be reliable.

Caleb's presence on the show is thematically important, not because he might be a host (it's much better if he's not), but because he's meant to represent a human stuck on a loop, not of their own choosing. In fact, it appears that the central message of Season 3 is that people who are subject to the authority of multinational technology empires are no better, and no more free, than robots. Westworld is essentially telling us that we have allowed ourselves to become artificially intelligent and therefore easier to manipulate. While that's certainly a relevant topic to explore, Westworld's take is, for once, too broad and oversimplified to withstand scrutiny. That's exemplified in Serac's all-knowing program and Dolores's efforts to dismantle it.

RELATED: Westworld Season 3 Fixes The Worst Season 2 Problems

Good science fiction needs even its most outlandish ideas to seem within the realm of the possible in order for the viewer to buy-in. Thus far, we've believed in the show's main conceits: that the uber-rich visit Westworld, that the hosts become self-aware at different speeds and to different maximum potential, that somebody would eventually misuse the tech behind it all. Even much of the rest of Season 3 seems within the scope of plausibility, everything except for Rehoboam.

It's true that in our world, companies exist with budgets that exceed economies of major countries, and that operate with less regulation and oversight than is wise. It's also true that many of those companies buy and sell our data, and then tailor their business to the public's appetite, and appetite which they can manufacture through marketing. The vast, vast majority of us willingly provide our personal information to such entities, not just for conveniences like free apps and coupon codes, but because doing so is the price of partaking in a largely digital, global society. We might allow some degree of the invasion of our privacy, but we also retain the right to complain about and act in opposition to that invasion. So, when Westworld gathered and sold guest data in Season 2, it checked out.

RELATED: Westworld: Dolores Finally Makes Her First Move Against Humanity

Serac facing Dolores

Okay, so maybe that isn't too far fetched. What Westworld Season 3 (and Incite) gets wrong is that, unlike lines of code, the organic human brain is extremely irrational and unpredictable. Serac was able to achieve an unhappy peace through algorithms, despite "outliers" throwing a wrench into the system. Dolores is able to expose that system to every citizen on the planet, which produces the intended result: chaos. In our world, even with creeping surveillance and monopolizing corporations, it would still be impossible for both Serac and Dolores to pull off their schemes.

Typical and atypical aren't distinct categories of being, no matter how hard we try to make them so. Take the autism spectrum or shifting definitions of mental illness; we are all a mixed bag of strengths, needs and idiosyncrasies. At least viewers are meant to find Incite's treatment of outliers abhorrent, and the institutionalization and experimentation does, regrettably, mirror history. But the show skips over way too much pertinent information in going from a program that games the stock market to one that influences all the world's decision making. The result is an instant and uniform breakdown of society, which makes for a cinematic flourish at the end of the episode, but not a sensible resolution.

First, Dolores's plan assumed everyone would readily comprehend and accept what they saw on their devices. It also assumed everyone would be upset with their "loop," which couldn't possibly be the case for the system to function. In reality, we can't easily predict how people will behave, and we can't easily corral them into collective action. We are like hosts, but not in the way the show thinks. Some of us are trusting, some are suspicious. Some are cooperative, some self-interested and some more self-aware than others. Westworld went big this season in trying to say something important the state of modern society, but it seems more eloquent when it keeps things personal.

KEEP READING: Westworld Theory: Why Dolores Called Bernard Irreplaceable