Last week, I wrote about how those of us on the right can be Star Trek fans despite its supposedly “Progressive” politics. Partly, this is because good art is about a lot more than a didactic political message. But it also struck me how much of the message of Star Trek is consistent with the values of many of us on the right. The original series was not “Progressive” but “liberal” in an old-fashioned sense, celebrating freedom and individualism and opposing censorship and conformity. This means that Trek also turned out some cautionary tales that are relevant today—and surprisingly prescient about the conformist agenda of big tech companies like Google.
In contrast to those who claim that Star Trek’s heroes were early Social Justice Warriors, it strikes me that the original Star Trek series produced a pointed parody of the SJWs—and of the technological enforcement of their creed—in the 1967 episode, “The Return of the Archons.” It’s not specifically against SJWs, but one of the things that makes Star Trek interesting is its use of science fiction to explore big philosophical issues that have many different applications.
Here’s a recap. In this episode, the Enterprise investigates a planet where everyone behaves with a kind of creepy placidity and orderliness. After he is captured by the locals and then rescued, a glassy-eyed Lieutenant Sulu gushes that it is a “paradise.” (So he’s pretty much like George Takei today.) When Kirk goes to investigate, he finds that strangers are asked, “Are you of the body?”—the “body” referring to the body of society, an ideal of collectivist conformity, all of it run from above by an all-knowing visionary leader named Landru.
But it turns out (spoiler alert) that Landru has been dead for six thousand years. What runs the whole society is a giant computer, an artificial intelligence programmed by the original Landru to keep his utopian society going in perpetuity—and wirelessly connected to everybody’s brains to do their thinking for them.
So what Kirk finds is a powerful, omnipresent computer network with a funny name and utopian pretensions. You see where I’m going with this, right? Just substitute, “Are you woke?” for “Are you of the body?” and you’ll get the idea.
What I am suggesting is that Landru is the basic ideal toward which Silicon Valley “Progressives” are working. Everything will be connected to their benevolent computer network, which will set itself the goal of pushing our behavior into forms that it finds acceptable and not disruptive to the established order.
If you think I’m just paranoid, consider a leaked video from Google that basically confirms our worst fears.
The proposal is for a system that would track users’ behavior, not for the purpose of helping them pursue their own preferences, but for the purpose of altering their behavior to meet some larger social goal. They can do it because, “When we use contemporary technology, a trail of information is created in the form of data. When analyzed, it describes our actions, decisions, preferences, movement, and relationships.” The term Google uses for this record is a “ledger,” which the video describes as a “codified version of who we are.”
I think most of us are aware by now that this is what Google wants and what it makes its money from, by using our information to sell us things. But what if it wants to do something more with that data? That’s what the video confirms. The term it uses for a ledger that serves your own interests is “user-centered design.” But that’s so old-fashioned and reactionary. “User-centered design principles have dominated the world of computing for many decades, but what if we looked at things a little differently? What if the ledger could be given a volition or purpose rather than simply acting as a historical reference?”
Yeah, giving Google’s computer network a volition and purpose of it own—what could go wrong?
The immediate implementation seems a little less threatening—it always does—consisting of a kind of New Year’s resolutions program, where you pick a goal, and the volitional ledger uses every interaction to nudge you in the direction of that goal. But the goals are “suitable targets” chosen by Google that emphasize trendy lefty obsessions like environmentalism and locally grown food. “Whilst the notion of a global good is problematic, topics would likely focus on health or environmental impact, to reflect Google’s values as an organization.” But over time, “the user’s behavior may be modified, and the ledger moves closer to its target.”
Then the internal Google video goes on to speculate about where this is all leading.
As cycles of [data] collection and comparison extend, it may be possible to develop a species-level understanding of complex issues such as depression, health, and poverty. Our ability to interpret user data, combined with the exponential growth in sensor-enabled objects, will result in an increasingly detailed account of who we are as a people. As these streams of information are brought together, the effect is multiplied. New patterns become apparent and new predictions are possible….
Just as the examination of protein structure paved the way to genetic sequencing, the mass, multigenerational examination of actions and results could introduce a model of behavioral sequencing. As gene sequencing yields a comprehensive map of human biology, researchers are increasingly able to target parts of the sequence and modify them in order to achieve a desired result. As patterns begin to emerge in the behavioral sequences, they too may be targeted.
The ledger could be given a focus, shifting it to a system which not only tracks our behavior but offers direction toward a desired result.
So let’s lay out the steps here. We will have more and more “sensor-embedded devices” that track every aspect of our behavior, which Google will use to analyze our behavior and try to figure out how to control it. The whole thing culminates in Google creating a system of behavioral engineering, patterned on genetic engineering, designed to push us in the direction of—what? Well, it’s going to push us toward “Google’s values as an organization.” And we know from recent experience that Google’s values as an organization include conformity to the woke cultural orthodoxy of the contemporary left. But don’t worry. This system “could offer benefits to this generation, to future generations, and the species as a whole.”
That’s what Landru said, too. As one of the locals explains, “There was war, convulsions, the world was destroying itself. Landru was our leader. He saw the truth. He changed the world.” What he promises is “a world without hate.” Give this sort of thinking another century or two and some much more advanced technology, like a sophisticated brain-computer interface, and sure—Google is on track to become Landru. But the point of the science fiction version of this story is not just to warn against the specific outcome of having all of our brains directly controlled by a central computer. The point is to use that as a metaphor for the more prosaic dangers in the present day.
Landru’s solution to his society’s problems was to kill human creativity and individuality in the name of harmony. You may recognize this as prefiguring the same themes explored by Star Trek twenty years later with the Borg, particularly in this line from Landru: “You will be absorbed. Your individuality will merge into the unity of good, and in your submergence into the common being of the body.” One of the Google video’s headings echoes this: “Unus Pro Omnibus,” “One for All.” I’m kind of surprised they didn’t use, “Are you of the body?” Or maybe, “Resistance is futile. You will be assimilated.”
Gene Rodenberry’s critique—this episode was based on his own story, which he initially pitched as a pilot for the series—was based on contrasting the mechanical predictability of a life scripted by computer to the creativity and spontaneity of natural human life. Even Spock says that the society has no “spark.” As Kirk explains to Landru, “Without freedom of choice, there is no creativity. Without creativity, there is no life.” (Fans will also note that this is one of four times in the series that Kirk talks a computer to death, a favorite Rodenberry trope. The episode also features the first reference to the Prime Directive of non-interference—right before Kirk flagrantly violates it, another longstanding trend in the franchise.)
There’s another important theme lurking in there, which is suggested by the “Festival,” a 12-hour orgy of unrestrained violence and licentiousness allowed by Landru every year as a way for his subjects to vent the aggressive urges they are normally forced to repress. They have only two modes, total control or total anarchy, because they have never learned to control their behavior on their own or to get along with one another without someone telling them what to do.
Similarly, under the tutelage of Political Correctness, we are being prevented from learning how to talk to and get along with each other, substituting a bunch of restrictions imposed from the top down. Under the pretense of making us “woke,” it is making us fall asleep. It is telling that the main instruction our culture gives right now for how to be woke is to sit down and shut up while in the presence of someone woker. That’s true if you’re white, or if you’re a black person with an independent streak. The one thing we’re not learning is how to express our views and to jostle in the back and forth of a vibrant society without silencing each other. We’re going to have to learn that on our own, without the paternalistic guidance of a computer network with a funny name.
As for Google’s proposal to gather all of our information and use it to control our behavior, I agree with Jim Kirk.
The plug must be pulled.