Science fiction often exploits the fear that we will invent computers that will become smarter than us and then attempt to extinguish our flawed and feeble, morally compromised race. The excellent Battlestar Galactica, whose third season starts soon (expect a like hype barrage like what was recently rolled out for The Wire), does some of the most interesting stuff with this trope, mainly by making the robots indistinguishable from humans and giving them an eschatological worldview. The cylons have a commitment to quasi-spiritual ideals, lending the conflict religious-war overtones have obvious significance for the alleged clash of civilizations currenly taking place in reality. The robots’ unflagging committment to their beliefs underscores the way humans waver and are repeatedly vulnerable to betraying one another. We don’t ever root against humans—as we do in Paul Verhoeven’s Starship Troopers, in which it slowly dawns on us that the humans are fascists, the real villains of the movie, and the mechanical-looking insects deserve our sympathy—but we can’t fail to see the implication that humankind tends to fracture into warring camps in the face of an implacable enemy. And there’s the usual overtones of human hubris and tampering in God’s domain and that sort of thing.
But eventually sci-fi will need to evolve a response to a phenomenon that’s potentially far more frightening: Rather than robots seeking to eradicate humans, humans become so impressed with the efficency of machines that they voluntarily seek to emulate them. It’s already happening all around us. For example, the book Mind Performance Hacks, recently promoted by BoingBoing, promises “tips and tools for overclocking your brain” and comes fully loaded with a host of other brain-as-processor metaphors. The brain is the hardware and consciousness the output of resident programs. The attraction of computer metaphors is that they seem to solve human problems by allowing us to conceptualize them in a ready-made way that makes them seem easily solvable by the march of technological process. Thus we talk of ideas as computer viruses, taking a biological metaphor that’s been technologized and repatriate it for humans. We see our own minds as programmable, controllable, able to be applied to discrete focused tasks. We talk about plugging ourselves into networks and so on. We imagine social life as a massive operating system in which everything has a deliberate function, so that it can seem comprehensible and managable. By imagining ourselves more like computers, we are to take the value system technology generates—one almost hegemonic in business culture—and apply it to our own behavior.
Well, come to think of it, this humans-wanting-to-be-hyperefficient-computers idea crops up even in the sci-fi I’ve seen (which is not much). There are the hyperintelligent mentats of Dune who drink a special potion to allow them to become human supercomputers. The Matrix depicted Keanu Reeves downloading information directly into his brain that became immediately functional—a kind of patch or software update, as though the brain ran on third-party programs. The human brain was regarded as passive, alien to the person whose head it was in. It was simply a matter of overwriting it with whatever the person was supposed to experience. One becomes configured as an end-user of one’s own brain, a mere consumer of the experiences it can be programmed to spit out. Consciousness is a step removed from the brain, which provides the data that consciousness enjoys, as though it were a film.
The Mind Hacks book takes mind-machinery a step further, promising to make the brain work more like a machine under the user’s conscious direction, which implies the user consciousness aspires to be more machinelike, more relentlessly productive. Rather than receiving data the brain spits out, consciousness merges with “subroutines” it can perform to think more mechanically, more efficiently. No doubt these things work—these kinds of ideas for human perfectibility and increased mental acuity have kicked around before as mnemonics or chisenbop or EST or hypnotherapy, bioengineering, methadrine, etc.—but what seems new is the insistence on the computer metaphor, as if to be a computer would be to live the dream.
My vague hypothesis about this is the following: that our economy’s emphasis on technology as a means to produce perpetual growth and wealth is having the effect of making us think that by becoming more machinelike, we become more human—we move closer to our human potential by mirroring the methods that have enhanced economic potential and productivity. This seems to fetishize information for its own sake. Information, now an unconquerable ocean, tempts us to master it through heroic feats of navigation, exploratory expeditions made purely for glory. Human potential, human experience may come to seem entirely a matter of information processing—and the faster your brain processes information, the more life one is cramming into our alloted time on earth. Efforts to absorb all this information can become a kind of flow experience, a way of entering the “zone” associated with atheltic accomplishment, and at that point one may seem to merge with the information itself, to become inseparable from its continual transmission. That might be the aspiration anyway, to become the best data you can be, so you still figure in the techno-future world. Social networking sites, which already seek to reduce ourselves (enhance ourselves?) into a flow of routinely updated data, may be the first florescence of this. And the burgeoning popularity of virtual spaces would be the next, integrating the data in a reconsitituted virtual self, bringing people a step closer to having the field for one’s identity laid out as a flexible, benevolent operating system, which lets one be ensconsced in the safety of programming logic, having shifted existence to a space where inhibiting personal anomalies can simply be debugged.
We all know how critical it is to keep independent voices alive and strong on the Internet. Your donation will help PopMatters stay viable through these changing and challenging times.