It’s the Twitter Olympics. Like? Dislike? The live feeds aren’t there or aren’t good, or, they’re spoiling the excitement of the evening TV replay. I’m panning wider: the Twitter Age. We talk about the Twitter Age the way we talked about the Nuclear Age. Like? Dislike? Doesn’t really make a difference because once here, technology—nuclear or cyber—doesn’t leave the room. A PDF is a most personally cherished possession as judged within a cultural mass psyche that cherishes all things personal. You have the power of personal choice, touch and response. Did I say instantaneous touch and response? And unlimited choices?
Is there a dark side? It begins not so dark, but if you think about it beyond a 140-character thought process, it all seems to be a burlesque ending to Enlightenment dreams so well lit in the mind that even postmodernity couldn’t dim them. But there’s a dimmer on now.
Emergency rooms around the U.S. report a growing number of people coming in with injuries resulting from not looking up from their hand-held devices to see where they were going. Nomophobia is a fear of losing your cell phone and it seems Nomophobics now abound and are seeking psychological help. Young people who have never been without such devices, can’t imagine how anyone could go on living “back in the day”, and, sad to say, cannot muster any interest in knowing about such technologically benighted times.
Whether we agree with the Buddhist notion that demotic forms of language are better at preserving democratic social values than elitist, hieratic forms, the technology for instantaneous, populist communication is a “done deal”. Our growing attachment to online life pushes our cultural imaginary toward imagining our own computer-like existence, pushing technology toward recombining the biological with the boolean, the computer chip with the cranial synapse. All this will leave behind the primary stuff of human nature that we no longer have an interest in exploring. And then perhaps it won’t matter because we would no longer have the means to do so.
Who can deny that an amalgam of man and computer seems a pretty cool future? The latest display is the android David, played by Michael Fassbender in Ridley Scott’s film Prometheus. “Limitless powers of analysis and extrapolation” writes Geoffrey O’Brien in the New York Review of Books of David, a mind alert to the details. What I can see, however, is that we’re not heading toward being that guy. In our desire to be closer to an online reality that is improving, accelerating and democratizing communication we are losing complexity and profundity not only in communication but in the minds doing the communicating. We now literally feel that the devil is in the details. We may do so because we’ve already developed a cultural ADD, an impatience with anything that doesn’t move at mouse click speed. The offline world doesn’t respond to our clicks.
Our response has been to make the domain of the “public” something like a private compound of friends; the idea of the “public good” alters accordingly. If we’re not really becoming more social but more like designers of a personalized world then assumed authority and superiority whether from newspapers “of record”, “public intellectuals”, refereed and copyright publishing, and “canonical” authors” become no more than luddite intrusions. A PDF in hand authorizes you and your words. Conversation and dialogue are transformed. We seem to be moving further away from social connectedness and closer to solipsism. When, however, the social is subsumed within the personal, we may have the feeling that our social connections have expanded when in fact what’s expanded is the isolated self, twittering and texting his or her way to a unique, self-empowered identity in a network of “sociability”.
Some argue that communicating with 140 characters doesn’t mean that we now only have the attention span and the mental acuity to read the Platonic Tweets and not the dialogues. We can still think in great detailed complexity. Wikipedia is an encyclopedia entrance to further and deeper reading. The hope is that somehow a mind impatient with detail, whose deepest criticism is that something is “too wordy”, is going to remain capable of understanding detail. The fear is that what is abbreviated and minimized leaves us unfit to meet the growing complexities of problems that seem to have already exceeded our new level of discourse. The reality may be this in regard to hopes and fears: TL;DR.