Editor's Choice

Slowing down

Complaining about the technologically mediated acceleration of life and the loss of the time for contemplation has become a lot like crying wolf. From what I gather, people seem to be sick of hearing it -- as a meme it had its moment several months ago. Even though I've beaten that drum many times, I find myself thinking: Okay. Concentrating is hard, but then when hasn't it been? There are a surfeit of distractions; I get it. But it's not like I am going to go on an information fast and spend my free time meditating. I'm not going to dismantle my RSS feed and devote an hour a night instead to reading a single poem. Those seem like idealistic, nostalgic fantasies about the "life of the mind," which in practice would most likely amount to a refusal to engage with life as it is actually being lived. For example, I very much wish I was in a world without Twitter and maybe even without telephones, but that doesn't mean it's imperative that I live as if it were so. Down that road lies the technological equivalent of veganism, wherein everyone in my life would need to adapt to my fussy, righteous rules about which ubiquitous behaviors were permissible in my little world.

Still, though David Bollier's account of an April 2009 lecture (probably based on this paper, pdf) by media studies professor David Levy has its share of neo-Thoreauvianism in it, it nevertheless raises some points worth considering. The main gist is this: "The digital communications apparatus has transformed our consciousness in some unwholesome ways. It privileges thinking that is rapid, productive and short-term, and crowds out deeper, more deliberative modes of thinking and relationships." I have said the same sort of thing lots of times, but, as Levy asks, what actually constitutes the difference between "productive" thought and "deliberative" thought? I tend to think of the former as data processing -- tagging mp3 files, for instance -- and the latter as analytical inquiry, but it may not be so easy to distinguish the two. The mental modes tend to flow into one another. Working through menial mental tasks sometimes allows for inspiration to break through -- and after all, what is one supposed to be doing with one's mind when it is taking its time to deliberate? The "information overload" critique sometimes centers on the idea of slowing down the mind. But the mind is always moving, thinking one thought after another; the problem with the internet is that it gives it too many places to go all at once, has the potential to gratify too many idle curiosities. Bollier suggests that "We are sabotaging those inner capacities of consciousness that we need to be present to others and ourselves." But the dream that Levy attributes to Vannevar Bush seems a more apt description of what we've tried to do. "Bush’s intention was clear: by automating the routine aspects of thinking, such as search and selection, he hoped to free up researchers’ time to think more deeply and creatively." It's just that the two functions can't be separated; the way in which we think about things doesn't have degrees. It's holistic; we require routine tasks to fire our creativity, and creativity can often become routinized.

It's important to distinguish between having information at our disposal and lacking the discipline to make contemplative use of it. Often the two are implicitly elided, as if too much information automatically leads to frivolous surfing through it. Bollier writes, "Fast-time activities absolutely crowd out slow-time alternatives. The now eclipses the timeless. And we are becoming diminished creatures in the process." I don't quite understand this. We have to live in the now, because we are not "timeless." We die. And the problem with information overload doesn't lie with the activities and the media so much as they do with the approach we take to them, the ideology about information consumption we have internalized in the course of mastering these new technologies. We think they are supposed to make our lives convenient, and we measure that in terms of time efficiency. If we do many different things in the same span of time we once were forced to do only a few things -- if on the train we can read 17 books simultaneously on a Kindle rather than one -- than we are "winning." The pressure to consume more is not inherent to the technology or in some new perception of time, but is instead inherent to consumer capitalism, which fetishizes quantity. As Levy points out, the roots of this are in the "production problem" -- how to keep making more stuff if people are already sated and don't have the time to consume more. The solution: manufacture new wants and speed up consumption. So the consumerist imperative probably led us to develop many of these technologies. But still, we should be careful not to blame the tools for the kind of people we have become. (If Twitter went out of business tomorrow, many people's discourse would still remain superficial and inane.) If we have ceased to be able to love, it is not because we lack the leisure or are too distracted. It is because we have learned to privilege different sorts of experience, are rewarded for different sorts of accomplishments.

So the call for "an 'information environmentalism' to help educate people about the myriad and aggressive forms of mental pollution afflicting our lives" seems misguided. The "mental pollution" is an effect, not a cause, of our loss of contemplative peace. That is, our mental lives are not degraded by information but by a pervasive cultural attitude about it, that treats ideas as things to be collected and/or consumed.

ADDENDUM: Ben Casnocha's review of Tyler Cowen's new book presents a far more cogent critique of the "attention crisis" hullabaloo then what I've provided above.

We have always had distractions. We have never had long attention spans. We have never had a golden age where our minds could freely concentrate on one thing and spawn a million complex and nuanced thoughts. Cowen reminds us that charges to the contrary have been made at the birth of every new cultural medium throughout history. Moreover, the technologies that are supposedly turning our brain into mush are very much within our control. The difference between the new distractions (a flickering TV in the kitchen) and age-old ones (crying infant) is that the TV can be turned off, whereas the crying infant cannot.
He also notes the way in which chaos and "un-focus" can lead us to breakthrough insights. Though I don't remember agreeing with much of Sam Anderson's New York magazine essay in praise of distraction, this point that Casnocha highlights seems apropos: "We ought to consider the possibility that attention may not be only reflective or reactive, that thinking may not only be deep or shallow, or focus only deployed either on task or off. There might be a synthesis that amounts to what Anderson calls 'mindful distraction.' " That's what I was struggling to express above: thinking is thinking; subjecting it to binary categorizations does injustice to how it actually works and leads to unnecessary and useless prescriptions for how to provoke thinking of a certain type.

So far J. J. Abrams and Rian Johnson resemble children at play, remaking the films they fell in love with. As an audience, however, we desire a fuller experience.

As recently as the lackluster episodes I-III of the Star Wars saga, the embossed gold logo followed by scrolling prologue text was cause for excitement. In the approach to the release of any of the then new prequel installments, the Twentieth Century Fox fanfare, followed by the Lucas Film logo, teased one's impulsive excitement at a glimpse into the next installment's narrative. Then sat in the movie theatre on the anticipated day of release, the sight and sound of the Twentieth Century Fox fanfare signalled the end of fevered anticipation. Whatever happened to those times? For some of us, is it a product of youth in which age now denies us the ability to lose ourselves within such adolescent pleasure? There's no answer to this question -- only the realisation that this sensation is missing and it has been since the summer of 2005. Star Wars is now a movie to tick off your to-watch list, no longer a spark in the dreary reality of the everyday. The magic has disappeared… Star Wars is spiritually dead.

Keep reading... Show less
6

This has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it.

It hardly needs to be said that the last 12 months haven't been everyone's favorite, but it does deserve to be noted that 2017 has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it. Other longtime dreamers either reappeared or kept up their recent hot streaks, and a number of relative newcomers established their place in what has become one of the more robust rock subgenre subcultures out there.

Keep reading... Show less
Theatre

​'The Ferryman': Ephemeral Ideas, Eternal Tragedies

The current cast of The Ferryman in London's West End. Photo by Johan Persson. (Courtesy of The Corner Shop)

Staggeringly multi-layered, dangerously fast-paced and rich in characterizations, dialogue and context, Jez Butterworth's new hit about a family during the time of Ireland's the Troubles leaves the audience breathless, sweaty and tearful, in a nightmarish, dry-heaving haze.

"Vanishing. It's a powerful word, that"

Northern Ireland, Rural Derry, 1981, nighttime. The local ringleader of the Irish Republican Army gun-toting comrades ambushes a priest and tells him that the body of one Seamus Carney has been recovered. It is said that the man had spent a full ten years rotting in a bog. The IRA gunslinger, Muldoon, orders the priest to arrange for the Carney family not to utter a word of what had happened to the wretched man.

Keep reading... Show less
10

Aaron Sorkin's real-life twister about Molly Bloom, an Olympic skier turned high-stakes poker wrangler, is scorchingly fun but never takes its heroine as seriously as the men.

Chances are, we will never see a heartwarming Aaron Sorkin movie about somebody with a learning disability or severe handicap they had to overcome. This is for the best. The most caffeinated major American screenwriter, Sorkin only seems to find his voice when inhabiting a frantically energetic persona whose thoughts outrun their ability to verbalize and emote them. The start of his latest movie, Molly's Game, is so resolutely Sorkin-esque that it's almost a self-parody. Only this time, like most of his better work, it's based on a true story.

Keep reading... Show less
7

There's something characteristically English about the Royal Society, whereby strangers gather under the aegis of some shared interest to read, study, and form friendships and in which they are implicitly agreed to exist insulated and apart from political differences.

There is an amusing detail in The Curious World of Samuel Pepys and John Evelyn that is emblematic of the kind of intellectual passions that animated the educated elite of late 17th-century England. We learn that Henry Oldenburg, the first secretary of the Royal Society, had for many years carried on a bitter dispute with Robert Hooke, one of the great polymaths of the era whose name still appears to students of physics and biology. Was the root of their quarrel a personality clash, was it over money or property, over love, ego, values? Something simple and recognizable? The precise source of their conflict was none of the above exactly but is nevertheless revealing of a specific early modern English context: They were in dispute, Margaret Willes writes, "over the development of the balance-spring regulator watch mechanism."

Keep reading... Show less
8
Pop Ten
Mixed Media
PM Picks

© 1999-2017 Popmatters.com. All rights reserved.
Popmatters is wholly independently owned and operated.

rating-image