Call for Feature Essays About Any Aspect of Popular Culture, Present or Past

Latest Posts

Bookmark and Share
Text:AAA
Saturday, May 19, 2007

Having previously offered a theory of boredom I read this post by blogger A White Bear with great relish. AWB claims to love boring art and cites slow and spare Antonioni films as examples of what she is talking about. I usually think of boredom as a character flaw, as a quality in the subject rather than the object, a kind of impatience with reticent, lulling things. But AWB suggests an aesthetics of boredom that hinges on a lack of “narrative necessity.” So-called boring things don’t provide that ready path for one’s mind to rush down, instead they perplex and retard the imagination, force the mind back on itself and make it do a lot more work to remain engaged. Boredom opens up a space for languid speculation, for productive frustration, for conceiving several concurrent hypotheses about what the hell the point is. Works that supply “excitement” often close off the speculative space, hurrying consumers along to largely predictable outcomes that nevertheless are experienced as suspenseful. No one doubts that the killer will be killed in a film like, say, Disturbia,yet nonetheless this pass as suspense. Something truly unpredictable—Antonioni’s L’Avventura, for example—can often seem boring in its aimless meandering. Boredom, then, stems from the absence of familiar genre conventions or recognizable rhythms.


AWB argues that the suspended nature of boring works gives them an erotic quality—the jouissance of the endlessly deferred climax transfered to a text that denies closure, that refuses to put forward a organizing principle that allows you to put everything in place. The unstructured material forces you to contemplate it for its own sake, in the moment, and that inspires a sensuality that comes to the surface when we starve the rational, puzzle-solving side of our brains. AWB sees this as a minor pleasure, dubbing it masturbatory—“satisfying without satiating.”


I don’t think I would describe masturbation that way. But is it true that details become more erotic as they become more arbitrary? In another post, AWB defines pornography in a similar way, as “a rhetorical mode” that lulls readers into vicariousness with its repetition and details for details’ sake. The danger of vicarious experience—what makes pornography immoral in AWB’s view—is that it supplants a reader’s direct experience of the world with texts, simulations. Here, I become skeptical, as I’m not sure how you distinguish authentic experience from simulated experience in our recursive world of late-capitalist consumerism. Most of our experience is already mediated, already in a sense secondhand, vicarious. Thus vicarious experience is virtually indistinguishable from lived experience, which if we encountered it, we’d probably reject as “boring” since it comes free of an interpretation, free from having been slotted into some pseudo-glamorous niche after having being represented elsewhere as entertainment, as worthy of rapt attention.


I would argue that the vicarious experience from texts prepares us to conceive and process our lived experience in certain prepackaged ways—it makes our own experience legible to ourselves, give it forms that are amenable to being stored as memories. Vicarious experience through entertainment also reinforces patterns of emotional satisfaction serviced by consumerism. I would define the pornographic not in terms of its arbitrarity but its instrumentality, how it facilitates the most convenient route to a precisely delimited goal. It makes emotional responses rote in a way that pleases us, the same way genre conventions keep us from being bored. Porn is sort of the ur-genre, I think, not a suspension of narrativity. It does aspire to a transparency that effaces textuality—the formal qualities and the careful word choices and the other things that make us pay attention to the medium itself; boring things force us to contemplate the nature of the medium, halting mediation.


I completely agree that consumers prefer what AWB calls the rhetorically pornographic because it protects us from the messiness of actual experience, which involves other people. (I make the porn=convenience argument here.)


I don’t want to moralize here about these aesthetic aspects of pornography, although a great deal of my dissertation is about the social and political issues that arise when readers begin to confuse texts for experience, and to privilege fictional characters’ experiences over their own, which partly happens because of the near-pornographic rhetorical nature of scenes, for example, in Pamela. But what is obvious, over the past three hundred years of mass print culture, and then mass art, music, and film cultures, is that the public always craves the rhetorically pornographic, especially when it does not explicitly offend a prudish sexual sensibility.
There is a fine line, in fact, that “pop” tends to walk now, between providing the pretense of an unmediated experience of something sensory and representing actual sex. My very prim, religious students are reduced to puddles by passages like the one from Dracula above, which gives them something almost like an experience without any of the messy complications of consequences or interpersonal relations, which is why, I think, they place such an outlandish value on their ability to experience along with every text…. What bothers me most is that they seem not to know the difference between reading and doing anymore.



I don’t think rhetorical porn—material that invites vicarious absorption—promises unmediated experience; I think it promises the pleasures of the precisely mediated. The transcendence (being above the action, it having been designed for you) compensates for the apparent passivity of vicariousness—whether or not the interpretative effort is passive, whether it is not a kind of activity to suspend disbelief and let yourself be absorbed by a text, is a different question. (This question reminds me that I should probably re-read Michael Fried’s Absorption and Theatricality.) This doesn’t efface the personality of the reader so much as remove it from the field of play, where it can take on the illusion of completeness, of being untouchable, of not being altered by experience.


Our relations with texts (be they films, books, crap bought at the mall or whatever) have the potential not so much replace our experience of reality, of other people, but format it so that it becomes instrumentalized. Reading and doing become the same thing, and not merely because so much of social existence is about interpreting signals and signs. Rather, the reader/doer presumes himself to be at the center of his own drama, with everything occurring around him seeming to have been designed solely for his amusement. This has the effect of granting identity to the reader, when seductive texts call out for one’s attention, interpolating the reader as an individuated subject, as Althusser argues in his essay about ISAs. The protagonists in pornographic literature may become interchangeable, but in so becoming they protect the reader from the fate of anonymity; the reader becomes the stable center around which the dance of anonymous entities and the highly detailed sensations recorded through them are choreographed.
(Hence compulsive self-revelation on the internet as a means to extend self-dramatization; a new tool to further a goal as old as print. Incidentally, my big research topic when I was a graduate student was about how vicarious experience was developed and systematically commercialized in 18th century novels. AWB’s dissertation sounds like one I would want to read, especially if it is as well-written as her blog is.) The way our entertainment patronizes us prepares us for life in this hermetic bubble. When entertainment bores us, that’s because it is forcing us outside of that bubble. (Porn is never boring in this sense; when clicking through channels randomly, while marooned in a hotel maybe, I never fail to pause when I hit porn; it’s compelling as the most concentrated form of mediation, perhaps—the most instrumental of entertainments, which all aspire to manipulate us with ease.) All that to say that I think what AWB is noticing in her students is the expectation that life itself shouldn’t ever transcend the level of vicarious experience, that life should never be anything more than entertaining.


Bookmark and Share
Text:AAA
Saturday, May 19, 2007

What is the point of Twitter?  This is not a rhetorical question; I’ve just been wondering ever since I read about it, in a surprising number of places—Wall Street Journal, Wired, Lifehacker, Boing Boing, Marginal Revolution (I think). Is the goal to elevate non-descriptness to an art form (one of my personal mottoes)? Is it an extension of the idea behind happiness research, where you wear a monitor that prompts you to record your hedonic well-being every so often? Is it intended to dignify everyday life as the locus of struggle and praxis? Is it a hymn to concision? Or is it just yet another way to turn the most powerful communication tool humans have yet devised into a mirror?


The point seems to be to encourage self-disclosure by taking away the intimidating open-endedness of the blank page, but do we really need more self-disclosure? Seems like there’s far too much of it already. This recent Psychology Today story, “The Decline and Fall of the Private Self,” investigates what makes people want to discuss themselves in public, and it airs the usual point that electronic media is disinhibiting (more than a few drinks! the article teases). But what does disinhibition end up meaning exhibitionism? Couldn’t it be that full disclosure is a compulsion that we are inhibited from refraining from? That we feel compelled to tell-all and that disinihibition could grant us the freedom to shut up?


Another common explanation for the Web blah-blahing the story touches on is the notion that we have to have media attention to validate our lives—the idea that, for instance, if you don’t have a Facebook page you aren’t really in college.


A blog makes your mundane life into an electronic saga that turns you into something more than an anonymous drone in a technological and impersonal world. “You now have a story and perhaps you’ve even become the focus of other watchers and listeners,” says Singer. “You become a character, a speaking part, in the larger theater of society.” Even if you’re playing the role of the loser—blogging about being unhappy and unattractive—at least you’re part of the show.


Sign me up for loser, and make sure it is as public a humilation as possible—that sounds great, very validating.


Just because there’s an audience, that doesn’t mean anyone’s actually paying attention. And being just an extra is probably worse than being in the audience. How does desperate self-publicity make one less anonymous? Haven’t these people seen MySpace? Nothing can be more anonymous than being one voice among many self-involved voices out there (I know; re: this blog); at least in imagination your self-aggrandizing story isn’t suffocated by the vast competition. In fact, personal blogs and social networking sites often seem to reinforce the mundaneness of selfhood, the limited and unimaginative options we fall back on—the size of our networks seems to merely emphasize our ultimate insignficance, as they loom disproportionately over what we can meaningfully process.


Are future generations in fact doomed to grow up with the idea that their internal monologues must be heard in order to be worth listening to themselves. Do they not recognize what they themselves are thinking until assured that someone else can see and comment on it as well? Externalizing our fantasies about our self-importance would seem to actually negate them, refute them—it’s what you would prescribe to someone who was too Walter Mittyish, even. The Psychology Today story suggests as much when it explains how confessing secrets online can be therapeutic and relieve stress.


He’s right: Telling secrets has been shown to have a positive effect on the person who’s doing the confessing, because keeping them requires a lot of mental work. Wegner has found that actively trying to suppress a thought (like trying not to think about a white bear) actually seems to repeatedly refresh your mental browser and bring it to mind. “It’s almost as though there’s a little corner of the mind that’s looking for the very thing you’re trying not to think about,” he says. Sharing the secret, though, “unprimes” the information, freeing the mind to focus on other things and breaking the cycle of worry. By recounting their sins and lapses, the AA member and the blogger can unload pesky thoughts and mull more productive ones for the rest of the day.


Maybe so. I’m convinced though that the less I think about myself, the better I am doing. So narrating the minutiae of my worries in unsublimated form would be like a sentence to purgatory for me.


Bookmark and Share
Text:AAA
Saturday, May 19, 2007

When reading Dennis Cass’s Head Case: How I Almost Lost My Mind Trying to Understand My Brain (HarperCollins, 2007), it’s important to manage your expectations about genre.  The title might lead you to believe it’s going to be a work of science reportage, like Steven Johnson’s Mind Wide Open.  That’s not quite right.  The book jacket compares it to Supersize Me, which implies a sort of culture-jamming, quasi-political approach.  That’s not right, either.  And the blurb says it’s “touching,” which is always a little worrisome.  If you bracket those expectations, however, Head Case turns out to be quite interesting. 


Head Case is actually several different books in one: Cass does subject himself to a battery of neurological tests, even self-medicating with Adderall, and he attends several neuroscientific conferences and has read a lot in the journals, and so to that extent it is a work of science reporting.  But thinking about minds and brains leads him, inevitably, into thoughts of his stepfather’s brain, tormented by addiction and manic depression, and of his first child’s rapidly forming brain.  Triangulating with wry humor among these three stories, Cass unpacks the discomfort many feel about thinking too closely about the brain. 


Thinking about the brain is so uncomfortable that at one point, while looking at functional MRI image of his brain, Cass “didn’t believe this brain was mine.  I found this disturbing.  Even though not feeling your brain is a perfectly healthy and normal thing, I thought that there was something sinister in how my brain denied its own existence.”  More darkly, he “went back over the Brain Logs, my diary of all the bad TV and fast food, and cringed.  I thought I was watching Terminator 3: Rise of the Machines with ironic detachment, but in reality the crap I was feeding my head meant something.  The brain was always on.  There was no work time and leisure time.”


Although Cass was raised by two drug addicts, one of whom ultimately suffered a psychotic break (and his natural father probably suffered from post-traumatic stress disorder), his stories of his childhood are not sentimental tales of victimization.  Instead, given the discussions of children’s theory of mind, the influence of childhood experiences, and so forth, Cass’s childhood emerges as a blackly comic source of potential psychopathology. 


Head Case is thoughtful, funny, and very accessible—I read it while proctoring an exam, and would vouch for it as summer reading.  An index and list of sources would be helpful; their absence serves as a cue that this book is less about understanding the brain and more about living, for better and for worse, with one’s mind.



Cass graciously agreed to answer a few questions about Head Case this week:


Head Case‘s distinguishing feature is its mix of science writing with two slightly different (though related) personal stories—your unforgettable relationship with your stepfather, and your slightly more standard-issue anxieties about fatherhood.  How/when did you realize that the book needed to open with your stepfather sprinting down Amsterdam Avenue?


When I started this project I had no intention of writing about my family. Yes my childhood was awful, but was it memoir awful? But then about halfway through my research I realized that in order for this story to make sense I would need to deal with my stepfather’s grandiose idea of conquering 80s New York. Then it became a technical matter. I wrote dozens of different opens, but having a prologue type thing about my stepfather’s psychotic break seemed like the best way to start. 


What interested me about Head Case‘s mixing of narratives is that it seems implicitly to contrast the two most culturally pervasive theories of mind in the past century: psychoanalysis and neuroscience.  (Implicit because you nowhere mention Freud or psychoanalysis, but that cultural mythology is so much about fathers and sons that it’s hard not to read it in.)  Was that deliberate?  If so, to what end? 


I wish I could take more credit for exploring thoughtful dichotomies, but in truth writing this book was an exercise in survival. Every day I got up and tried to make it good and every day the subject matter (neuroscience), the story (weaving together personal narrative, participatory journalism and memoir) and the tone (it’s supposed to be funny) kicked my living ass. But when it works (and it doesn’t always work) I think there is a lot of room for the reader to make these kinds of larger connections. This is a book that invites you to talk about it behind its back. 


A follow-on about psychoanalysis: In the 20s and 30s, many artists turned their back on psychoanalysis, not so much on scientific or medical grounds but on epistemological/aesthetic/ontological ones: They didn’t want to unravel the source of their art.  You voice similar doubts throughout Head Case.  Is this just part and parcel of thinking about the mind or brain? 


I don’t think this resistance is limited to artists or writers. Imagine that you and your friends are at a bar getting deliciously drunk. Nothing ruins the moment more than someway exclaiming, “We’re so wasted!”


After your experience with Adderall, which I recognize is colored to some extent by your experience of your parents’ drug abuse, how do you view the increasing use of, or acceptance of, such cognitive enhancers by college students and others?  Will our children see Adderall much like coffee? 


My problem with any kind of drug is that there is always a price. And I mean that physiologically, not morally. Because it’s time-released Adderall has lower side effects than traditional amphetamines, but still: after up time it’s down time. Which is too bad, because you can really read on that stuff. 


Near the book’s end, you’re not just skeptical about neuroscience’s ability to decode the brain, but instead see neuroscience as treating you abjectly.  (I’m thinking of the moment where you describe yourself as “covered in science cum,” even though you were “not having a good time.”)  Do you have ethical reservations about neuroscientific research, or is your recoil more idiosyncratic?


I think it’s a little of both. I think we’re probably going to discover something about the brain that we’ll regret discovering. That is no fault of science; it’s more a matter of the law of unintended consequences. Other than perhaps outright curing a disease like polio it’s hard to find any human endeavor where there weren’t unintended negative consequences. But mostly in that moment I just felt like a jerk. 


Head Case is preoccupied with its own writing—you take on several different roles during the narrative; you spend some time talking about the nature of insight, and so forth.  Has your writing process changed much since working on this book?


I often write myself into my stories and I do it for a lot of reasons. First, it’s fun. I can tell jokes and make observations that are more personal and idiosyncratic than if I were writing from a distance. Plus, I can serve as the through line, which lets me juggle a lot of different elements. I also feel that it’s more honest. If I’m there talking to someone or witnessing something, why pretend like I’m not? 


What bit of brain knowledge is either your favorite or something that haunts your dreams?  For myself, I could have done without learning that an “unpreserved brain would spread like pudding.”


Yes: the physical brain is pretty gross. But I think the most haunting thing is the idea that the 10% myth is just that: a myth. There is no secret door behind which lay wonders or a hidden switch that activates cognitive afterburners. We are using all of our brains all the time and this is what we get. This is your life. This is the world we have made for ourselves. Bon chance.


Bookmark and Share
Text:AAA
Friday, May 18, 2007


Bubby is 35 years old. He has lived in a grimy bunker like apartment all his life. His only companions are a feral cat, and his fanatical mother. Dictatorial and overbearing, this supposed parent treats her son horribly, making outrageous demands and ridiculous rules. Of course, with her boy now a man, she also benefits of his “matured” sexuality. Bubby cannot escape his claustrophobic world. Mother has told him that the air outside is contaminated and if the poison doesn’t get him, Jesus will. So Bubby stays inside, waiting for the next round of reprobate behavior. One day, a stranger comes knocking at the door. It turns out to be Bubby’s long lost father. Confused and scared, Bubby’s behavior turns even more twisted, and it’s not long before he has dealt with his family issues, and is off on his own. And the world turns out to be a strange and savage place for our stifled simpleton.


You have never seen a movie quite like Bad Boy Bubby. No David Lynchian surrealscape or David Cronenberg psychosexual splatter job can compare to the stellar, sinister magic director Rolf De Heer creates in this amazing masterpiece. Borrowing from his demented brothers in arms, De Heer uses many recognizable reference points to define a unique style and vision all his own. By fashioning experimental elements into a strong focus on character and narrative, the filmmaker takes us on a literal journey from Hell to Heaven. As much a coming of age as it is a mediation on the pitfalls of maturity, this is a Thomas Pynchon novel typed onto celluloid, a complex narrative where every scene has several meanings, and differing layers diverge and reform to create something wholly original and inspired with each configuration. It may be difficult to watch at first, and does deal with subjects and people that we’d never imagine tolerating, let alone taking an interest in. But somehow, with all the vileness and the vitality on hand, De Heer and his stellar cast manage to concoct a modern classic.


Part of the reason why Bad Boy Bubby works so well is its bravery. Obviously a product of its time – 1993 – and its place – Australia (Hollywood wouldn’t have touched this script with a script doctors glove soaked in antibiotics), De Heer pushes the limits of acceptable cinematic behavior from the very first series of shots. Using nudity as a symbol for both defenselessness and perversion, and playing simultaneously with the notions of neglect and incest, it’s hard to get a handle on what the film is offering. It’s almost like a sideshow, where freaks are paraded out for our amusement and morbid curiosity. Then slowly, as the unreal situations and circumstances become more and more agonizing, De Heer sets up his first stroke of storytelling genius.


We know Bubby is a prisoner in his hovel of a home, brainwashed into believing the world beyond the front door is filled with poisoned air, and that his mother is the only solace, physical or otherwise, he will ever require. Her overbearing browbeating has lead Bubby to become a kind of human Rosetta Stone, recording and reinterpreting everything around him as it passes through his orphaned, underdeveloped mind. So by the time the long lost – but equally bullying – father reappears, we are just as desperate as Bubby. We want to see what lies beyond that massive, ironclad apartment door. And when he does, Bad Boy Bubby becomes yet another experience all together.


Bad Boy Bubby‘s second “movement” is magically aimless, a series of vignettes and experiences as seen through the eyes – and most importantly, heard through the ears – of our lead character. The symphonic analogy is quite fitting here, as De Heer relies on music so frequently, it becomes a character in the film. Gorgeous organ solos, brash, yet equally atmospheric bagpipes, or the standard sonic boom of rock and roll, all chime in like harmonic Greek Choruses to remind us of our protagonist’s naiveté and innocence. Sound literally colors the world around Bubby. He is also filled with a lot of foul ideas, facets that have to be purged and tamed like the ferocity of an undomesticated animal. Music, in the film, does have the proverbial charms to soothe this savage, and little by little, note by note, the melodiousness sinks down inside, and starts the process of reviving Bubby’s soul.


In what has to be one of the most amazing third acts ever created, Bubby’s distress and disposition finally come full circle, able to be used and employed for both beneficial, and baneful purposes. That he becomes a rock star, and a kind of spiritual medium for the physically handicapped, may seem a bit pat (both situations seem fanciful and outside Bubby’s realm of existence), but De Heer makes them work because of the fantastic foundation he’s laid before. Throughout the course of the film, we’ve wondered how Bubby will fend for himself, as well as why fate allowed him to suffer so. The answer comes in his opposing abilities. He can use his incredible rage to vent a kind of industrial, cathartic punk rock. And he can use his naive sweetness and his non-jaded nature to speak with those whose voices are lost to “normal” people. All of this adds up to a profound and deeply moving cinematic experience.


But there is more to it than simple storytelling. The reasons for Bad Boy Bubby‘s majesty are indeed many. First and foremost, the performance by Nicholas Hope is flat out extraordinary. Looking like a more mannered Hugo Weaving (or a more insane Douglas Bradley), and mimicking many of the people he meets in the movie, Bubby is a wholly original creation, an intricate and infected innocent who may be smarter - or a lot dumber - than he appears. There are moments of high comedy in Hope’s interpretation, as well as deep, deep sadness. That we can get behind and support someone like Bubby, who seems simultaneously antisocial and empathetic, is as much a commendation of De Heer’s script as it is praise for Hope’s performance. This is the very definition of a tour de force.


So is De Heer’s direction. From the ideas floating around inside, to the way in which he chooses to illustrate them, Bad Boy Bubby brims with untold imagination. This is not just a narrative centering on mental/physical/ sexual abuse and bad parenting – it is also a discussion of God, a look at celebrity, a critique on aging and a swipe at social standards. This is a dense, dissertation of a film, a multifaceted test that offers something surprising with each and every viewing. This is the kind of movie one gets lost in, mesmerized by what they see and enraptured by what they hear. From its ominous beginnings to its optimistic end, Bad Boy Bubby retains its integrity and its power. This is one of the lost gems of world cinema.


Bookmark and Share
Text:AAA
Friday, May 18, 2007

Tyler Cowen’s NYT column yesterday pertains to some of the same issues about education raised in my previous post. He cites recent work by economists Claudia Golden and Lawrence Katz about the return to education and its role in generating income inequality and argues that variations in the supply of highly skilled (educated) workers explains trends in the income inequality gap.


Starting about 1950, the relative returns for schooling rose, and they skyrocketed after 1980. The reason is supply and demand. For the first time in American history, the current generation is not significantly more educated than its parents. Those in need of skilled labor are bidding for a relatively stagnant supply and so must pay more.
The return for a college education, in percentage terms, is now about what it was in America’s Gilded Age in the late 19th century; this drives the current scramble to get into top colleges and universities. In contrast, from 1915 to 1950, the relative return for education fell, mostly because more new college graduates competed for a relatively few top jobs, and that kept top wages from rising too high.
Professors Goldin and Katz portray a kind of race. Improvements in technology have raised the gains for those with enough skills to handle complex jobs. The resulting inequalities are bid back down only as more people receive more education and move up the wage ladder.


This paints a somewhat different picture than what Posner was painting, because the ultimate intention behind the argument is different. Though it’s not explicitly mentioned, Cowen’s argument suggests that higher education supplies the skills in a straightforward way—you learn things you need to know in class via texts and teachers and so on. Posner suggests that the content of education is arbitrary and college merely certifies skills that more or less pre-exist a student’s being admitted. That places him in the camp with the “pessimists” that Cowen acknowledges, who believe “that only so many individuals are educable at a high level. If that were the case, current levels of inequality might be here to stay.” Whether the pessimists’ belief in the existence of the intractably stupid precedes their concern that the state not subsidize education is an open question. The beliefs probably mutually reinforce each other.


Cowen seems more interested in a different point: “Nonetheless it will, sooner or later, become increasingly difficult to deliver the gains from college — not to mention postgraduate study — to the entire population. Technology is advancing faster than our ability to educate. So even if inequality declines today, it may well intensify in the future.” He seems less interested in halting education subsidies than stifling the argument about income inequality, which appears as inevitable.


I find myself still wondering whether the gains from higher education stem from its amplifying preexisting advantages in social capital (which the “pessimists” suggest is a product not of an unfair society but of different natural abilities—sorry, blame God) rather than the quality of what one learns in the classroom. I’d like to see the skills that are so important to the new economy delineated somewhere—maybe I should do some actual research on this point, but usually economists are content to point to skills-biased technological change and keep the skills themselves in the black box. (Part of my inquisitiveness is personal mystification: I’m not sure I can spell out exactly what I learned from college, and I would not at all feel comfortable claiming it justified any of its effects on my income.) I also wonder if there is anything useful the state can do to prevent education from being the means of perpetuating class privilege.


Now on PopMatters
PM Picks
Announcements

© 1999-2014 PopMatters.com. All rights reserved.
PopMatters.com™ and PopMatters™ are trademarks
of PopMatters Media, Inc.

PopMatters is wholly independently owned and operated.