PM Pick

Asymmetric paternalism

I'm generally sympathetic to the arguments of behavioral economists, who want to broaden economics in efforts to account for humankind's irrationality, defined as its failure to always maximize utility and make choices that will lead to the most bountiful outcomes. But while reading John Cassidy's New Yorker story about neuroeconomics I found myself resisting the whole rational/irrational paradigm, which suddenly seemed impoverished. Suddenly the general humility of economics seemed much more appealing than the hubris of the neuroscientists who seem on the verge of suggesting we neutralize certain lobes of our cerebral cortex to make ourselves more "rationally" profit-seeking or to snort oxytocin (a hormone which induces loving feelings toward others) to make ourselves more trusting and therefore more economically efficient in commercial contexts. Rather than respect the different kinds of decision-making processes humans have adapted, it sounds as though some of these econo-scientists would like to modify people so they fit the traditional homo economicus models more comfortably. It seems better to amend the models or limit their applicability than to force humankind to exhibit the remorseless efficiency they presume. By the end of the article Cassidy is citing economist David Laibson pitching a dualistic model that mimics the Cartesian mind-body split: "The modified theories to which Laibson referred assume that people have two warring sides: the first deliberative and forward-looking, the second impulsive and myopic. Under certain circumstances, the impulsive side prevails, and people succumb to things like drug addiction, overeating, and taking wild gambles in the stock market." If this is so, I wonder whether these sides would have a consistent, predictable influence on the other, or whether they might not work simultaneously and independently. One can overeat while plotting an extremely rational stock portfolio. And a certain amount of pleasure derives from avoiding decisions altogether, from surrendering, from refusing to calculate, from inertia or expediency -- Cassidy himself notes he made decisions that were expedient when the machine measuring his brainwaves began to make him claustrophobic. At some point it becomes rational to be irrational; irrationality is not merely a consequence of emotions inappropriately obtruding.

My somewhat paranoid concerns about forced rationalism grew strongest when Cassidy discussed "asymmetric paternalism":

Reforming 401(k) plans is an example of “asymmetric paternalism,” a new political philosophy based on the idea of saving people from the vagaries of their limbic regions. Warning labels on tobacco and potentially harmful foods are similarly intended to keep subcortical structures in check. Neuroeconomists have suggested additional policies, including warning buyers of lottery tickets that their chances of winning are practically nonexistent and imposing mandatory “cooling off” periods before people make big-ticket purchases, such as cars and boats. “Asymmetric paternalism helps those whose rationality is bounded from making a costly mistake and harms more rational folks very little,” Camerer, Loewenstein, and three colleagues wrote in a 2003 issue of the University of Pennsylvania Law Review. “Such policies should appeal to everyone across the political spectrum.”

You don't have to work for the Cato Institute to find this dubious. None of these specific policy prescriptions seem problematic, but the logic behind them is worrisome. Some people can't be trusted to act in their own interest -- but who defines what that is, and by what criteria? Who gets to say what a rational person "should" do? Who get sto decide which reasons for acting are "bad" or "wrong"? Here, neuroeconomists fall back on the definition of profit/utility maximization as a defintion of rationality. You can see how this rationality, enforced by paternalistic measures, could easily become a prison, the bureaucratic nightmare Adorno evokes in his critique of Enlightenment positivism. It's like using an ad to advertise the idea that paying attention to ads is harmful -- these kind of measures are designed to bring people "back" to their senses while reinforcing the idea that the there's no need to return since common sense and rationality are already being retrofitted into the options society presents them. The underlying assumption of asymmetric paternalism is that people are sheep, with no strong reasons for doing what they do, so they may as well be encouraged/forced to do what can be deemed most beneficial socially.

Cassidy quotes Laibson on the nature of this paternalism “The practical implications of the experiment come from obtaining a better understanding of the human taste for instant gratification,” Laibson said. “If we can understand that, we will be in a much better position to design policies that mitigate what can be self-defeating behavior.” I'm not going to make the arguement that "self-defeating" is a contradiction in terms, as some economists sometimes seem to imply (if every choice by definition reveals a preference, then how can you choose what doesn't suit your own wishes without being coerced?) -- but who's to say what is "self-defeating" and in what circumstances? What sort of policy could cover all the exceptions? Time has a different value to different people in different circumstances -- influencing that value is what exploiting convenience is all about. The taste for instant gratification may be impulsive or may be a matter of what an indivdual considers timely -- it seems foolish to, say, buy an BluRay DVD player right now, but if you derive all sorts of satisfaction from being the first on the block to have one, you can't afford to have your gratification delayed. And it seems dumb to buy lottery tickets, but they are licenses for invaluable fantasy for some. So what may seem like poor decision-making to us could just be part of the plan. There's no sure way of accounting for other people's notions of utility.

If we are going to institute some of these measures, I'd rather they be sold not as something for my own good but something that is for the social good -- you will be defaulted to save in a 401 (k) because it will help prevent sociey from having to support you when you are old and destitute, or spare society the sight of your suffering. You will be discouraged from smoking, because your smoke poisons others and because society doesn't want to bear the burden of your medical costs. And so on. Leave people with the illusions that they know what is best for themselves and encourage the notion that everyone will be making sacrifices for the common good -- this seems better, ideologically speaking, than having the state work to maximize outcomes for individuals.

A side note: I'm a bit puzzled by the ultimatum game:

A good way to illustrate Cohen’s point is to imagine that you and a stranger are sitting on a park bench, when an economist approaches and offers both of you ten dollars. He asks the stranger to suggest how the ten dollars should be divided, and he gives you the right to approve or reject the division. If you accept the stranger’s proposal, the money will be divided between you accordingly; if you refuse it, neither of you gets anything.

How would you react to this situation, which economists refer to as an “ultimatum game,” because one player effectively gives the other an ultimatum? Game theorists say that you should accept any positive offer you receive, even one as low as a dollar, or you will end up with nothing. But most people reject offers of less than three dollars, and some turn down anything less than five dollars.

It seems to me that this game sets up a reference group of you and the other person that makes invidious comparison inevitable. Thus as the other person gets richer, you get poorer by comparison. It seems perfectly rational from that point of view to demand an even split, or have neither of you gain anything. Only by imagining a fictitious reference group -- i.e. not the person you are in the game with but people you are theoretically comparable with -- can you make the game theorists' rational choice. Rationality ends up depending on a rich, healthy imagination.

From genre-busting electronic music to new highs in the ever-evolving R&B scene, from hip-hop and Americana to rock and pop, 2017's music scenes bestowed an embarrassment of riches upon us.

60. White Hills - Stop Mute Defeat (Thrill Jockey)

White Hills epic '80s callback Stop Mute Defeat is a determined march against encroaching imperial darkness; their eyes boring into the shadows for danger but they're aware that blinding lights can kill and distort truth. From "Overlord's" dark stomp casting nets for totalitarian warnings to "Attack Mode", which roars in with the tribal certainty that we can survive the madness if we keep our wits, the record is a true and timely win for Dave W. and Ego Sensation. Martin Bisi and the poster band's mysterious but relevant cool make a great team and deliver one of their least psych yet most mind destroying records to date. Much like the first time you heard Joy Division or early Pigface, for example, you'll experience being startled at first before becoming addicted to the band's unique microcosm of dystopia that is simultaneously corrupting and seducing your ears. - Morgan Y. Evans

Keep reading... Show less

The year in song reflected the state of the world around us. Here are the 70 songs that spoke to us this year.

70. The Horrors - "Machine"

On their fifth album V, the Horrors expand on the bright, psychedelic territory they explored with Luminous, anchoring the ten new tracks with retro synths and guitar fuzz freakouts. "Machine" is the delicious outlier and the most vitriolic cut on the record, with Faris Badwan belting out accusations to the song's subject, who may even be us. The concept of alienation is nothing new, but here the Brits incorporate a beautiful metaphor of an insect trapped in amber as an illustration of the human caught within modernity. Whether our trappings are technological, psychological, or something else entirely makes the statement all the more chilling. - Tristan Kneschke

Keep reading... Show less

Net Neutrality and the Music Ecosystem: Defending the Last Mile

Still from Whiplash (2014) (Photo by Daniel McFadden - © Courtesy of Sundance Institute) (IMDB)

"...when the history books get written about this era, they'll show that the music community recognized the potential impacts and were strong leaders." An interview with Kevin Erickson of Future of Music Coalition.

Last week, the musician Phil Elverum, a.k.a. Mount Eerie, celebrated the fact that his album A Crow Looked at Me had been ranked #3 on the New York Times' Best of 2017 list. You might expect that high praise from the prestigious newspaper would result in a significant spike in album sales. In a tweet, Elverum divulged that since making the list, he'd sold…six. Six copies.

Keep reading... Show less

Under the lens of cultural and historical context, as well as understanding the reflective nature of popular culture, it's hard not to read this film as a cautionary tale about the limitations of isolationism.

I recently spoke to a class full of students about Plato's "Allegory of the Cave". Actually, I mentioned Plato's "Allegory of the Cave" by prefacing that I understood the likelihood that no one had read it. Fortunately, two students had, which brought mild temporary relief. In an effort to close the gap of understanding (perhaps more a canyon or uncanny valley) I made the popular quick comparison between Plato's often cited work and the Wachowski siblings' cinema spectacle, The Matrix. What I didn't anticipate in that moment was complete and utter dissociation observable in collective wide-eyed stares. Example by comparison lost. Not a single student in a class of undergraduates had partaken of The Matrix in all its Dystopic future shock and CGI kung fu technobabble philosophy. My muted response in that moment: Whoa!

Keep reading... Show less

'The Art of Confession' Ties Together Threads of Performance

Allen Ginsberg and Robert Lowell at St. Mark's Church in New York City, 23 February 1977

Scholar Christopher Grobe crafts a series of individually satisfying case studies, then shows the strong threads between confessional poetry, performance art, and reality television, with stops along the way.

Tracing a thread from Robert Lowell to reality TV seems like an ominous task, and it is one that Christopher Grobe tackles by laying out several intertwining threads. The history of an idea, like confession, is only linear when we want to create a sensible structure, the "one damn thing after the next" that is the standing critique of creating historical accounts. The organization Grobe employs helps sensemaking.

Keep reading... Show less
Pop Ten
Mixed Media
PM Picks

© 1999-2017 All rights reserved.
Popmatters is wholly independently owned and operated.