I’m a little old to still be shocked by the discovery that human beings can be profoundly and disturbingly irrational. Nonetheless, it isn’t easy to shed the chilling unease I experienced last year on a trip to New York City when I stopped to chat with some young people who were vociferously demonstrating on behalf of the 9/11 “Truth” movement (which believes the destruction of the World Trade Center buildings was an “inside job”).
Some of the demonstrators told me and other passersby that the pilots flying the hijacked airplanes (which weren’t hijacked at all, in their view) were really robots; that the calls the passengers made to their loved ones were faked by an army of impersonators hired by the CIA (or whomever); and, alternately, that there weren’t any “airplanes” to begin with, the objects that “crashed” into the World Trade Center buildings being, in actuality, clever holographic projections.
One of these protestors, when I suggested that these explanations were farcically beyond the pale, actually admonished me to study Occam’s Razor, the venerable principle that states that one should not increase, beyond what is required, the number of entities required to explain a phenomenon—or put another way, that the simplest explanation is usually the likeliest one.
It’s no exaggeration to state that his invoking of this principle, of all things, to buttress his belief in robo-pilots and infinitely complex conspiracies, left me feeling hopeless about the ability of the human race to reason its way out of a soggy, partially torn paper bag.
Gary Marcus, a professor of psychology at New York University, has doubtless encountered some of these same protestors on his daily rounds, but in his book, Kluge: The Haphazard Construction of the Human Mind, he doesn’t talk much about politics (except for a random swipe or two at our conduct of the Iraq war.)
Still, Kluge is a useful overview of the defects of the human mind that cause all of us—not just those manifestly misguided protestors—to engage in self-deluding, self-destructive, inconsistent, contradictory, bizarre, and just plain imbecilic thinking and behavior.
The book’s central thesis is contained in its initially cryptic title: To engineers, a “kluge” is a cobbled-together construction that more or less gets the job done when better solutions are not immediately available. One example noted in the book—a spectacularly successful one, though others fail spectacularly—is the improvised CO2 filter that the Apollo 13 astronauts jury-rigged from socks, plastic bags, cardboard boxes, and, of course, duct tape, when their real filter failed, threatening them with asphyxiation.
This assemblage, immortalized in the well-known Ron Howard movie, saved the lives of the astronauts, but Marcus’ premise is that the human brain is itself an evolutionary kluge that more often that not makes our lives miserable. Why? Because evolution, with all of its infinite and magnificent patience, is nonetheless an uneven, imperfect process that has resulted in a human body featuring, in the words of one scientist Marcus quotes, “useless protuberances above the nostrils, rotting teeth with trouble-prone third molars, aching feet … easily strained backs, and naked tender skin, subject to cuts, bites, and, for many, sunburn. We are poor runners and are only a third as strong as chimpanzees smaller than ourselves.”
And that’s not even considering more substantial genetic flaws ranging from horrors like Huntington’s Chorea or susceptibility to certain kinds of cancer, to a host of mundane imperfections such as warts or the gout. The human body is, the scientist states, “a bundle of imperfections.”
And the biggest botch of all is the brain.
Kluge’s chapters take the reader through a number of interesting examples of how our brains work, and don’t work, such as our frequent failures to resist temptation, our predilection for making poor economic choices, our susceptibility to false memories, and our consistent inability to separate correlation from causation.
In this vein, Marcus notes amusingly that people with bigger shoe sizes tend to know more about history and geography than people with small shoe sizes, which seems suggestive of something rather important until he points out that “people with the littlest feet (and tiniest shoes) are our planet’s newest visitors: infants and toddlers, human beings too young to have yet taken their first history class. We learn as we grow, but that doesn’t mean that growing (per se) makes us learn.” Wars and genocides have started from correlations just as confused.
And speaking of footwear, consider another example Marcus cites, from a study by the economist Richard Thaler. Imagine that you’ve bought a pair of very expensive shoes and discover after a few wearings that they don’t really fit. Thaler’s data suggests that “the more you paid for the shoes, the more times you’ll try to wear them. Eventually you’ll stop wearing the shoes, but you do not throw them away. The more you paid for the shoes, the longer they will sit in the back of your closet before you dispose of them. At some point, you throw the shoes away, regardless of what they cost, the payment having been fully ‘depreciated.’ “ This makes very little sense, but most of us have behaved similarly.
Kluge is a short and not terribly substantial book, and the writing is pedestrian. So, too, are some of the citations. No reader alive could possibly need to be reminded, for the thousandth time, that F. Scott Fitzgerald once said, “The rich are not like us,” nor that Ernest Hemingway responded by saying, well, you know. And, for a passage on self-delusion and rationalization, Marcus, like a thousand authors before him, quotes Garrison Keillor’s description of Lake Woebegone, “where the women are strong, the men are good looking, and all of the children are above average.” Yawn.
But over its brief length, the book offers a fair helping of useful advice for understanding, and bypassing, our thinking problem—ways, in other words, of creating our own kluge to defeat the botched-up bricolage that the exigencies of evolution installed between our shoulders. And he concludes the book, in a chapter titled “True Wisdom,” with some commonsensical takeaways to guide us through our days, ranging from “beware the vivid, the personal, and the anecdotal” to “whenever possible, consider alternate hypotheses.”
Though not, I would add, if these hypotheses involve robots and holograms.
"Ever wondered what the difference between cinnamon and cassia is? The Encyclopedia of Spices and Herbs will teach you.READ the article