Latest Blog Posts

by Rob Horning

13 Nov 2008

This is the last paragraph from David Leonhardt’s article yesterday about consumer confidence (I would have made it the lead):

It would be silly to insist that a few terrible months meant the end of American consumer culture. But it would be equally silly to assume that culture could never change. It might be changing right now.

Data and anecdotes support the notion that consumers are currently spending less and mean to cut back even more—Best Buy’s CEO declared that “rapid, seismic changes in consumer behavior have created the most difficult climate we’ve ever seen.” The FT’s Lex column today wondered whether “conspicuous aceticism” might become the “new ostentation,” producing “structually lower levels of demand across all areas of discretionary spending.” I’m still pessimistic, though, that this amounts to a rupture with the culture that is all any of us born after World War II have known.

Nevertheless, I don’t think that means Americans are incurably optimistic. One of the strangest things about the business press, and I’m still not used to it, is how optimistic is usually a complimentary term, a boon and a benefit. Where I come from intellectually, it tends to mean you are a useful idiot or a rube. That seems especially true when applied to consumers.

Andrew Kohut, president of the Pew Research Center, noted that his recent polls showed a sharp rise in the number of people planning to cut back on spending — but also a clear increase in the number who expected the economy to be in better shape next year. “What the American economy has going for it is the innate optimism of the public,” he said. “Americans get optimistic at the drop of a hat.”

We don’t need a reason to expect the best; we’re just dog-like in that way. Our masters are going to put something good in the bowl; we just know it.

Also, is shopping rather than saving really an expression of optimism? “I am feeling very positive. I’m going to go buy a TV set.” Seems like it is a preference for living for today instead of having faith or concern with tomorrow. I guess the idea is that confidence in our future earning capabilities makes us more likely to spend now, but I always (wrongly) interpret consumer confidence as meaning “confidence in the consumer way of life.” When it is high, it suggests to me a vote of no confidence in the possibility of meaningful work, of finding purpose, confidence, hope, etc. in making and doing rather than spending and getting. It’s as though consumers are surrendering by being confident in the pleasures of consuming, and that when consumer confidence falls, people are indicating that they suddenly enjoy consumption less. Falling consumer confidence seems like it should mean rising personal confidence. But that of course isn’t the case. They just aren’t confident enough about having a healthy flow of cash to support all the spending they wish to perform.

Still, the term consumer confidence seems to relegate people to their passive roles, whereas these same people also are part of the production process. But we are accustomed to thinking that the only role we take pride and pleasure in is our role as consumer; it’s through that process that we make ourselves with as much autonomy as we like—not the working world. What’s hard to take is how often disappointment in American consumers is expressed, for letting the economy down, for their thinking of other ways to make it through their days without ceaseless spending on consumer goods. How dare they? Have they lost their minds? Why can’t they be more optimistic and compliant?

by Rob Horning

13 Nov 2008

When business journalists mention the flight to quality, they usually mean investors shedding risky investments and buying government Treasurys, blue-chip stocks, gold ingots. But in a recession (consumer spending was down 3.1 percent last quarter, which is an astonishingly high number), consumers may make their own humble flight to quality as well. I was stuck by a line from this Economist article about American retailers’ coming struggles: “Among deep discounters, too, such as Dollar General and Dollar Tree, which have benefited from shoppers looking for the best possible value, the leaders are gaining at the expense of laggards. Even dollar stores are finding life harder, as customers are somehow finding their way to goods that yield their sellers the very lowest profit margins.” The word “somehow” intrigues me in that last sentence—the lowest profit margins probably come from the goods that give consumers the most value, and when their minds are focused by hard times, they can perhaps ferret out that value more ably. That’s a tall assumption—that use value correlates negatively with profit margin—but I’m going to go with it to indulge in some speculation.

It seems to me that in the 99-cent store, where there are no coherent efforts at marketing, branding, or promotion, consumers are at less of a disadvantage; some of the information asymmetries that marketing systematically tries to create are absent. And distortions of use value created by price signals are muted, since everything is priced the same. So without all that static, consumers can perceive the actual usefulness of goods more clearly. But in order for that to happen, consumers must overcome the initial disorientation that comes with shopping in such an arid retail environment. Marketing and branding, etc., all ultimately save us time and make our shopping at once more efficient and more pleasurable—we can fly into fantasy thanks to the narratives advertising has enchanted the goods with. At the 99-cent store, goods are disenchanted and bewildering in their profusion. We are forced into a different mind-set when we go there—a skeptical, distrustful attitude that has us interrogating the goods rather than open to being seduced by them. This is the opposite of what profit-seeking retailers generally try to accomplish: this McKinsey report summarizes what the typical goal is:

Retailers with good financial health in mature industries can also go on the offensive, taking actions to quickly grow revenue by driving traffic into stores through more compelling offers and ensuring that staff is ready on the floor for the assisted sale. For example, a North American soft goods retailer has reversed declining sales, improved customer satisfaction, and increased the frequency and average size of transactions by focusing on eliminating out-of-stocks, raising the effectiveness of front-line salespeople, and making small store-layout changes that help customers find the goods they want.


It’s worth remembering that these efforts improve the retailers’ bottom line, not the consumers’. (It has been Best Buy’s strategy in crushing Circuit City, which announced it was closing The consumers spend extra for the accessibility, not for quality; if they are trained by hard times to eschew that, then they can save what the ordinarily pay to spruce up the shopping experience while still satisfying their “needs”—that is, if you accept that there is such a thing as the difference between wants and needs.

by David Pullar

13 Nov 2008

Thanks to my fiancée working in a bookshop, I have been fortunate to discover a bizarre sub-genre of book that I would never have heard of otherwise: the cosy murder mystery.

The proliferation of hard-nosed TV cop shows of the CSI ilk has given me the impression that murder is a pretty grisly business.  Yet apparently there is a section of the populace that like their murders with a side of handicrafts and a dressing of soothing familiarity.

My first experience with this style of book was with the inimitable Laura Childs and her scrapbooking murder mystery Photo Finished.  Not only has Childs produced six murder mysteries set in a New Orleans scrapbooking shop (all loaded with helpful exposition on decorative edging), she is also responsible for a series of teashop mysteries.  Photo Finished is appallingly written and the plot is nothing short of absurd, but that all seems irrelevant when you discover that Childs is far from the only writer working in the genre.

The concept behind these books is not so strange.  After all, Agatha Christie’s Miss Marple books featured a cosy English village that was unusually prone to homicide.  Last century, butlers were notorious for bumping off houseguests in novels—possibly lashing out at their declining employment prospects.  Why should scrapbooking shops be any less popular locales for murders?

Nevertheless, it’s an intriguing combination.  Could there be anything less comforting than the prospect of being sent packing from this life while enjoying the simple pleasures of flower shopping (Shoots To Kill), drinking coffee (On What Grounds) or…gasp…teddy-bear collecting (The Clockwork Teddy)?  In fact, the thriller genre has been most effective when it has shown crime intruding into the safest places.

The trick with the cosy murder mystery seems to be to keep the murder part to a minimum.  Killings are brief, absent grisly detail and usually of incidental characters we have not had time to get used to.  I suppose this is one way to maintain the “cosy” vibe, but it does seem to defeat the purpose of a murder mystery.

I’m trying to work out what the existence of this genre says about humanity.  If we sidestep the question of why some people are so keen on scrapbooking that they want their murder mysteries to involve it, we’re still left with this: why people simultaneously crave the excitement of bloodshed and the comforting knowledge that it won’t happen to them and they can go back to their quilting afterwards.

Maybe decades of crime fiction has reduced murder to a simple plot trick.  We’re no longer interested in the procedure of detection or the psychology of crime.  We’re really just looking for an excuse for our characters to momentarily escape their lives and have an “adventure”.

That would at least explain why so many of these books are about really boring hobbies.  If you’re writing about skydiving or spear-fishing in the Marianas Trench, then you hardly need to bump off one of your characters with a pair of craft scissors—the thing is exciting enough as it is.  On the other hand, writing 200-odd pages about a group of cat-sitters would drive anyone to murder.

by Sean Murphy

12 Nov 2008

I. Personal

Remember when Born in the U.S.A. was ubiquitous? The album and the song. Bruce was already big, but he wasn’t over the top. Born in the U.S.A. put him over the top and, to a certain extent, he’s stayed there ever since. Of course, people in the know understood he was already a legend before the ‘70s ended; in the early ‘80s The River and Nebraska cemented that status, but Born in the U.S.A. ensured that no one could ever ignore The Boss.

I already owned scratchy LP copies of Born to Run and Darkness on the Edge of Town, as well as original (and shitty sounding) cassette copies of the oft-overlooked but brilliant first two albums (Greetings from Asbury Park, N.J. and The Wild, The Innocent, and the E. Street Shuffle), so by the time Born in the U.S.A. hit the market, I was admittedly wary of the frenzied and new-fangled faithful joining the party. But other, more disconcerting forces were at play: the album, as good as it was, wasn’t that good. “Dancing in the Dark”, “I’m on Fire”, “No Surrender”, “My Hometown”? Eh. “Glory Days” was pretty much an instant classic, but (as is always the case with FM-friendly tunes, and never the fault of the artist) overplay hasn’t helped its staying power. But the big hit, the title track, the song that seemed to shoot through the dial 24/7, that one was a love or hate affair. I hated it. If ever there was an arena-ready anthem, this was it. And the muscle-bound Bruce from the video? Give me the spindly Serpico clone from ’78 any day.

(Interesting coincidence: Springsteen had a difficult time getting the track to sound the way he wanted it. Indeed, it was an outtake from his stark solo effort Nebraska. This is not unlike the origins of another overplayed song from the ‘80s, the Rolling Stones’ insufferable “Start Me Up”. That one was originally cut as a reggae-ish romp, before it devolved into the over-produced, if innocuous hit it was destined to be. “Start Me Up”, to be certain, is a lark, and it was—for better or worse—fated to be recycled for eternity at sporting events. “Born in the U.S.A.”, on the other hand, is actually a serious song and, as it happens, is much better than it sounds.)

Perhaps it’s my own fault, but it took several years before I even figured out the words Bruce was singing; perhaps it’s due to his overwrought delivery—equal parts marble-mouthed and shouting. Regardless, this is quite possibly Springsteen’s most somber song—and considering the era (Nebraska) it was written, that is saying a great deal. (And for the curious, it’s well worth checking out the (far superior) demo version that didn’t make the cut for the Nebraska album.) It made all the sense in the world, then, when Springsteen hit the road for his subdued Tom Joad tour in the mid-‘90s, he made the searing, stripped-down version of this song a centerpiece of the show. His hand pounding the acoustic guitar to simulate a heart beat at the song’s coda remains one of the most quietly powerful and emotional moments I’ve ever witnessed at a concert.

II. Polemical

Check it out:

Born down in a dead man’s town
The first kick I took was when I hit the ground
You end up like a dog that’s been beat too much
Till you spend half your life just covering up

Born in the U.S.A.
I was born in the U.S.A.
I was born in the U.S.A.
Born in the U.S.A.

Got in a little hometown jam
So they put a rifle in my hand
Sent me off to a foreign land
To go and kill the yellow man

(chorus)

Come back home to the refinery
Hiring man says “Son if it was up to me”
Went down to see my V.A. man
He said “Son, don’t you understand”

I had a brother at Khe Sahn fighting off the Viet Cong
They’re still there, he’s all gone

He had a woman he loved in Saigon
I got a picture of him in her arms

Down in the shadow of the penitentiary
Out by the gas fires of the refinery
I’m ten years burning down the road
Nowhere to run ain’t got nowhere to go

This song is, upon closer inspection, a staggering achievement. With few words and admirable restraint, Springsteen captures the cause and effects of the Vietnam war from the perspective of an ordinary American, the afflicted civilian. More, he moves the narrator into the here-and-now, making the uncomfortable point that the war never died for the people who managed to live. Movies like The Deer Hunter and Coming Home dealt with Vietnam’s immediate aftermath—the dead or wounded—but not many artists (certainly not enough artists) articulated the dilemma of the working poor who returned from the front line to become the unemployed, or unemployable poor. The vets who ended up in jail, or hospitals, or sleeping under bridges. Or the ones always on the edge (this was, remarkably, a time when shell shock was still a more commonly used term than Post Traumatic Stress Disorder and, as George Carlin astutely pointed out, perhaps if we still called it “shell shock” it might be less easy to ignore), the ones who, by all outside appearances, could—and should—be finding work, and contributing to society, and staying out of trouble. As politicians of a certain party confirm time and again, you cease to be especially useful once you’re no longer in the womb, or no longer wearing the uniform.

On albums like Nebraska and Darkness on the Edge of Town, Springsteen presented stories of the dirty and the desperate, the men and women straddling the line between paychecks and prison, the ones wrestling with the hope and glory inherent in the mostly mythical American Dream. All of them had a story, and many of them were archetypes from small towns and big cities all across the country. But “Born in the U.S.A.” might be the first instance where Springsteen takes a topical dilemma and wrestles with an entire demographic: the veterans with “nowhere to run (and) nowhere to go”.

Of course, in an irony that could only occur in America, none other than our PPP (proudly patriotic president), Ronald Reagan, (or, more likely, his handlers) utterly misread the song and tried to appropriate it as a feel-good anthem for his 1984 reelection campaign. Predictably, Springsteen protested. But what Reagan and his opportunistic underlings heard was, in fairness, the same interpretation so many other Americans shared. And who cares, anyway? It’s just a song after all. And yet, it is a shame that such an effective, and affecting, observation was celebrated as representing the very facile values (unthinking nationalism, unblinking pride) it calls into question. Again, Springsteen and his band deserve no small amount of artistic culpability for marrying such stark lyrics to such a buoyant, fist-pumping, car commercial sounding song. People hear those martial drums and think of John Wayne instead of Travis Bickle.

Travis Bickle, from Taxi Driver

III. Political

Why bring politics into it at all, one might ask? Music can be, and certainly is, enjoyed regardless of what it was intended to inspire. If a song moves you, or manages to make sense in ways that directly contradict the artist’s design, beauty is forever in the eye of the beholder. On the other hand, as George Orwell noted, “the opinion that art should have nothing to do with politics is itself a political attitude”. Put another way, “Born in the U.S.A.” is still relevant because the issues it confronts are still relevant. We not only have (entirely too many) struggling veterans from last century’s wars, we will have no shortage of men and women who have fought (or are currently fighting) in this generation’s imbroglio. History only makes one promise, and it’s that it will ceaselessly repeat itself.

And so, even as our ill-advised adventure in Iraq reaches its inevitable endgame, we will only be in the initial stages of dealing with the veterans who need care and attention. We won’t count the ultimate cost of “mission accomplished” until we consider the lives lost and the walking wounded, tallied up alongside the untold billions of dollars. This is reason enough to be grateful for an Obama administration (the irony that a genuine war hero, had he managed to win, would have necessarily been obliged to overlook those in need of help to pacify the string-pullers in his party, was, thankfully, too outrageous even for America to make possible). The Democrats can’t create miracles, but they can continue to ensure that the people owed the most won’t get the least.

Remember this, when the ankle-biters and small-government-soundbite hyenas crawl out of their tax-payer fortified foxholes to decry liberal “big spending” programs. Remember it’s these programs that, in addition to paving roads, building schools and providing health care, attempt to secure some support and solace for our broken soldiers. And remember, in two, or four, or forty years, these same craven war pigs will once again wrap themselves in the American flag;  these same armchair generals prepared to fight to the last drop of other folks’ blood will be the ones seeking to slash the programs designed to save the ones burning down the road.

by Bill Gibron

12 Nov 2008

He’s that old friend we hardly recognize anymore, that middle aged idol that’s, apparently, going through a bit of a creative and cultural crisis. Granted, the secret agent is substantially less sexy in 2008, especially when you consider the War on Terror implications of such stealth. And let’s not forget the endless recycling and regurgitation. Over the course of 22 films, he’s gone from suave and dangerously debonair to a pitbull on ADD. He’s been resourceful, laxidasical, and constantly reconfigured to fit contemporary parameters. But the question remains - is James Bond still James Bond? - and better yet, has the latest incarnation put the final stake in the character’s heroic heart once and for all.

When Daniel Craig was announced as the latest incarnation of Her Majesty’s licensed to kill-bot, there was the typical unbridled backlash. Most of the complaints centered on the unknown UK actor’s age (Sean Connery was 32 when he starred in Dr. No - Craig was 28 at the time of Casino Royale), his blond hair, his lack of experience, and the general kvetching that comes with any change in the 007 mantle. While he may have faced more scrutiny than Pierce Brosnan or Timothy Dalton, no new Bond gets off easy. Then again, the Connery vs. Roger Moore/George Lazenby/you name it argument is so old it beats the original spy thriller to the retirement home.

So what’s there left to talk about if we don’t dish on whether actor X can carry legend Y’s Walter PPK? How about the equally erratic aspect of the men behind the lens? In the franchise’s 46 year history, there have only been 10 directors involved in the James Bond films - Terence Young (Dr. No, From Russia with Love, Thunderball), Guy Hamilton (Goldfinger, Diamonds are Forever, Live and Let Die, The Man With the Golden Gun), Lewis Gilbert (You Only Live Twice, The Spy Who Loved Me, Moonraker) Peter R. Hunt (On Her Majesty’s Secret Service), John Glen (For Your Eyes Only, Octopussy, A View to a Kill, The Living Daylights, License to Kill), Martin Campbell (GoldenEye, Casino Royale), Roger Spottiswoode (Tomorrow Never Dies), Michael Apted (The World Is Not Enough), Lee Tamahori (Die Another Day) and now, Marc Forster (Quantum of Solace).

For many the same old sentiment applies - the older films were far better and truer to the character than the newer, more modern action efforts. Others point to Young and Hamilton as forming the Bond mythos, and the latter lackluster work of Glen for almost destroying it. The decision over the last decade to offer an Alien like approach to the series (a new filmmaking face guided the material each time out) has met with some hesitation, and a lot of head scratching. Was Tamahori really the right person to put in charge of Brosnon’s final fling with the character? Indeed, the same could be said for Apted, a man mostly known for the triumphant documentary anthology The Up Series.

With Quantum of Solace, one assumes that Forster will face the same cinematic struggles. In an era where stuntwork has to be spectacular, massive in scope, driven directly by the narrative, and captured with a frantic ‘you are there’ urgency, the reigning king is Paul Greengrass and the amnesiac black ops icon, Jason Bourne. There is no denying that the two films helmed by this gifted director (Supremacy and Ultimatum) are contemporary action done with a determined artistic merit. Sure, you sometimes get queasy as the camera careens endlessly around the actors, but Greengrass understands the volatility of such sequences, and the violence that typically results.

Forster obviously feels a kinship to this kind of chaos. From the very opening of Solace, he strives to keep the viewer directly in the line of car chase/fisticuffs fire. Of course, it seems odd that the man responsible for Monster’s Ball, Finding Neverland, Stranger than Fiction, and The Kite Runner is putting on his shaky-cam POV. He’s the last wannabe auteur you’d envision taking over the Bond beatitudes. When the characters interact in the latest installment, Forster is right at home. These moments remind us of why the spy thriller remains a potent genre. But as a creator of convincing spectacle, Forster fails. He’s no John Woo, or for that matter, Michael Davis.

Indeed, by taking this strategy in bringing the character into the 21st century, Quantum stumbles. Indeed, what Davis did with his rollicking Shoot ‘Em Up, or Tarantino does with his typical homage heavy approach is bring the mannerism to the material, not visa versa. In essence, when QT takes on a bit of vehicular mayhem, he draws from the endless canon of same, picking and choosing the best bits to drive his camera/crash choreography. Similarly, someone like Woo works out placement and particulars so that his sequences become dramatic statements on the storyline’s themes and subtext. But in Forster’s case, it’s just copying for the sake of commerciality. There’s even a bit of balcony jumping ala Bourne.

Going back to the old Bond films, one is instantly aware of how clearly defined they were/are. Our hero faces an evil enemy hellbent on taking over the world. He gets help from a hot lady, an entire Aston Martin full of gadgets, and enough mental ingenuity and physical acumen to guarantee at least a chance at success. In the post-millennial 007 universe, the superspy is now a superhero, almost impervious to pain, injury, or unlucky rolls of the plotpoint dice. Taking away the debonair dandy’s vulnerability may be in line with today’s power hungry demographic, but it robs Bond of one of his most important aspects - his humanness. Spies are not gods. They are people playing policy against each other to root out terror and keep the bad guys at bay.

Quantum of Solace forgets all that, and it’s not all Forster’s fault. Indeed, he’s just guilty of giving the camera a bit of an unnecessary nudge every now and again. There will be those who sing the praises of this 22nd excursion into the life of a masterful MI6 mole, and the way the narrative is set up, Quantum plays like the middle act of a much larger cinematic statement (it picks up directly after, and incorporates a lot of storyline, from Casino Royale). Making Bond aggressively badass last time around was a necessary need of a floundering franchise. Making him into the Terminator in a tux just doesn’t seem right. No wonder it’s getting harder and harder to recognize him.

//Mixed media
//Blogs

Ubisoft Understands the Art of the Climb

// Moving Pixels

"Ubisoft's Assassin's Creed and Grow Home epitomize the art of the climb.

READ the article