Latest Blog Posts

by Rob Horning

25 Aug 2008

In discussions of the current economic woes in the U.S., the dismal savings rate is a topic that frequently comes up, with virtually every commentator agreeing that this must be raised in order to begin to correct the problem. David Leonhardt, in his must-read NYT Magazine article about Obama’s economic ideology this weekend, explains the problem succinctly.

For the first time on record, an economic expansion seems to have ended without family income having risen substantially. Most families are still making less, after accounting for inflation, than they were in 2000. For these workers, roughly the bottom 60 percent of the income ladder, economic growth has become a theoretical concept rather than the wellspring of better medical care, a new car, a nicer house — a better life than their parents had.
Americans have still been buying such things, but they have been doing so with debt. A big chunk of that debt will never be repaid, which is the most basic explanation for the financial crisis. Even after the crisis has passed, the larger problem of income stagnation will remain.

In order to minimize that unpaid chunk of dent—and minimize the damage—increased savings over time must be used to pay that debt down. (At a macro level, too, increased savings will mitigate trade imbalances, stabilize the dollar, and reduce our reliance on foreign banks to fund our borrowing.)

But with the stagnating wages comes a greater reluctance to save and a loss of faith in the traditional “work a lot, save a lot” method to wealth. Instead, people rightly conclude that they will continue to fall behind without supplementing their wages with another form of income. Some try lottery tickets, others apparently try the upscale equivalent—short-term investing. Felix Salmon takes note of a survey about American attitudes to saving.

Here’s a depressing statistic. In a recent Harris survey, 3,866 Americans were asked which things were “extremely important to achieving financial security in your retirement”. 39% said that “investing wisely” was extremely important, while just 34% said that saving money during one’s working years was.
The problem is that while the financial-services industry is very good at marketing and selling investment products, it’s very bad at marketing and selling thrift, and living within one’s means. After all, the only thing which is marketed more aggressively than investments is credit products.

It’s not merely the fault of the marketing departments in the financial services industry, though. It’s the fault of marketing, period. Advertising works hard to undermine the ethic of saving, making acquiring more goods seem all important and making our faltering wages seem even more inadequate. This works hand in hand, then, with ads touting more credit products. There is virtually no commercial incentive to encourage thrift; that’s not where the fat margins are. And prioritizing not having stuff is unlikely to become anything but an alternative lifestyle—a marginal mode of bourgeois rebellion—any time soon. Our sense of self is too bound up with what we possess and display; expecting people to consume and acquire less is almost tantamount to asking them to become less of a person.

That’s why the government must do what it can to encourage thrift—mandating an opt-out standard for participation in 401k’s for example. But what mainly needs to change is the sense of unfairness that permeates the economy, something that shows up in the class divide between those who earn income through wages and those who earn it through investments. The consensus seems to be this: Working and saving are noble goals, but consuming is what gets people’s attention and pins down the sorts of things we are committed to (putting our money where our mouth is). And working is sort of for suckers; having your money work for you is where it is at. Wage earners thought buying houses would make their money work for them—the house magically doubled their money as property values irrationally increased (creating knock-on wealth effects and encouraging increased consumption). But now that it has become clear that buying houses is simply more consumption, and not investment, the shortfall in national savings is easily recognized as an abyss.

by Lara Killian

25 Aug 2008

image

Heading out the door recently for a workout session, I paused to grab some reading material—the July/August 2008 issue of The Atlantic magazine with its intriguing cover headline: “Is Google Making Us Stoopid?” Good thing I was stuck in one place while perusing the text, because my ability to focus on a single topic for long has been rather challenged of late.

Nicholas Carr’s article has a subtitle: “What the Internet is Doing to Our Brains.”

A quick Google search turns up all the information a researcher, whether casual or professional, could hope for, without the traditional trawling through archives or leafing through indices. Carr discusses the fact that with the ability to access nonstop information 24/7, habitual Internet users are developing a unique method of dealing with the overflow. Classic journalism theory holds (so I’m told) that a newspaper reporter tends to put the most compelling information in the first paragraph, since few readers will finish a longer article. On the Internet, with constant links leading deeper into the rabbit hole, readers seldom return to a web site to finish an article or blog post once a link takes them away. Even a particularly enticing first paragraph is not enough to focus the jaded surfer’s attention.

With constant headlines flashing past our eyes and distracting advertisements extolling the latest IQ test or makeover strategy, we’re losing the ability to concentrate on reading for more than a few moments before our brains demand a subject change.

Carr writes, “Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.” He goes on to say that reading literature or even a full length magazine article is becoming more difficult even for academics who previously devoured works like War and Peace.

Is this really the future of reading? Losing the ability to sit down and read a full chapter of a biography or finally reach the end of that novel? I felt good about managing to reach the end of Carr’s cover story. If you made it to the end of this blog post, there may be hope for us yet.

by Bill Gibron

24 Aug 2008

He seems like a nice enough guy. Last time anyone checked, he wasn’t making massive tabloid headlines with his debauched behavior, nor had he been discovered killing kittens in some crack-soaked back alley. Heck, he even has a hot girlfriend (director drag and drop diva Milla Jovovich) and a baby girl. And yet ask film fans who their least favorite director is - nay, ask them to list the men who’ve made an abomination out of the motion picture medium - and his name instantly comes up. As frequently as Dr. Uwe Boll. With a directness reserved for Ed Wood or Coleman Francis. To listen to the disgruntled talk, he has systematically destroyed potentially effective projects, reducing long held genre hopes to squirming, squiggling junk.

So what is it about Paul W. S. Anderson that drives the critic to complain - and even worse, why does this friendly faced UK filmmaker receive so much fanboy wrath? The answer, sadly, remains rather elusive. It can’t be his actual moviemaking acumen. He’s certainly got a handle on the artform’s basics, unlike other hacks that can’t put two scenes together without struggling to make sense of the narrative structure. And as this week’s Death Race proves, he can manufacture fake action with the best of them. Sure, he edits like an insane person and piles on the flash when some focus would truly help. But Paul W. S. Anderson is not a bad director. He’s just had the unfortunate luck of taking on titles that get geek panties in a big fat workmanlike wedge.

His name wasn’t always a motion picture pariah. He first came to prominence in his native Britain, where in 1994 his violent thriller Shopping caused quite a stir. Its portrait of disaffected youth, stogy class conformity, and the purposeful destruction of property gave a smug England some harsh food for thought, and catapulted Anderson into the minor fringes of the mainstream. It also made him fodder for that notorious “next big thing” tag, something many foreign filmmakers get saddled with once Hollywood finally hears about them. As a creative cause celeb, Anderson was given immediate access to the hottest script in the studio system - the big screen adaptation of the video game smash Mortal Kombat. It would wind up being the first of his many career coffin nails.

Granted, it’s hard to screw up a martial arts movie in which characters compete in a ‘brawl for it all’ tournament to the death, but Kombat apparently gave audiences its first reasons to be concerned about Anderson. It wasn’t the lack of skill - again he is far more fluid in his filmmaking than any of the movie making misfits he’s frequently referenced with. No, where Anderson seems to stumble (both then and now) is in the all important area of ‘reimagination’. Unlike Christopher Nolan, who tweaks the Batman saga into a psychologically deep crime story, or Sam Raimi who tries to keep to Spider-man’s general spirit, you never know what to expect when Anderson is in charge. Sometimes, you get a reverent reinvention of the mythos. At other instances, the end results are unrecognizable to even the most ardent aficionado.

In Kombat‘s case, the reinvention process seems to totally forget the reason the movie is being made in the first place. It has to be hard for screenwriters to turn fisticuffs into fleshed out stories, but Anderson’s scribes treat it like brain surgery. Gamers loved Kombat because of its bone crushing battles bathed in buckets of blood. They loved the finishing moves and the easily identifiable characters. Trying to turn this all into some manner of Shaw Brothers knock-off was not the way to go, and yet Anderson and company strove to bring a kind of backstory viability to the concept. While many felt the reformatting failed, the title was still so commercial that even this subpar semblance of the game made money.

As usual, cash creates opportunities, and Anderson was allowed to pick his next effort. He chose the David Webb People script Soldier. Kurt Russell was pegged to star, and pre-production began on the potential sci-fi epic. The pedigree at least seemed secure - Peoples had co-written Blade Runner, received an Oscar nomination for his work on Clint Eastwood’s Unforgiven, and guided Terry Gilliam’s great 12 Monkeys. Soldier had all the elements of a potential hit - a certified cult star, an intriguing story, and a hot shot helmer behind the lens. Then Russell decided to take some time off, and the entire project was pushed back.

Anderson needed something to help him cope with Soldier‘s work stoppage. He barreled head first into the outer space horror film Even Horizon. The original screenplay by novice Philip Eisner offered an abandoned alien laboratory investigated by a party of Earth astronauts. Anderson preferred a more straightforward scary movie, and discarded the idea. Instead, the new Horizon storyline centered on a missing spacecraft that may or may not have traveled to the bowels of Hell when it disappeared for seven years. Loading the narrative up with sadomasochistic sex and gore-drenched violence, Anderson hoped to redefine both terror and the extraterrestrial. Instead, he was forced to cut nearly 20 minutes of the movie to get an “R” MPAA rating.

At this point, Anderson was two for two. Sure, Event Horizon was not a major financial hit, but enough in the business saw its polish and professionalism to give the director another shot at Soldier. Russell was ready now, and the film premiered to universal yawns in 1998. Many consider it to be the worst film of Anderson’s career, a braindead bit of bombast that trades on little of the premise’s promise and ideals. At the time, the filmmaker had hoped to update Roger Corman’s Death Race for an actual 2000 release. Instead, he had to suffer the blowback from creating a big time blockbuster bomb. It would be two more years before Anderson got a chance at another noted title.

The zombie video game Resident Evil had long been considered a cinematic slam dunk. There were even suggestions that the father of the undead film, George Romero, was eager to film an adaptation. But the job went to Anderson instead, and while the devotees dished over the stupidity of the choice, the director delivered. Even though it changed some of the console title basics, Evil was still a moderate hit. It led the way to Anderson’s adaptation of AvP: Alien vs. Predator, another solid success. Again, the faithful fumed over the liberties taken with the material, including elements not found in the comics or companion sources. Yet Anderson argued for his approach, highlighting his reliance on the original films as guidance and inspiration. 

All of which brings us to this week’s box office dud Death Race. Coming in third behind Tropic Thunder and The House Bunny, Anderson clearly has lost a lot of his big screen buzz. Of course, no one was really clamoring for a revisit to Corman’s 1976 road kill epic to begin with, but the update is not as bad as the reviews suggest. Instead, it’s just big dumb action with lots of explosions and cars (and body parts) going v-rrrooooom. Indeed, there is nothing here to suggest Anderson is the Antichrist or incapable of delivering decided popcorn perfection. But as with many of his movies, the way he reimagines Death Race - an internet competition inside a maximum security prison run by a ruthless female warden with one eye on the ratings and another on her big corporation concerns - fails to fulfill the concept’s kitsch calling.

And there’s another argument that may or may not sway potential detractors. Anderson is one of the few filmmakers who is open and brutally honest about the editorial decisions he is forced to tolerate by mindless studio heads. Ever since Kombat, he has complained about interference, stating that if he could release a “Director’s Cut” of his frequently panned projects, the opinion of his work would change radically. Event Horizon is one of his particular sore spots, the aforementioned missing footage destroyed or lost by parent Paramount. Especially in this era of the digital domain, where DVD can indeed redeem a failed film, Anderson is angry that he hasn’t had a chance to do just that. There are supposed longer edits out there for every one of his marginalized movies, but due to their lack of success, the rights holders see no reason to rereleased his versions - if they’re even available. 

And so Paul W. S. Anderson sits, marginalized by a business he’s frequently benefited. Personally, he says he’s sick of trying to explain the symbolism in Magnolia (clearly being mistaken for Paul THOMAS Anderson), and after changing his name to W.S. he hates explaining anew that he is not responsible for The Life Aquatic or The Darjeeling Limited. His next film is another video game adaptation - the more or less unnecessary Spy Hunter - and one assumes that even now, the arcade crowd is gearing up to undermine his efforts.

Until then, Anderson will continue on as producer, writer (Castlevania), and behind the scenes Resident Evil guide (the franchise appears headed for its fourth film). It’s also clear he will remain a ridiculed member of an easily outclassed collective. He’s definitely not the worst director in the history of film. But defending him gets harder and harder - especially in light of his less than spectacular past and present preoccupation with b-movie mediocrity. One day he might find a way to prove his detractors wrong. Until then, Paul W. S. Anderson will remain an easy if enigmatic target. Just like his films, figuring out what’s wrong with his reputation is not as simple or straightforward as it sounds.

by Mike Schiller

24 Aug 2008

Know how to tell when the holiday gaming season, that oh-so-wondrous three-ish months that closes out the year, is around the corner?  When the list of games being released gets a lot bigger, but the number of games that you actually want to play stays pretty much the same as it’s been all summer.

This, of course, is the first week in which that particular phenomenon appears to be taking hold.  As such, we are offered such licensed audience-pleasers as Digimon World Championship and Garfield’s Fun Fest, both out for the DS this week.  Specialty racing games are also prime suspects for the pre-holiday rush, and this week we see Ferraris and demolition racers get their own games for multiple systems (the sadly toothless Need for Speed franchise gets a release as well).  And…wow.  Look at the Wii.  The poor system’s got a reputation for shovelware already, and this week is not going to help.  Another Kidz Sports game?  Something called Freddi Fish in the Kelp Seed Mystery?  And then there’s my personal favorite, Spy Fox in Dry Cereal, which sounds like one of my average Saturday mornings in the mid ‘80s.  All that list is missing is Ninjabread Man 2.

Tales of Vesperia, for the Xbox 360

Tales of Vesperia, for the Xbox 360

Counteracting this onslaught of things I’m entirely not interested in are two releases that promise to be some of the most engrossing play experiences yet released this year: Tales of Vesperia, for the Xbox 360, and Disgaea 3 for the PS3.  The first is a more traditional RPG experience (though if you’ve played the demo, you’ve already found that the combat is a little bit more hectic than that would imply), while the second is a tactical RPG.  Both are new entries in well-established franchises, both have excellent advance press, and both have the potential to utterly destroy your social life for long periods of time.  That means they’re winners in my book!

Disgaea 3, for the PS3

Disgaea 3, for the PS3

Also on the docket this week is the release of the new Tiger Woods game, which almost gets the game of the week nod on the strength of its brilliant little trailer alone.  Whatever advertising agency decided to capitalize on last year’s glitch and turn it into this year’s gold deserves a raise.  A big raise.  The ever-reliable Xbox Live Arcade gets Castle Crashers, which looks like another utterly chaotic (not to mention potentially brilliant) effort turned in by the geniuses over at The Behemoth, who have made an art form of gracefully mixing cuteness and violence.  Mario Super Sluggers has a good chance of being exactly the arcade baseball game that Wii owners have been waiting for as well.

And…aw, heck, who am I kidding.  I think I’m going to buy Spy Fox in Dry Cereal just so I can look at that name on my shelf.  Doesn’t it sound like a classic waiting to happen?

Trailers for Vesperia and Disgaea, along with the full release list, are after…the jump.

by Jason Gross

24 Aug 2008

Been brewing on this for a few months so please excuse the fact that the article references below are a little old, inspired by yet another wave of ‘death of journalism articles.’

It’s not only this study claiming that critics are losing out to social networks and music services but also this survey of UK critics bemoaning their own profession.

Let’s admit it- the reason that you see a lot of these columns is because of self-interest.  The writers left standing in publications want to defend not just their peers but also their profession and their job.  The debate then is whether this is really warranted or not otherwise.  One argument against scribes is that the egalitarian nature of the Net levels the playing field and lets the masses storm the gate of opinion, making it more public again.  Then again, just because someone has an opinion doesn’t mean that they can express it well or as the old saying goes “Opinions are like assholes- everybody’s got one.”

There IS good reason to worry though as recently (well, relatively recently), the L.A. Times has cut more writers loose, including Chuck Philips (who admittedly had some big problems with sources to a recent story).

In my mind, a good music critic can serve two important purposes: 1) helping you to find out good music and/or 2) helping to think about music and issues around it.  Admittedly, there’s much more call for the former than the later and even then, there’s a lot of competition from other sources, mostly online.

And that’s where the big stink happens when professional writers complain about the Net, as for instance in this Guardian article.  What they’re worried about is whether blogging will or can (or should) replace print criticism, but maybe this a false set-up.  Posting a link to a story or an MP3 file or an embedded music video isn’t the same thing as writing a think piece or a carefully researched article- that doesn’t usually happen in blogs and maybe it’s expecting too much of them to think that they (always) should.  Posting info can be a valuable service which you can learn something from- a good music blog can just as well help you find good music.  To say that it’s not ‘journalism’ per se is right but that doesn’t take away it’s value as providing a public service.

In the next installment (hopefully soon), we’ll hash through some fallacies about the ‘anyone can write’ argument…

//Mixed media
//Blogs

The Eye of Lenzi: "Gang War in Milan" and "Spasmo"

// Short Ends and Leader

"Two wide and handsome Italian thrillers of the 1970s.

READ the article