Latest Blog Posts

by Jillian Burt

17 Sep 2007

By Aaron McKain.

“Here’s where they stand in Iowa”: Obama: 27, Clinton: 26, Edwards: 26, Richardson: 11, Biden: 2, Kucinich: 2, Dodd: 1, Gravel: “no support registered.”

A poll is a fairly shady way to kick-off a presidential debate. But it’s a shadiness that’s become more or less standard operating procedure: we trot the candidates out on stage, they stand with aw-shucks grins, and they continue to stand and grin while the moderator reads 5/8s of them their political death rites. On this particular morning in Des Moines (August 19, 2007), it’s ABC News’s George Stephanopoulos doing the honors, alerting the audience to the percentage of surveyed Iowans—those first-line gatekeepers of the American presidency—who “if the Democratic Caucus were being held today” would throw their support behind given candidate X. Twenty-four weeks before the Iowa Caucus, the poll is a strange bit of statistical speculation. Broadcast six minutes before a televised debate, the poll may cross the line between strange and outright bumfight cruel, particularly when it forces a genuinely decent old crank like Mike Gravel to smile-squirm through the announcement of his zero percent approval rating.

But pointing out this strangeness—the polls, the debates, the news-media’s hand in either—is decidedly Stale Old News, not worth a yawn or the time it takes you to scroll down. This is the quixotic dilemma of any attempt to quote-unquote “critically analyze” presidential politics: everyone already knows that the campaign is a glorified horserace, and everyone already knows that this horserace, like anything else worth anything in this world, is a wee bit totally jury-rigged. Moreover, for any bona fide campaign junkie, anyone truly addicted to the jockeying and braying, hinting at the bunkness of our nation’s electoral ritual goes beyond banal and bumps-ass into outright betrayal. It’s like telling your six-year-old nephew that pro-wrestling is fake: it doesn’t make the kid feel any better, it doesn’t make it any easier to body slam 550lbs of pituitary anathema, and it doesn’t necessarily explain a better way of picking an Intercontinental Heavy Weight Champion. This is what my friends who can’t believe I still watch the stuff—politics, not wrestling—don’t get. And it’s why all the recent brouhaha about hypotheticals (you were wondering when I’d get to them, no?) is such a slap in the face to all of us still foolish enough to watch the contest.

For those at least smart enough to tune out the pageant during high summer, here’s the catch-up. Sometime around August, the politicos decided that going after Sen. Obama for his willingness to answer hypothetical “what if?” questions would be a capital-g Good Idea, a strategy smart enough to win over voters, including those crucial Iowa Caucus-goers. Calling bullshit on an opponent’s willingness to engage in a mode of questioning, rather than their actual answers to specific questions, is a stupendously odd (albeit ancient) rhetorical strategy. That Obama’s rivals are even trying it is a testament to the bind they’re in. The “hypothetical” stances the Senator from Illinois has carved out—he would meet with the Axis of Evil, he wouldn’t nuke Iran, he would pursue al-Qaeda in Pakistan with or without Gen. Musharraf’s support—are all popular positions for the mainstream left, and if your opponent is able to articulate popular positions whenever they open their mouth, then that’s a mouth you need to find a way to shut. Attacking hypotheticals could, hypothetically, accomplish this.

Unfortunately for Obama’a challengers, convincing voters that hypotheticals are dangerous (and thus off-limits like chokeholds and piledrivers) is a hard argument to sell in a soundbyte. On the stage in Des Moines, the candidates’ delivery of the anti-hypothetical argument is not only miserably club-footed (Clinton: “we shouldn’t use hypotheticals, words do matter”; Edwards: “as a president, I wouldn’t talk about hypotheticals”; Richardson: “this talk about hypotheticals is what’s gotten us into trouble”) but it also foregrounds why the critique is counter-intuitive in the first place. As John Dickerson has pointed out in Slate, presidential campaigns are comprised of a series of hypotheticals (in the Iowa debate, what-ifs about troop pull outs, sovereign partitions, oil revenue, health care, and whether Clinton has a snowball’s chance of winning) all in the service of an over-arching hypothetical (“if I am elected…”) that provides voters precisely the information they most desire (i.e., what is this ego-freak going to do once we make them the Most Powerful Person on Earth?). Hypotheticals are also on the spot gut-checks, glimpses into a would-be leader’s instincts and reasoning and gauges of whether they share our common sense of the world. To voters, hypotheticals will always feel more People’s Elbow than low blow, and it would take a lot—and a lot of specific lots—to persuade them that this populist crowd favorite needs to be kept out of the ring.

Which is not to say that hypotheticals aren’t potentially dangerous. Obama’s comments re: Iran and Pakistan referenced ongoing, nuanced (well, ongoing at least),undeniably non-hypothetical diplomatic efforts. Richardson and Clinton are half-right when they warn that in these circumstances “[w]ords do matter” because words can matter, particularly when they are that explosively magical combination of the right words and the right person saying them, an equation which becomes highly probable when the person saying those words has a fair shot at being The Next President of the United States of America. Candidate Reagan learned this lesson on the campaign trail in ’80 when he slipped-up and announced his support for Taiwan while his vice-presidential candidate George H. W. Bush was in China pimping for the opposite policy. (In political science circles, this is what is known as being taken for a ride on Space Mountain”). The Chinese, like Musharraf, were beyond displeased. But this is the cost of doing campaign business in the democratic open-air. To refuse to ever pay this price, to deny hypotheticals across the board—or even hint at such a thing in a desperate fit of political stratagem—is to turn all campaign discourse into vanilla stump speech mush.

Of course, the other entity with words and the juicy gravitas to make them matter is ABC News. And the sick spit in your eye irony—conveniently lost on everyone beating this hypothetical story into the ground—is that if candidate hypotheticals are dangerous, news/pundit hypotheticals are doubly so. The ever-present horserace question, “If the election were today, would you vote for this candidate?” is the ultimate campaign hypothetical and this speculative staple of our political diet finds its numerical, quasi-scientific legitimation in The Poll. The poll has pull, the sort of news-muscle that determines who gets to the chance to compete to be elected and thus what actually happens to Musharraf or Tehran. The poll is what gives the voters, to quote Rep. Kucinich’s sound-with-bite, a “conditioned choice,” letting ABC tell those all-important Iowans that an eight person scramble with six months to go is a really just a three-way race. (And it’s what justifies ABC’s use of debate questions that ensure the race stays that way; e.g., asking Obama to respond to Richardson responding to Biden responding to Dodd responding to Clinton responding to Biden’s statement that Obama isn’t experienced enough to be president.)  The survey hypothetical is what has preserved Hilary Clinton’s frontrunner status for nearly two years, which is four times longer than JFK’s entire primary campaign. And it’s what reduces Dodd, Kucinich, Richardson, Biden, and Gravel to jobbers, punching bags thrown in the ring only to make the company favorites look good.

All of which is still State Old News, a Yawnfest ostensibly beneath comment or contempt. But candidates working the refs by crying hypothetical to the ABC oddsmakers makes this Old News a little more than this political junkie can bear; it dances too close to the flame and pretends to not feel the heat. It’s Ric Flair, all 243 shambling, greased-up pounds of him, stopping the match and turning to the cameras to say “hey, I think that punch was fake.” We already know we’re suckers for watching; don’t rub our faces in it.

by Jason Gross

17 Sep 2007

We’re so used to bands doing ads or having their songs used as product placement or having corporations sponsor tours that any news of this sort doesn’t raise eyebrows now.  You just hope that your favorite band doesn’t get associated with anything too embarrassing.  And then there’s the case of Band of Horses and the backlash and their response. So is BOH right to say that they’re just trying to earn a living and keep playing music or are there some lines that shouldn’t be crossed in taking ad money?

by Bill Gibron

16 Sep 2007

They were supposed to be the saving grace of cinema, the cyberspace tastemakers that provided insight into what would be a hit come theatrical release date. Via their focused devotion and frothing fanbase obsessions, they would function as broad-based barometer, a way to decipher how like minded movie maniacs would respond. Yet ever since Snakes on a Plane significantly underperformed, and Grindhouse ground to a halt, the geek has been getting its commercial clairvoyance kicked. Over the last few months alone, the potential prognostication of these messageboard/MySpace mavericks, luminaries supposedly in tune with the times, has proved to be downright deadly. And in its wake, a selection of stellar and slightly less significant films have been left to flounder.

Of course, a caveat has to be provided before plowing forward. Just because the knowledgeable nerd loves a possible project with all his mint condition action figure might doesn’t mean the movie will actually be good. With large exceptions – 300, for example – the quality of the film actually figures into the failure. In addition, any kind of cult, by its very nature, is limited in scope and design. Unless you can manage a Unification Church level of brainscrubbing, the choir will always be preaching to a smaller and smaller subsect of the converted. And yet Hollywood still rests a lot of its hope on feeding the so-called insider sites with as much pre-production pimping as possible. Rarely does it come back to bite then in the bet (the recent dork nation reject of Rob Zombie’s Halloween a clear anomaly).

Take Shoot ‘Em Up! for example. Released at the start of Fall’s frequently confusing motion picture season, it had the earnest earmarks of a surprise post-Summer sleeper. There was non-stop action, loads of gratuitous violence, a scantily clad Monica Belluci, and several deadly carrots. The characters were cardboard cut-outs of carbon copies accentuated with just enough quirk and smirk to make them viable, and director Michael Davis didn’t just bury his tongue in his cheek – he cut the damn thing off and crammed it into your craw. Yet after one week in theaters, and a less than impressive $6 million take at the turnstiles, the movie is headed for a quick take turnaround onto the DVD format. Receipts are down almost 60% in the second week, and the lack of “legs” indicates an audience that’s already climaxed on this kooky crime caper.

So what went wrong? Why is Shoot ‘Em Up! failing to make a major marketplace dent. There are two answers, really. One is a throwback to the days of the VCR. There is still a significant number in the mainstream viewership who will see a title or trailer like this, run the entertainment possibilities through their own aesthetic processor, and determine that a trip to Blockbuster (or a pre-release placement on a Netflix queue) would be preferable to battling crowds and disruptive theaters in exchange for their discretionary income. This “I’ll wait for the (digital/analog) release” has plagued the industry, and the occasional unusual movie, ever since Beta battled VHS for format supremacy.

The other factor is far more fascinating. Call it the “basement” syndrome, or the “Me, Myself, and I” ideal. In general, a geek is a geek because of their solo fixation on something. They love it because of how it speaks to them, not how it resonates with the masses. Indeed, it could be argued that popularity completely undermines the feeb. Once it’s a part of pop culture, it’s hard to feel it belongs only to you. So as long as the material is unavailable, able to be scrutinized, and scanned as part of a personal dynamic, there’s a façade of potential success. All the advance buzz and preview hype does help. But once the movie makes it into the marketplace of ideas, it begins to loose its exclusivity. And with rare exceptions, this means the fanatical will have their moment – and then move on.

Of course, there are those times when Tinsel Town tries the opposite approach. Take the case of Neil Gaiman. Somehow, overnight, he went from well loved literary figure with a few notable adaptations under his belt (MirrorMask, Neverwhere) and an equally devoted following to the latest player in the post-LOTR fantasy adventure face off. Without the prerequisite preparation for a ‘next big thing’ crowning, a version of his Princess Bride like fairytale farce, Stardust, attempted to become a major popcorn movie moment. For months prior to its August release, it was touted on numerous websites as the second coming of sophisticated adult fairy tale-ing. But after a month in theaters, the film has barely grossed $36 million, a far cry from its $65 million budget.

It’s clear that the studio suits underestimated this British writer’s popularity. But it didn’t help matters much that Matthew Vaughn’s take on the material was all mannerism and no magic. People don’t usually go to a sword and sorcery epic to see aging actors swishing around (Robert DeNiro played a closeted gay sky pirate) or noted beauties rendered butt ugly (though Michelle Pfieffer was actually very good as a crabby, craggy witch). No, they want the visual fireworks, the ephemeral eye candy that comes with the genre – and if not that, some very solid satire. Stardust had neither. Instead, Gaiman was garroted, his own unique vision undermined by a movie that skimped on both spectacle and wit. 

Even independents found themselves struggling under the lack of clear geek support. Prior to its coming to our shores, the New Zealand comedy Eagle vs. Shark was being pushed as a Napoleon Dynamite for the Kiwi cult. It even starred the up and coming actor from the acclaimed HBO series Flight of the Conchords (Jermaine Clement). Unfortunately, the movie itself was a bafflingly disorganized dramedy that took a decidedly hard line look at what were, in essence, massively marginalized human beings. Where Nappy co-writer/director Jared Hess felt a kinship with the crackpots he put on screen, Eagle creator Taika Waititi just wanted to mock his morons. Even with the evocative setting, the storyline seemed harsh and the characters more confrontational than charming.

About the only films in the last nine months that followed through on their omnipresent online anticipation came from one enlightened individual. While his name was already known to many in the motion picture bazaar thanks to certified 2006 hits Talladega Nights: The Ballad of Ricky Bobby and The 40 Year Old Virgin, Judd Apatow literally stormed the cinematic stocks in 2007 and took over the reign as comedy’s creative king. His Knocked Up was one of the Summer’s certified gems, and his production credit on the equally engaging Superbad gave the smallish coming of age farce a much needed shot of significance. And it worked. Both films remain fan favorites from the otherwise unimpressive sunshine season, and stand as examples of how nerd acknowledgment can lead to legitimate commercial claims.

But these are the rarities, the situations where artistic integrity (read: good filmmaking) meshed with Internet attention to create a cult of profitability. But it’s not really indicative of the dolt demographic’s perceived power. Indeed, both Superbad and Knocked Up got as much conventional support as they earned from the online community. No, in most cases, the fanatical come up rather short in their power to both guide and deride the similarly minded. Indeed, they are equally powerless at stopping a film’s support as they are at guaranteeing its success.

As mentioned before, Rob Zombie’s recent Halloween remake stands as a great example of their overall ineffectual stance. For months, Ain’t It Cool News was gunning for this “unnecessary” horror update. It published pundit piece after pundit piece criticizing the script (even before the film went into production), arguing over Zombie’s approach, and picking apart the casting. As time passed, the mandatory screening reviews started to appear, it was clear that Harry Knowles and his artificial (and actual) industry insiders were of one like mind. Because of their longstanding professional relationship with John Carpenter, they were desperate to undermine anything that challenged his legacy.

Now, this is not just conspiracy theorizing. While no one from the site has actually come out and stated such an intent, it’s pretty easy to infer, given the obtainable facts. Drew McWeeny, otherwise known to AICN readers as “Moriarty”, has worked very closely with Carpenter in the past. He scripted the macabre icon’s Master of Horror segments “Cigarette Burns” and “Pro-Life” and is noted for his connection to the famed filmmaker. It’s no surprise then that McWeeny took Zombie to task in a 31 August review of Halloween that, in brief, referred to the film as “creatively bankrupt from the start”, and incessantly trashed it for nearly 3000 words. Now, there is no denying the man’s entitlement to his opinion. It’s the cornerstone of criticism. But the lack of openness (Carpenter’s name is mentioned, but never the duo’s business relationship) taints any take.

The funny thing is – it really didn’t work. While far from a blockbuster and more or less destroyed by the rest of the fractured Fourth Estate, Halloween did go on to score almost $52 million at the box office, guaranteeing Zombie another stint behind the camera. In fact, your regular movie going audiences have been much more receptive of the film than the so-called clued in, and with its microscopic production costs (approximately $15 to $20 million, by some estimates), it will surely be labeled a decent sized hit. So what does this say about the geek contingent? Are they really a powerful predictor of success? Or are they nothing more than untried tea leaves for a desperate studio system?

The answer is clearly neither. While there is nothing new about gauging fan interest in divining a product’s potential success, Hollywood has forgotten something significant about the online community. Like talk radio and any other forum for public interaction, the squeaky wheels that choose to participate are not representative of the entire population. For every lover/hater of a movie/director/actor, there’s a Nixon-esque silent majority sitting back, making up its own mind. They will ignore the love of a specific author or genre type to simply pay for what interests them. In fact, the louder the screams from the self-imposed about the importance of a project, the more likely the hype will fall on indifferent or just plain deaf ears.

Certainly, the geek will have its failures. All gamblers do. And it is sad when such a flop is fostered upon an undeserving entity (Grindhouse was great, as was Shoot ‘Em Up!). But perhaps it’s time to stop using the overtly zealous as a benchmark for bankability. It’s clear that any position they take – pro or con – still renders a title a veritable unknown quantity. Like the buzz building around a student union, or a high school cafeteria, the new ‘Net water cooler is just one factor in a film’s overall potential success. The rest of the elements tend to render the nerd a minor mirror at best. Hopefully Hollywood will remember that come creativity/concept time. It’s one thing to play to the prone. Relying on them is just a fool’s paradise.

by Rob Horning

16 Sep 2007

Harper’s has an excerpt from Naomi Klein’s new book, which I mentioned in a post a few days ago. I was a little skeptical about it having read only summaries, but this excerpt helps bring the argument into better focus: Klein uses the Green Zone in Iraq as an example of what the U.S. could become if the mania for privatizing all government services would be allowed to continue unabated. In walled, heavily guarded communities, the privileged would continue their lives of comfort while outside the walls would be a reversion to a primitive struggle for basic resources, for the chance at survival itself. The idea of a functioning state that unifies the different classes and factions in a society with equal rights and a basic level of public services is rejected; in Iraq, it appears absolutely impossible. This appears to have been by design; in the immediate aftermath of the invasion, as has often been noted, the Bush administration sent a bunch of flunkies picked for their ideological commitment to administer an experiment in free market anti-government, privatizing services and dismantling existing state institutions, and helping bring forth the conditions that have produced the current quagmire. Klein’s point in the article is that the same free marketeers would like to introduce the program in America, using privatization as a rallying cry in times of crisis to make the well-off think to themselves, Why should I share in the general suffering? Should this society be set up so that my money will protect me from all misery and discomfort? Otherwise what good is my money? And if those poor people feel “left behind,” then, well, they should have thought of that when they decided to be born poor and then not work extra hard to overcome that disadvantage to get enough money to safeguard themselves.

This is a fundamental divide between American conservatives and liberals: conservatives want to conserve privilege and prevent the state from alleviating inequalities (otherwise we’ll have “moral hazard” and “perverse incentives” and the poor won’t bother to work very hard to change their condition); liberals believe that civil unrest is mitigated and the better side of human nature expressed by using the state to help provide equal opportunity and a basic level of security to all citizens. That means making a certain amount of sacrifice for an intangible benefit, and it means irrational decision making at the margins, but that’s because the return on the investment is not readily measurable by the economic tools the champions of market forces tend to prefer. One can imagine the same ideological divide applying to the health care crisis America faces regarding supplying a basic level fo care for all: Prices are going up and insurance companies won’t cover those likely to be sick (or unemployed,a rough proxy for health) and this amounts to a de facto form of rationing. Conservatives play to the upper-middle-class’s fears and selfishness: You know there will be rationing, they intimate, so why not rig the system so your wealth guarantees you preferential treatment. And if those without suffer, well, at least it’s not your family. Why have a safety net if that means you, in your privilege, are a hair less safe? Of course the liberal answer to that regards the lack of huge disparities among society as beneficial in itself; it means less envy, less misery, less unrest, and ultimately more liberty for more people—satisfied people will generally leave each other alone and let people live and let live.

by Matt Mazur

16 Sep 2007

The Brave One

The Diving Bell and the Butterfly

Amidst the sea of flickering Blackberries being lovingly fondled by the throng of jaded industry professionals, one thing stood out for me at this year’s Toronto International Film Festival: the films seemed to be dominated by strong women; particularly by actresses of all shapes, sizes, and ages. After being subjected to a long, hot summer filled with the smell of testosterone in the theaters, the ladies are back with a vengeance. And they are ready not only for their close-ups, but also for their accolades.

There are always cries about how women are getting the shaft in film. There’s not a year that passes where there is some wag insisting that it is “a weak year for actresses”. While this might have an unfortunate grain of truth in most typical years, 2007 is shaping up to be unusually warm to the idea of women as equal partners in terms of cinematic importance. The playing field this year may mercifully be leveled, thanks in part to the tremendous achievements of a handful of women who brought their offerings to festival crowds this year.

Most of the buzz this year will revolve around the dozen or so expert performances that had their North American premiere at the TIFF. Most major films had at least one outstanding role for an actress somewhere (or, as was the case with Joe Wright’s Atonement, there were at least four), while many will be competing for spots in the female acting races early next year at the Oscars.

The Brave One

The Brave One

Getting a jump on the competition, Neil Jordan’s polarizing The Brave One, starring the excellent Jodie Foster, showed on day one, proving to be more than just another standard Foster-big budget extravaganza. A tale of revenge and love that owes a debt of gratitude to modern Asian language cinema as much as it does to the classic Western, The Brave One has been criticized by many as being “over-the-top” and “unbelievable”.

Even though most critics have unanimously cited Foster’s performance (which was more natural than anything the actress has done in recent memory) as one of her best—and many, like me, are calling for a deserved Oscar nomination, the film itself has been widely received in a more lukewarm manner than it was by the festival crowds I saw it with; in Toronto, there was nothing but surprised enthusiasm over this one.

Lust, Caution

Lust, Caution

Ang Lee’s beautifully made sex thriller Lust, Caution, adapted from one of Eileen Chang’s novels, didn’t quite live up to expectations, despite being technically very solid. Almost every person I spoke with regarding this film found it disappointing, as a whole, but there was universal praise for the debut leading performance of Tang Pei. The actress had a vivid character to play: a naïve young actress that becomes a political radical and ends up using her sexuality to exert control over a government official. The demanding role required Tang to simulate various, intimate sex acts (that come across as looking quite real), as well as hit dramatic highs and lows. Thanks to Lee’s masterful knack for casting, the newcomer pulled it off beautifully, dignity intact.

Noah Baumbach, of The Squid and the Whale fame, offered up one of the strongest displays of female acting at the festival with his newest, Margot at the Wedding; giving his wife Jennifer Jason Leigh and Nicole Kidman their best roles in years as sniping sisters who are inexplicably connected despite years of emotional terrorism towards each other.

Margot at the Wedding

Unremittingly dark and unapologetically unafraid to show the main characters as unsympathetically damaged and flawed; Margot (which has more than a few Ingmar Bergman overtones) is a two-woman showcase for Kidman and Leigh to flex their acting muscles as two very different, yet fundamentally linked sisters who share a turbulent history with one another. Leigh, who is always a pleasure to watch, should be up for the Oscar that has eluded her for more than fifteen years (in a just world). Her Pauline is one of the actresses’ finest creations: earthy, natural, and soft; a welcome change from the risky actress known for her portrayals of intense, damaged women. The range and maturity that Leigh conveys is astounding.

Kidman, who can be hit or miss, is on fire as Margot. Not since her role in 2001’s The Others, has the actress found such a perfect character with which to harness her natural iciness and neuroses. Margot is a tangle of nerve endings about to explode. She is brainy, lonely, and what this boils down to is a veritable field day for any actress. Kidman realizes the opportunity and plays the part beautifully. This is a character who would have been right at home in a film in the 1970s by John Cassavetes or Woody Allen, and Margot is the perfect marriage of actresses, director, and script.

In Bloom

In Bloom, director Vadim Perelman’s follow-up to 2003’s House of Sand and Fog, can be seen as a success in that it highlights three strong, unique female performances: Uma Thurman, Evan Rachel Wood, and Susan Sarandon’s daughter Eva Amurri play three women coping with the effects of a high school shooting. Each brings something unusual and strong to the bleak, sometimes off-kilter film. Perelman, as he did with his first feature, shows a clear affinity for working with capable actresses.

While Anton Corbijn’s Control may have been about the boys club of Joy Division, it was co-star Samantha Morton who quietly stole the show as Ian Curtis’ young wife Debbie. In a film where the boys all got to go out and play rock and roll, sleep with all of the groupies, and get all of the glory, it was Debbie’s story that kept the biopic rooted firmly in reality. Morton, in yet another fully-realized portrayal, never lets Debbie slink into the trap of being just another “wife role” –- something that Terry George could have taken pointers on when making Reservation Road, a film that sadly relegates Oscar winners Jennifer Connelly and Mira Sorvino to the supportive sidelines in routine “spouse” roles.

The same, unfortunately, is true for Reese Witherspoon (who won an Oscar for playing “the wife” role in Walk the Line) in Gavin Hood’s Rendition. The actress has very little to do as the put-upon wife of an Egyptian national who is mistakenly labeled a terrorist, other than play a second-rate, shrieking Nancy Drew alongside Peter Sarsgaard. Not even the presence of Meryl Streep (venturing awfully close to self-parody in her essentially stock role) can save this sentimental, clichéd disappointment. What wants to be an edgy, timely examination on Middle East policies and modern warfare instead devolves into an overly-liberal stinker.

For real political edge, the film to turn to at the TIFF this year was an animated one: artist Marjane Satrapi (along with co-director Vincent Paronnaud) adapted her own autobiographical graphic novel Persepolis to resounding success. Spanning decades, beginning in Iran as the Shah comes to power and the Islamic fundamentalists seize control of the government, Satrapi examines what war means for a young, outspoken woman in a country where men dominate almost everything and women are second class citizens.


The second half of Persepolis finds Marjane sent away to Europe by her politically-active parents and taking a pointed look at racism towards people of Middle Eastern descent. There are a lot of bold ideas happening in the film, which is peppered with a droll sense of humor and an assured artist’s touch. Every element that was essential to the success of the books has been gloriously transferred to the big-screen version intact; and while this isn’t a frame-for-frame recreation of the novels, Persepolis never suffers from refusing to be slavishly devoted to its source materials.

While the clear presence of women could definitely be seen in the acting achievements, there was also a major feminine impact in the director’s stakes: Satrapi, Julie Taymor, Tamara Jenkins, Robin Swicord, Alison Eastwood, and Helen Hunt all debuted films at the TIFF this year, to varying degrees of success. But the major thing to remember here is that when you stroll into a local multiplex, and choose a film, it is highly unlikely that a major studio film is going to be directed by a woman. So to see five ladies, all confidently in control of their visions, get a chance to show five very different films at a major festival like this, there is a glimmer or hope for the directorial future of women; even if some of the films ended up as grand misfires.

Across the Universe

Taymor’s film, Across the Universe provoked another love-it-or-hate-it reaction from most festival-goers. The visionary director (whose Titus and Frida were both visually stunning) was given near-unanimous praise for its visually stunning uniqueness. The music (culled from the back catalogue of The Beatles) was the real star of the show, as most fans would point out; but the film’s script received a lot of criticism for being of mediocre quality, with laughable dialogue.

Across the Universe garnered some attention earlier this year when the film was taken out of Taymor’s hands (by studio executives), and handed over to another editor to whittle down the three-plus hour running time. While the director and the studio eventually found a happy medium, as far as length goes, the fact that the film was taken away from the artist shows a glaring discrepancy from the way a male director’s film might have been received: with Taymor, her film was taken away because of a perceived incompetence. Had this been a male director’s film, he would have been called an auteur.

The director will have another battle on her hands when the film is widely released: will the public pay to see what is essentially a two and a half hour, grand-scale music video for The Beatles? Is there a viable audience for this music anymore that will come out to support it?

The Savages

Jenkins fared much better with her biting, effective The Savages, her first feature since 1998’s The Slums of Beverly Hills. Tackling sibling rivalry, the state of elder care in the US, and familial bonds during times of crisis, Jenkins was able to scale back all of the obvious emotions tied to these often taboo subjects and strip everything down to it’s bare bones; creating an indelible, funny, and often touching film about the titular family.

Phillip Seymour Hoffman and Laura Linney, as the brother and sister who must come together and stop being self-involved when their ailing father (Phillip Bosco) becomes their dependent, give career-best performances in The Savages, thanks mainly to Jenkins’ impeccable script – which gives the actors a chance to cover all of the bases.

Swicord is known mostly for being a screenwriter (she famously adapted Memoirs of a Geisha and Little Women), which is why the mild The Jane Austen Book Club, her feature directorial debut, comes off as a bit disappointing.

Despite having a solid cast of women (including Amy Brennenman, Maria Bello, Kathy Baker, and the great Emily Blunt), the film is so conventional and poorly-edited that even the biggest supporters of the “chick flick” will likely be unsatisfied with this lumbering adaptation.

Then She Found Me

Hunt fares much better in the directorial debut and novel adaptation stakes, mainly because of her familiarity with the genre: the romantic comedy. Then She Found Me is a light, confident directorial debut that shows Hunt at the top of her genre game: the actress directs not only herself with a strong touch; but also gives beloved veteran Bette Midler a chance to prove herself as a character actress after being sadly put out to pasture for the last few years as a performer.

Hunt’s graciousness in turning each scene Midler is in over to the respected, gifted star is a very smart (and bold) move for both women. The idea of a female director (who is also the star of the picture) supporting another woman of another generation so generously is one that needs to be explored more in feature filmmaking, and Hunt makes it look effortless and fun.

Clint’s daughter, Alison Eastwood, gave it a game try with her directorial debut Rails & Ties, but the formulaic, unbelievable plot and plodding television movie editing kill the film’s emotional pull, despite a very nice performance by Marcia Gay Harden and a less successful one by Kevin Bacon, as a husband and wife who illegally take in an orphan after a train accident.

Films made by male artists, Julian Schnabel’s sumptuous The Diving Bell and the Butterfly and Canadian director David Cronenberg’s expert Eastern Promises, focused more on male lead characters, but still offered up strong female characters with balance and poise: Promises boasted yet another canny, capable performance by Naomi Watts (who has been on a hot streak for a few years now); while Diving Bell featured four strong supporting roles in a film about a male author: Emanuelle Seigner, Marina Hands, Anne Cosigny, and the amazingly talented Marie-Josee Croze all took advantage of their relatively smallish parts and made each woman stand out.

I’m Not There

Oddly enough, the festival’s most talked about female contribution came from a woman playing a man: Cate Blanchett as “Jude”, a distaff version of Bob Dylan in his electric, drug-addled era; had everyone frothing at the mouth. Blanchett, who showed amazing range this year playing two legends (Dylan and Queen Elizabeth I in Elizabeth: The Golden Age—which everyone expected to be her runaway success), soared to new artistic, surreal heights as Dylan, out-performing the entire cast that included Christian Bale, Heath Ledger, Julianne Moore, Michelle Williams, Charlotte Gainsbourg, and Richard Gere.

“The image of Dylan is so well-known and so woven into our cultural fabric now that I felt the sheer shock of it that people must have experienced at that time is gone,” said Haynes. “I wanted to find a way to re-infuse it with true strangeness – the eeriness and sexual uncertainty and diffusion. And that’s why I wanted to have a woman play the part. And it took Cate Blanchett to transform that tall order into something more than a cinematic stunt.”

While the casting of the triumphantly weird I’m Not There could be misconstrued as “stunt-y”, director Todd Haynes has directed one of our generation’s most capable actresses to perhaps her most daring, experimental performance to date. In a career that already includes playing Katharine Hepburn (in Martin Scorsese’s Oscar winner The Aviator), Queen Elizabeth (twice!), Nora and Hedda (onstage in Henrik Ibsen’s A Doll’s House and Hedda Gabler), and key part in the Lord of the Rings trilogy; Blanchett’s work in Haynes’ visionary re-telling of Dylan’s story just might be her riskiest maneuver to date—albeit one that pays off handsomely.

It’s refreshing and satisfying to see, for once, a woman getting one of the year’s most interesting, and talked-about parts; a role that theoretically (on the page) should have been played by a man. It is the kind of female contribution to the movies that makes the possibilities for actresses seem limitless.

//Mixed media

'Sugar Hill' Breaks Out the Old-School Zombies

// Short Ends and Leader

"Sugar Hill was made in a world before ordinary shuffling, Romero-type zombies took over the cinema world.

READ the article