Call for Essays About Any Aspect of Popular Culture, Present or Past

 

Latest Posts

Bookmark and Share
Text:AAA
Monday, Oct 23, 2006

Economic determinism at work: My disinclination to start a family has always seemed to me a deeply personal decision based on my gut reactions to things like baby talk and steaming piles of human feces. But it may be that my reactions are colored by economic realities: In posts on the Washington Monthly‘s blog, to the costs of raising children and the fact that households of single people now outnumber those of marrieds.


The family used to be a refuge from risk. Today, it is the epicenter of risk. And, increasingly, families are a source of risk as well. Because it takes more work and more income to maintain a middle-class standard of living, financial shocks are more threatening for families. What happens when a woman leaves the workforce to have children? What happens when a child is chronically ill? What happens when a spouse loses his or her job? And what happens when families fall apart?
We are not used to thinking of children as an economic liability, but the facts are clear. According to 2005 calculations by the U.S. Department of Agriculture, raising a child to age 18 will cost almost $237,000 for a middle-income family. And that leaves out the upward-spiraling college tuition that is now a required ticket for admission into the middle class. Fully one-quarter of “poverty spells” — periods in which family income drops below the federal poverty line — begin with the birth of a child.
Spouses, of course, do not equally share the investment of time and money that raising children entails. Women still mostly care for the child — and bear the greatest cost. Their careers are most likely to be disrupted by family events. If they work, their jobs are most likely to be low-paying and with poor benefits. And they are most disadvantaged when families fall apart.


In short, raising a kid is expensive and extremely risky and could result in your spending many sleepless nights wondering how to pay bills and avoid the dire poverty you’ve set a course for. Perhaps if we had a society that truly valued family, we’d have governmental protections like they have in France, where they also have a rising birth rate and are free from hyperbolic worry of demographic disaster.


Of course we are encouraged to embrace the ideology that families are made by love, not money (which even a cursory knowledge of the history of the family disproves), and that it’s callous to think of the expense of a child, to put a price tag on the miracle of life. People who prefer to treat themselves—with free time, hobbies, solid nights of sleep, financial security, the luxuries of dining out or taking in cultural events without the stress of penny-pinching or baby-sitter hunting—instead of make sacrifices for children are regarded suspiciously as selfish monsters (probably liberals and hence hedonistic sybarites) whose lives will ultimately lack meaning. Women are suspected of ignoring their “biological clock” and suppressing their natural maternal instinct, though as Brad Plumer explores here the maternal instinct is something of a recent invention. “In her new book, Laura Kipnis points to a related concept—the idea that mothers and infants “bond” at birth. According to Diane Eyer, the whole notion is actually something of a scientific fiction, first pushed in the 1970s, when a large number of women where entering the labor market for the first time and messing up cultural conceptions of a woman’s place in society. Many people still believe that mother-infant “bonding” exists—articles are still cropping up in the news about how painkillers and the like affect the process—but to a large extent it appears to be invented.” Plumer cites this passage from Kipnis’s book, The Female Thing:


With the industrial revolution, children’s economic value declined: they weren’t necessary additions to the household labor force, and once children started costing more to raise than they contributed economically to the household, there had to be some justification for having them. Ironically, it was only when children lost economic worth that they become the priceless little treasures we know them as today. On the emotional side, it also took a decline in infant-mortality rates for parents to start treating their offspring with much affection—when infant deaths were high (in England prior to 1800 they ran between 15 and 30 percent for a child’s first year), maternal attachment ran low. With smaller family size… the emotional value of each child increased; so did sentimentality about children and the deeply felt emotional need to acquire them.


As children become more expensive, the more acute becomes our need to feel that they are worth it emotionally. The amount we love children, then, may be correlated to how much hardship they give us, how gratuitous they are to our material well-being. To use a totally trivializing comparison, this is like when I blew my allowance money on a copy of the truly awful Frankie Goes to Hollywood double album Welcome to the Pleasure Dome because I swallowed the hype about it. (I learned some hard lessons about the British music press and the costs of juvenile Anglophilia, but that’s another story). I had to pretend I liked it harder and longer because I spent so much ($14 in mid 1980s teenage dollars) on it.


But if Hacker’s right, the ideology of treasuring children and yearning for them only goes so far; for bourgeois folk like me, it may stop when children are no longer merely a personal inconvenience but a threat to our accustomed station, when having a kid means surrendering our class prerogatives. In response to the threat children pose, we may begin to exaggerate the other nuisances they present to solidify our resolve.


Bookmark and Share
Text:AAA
Sunday, Oct 22, 2006


As part of a month long celebration of all things scary, SE&L will use its regular Monday/Thursday commentary pieces as a platform to discuss a few of horror’s most influential and important filmmakers. This time around, how Tobe Hooper, one of post-modern horror’s most promising filmmakers, became a monster movie pariah.


How did it happen? Where did he go wrong? In a perfect world, Tobe Hooper wouldn’t be a fright film pariah. He’d be considering his next creative decision, mulling over dozens of derivative Hollywood scripts in a coy cat and mouse game that he, naturally would end up winning. He would have taken the success of his amazing 1974 classic The Texas Chain Saw Massacre, and parlayed it into a non-stop stream of genre defining and redefining efforts. There’d be no question about who directed Poltergeist (screw a certain Steven S.), and past films like The Funhouse and Eaten Alive would be seen as minor missteps instead of the last likeable efforts from one of the medium’s most misbegotten masters. Sadly, this is not a perfect world, and as anyone who’s tried to sit through many of Hooper’s more recent efforts, he is definitely not a perfect filmmaker.


So how did it happen, actually? Where indeed did Tobe Hooper go wrong? There are some rather ardent supporters who still believe in his ability to scare people, holding out hope that he’ll eventually right his derailed directorial canon. They will overlook outright junk like Spontaneous Combustion, Night Terrors, Crocodile, The Mangler, and his most recent reject, Mortuary and still claim that prior to becoming a Hollywood hack for hire, Hooper was still a vital filmmaker. They may have a point. Looking over the films he’s made in the 12 years between the two signature Saw films argues for an artist still trying to be viable in a filmic category that was slowly swallowing its own soul. As the Devil gave way to the slasher, Hooper helmed unique and uncompromising movies that said more about who he was as an idealistic individual than the current state of macabre.


No one could have predicted that a little slapdash exploitation film made to grind some bucks out of the still viable drive-in demographic, based loosely on the life of Wisconsin’s notorious Ed Gein mythos, would end up being one of terror’s tent pole experiences. Through a combination of inspiration, invention and outright karmic happenstance, what could have been a minor monster movie became an unsettling work of art. Take away all of The Texas Chain Saw Massacre‘s violence and brutality – the final shot of Leatherface dancing in the rising sun of a new day is one of the most compelling images ever captured on celluloid. It made Hooper an instant icon, and secured his place as one of the pioneers of terror. It also opened doors for the former college professor and documentary cameraman that perhaps he shouldn’t have passed through.


There were also signs early on that all was not well in Hooperville. Right after his killer alligator epic Eaten Alive, the filmmaker was hired to helm The Dark, an oddball extraterrestrial invasion film that looked and felt like an attempt to jump on the about to be hot Alien bandwagon. At some point in the production, Hooper went head to head with the producers, and was fired. John Cardos was brought in to finish the project. It wouldn’t be the last time that Hooper was removed from a movie. Aside from the rumors surrounding Poltergeist, he quit the British snake thriller Venom, sighting “creative differences” with the main moneymen. Among the many reasons a filmmaker can fall in the tripwire town of Tinsel, failing at the box office is creative crime number one. But standing right besides said fiscal flopping is the “difficult reputation”. Whether or not his reasons for rejection were viable, Hooper had been labeled. And after his next three films, he’d more or less cemented his professional unacceptability.


After that notorious suburban spook show hit, Hooper was handed a number of possible projects. Unfortunately, he fell in with the infamous meddlers Yoram Globus and Menahem Golan of Canon Films. While they promised financial support, they delivered no guarantees when it came to final cut, or eventual distribution. Three years came and went before Hooper’s adaptation of Colin Wilson’s The Space Vampires arrived in theaters, minus 15 of its original 116 minute running time, and with the lamentable title change to Lifeforce. More sci-fi than scary, and missing much of its internal logic thanks to the editing, the film was viewed as a failure by even the most ardent Chainsaw supporter. Even those who came to appreciate the movie in later years were mainly responding to the recovered “director’s cut”. It was a stunning blow for a man that, up until this UK jive, was considered a fabulous fright master.


His next step didn’t endear himself to anyone. Hooper had always loved 1953’s Invaders from Mars, and wanted to modernize the cheesy matinee classic. Unfortunately, while the situation looked new, the effects were as retro as a trip back to the Eisenhower era. The decision to maintain the look and limits of the old b-movie style of monster made this intended update more funny than fresh, and fans just didn’t get the rationale behind revisiting what appeared to be a standard shoddy creature feature from the past. Lost for a novel next step, Hooper appeared to become desperate. His next move would baffle even his heretofore strongest followers.


Depending on who you listen to, The Texas Chain Saw Massacre 2 is either a wonderful cinematic satire, on par with the scathing social commentary found in George Romero’s work or the last bullet in the creative gun that helped Hooper commit career suicide. There’s no meaningful middle ground on the project – fright film mavens either love it or LOATHE it. Purposefully the polar opposite of everything he did in the 1974 original (tense atmosphere, documentary stylizing, maintenance of an air of authenticity) this full blown farce had our antihero Leatherface as a hyped up horndog. It presented the previous sinister Cook as a non-stop one liner dropping Bleak chorus. It even introduced a new clan member into the mix, the metal plate sporting Chop Top, whose sole purpose seemed to be egging on his power tool wielding brother while dropping deranged pop culture references.


Time has definitely treated this instantly dismissed title rather well. Even disparate elements like Dennis Hopper’s Method acting madness, or the entire Vietnam-based abandoned amusement park now seem like part of one artistic madman’s personal cinematic purgative. A great deal of the time, Chain Saw 2 plays like Hooper’s final statement on the entire Massacre phenomenon. He kids himself, and his fans, even adding a scene where Drive-In critic Joe Bob Briggs comments on the manner in which Leatherface slaughters some random babes. Golan and Globus had wanted another dark, disgusting exercise in dread. What they got was an aggressive, Airplane! like lampoon where the only thing taken seriously was Tom Savini’s autopsy-quality F/X.


It was apparently the straw that finally broke the fear fans’ benevolent back. The original movie is considered by most to be one of the best ever made. The revamp came and went without anyone much mentioning it afterward. Canon closed shop, leaving Hooper to wander through a few tame television efforts before trying his hand again at the big screen. Spontaneous Combustion was certifiable proof that his outright genre rejection shown in Chain Saw 2 was not just some one-time Hooper experiment. A stupid story involving nuclear weapons, genetic defects, and one man’s ability to immolate people made absolutely no sense when it finally found its direct to video home, and the disdain and contempt for the audience was obvious. Hooper no longer wanted to connect with viewers. He was merely going to give them what he saw fit. Fuck ‘em if they can’t take his fright.


It has been all downhill from there. When the best thing you can say about a recent Hooper effort is that it had some pretty good gore effects (the only interesting element in his otherwise pointless Toolbox Murders remake), you know you’re dredging the bottom of the boo barrel. Having long since given up on this journeyman turned joke, most fans find his current canon to be as laughable as it is lamentable. His production credit on the two new Chainsaw updates also causes the faithful to cringe, again considering the status the first film has in the annals of the genre. And yet, none of this really explains why he’s now such a non-entity. Scholars could compile as much research as possible and still not be able to figure out how or why Hooper finally fell.


It’s possible that, like Chain Saw 2, or Eaten Alive, the movies that many consider to be horrid examples of Hooper’s oeuvre will find solid support upon future reevaluation. After all, his masterpiece was considered quite the abomination at the time of its release. It is conceivable that something like Night Terrors will be hailed as a classic, or Invaders from Mars seen as something of a sci-fi highlight decades from now. His career could also be a clear case of the almost unavoidable horror one hit wonder paradigm. Maybe Hooper only had one good movie in him, and the original Black and Decker epic was it. It could also be that Hooper was stereotyped by The Texas Chain Saw Massacre. Perhaps he saw himself as a far more varied filmmaker, capable of dabbling in any and all cinematic categories. Unlike Sam Raimi who found a way out, Hooper got stuck being a terror titan – and it effected everything he did thereafter.


Of course, one can’t discount the Poltergeist factor. The 1982 film was such a huge hit that individuals on both sides of the situation obviously understood the power of being linked to such a box office behemoth. The power play against Hooper – the persistent if still unproven rumors that, once again, he had been replaced and that the end result was more a Spielberg style scare film – hounds him to this very day. It leaves people with questions, allowing them to think that there is more truth than professional sour grapes behind the undying creative control gossip. And maybe it became too much. Maybe playing the Hollywood game and getting your otherwise appreciated name dragged through the meaningless motion picture mud has scarred Hooper forever.


It sure does appear that, after the Poltergeist poisoning and his inability thereafter to reproduce it’s success, Hooper simply gave up. Nothing post-Chain Saw 2 has had the pure horror chutzpah of the movies he made in the ‘70s. Even his TV miniseries version of Salem’s Lot and the carnival as killing floor fiendishness of The Funhouse can’t find a comparative contemporary equivalent. It’s as if this director just stopped trying once 1986 ended, and the last 20 years have been an endless ramble toward complete cinematic insignificance. It’s already working. Many younger film fans think the original Texas Chain Saw Massacre is a meek, mild effort when compared to Marcus Nispel’s balls to the wall reimagining, That a true horror milestone can be made unimportant reflects very poorly on the man who made it. If he’s not careful, Tobe Hooper may discover that it’s too late to save his already addled legacy. And that’s more terrifying than anything he’s done in decades. 


Bookmark and Share
Text:AAA
Sunday, Oct 22, 2006

The linkage of coffee and culture is not new. Coffeehouse was a byword for intellectual foment in 18th century England, and beatniks were widely regarded as skulking in coffee shops in their heyday. So It’s not suprising Starbucks wants to be in the culture-distribution business, streamlining and sanitizing the coffee-culture linkage, debeatnikifying it the way it has de-Europeanized the cappuccino and espresso machine. Today’s NYT has a long article about “The Starbucks aesthetic” in the Arts and Leisure section:


the chain is increasingly positioning itself as a purveyor of premium-blend culture. “We’re very excited, because despite how much we’ve grown, these are the early stages for development,” said Howard Schultz, the chairman of Starbucks. “At our core, we’re a coffee company, but the opportunity we have to extend the brand is beyond coffee; it’s entertainment.”


Much like Coca-Cola has proclaimed it is a media company selling brand impressions (as opposed to a beverage company selling sugar syrup, like many of us have naively believed), Starbucks is positioning itself the same way, selling its brand as a cultural filter, selecting highbrow coffee (latte hasn’t replaced limousine in the epithet for self-centered liberals for nothing) and entertainments, to send the proper signal of gentility to the people who pay attention to such things. The coffee has established itself as the upper-class alternative to plebian coffee, and this presumably has a halo effect that hovers over everything sold in one of their branches. The de facto music supervisor for the store in-house music, a former manager named Timothy Jones, says he looks for music that has “a believable sound that isn’t too harsh.” In practical terms, this means the sort of adult contemporary that you’d hear on a station like Philadelphia’s (truly nauseating) WXPN: Sting, Natalie Merchant, Amos Lee—dull and earnest, unlikely to disrupt a conversation or a nap. This is music that connotes authenticity while having all its edges smoothed by precisely the sort of compromises “authenticity” suggests one would reject. It’s a lot like NPR (which connotes liberalism without espousing anything actually leftist), which is mentioned frequently in the article as the cultural touchstone Starbucks shoots for.


People who buy records and, who in the future will buy books, at Starbucks are likely to be fairly conformist in their outlook on culture, seeking to gain no distinction from discovering anything original. Yet they probably don’t see themselves as part of the unwashed masses. They want to be familiar with the right things, and surround themselves with cultural product that will reaffirm their idea of themselves as an open-minded yet tasteful consumers of entertainment—thus everything at Starbucks must connote sophistication and adventursomeness without actually being so. You can feel hip without any of the unpleasantness that actually comes from associating with hipsters: arrogance, greasiness, contempt, envy, fierce competitiveness over personality nuances, etc. Thus the predominance in their music stock of lite world-beat music and elevator folk. Even if the consumer never flaunts his choices from the Starbucks cultural cornucopia, he may rest comfortably in his private enjoyment that he has placed himself squarely within the genteel matrix of the acceptable—it’s an efficient way to consume one’s own class status as a pleasing and satisfying product. Says one satisfied customer quoted in the article: “It’s who I am—baby boomer, upper middle class, a little hippyish, rockish. ...” Wouldn’t you be proud of that pedigree? Wouldn’t you want to be able to enjoy yourself, consume yourself, if that were you? Starbucks culture permits you to express your self-satisfaction through a shopping gesture—the only gestures that matter in consumer culture—and have a souvenir of the triumphant moment. Look, this Akeelah and the Bee soundtrack. It’s who I am, and I’m wholly comfortable with it. The fastest-growing middlebrow chain endorses me and I endorse it.


What Starbucks gains from all this is a more effective way of shopping for the right sort of customers: “The more cultural products with which Starbucks affiliates itself, the more clearly a Starbucks aesthetic comes into view: the image the chain is trying to cultivate and the way it thinks it’s reflecting its consumer.” These affluent customers reinforce the brand image and police it; for those who don’t fit the demographic, the sonic barrage of Madeleine Peyroux serves as repellant, if the disapproving glances of the sort of people who hang out in Starbucks don’t do the trick.


(See copyranter‘s screed on Gawker for a succinct retelling of the article.)


Bookmark and Share
Text:AAA
Saturday, Oct 21, 2006


While cruising the sun stroked byways of Retirement Territory, U.S.A (a.k.a. Florida) on his mega-machined chopper, wounded Vietnam veteran Herschel runs into Jesus’ personal P.R. representative, Angel. She lives with her dope fiend sister Ann in a house frequented by several prime examples of why American ingenuity and productivity was so poor in the ‘70s. While Angel preaches the psalms to Herschel, Ann tries to get to “know” him in the true Biblical sense. Realizing that the only begetting old Hersch is interested in is of the platonic variety, Ann seeks her revenge by making the beefy buffoon smoke some oregano doobies laced with pure smack. One puff, and Herschel is hooked, painfully craving more spiked smoke to calm his horrible overacting.


But instead he gets a job on a local turkey farm where the inbred cousins of Bartles and James feed him free bird pumped full of Adolph’s meat tenderizer, overly salty chicken broth, and the magic ingredient Polyplotpoint 80. Instead of copping a buzz off the L-tryptophan, however, Herschel turns into a half-man/half bird beast, complete with papier-mâché turkey head and overdubbed gobble. Hungry like the hen, he goes out looking for drug addicts to kill for their rich, chemically enhanced blood. And while Ann feels guilt for getting Herschel hooked, and Angel memorizes the last few Beatitudes, the foul feathered fiend roams the streets of Sun City Center, looking for supermodels, rock stars and grade schoolers to supply him with the opium rich artery juice he so desperately needs.


What do you get when you cross some retread reefer madness, accidental drug addiction, religious fundamentalism, body building and processed turkey loaf? Well, if you’re oddball director Brad Grinter, you end up with Blood Freak, the only film in the entire exploitation canon to be endorsed by The Southern Baptist Convention, the Betty Ford Clinic, and the Butterball Thanksgiving Hotline. There is probably no other movie in the long lineage of monster/maniac/heroin related filmography that centers on a brawny European muscleman getting addicted to Chinese Rock-enhanced wacky weed while working as the subject of some warped experiments at the local subsidiary of the Perdue poultry empire. Only Godmonster of Indian Flats can boast a more bizarre cinematic universe, and yet its Old West weirdness just cannot compare to Freak‘s Vietnam vet in a fowl mood madness.


It’s hard to fathom what Grinter was hoping to achieve with this movie. Was he mad at drugs? Irritated by religion? Longing for the invention of Stovetop Stuffing? The motivation is unclear. But the method used to achieve it is downright demented. Grinter is of the old cinematic school that feels a movie doesn’t have to make a great deal of linear sense as long as it contains frequent shots of the director smoking. That’s right, about every eight minutes or so, our swarthy South Florida celluloid sod appears on camera, eyes blurry from too many Tom Collins, fingers and breath stained yellow from endless Marlboros, hair swirled with a combination of Alberto VO5 and dried vomit, and proceeds to narrate the film by blatantly reading from the script. His Grecian Formula 16 chorus adds an inebriated pseudo-philosophy to the entire pissed off psycho pullet shenanigans.


But these drunken monotonous-logues by Mr. Grinter, with their non-sensical segues and his pre-throat cancerous croak are not the only unhinged things about Blood Freak. The whole religious, Jesus saves subplot is hilariously out of place here. It’s as if some cast member ran across a copy of The Watchtower on the craft services table and wouldn’t let the production finish until there was a little holy hollering added to the sex, drugs, and turkey murders. The cast gives off the aura of being perplexed by their own performances, with the forced child confession emoting of the actress playing Ann as plastic as the elaborate layers of eye paint she wears—Tammy Faye must be spinning in her vanity chair.


But it’s the whole murderous doped up turkey-man idea that shoots this movie into the surreal stratosphere. The scenes of our strung out strongman, big bullem bird head in place, attacking victims and letting blood have an unworldly, downright disturbing quality. You will be laughing, mind you, but some of the gore is fairly nasty. Especially effective is an elongated torture scene near the end of the film. Lets just say it involves our insane roaster, a table saw, and a drug dealer’s leg (Lucio Fulci would be proud). The kinetic, freestyle editing, the endless shots of Grinter babbling like an improvising, smut peddling Criswell, and actors who play dead by wincing and wiggling as all the while effects gore F/X across their face makes Blood Freak a first rate crazed capon caper.


Bookmark and Share
Text:AAA
Friday, Oct 20, 2006


The Frighteners is Peter Jackson’s lost masterpiece, an important cinematic cog linking his genre work of the past with the monumental achievements in fantasy filmmaking he would attain with the Lord of the Rings. Coming right after the personal, praised Heavenly Creatures, Jackson had wanted to make a more mainstream film. Robert Zemeckis stepped in and offered the director a chance to make a full-blown Hollywood hit. With longtime partner Fran Walsh, Jackson had been kicking around the idea of a Ghostbusters-style psychic who conned people out of money by pretending to purge spirits from their home. The only catch was that Frank Bannister could actually see specters, and was using the otherworldly agents as his grifting partners. Agreeing to let the director film in his native New Zealand (which more or less passes for the Pacific Northwest) and also allowing all the post-production work to be done by Kiwi craftsman, The Frighteners suddenly had full U.S. studio support.


Though it failed to become the blockbuster everyone had hoped for, The Frighteners still became a real stepping-stone in its creator’s canon. Beyond its import to his career, Jackson’s film is also important in the ongoing evolution of CGI. Before WETA’s work in The Frighteners (they also helmed a few scenes in Creatures), computer-generated imagery was seen as the exclusive domain of the Americans—and ILM in particular. While Jurassic Park will always be seen as a monumental step forward, The Frighteners was a formidable attempt at the seamless incorporation of motherboard rendered visuals into a narrative. The main monster here, a wonderfully fluid and fierce figure known as The Reaper, may seem a tad dated in light of our post-millennial management of CGI elements, but for its time, the callous cloak with a deadly sickle was quite a quantum leap.


Jackson also pushed the basic boundaries of the new effects format in his film. For him, it wasn’t just eye candy or a visual set piece. The CGI characters in The Frighteners had to live and breath, acting with emotional resonance and believable authenticity. Though he would have much more success in this department with Rings (and now King Kong), the ghosts created for the film really do live up to their spectral specifics. Thanks to the added footage included in the new director’s cut, we get to see Jackson having more fun with his phantoms, putting them through their physics-defying paces to increase the crazy cartoon-like anarchy of the film. Jackson enjoys giving the Judge character a less-than-complete corpse, and has fun fooling with some attempted splatter effects as well. The entire movie feels like a resume reel for a man who would one day create the most consistently artistic and accomplished trilogy in the history of motion pictures.


But it’s the amazing acting that really sells The Frighteners. Michael J. Fox—near the end of his reign as a box-office champ and ready to challenge himself with different, difficult roles—finds a lot of heart and horror in the backstory of his bogus psychic detective. Frank Bannister is supposed to be a scarred man, more figuratively than literally, and Fox wears such wounding across his still cherubic face. But when asked to dig deep and play the depths of despair, he really delivers the goods. Trini Alvardo, Dee Wallace Stone, Jake Busey, and the ghostly trio of John Astin, Jim Fyfe, and Chi McBride are all excellent. But if the movie truly belongs to one individual, it would have to be everyone’s favorite Re-Animator, Jeffrey Combs. As messed-up FBI flatfoot Milton Dammers, Combs creates a character so unique, so unbelievably idiosyncratic and iconic that he truly deserved Oscar recognition for this work. Every line reading is like an adventure, every reaction a study in sensational strangeness. By the time he’s reduced to a near-routine villain, spitting out his threats with varying vileness, we want as much Milton as we can get.


One of the best things about The Frighteners, though, is that Jackson never overstays his cinematic welcome. We receive just enough Dammers to satisfy our sentiments, not so much that we grow weary of his weirdness. The same with the spooks. Had Jackson turned them into the poltergeist version of the Three Stooges, all slapstick and joking jive, we’d want less of their ethereal lunacy. Indeed, everything about The Frighteners is measured and metered out in sly, successful segments. The film has the real feeling of a completed, complementary work, where narrative ends are tied up and tossed together with other cinematic specialness to create a solid, satisfying whole. There are those who believe that the film is still missing a key entertainment element (and they will probably feel the same after viewing the long-dormant director’s cut), but the truth is that, for its time, The Frighteners was one masterful movie. It deserved more credit than it got during its initial release


Now on PopMatters
PM Picks
Announcements

© 1999-2014 PopMatters.com. All rights reserved.
PopMatters.com™ and PopMatters™ are trademarks
of PopMatters Media, Inc.

PopMatters is wholly independently owned and operated.