Call for Music Writers... Rock, Indie, Hip-hop, R&B, Electronic, Americana, Metal, World and More

 

Latest Posts

Bookmark and Share
Text:AAA
Wednesday, Oct 1, 2008

Maybe not but it’s disheartening to see them file for bankruptcy.  Great publication that had quality arts coverage in it.  I’d even say the same about the New York Sun, despite its pathetically fawning coverage of Dubya.


Bookmark and Share
Text:AAA
Wednesday, Oct 1, 2008

Good to see that former antichrist John Lydon is appearing in an ad for Country Life butter. Where would we be if punk hadn’t upended the establishment and ushered in a whole new set of values based on integrity, authenticity, and a refusal to support the status quo?


Bookmark and Share
Text:AAA
Tuesday, Sep 30, 2008
Words and Pictures by Kirstie Shanley.

My Bloody Valentine—a legendary band who epitomized the shoegaze genre, if not actually defined it in some sense—brought their own sound system to Chicago’s Aragon Ballroom, filling the space with raging cascades of sound. Channeling an ethereal sort of madness, the Irish four-piece played with a transcendent force that tore through the enraptured crowd.


As one might expect, a band like My Bloody Valentine does not conform to a typical stage presence. There is very little in the way of words or even spaces between songs. The stage was filled with drastic periods of alternating darkness and blinding light. Shielded from the front section of the stage with plexiglass sound barriers, bassist Debbie Googe stayed in the back next to drummer Colm Ó Cíosóig. Despite this obvious distancing, they worked effectively together to provide a building tension and edgy rhythm. Kevin Shields and Bilinda Butcher stayed in front but apart, visible at times only between shadows and strobes. The intensity of their impact lay not within any facial expressions but within the massive sound they created. 


Through all the grinding and swirling guitars provided by Shields and Butcher, the sweetness of Butcher’s vocals drifted across the audience throughout most of the night, helping to lessen the harshness with a strong feminine presence. The band, back after a prolonged hiatus (their last album was released in 1991), played many key favorites from Isn’t Anything and Loveless with a focus on the latter of the two. The audience stood transfixed as the band powered through “Feed Me with Your Kiss”, “I Only Said”, Only Shallow”, “When You Sleep”, and “To Here Knows When”. And while there were times when the noise climbed to such a height that it became impossible to distinguish a song’s separate element, these decibel-destroying occasions allowed the tunes to take on a new shape.


The pivotal moment of the set came during “You Made Me Realize”, which broke off into over twenty minutes of evolving noise that locked into a strange infinity. It was impossible not to feel like you were surviving something outside this world that was as heavenly as it was traumatic. In many ways, it felt like punishment served at the same time as salvation. Halfway through the sonic assault, audience members reached up in the air, grabbing the dissolving molecules of sound as if they were pieces of chocolate. This was the sort of pounding that might lead to blistering, but it was a welcome assault, and left the impression that, oddly enough, it was the silence that was too loud.



Bookmark and Share
Text:AAA
Tuesday, Sep 30, 2008

In continuing to think about whether social networking is engineered to make us more narcissistic, I picked up Christopher Lasch’s study The Culture of Narcissism, the dour condemnation of 1970s America that helped prompt Jimmy Carter’s infamous malaise speech. Early in the first chapter, after arguing that Americans are helplessly dependent on bureaucratic institutions, he unloads with this:


Narcissism represents the psychological dimension of this dependence. Notwithstanding his occasional illusions of omnipotence, the narcissist depends on others to validate his self-esteem. He cannot live without an admiring audience. His apparent freedom from family ties and institutional constraints does not free him to stand alone or to glory in his individuality. On the contrary, it contributes to his insecurity, which he can overcome only by seeing his “grandiose self” reflected in the attentions of others, or by attaching himself to those who radiate celebrity, power, and charisma. For the narcissist, the world is a mirror, whereas the rugged individualist saw it as am empty wilderness to be shaped by his own design.


The applications of this to social networked seem pretty self-evident. The empty profile page provides the illusion of providing that “empty wilderness” to conquer, but that is just the alibi for the real function of social networks, which is to gratify our bottomless need to be validated in as close to real time as possible. Clearly Facebook and Twitter serve to meet that need, and what’s more, it taps into the latent narcissism of all its users, rendering self-involvement even more socially acceptable. It’s now a perfectly plausible and respectable basis for a business model.


In general, The Culture of Narcissism is a bit cranky and dated, and a bit too much of a jeremiad to be persuasive; its chief virtue is to reference Richard Sennett’s far more comprehensive and convincing The Fall of Public Man, which charts the disappearance of the public sphere and speculates about what exactly caused it to vanish. The gist of Sennett’s argument is that (perhaps for reasons that Habermas articulates in The Structural Transformation of the Public Sphere) society was once such that we maintained public and private selves: We donned a public persona, guided by rules of public conduct, when we sought to contribute to society, and in private we had an intimate self appropriate for family life. The two were only tangentially related, and neither was considered the absolutely authentic, real self. With the rules of civilization clearly in place, public discourse was civil and impersonal, and therefore far more objective and constructive, a place for “rational-critical debate”—- the sort of thing Habermas celebrates.


But thanks to the individualism fomented by the rise of capitalism prompted a growing fascination with authenticity, “realism” in the novelistic sense, depth psychology, and the all-consuming importance of an integral identity that we establish in our own minds through our deeds and public behavior. (The roots of this can be glimpsed in the 18th century cult of sensibility and then romanticism. In those movements was the advent of studied spontaneity. The 18th century had vestiges of a theatricalized public sphere that is annihilated by a new emphasis on authentic personality—one must represent rather than present emotion, so all public behavior is at a remove from the new standard of authenticity. Anxiety, and vulnerability to marketing campaigns, ensues.) Gradually we started to conceive as the public sphere as a place to establish our identity; it became a mirror rather than a realm for discourse and the shared social construction of reality. This makes social interaction difficult, since our whole personality is at stake, at all times, with all people we encounter. Consequently, convenience becomes synonymous with avoiding interpersonal contact (self-service begins in earnest). And we fall prey to “passive participation,” or the impulse to vicariousness, which allows us to partake in society, now reconceived as a kind of pageant of self, but without the vulnerability. Hence deliberation and conversation are out; marketing and celebritization are in. And the next thing you know, there are tattoo parlors on Main Street.


Facebook and Twitter would seem to complete the erosion of the wall between public and private selves, offering us the technology to broadcast every moment of our private lives as if the world was nothing but an audience waiting for updates, or a canvas onto which to paint our ever-evolving self-concept. It moves us from vicariousness to a more direct kind of self-display, because the filter of the internet shields us from the rejection incumbent with social participation (aka “social anxiety”). I imagine, though, that apologists for the technologies view the matter in precisely the opposite way, regarding the space of social networking as a rebirth of the public sphere, where no identity represented should be regarded as authentic but as evidence of free play and an experimental testing of possibilities for the purposes of our collective edification. But I don’t think that holds up: Most people would regard having multiple profiles on the same social networking site as sneaky, and a fictitious profile not as a expression of creativity but a pack of lies. Social networks seem to function as a more manageable substitute for actual presence in relationships, you get the upside of validation of your “true self” without the hassle of actual reciprocity. What distinguishes social networks from the blogosphere generally is that they are defined specifically by their not being forums for the exploration and debate of ideas. The prevailing purpose is to display yourself to your best advantage and “stay in touch” with people with whom it would otherwise take effort to remain in touch with.


Bookmark and Share
Text:AAA
Monday, Sep 29, 2008

He was classic Hollywood for the counterculture generation, a throwback to the days of good looks and gifted talent transformed into idealism, allure, and myth, He legitimized the word ‘legend’ proving that a mere mortal could carry the tag with dignity and distinction. He had the face of an angel, the ethic of a saint, and the passions of a sinner. Together with his deliberate career choices and professional admonitions, he forged a cinematic canon unmatched by his fellow fame seekers. Even outside the industry for many years, the rumors of Paul Newman’s life threatening cancer gave everybody in his business and his formidable fanbase pause. The 83 year old seemed so ageless, so timeless, that to think that something simple as disease could destroy him appeared impossible. Sadly, he succumbed to mere mortality on 26 September. It was more than just the end of an era. It was the end of an entire motion picture principle.


Of course, such greatness had to start from humble beginnings. As a youth, Paul Leonard Newman, showed a keen interest in acting. His father ran a small sporting goods store. His mother, a Christian Scientist, fostered his love of theater. By the time he graduated from Shaker Heights High School in his hometown, he was set to pursue a degree at Ohio University at Athens. He was later kicked out for bad behavior. With little options available, Newman entered the military and spent three years as a naval radioman during the Pacific campaigns in World War II. After the service, he completed his studies at Kenyon College, went on to Yale to work on his dramatic skills, and was accepted to Lee Strasberg’s prestigious Actor’s Studio.


Getting his start onstage, where he cut his teeth on such Great White Way smashes as Picnic, The Desperate Hours, and later Sweet Bird of Youth, Newman would also find roles in the fledgling format of live TV drama. It was a wonderful proving ground for the still green thespian. After seeing his theatrical turns, Warners offered him a contract, and a part in the Roman costume epic The Silver Chalice introduced the actor to movie audiences. Sadly, the film was so awful that it nearly ended Newman’s fledgling career. But with Somebody Up There Likes Me, he found a perfect fit. As real life boxer Rocky Graziano, Newman established an onscreen persona that would carry him through the next several decades - the well intentioned outsider who battles the system to salvage his own humanity.


After starring in a pair of Tennessee Williams potboilers - The Long Hot Summer and Cat on a Hot Tin Roof, Newman ushered in the ‘60s with a film that would end up looming large in his legend. As “Fast” Eddie Felson, he costarred alongside Jackie Gleason, Piper Laurie, and George C. Scott in the definitive pool hall parable The Hustler. The film showed that, even with his natural good looks, Newman could portray a morally complex (and occasionally, bankrupt) character. It was something he would carry on through signature turns in such now classics as Hud, Harper, and the messianic message picture Cool Hand Luke. By the end of the era, Newman was the biggest box office draw in Tinsel Town. In 1969, Butch Cassidy and the Sundance Kid continued his counterculture significance and mainstream value.


It also established one of his great lifetime friend and partnerships. At the time he was hired to play the brooding gunslinger with a luminous name, Robert Redford was an up and coming star. Newman championed the younger man, and together they formed a creative combination that would carry over for the next few years. By celebrating the anti-hero and deflating the influence and power of the “Establishment” Butch Cassidy clicked with late ‘60s audiences, and it wasn’t long before the duo were the most bankable actors in Hollywood. Their fantastic follow-up together, The Sting, would become an instant classic and winner of Best Picture at the 1973 Academy Awards. Newman took his new clout to the bank, making disaster films for Irwin Allen (The Towering Inferno, When Time Ran Out) and branching out into all manner of movies, from sports comedies (Slap Shot) to experimental fare with famed director Robert Altman (Buffalo Bill and the Indians, Quintet).


By the ‘80s, a middle aged Newman was ready to play elder superstar statesmen. The parts he chose continued to challenge his abilities (a down and drunken lawyer in The Verdict) and expand his range (the cartoonish Louisiana Governor Earl Long in Blaze). But one thing continued to elude the actor. Even after being nominated seven previous times, Newman had yet to win the Oscar. It would take Martin Scorsese, Tom Cruise, and some character karma in the form of a return to “Fast” Eddie to gain his little gold man. Color of Money showed that, while his façade may have aged, there was nothing ‘old’ about this longtime leading man. Today, his intense and insular performance makes the work of his younger costar seem overly simplistic by comparison.


With said persistent professional obstacle removed, Newman entered into a phase of semi-retirement. He only made five movies in the ‘90s, and of those, only Mr. and Mrs. Bridge (from Merchant/Ivory) and the Coen Brothers corporate screwball classic The Hudsucker Proxy stood out. He became even more reclusive in the new millennium, working with Tom Hanks in Road to Perdition, and voicing the amiable Doc Hudson for Pixar’s animated effort Cars. During his now abundant downtime, he continued several important passions from his far more famous days. Newman loved racing, and he indulged in the sport from the moment he completed work on 1969’s Winning. Charity was also important to the man. Having lost his only son to drug addiction in 1978, he was a supporter for rehabilitation. He also sponsored camps for children with cancer, and used his love of food to begin Newman’s Own, a culinary label that, to date, has contributed hundreds of millions to various non-profit causes.


For such a handsome, hunky lead, Newman was only married twice. His first marriage to Jackie Newman lasted a little over eight years. He met fellow performer Joanne Woodward while they were understudies on Picnic. After begging his first wife for a divorce, the icon and his new leading lady were married a week after the court’s decree was final. It was a relationship that lasted for the next 50 years. Newman often worked with his lady love, directing her in such films as Rachel, Rachel, The Effect of Gamma Rays on Man in the Moon Marigolds, and The Glass Menagerie. Theirs was a partnership that bucked the Tinsel Town trend. Normally, two incredibly successful and important stars would have a hard time sharing the spotlight professionally, let alone personally. But Newman argued that Woodward kept him grounded, and she the same.


Looking back at his illustrious career, it’s clear that this was a man who understood his influence within popular cultural and the social dynamic. His choices often reflected his politics, and during the ‘60s, he stayed close to his idealistic roots. By the ‘70s, it was time to expand the oeuvre, to experiment as part of the post-modern movement. The ‘80s was all about product, about sealing the legacy and retaining a bit of dignity. And up until his death a few days ago, the rest of his creative life was a balance between doing what he wanted and what he needed to in order to maintain his majesty. In between, he took on challenges that would undermine a mere mortal, his stature only growing as the years trailed by.


Sure, there was talk of a Newman/Redford reteam. There was even a 2004 interview where the two twinkled mischievously at the thought of making another movie together. There was also the change of heart, the actor announcing that Hudsucker would be his last film ever - before turning around and performing again. He was a foil to late night TV guru David Letterman, and was known - within limits - to poke fun at his own persona (as in Mel Brooks’ demented Silent Movie). But what’s certain about Paul Newman, and his lasting reputation, is the notion of true super stardom. He looked the part, played it perfectly, and never allowed fame to influence his abilities or beliefs. Newman never phoned it in, or traded his talent for a paycheck. Somehow, he knew his importance - beyond the good looks and classic features - to those in the audience. He never let them down, not in life, and not in death. Paul Newman was everyman’s idol. He was truly an icon for every generation, and deservedly so.


Now on PopMatters
PM Picks
Announcements

© 1999-2015 PopMatters.com. All rights reserved.
PopMatters.com™ and PopMatters™ are trademarks
of PopMatters Media, Inc.

PopMatters is wholly independently owned and operated.