Call for Music Writers... Rock, Indie, Urban, Electronic, Americana, Metal, World and More

 

Latest Posts

Bookmark and Share
Text:AAA
Tuesday, Apr 29, 2008

How is it that a song made of all the worst things possible is endlessly awesome? Contemplate this while enjoying Komar & Melamid’s “Most Unwanted Song” (link via Scott McLemee). The song was produced based on the results of a 1990s poll asking Americans what features they like least in music. Michael Bierut at Design Observer suggests the result is a triumph of design: “If working within limitations is one of the ways designers distinguish themselves from artists, America’s Most Unwanted Song is a design achievement of a high order.”


And naturally, the most wanted song is unlistenable. If American Idol is the future of pop music, this poll-produced contrivance suggests the future will be bleak indeed.


Bookmark and Share
Text:AAA
Tuesday, Apr 29, 2008

Clay Shirky, the technophilic author of a new book about spontaneous organizational behavior online, recently delivered this widely linked speech about how TV once managed to suck up the “social surplus” that is now being directed into building social networks and open-source applications and whatnot on the internet. He argues that the 20th century brought us more disposable leisure time, and it brought us TV to help us dissipate it.


Starting with the Second World War a whole series of things happened—rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before—free time.
And what did we do with that free time? Well, mostly we spent it watching TV.



Shirky then explains that while Wikipedia took an estimated 100 million hours of human participation to create, American TV viewers spend that much time every weekend watching advertisements.


Currently, with Web 2.0, etc., society is erecting a “architecture of participation” (a term he borrows from tech-industry consultant Tim O’Reilly) which will allow people to switch gears from passive consumption of TV to active participation in collective social projects—annotating maps and debugging software and posting lolcats and correcting misinformed bloggers in comments and that sort of thing.


It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some sans-serif fonts on your computer, you can play this game, too.” And that’s message—I can do that, too—is a big change.
This is something that people in the media world don’t understand. Media in the 20th century was run as a single race—consumption. How much can we produce? How much can you consume? Can we produce more and you’ll consume more? And the answer to that question has generally been yes. But media is actually a triathlon, it ‘s three different events. People like to consume, but they also like to produce, and they like to share.
And what’s astonished people who were committed to the structure of the previous society, prior to trying to take this surplus and do something interesting, is that they’re discovering that when you offer people the opportunity to produce and to share, they’ll take you up on that offer. It doesn’t mean that we’ll never sit around mindlessly watching Scrubs on the couch. It just means we’ll do it less.


Shirky’s vision of the future sounds great. We all benefit when we contribute our spare time to projects that in theory benefit society at large. And when people are more attuned to their productive rather than their consuming capabilities, they are probably likely to be much happier, as displaying one’s ability to make things is a quintessentially human quality and social recognition makes life worth living. But some skepticism is in order. The notion of a social surplus sounds a lot like Bataille’s accursed share. Bataille argues that in a post-scarcity society, individuals need to come up with ways to destroy excess production through various modes of luxurious waste in order to sustain economic growth. From this point of view, wastefulness is intentional, a demonstration of wealth. Squandering the surfeit of free time on sitcoms seems a variation on this. Perhaps, watching TV is not like drinking gin, as Shirky suggests, but like conspicuous consumption. It’s a triumph of our culture that we can waste entire evenings on Battlestar Galactica rather than, say, foraging for food. The point of free time is wasting it, not employing it productively in some other arena. An hour spent watching VH1 = luxury. An hour spent annotating a Google map = digital sharecropping. I don’t want to believe this, but empirical observation suggests that it’s true that many people have this attitude. Most people aren’t looking for ways to be more productive; instead they tend to seek means to consume more. And if they are given reasons to believe their manic consumption is really a kind of production in itself, so much the better. The culture industry’s main thrust in the internet era has been to do just that, make consumption feel like it’s participation, so we don’t feel bad when we don’t better avail ourselves of the “architecture of participation”. Chances are, this architecture will become more akin to a Playland at McDonald’s than the infrastructure for a social revolution.

In the quote from Shirky, a lot hinges on the inadequately defined word interesting. Likewise for his assertion that people watching TV are doing “nothing.” Some people will not be easy to convince that watching TV is equal to doing nothing, that it is not in fact “doing something interesting.” What the culture industry has traditionally done is not only mask the social surplus, as Shirky notes, but sell the passive squandering of it as a dignified social activity—watching “Must-see TV” becomes a way to participate in water-cooler conversations that occur mainly in our imaginations. The internet is a real ongoing conversation, one that opens us to risks (of embarrassment or irritation) as well as rewards. But the pretend conversation we have while passively consuming is 100 percent safe. We are also persuaded that participation and collaboration are more inconvenient than individuation and private consumption in isolation. In my life, this plays out as me playing chess against a computer when I could readily play against human opponents online. In my mind, playing a human is more rewarding, but in practice playing the computer fulfills my need for momentary, commitment-free distraction. Sharing and cooperation leads inevitably to compromise, and the main thrust of most advertising is to convince us never to compromise when pursuing what we want in our hard-earned leisure time.


In short, the marketing world and the culture industry at large invests a lot in making doing nothing feel like something, and for many of us, it does; we collaborate on the fiction with the marketers, and this curbs our inclination for the sort of collaborating Shirky talks about. Shirky does qualify his statements by saying that it only a few people to change their habits to produce a huge shift once the behavior is leveraged across the internet. And perhaps that is enough to prompt optimism. But the same forces that enable online sharing also enable deeper individuation, filtering, personalization and so on, discouraging cooperation on anyone else’s terms, limiting the usefulness of what is shared. It also extends marketing’s reach, enhancing the power of its messages about the joys of passive consumption and fantasizing about identity rather than “doing something.”


The most difficult word to define, then, is participation—to a certain degree we can’t help but participate in our culture, and Web-style interactivity has become much more prominent in that culture. But with that prominence the phenomenon becomes domesticated, becomes a new way to absorb the social surplus harmlessly, becomes the new way to watch TV. Its fruits are trivial before the fact, because most people don’t want to see themselves as revolutionaries, and many want to luxuriate in flamboyant triviality. They are already absorbed into the status quo, which perhaps is subtly shifting along the lines Shirky is suggesting but is always in the process of disguising the change. The radical breaks that futurists and techno-evangelists are always predicting is always about to happen; it can never actually come.


Bookmark and Share
Text:AAA
Tuesday, Apr 29, 2008
The Zarathustruan Analytics series continues with L.B. Jeffries' thoughts on player input.


Part of the reason this analytical method is named after Nietzsche’s Thus Spoke Zarathustra is to do justice to the individualized nature of player input, to put aside judging a game purely by the game play or plot and go beyond that to analyzing the actual experience of a game itself. The problem is…although critics are quite capable of analyzing their own experience from playing a game, it is not quite so easy to apply that analysis to others. Indeed, this critical method is more an approach to assessing the experience creating methods in a game rather than the individual experience itself. The player input, then, is literally your connection to the game because it keeps you interested and playing. To that end, when critically judging player input, you are looking at how the game and story react to your input and the impact this has on the overall experience. Rather than go into the huge variety of ways games do this, we’ll do an analysis of one of the more controversial player input methods that’s prevalent in games today and use it to highlight the requirements of player input itself.

There has been a great deal of criticism over the silent protagonist in video games recently and for good reason: they’re suddenly everywhere. Out of the top ranking games of 2007, almost all of them involve playing characters who don’t speak. Gordon Freeman from Half-Life never utters a word. Master Chief hardly speaks, and Link does little more than grunt. It’s tempting to dismiss the feature as simply a cop-out on the part of the creators, and yet there are certainly games that have used the device effectively. Why does the connection of not letting a player’s character speak work in some games and in others supposedly break-down?


Bookmark and Share
Text:AAA
Monday, Apr 28, 2008

Recently I started making my way through Irish author Eoin “It’s Pronounced ‘Owen’!” Colfer’s popular Artemis Fowl series. I’ll admit, I’m rather behind on the times—the original Artemis Fowl was published in 2001, and the following four books (plus one due out this July) about the boy genius have emerged at roughly the rate of one per year.


I believe it was in early 2004 that a fellow student of fine literature mentioned the Fowl series to me and heartily recommended them—knowing that I had just finished the latest Harry Potter installment, Harry Potter and the Order of the Phoenix, and would have to wait another year for the next segment of the Hogwarts adventure. The magical elements and witty writing style of Colfer’s work were sure to appeal. I have mentioned before that young adult fiction is not just meant for teenagers—anyone with a short attention span or simply a love of a well-spun tale is sure to enjoy.


My friend failed to mention the enormous difference between J.K. Rowling’s work and Colfer’s. Artemis Fowl is a criminal mastermind. That is, he enjoys cheating other people out of money for profit. And he only seems to do it in order to increase his family’s fortune, which is already extensive. He gets away with it (and keeps the reader’s interest) because he has a high IQ, and some excellent (and entertaining) backup in the form of his martial arts aficionado and gun-wielding ‘man-mountain’ servant known as Butler.


The reason one reads on is because Artemis is so darned clever, first of all, and secondly, there are moments when his humanity shines through (though he tries so hard to be evil) and the reader begins to like him despite his shabby, selfish actions.


image

Like the Harry Potter series, Artemis Fowl is supported by supplementary short stories and even graphic novels; the first Artemis Fowl movie is rumored to be in the works. The books are quick adventures and easy reading; I made it through The Arctic Incident before the break and neglected to check out the third book in the series, The Eternity Code, but it is on my library shortlist.


Last week I wrote optimistically about my spring break reading—thinking I’d use a little LEPrecon fairy magic to stop time and get through a stack of magazines. Unsurprisingly, not much progress was made. Did you get through your vacation reading?


Bookmark and Share
Text:AAA
Monday, Apr 28, 2008


Tradition holds that, for Hollywood, the Spring represents the end of ballyhoo - and the business year. During the four month flatline between January and April, every unmarketable mess, every experimental excuse, every contractually obligated star vehicle, and otherwise underdone effort would get a mandatory release - a few days of bewildering box office glory before fading into VHS obscurity. It was always an aesthetic stop gap, a means of making talent happy, critics cranky, and audiences wary. Summer would come soon enough, and with it, the far more palatable popcorn fare. Yet for over 16 weeks, we had to tolerate some pretty pathetic offerings. All of that changed a few years ago when Hollywood realized it could up the ante, just a little, by providing a couple less than mediocre movies. The accompanying turnstile twists proved their approach correct.


Now, Spring is a battle between horrendous and highlights. There are still more stumbles than sonnets, but when you consider the crap that used to pour forth, literally nonstop, a few fine films is all one can ask for. Yet oddly enough, 2008 saw a trend toward documentaries that indicates a real failing among fiction films. While the studios seem convinced that everything old is repackage-able again, the men and women exploring the reality around us are doing it with style, wit, and a clean, clinical eye. They say that everyone has a story to tell, a narrative that if captured properly, would give the old “truth is stranger than…” mantra a clear run for its money. Two of the five films listed below do indeed bring that maxim to startling life.


But there were other excellent offerings that deserve a runner’s up mention: the beat-happy British heist flick The Bank Job; Leatherheads, the half-successful screwball comedy from George Clooney; the uneven document Sputnik Mania, centering on a certain Soviet satellite and the effect it had on a worried West; and the gonzo zombie stomp of Shine a Light, featuring the undead Rolling Stones in all their going through the maverick motions glory. In addition, the underserved demographic of Florida finally got to see two outstanding foreign films from 2007 - The Counterfeiters and Persepolis - movies that would have made this list had they not already had their moment of glory last year. So here is what SE&L thought were the best Spring flings of 2008, beginning with:



# 5 - Forgetting Sarah Marshall dir. Nicholas Stoller


While some may believe - falsely - that the Apatow era of feature length funny business has peeked and begun to ebb (thanks to Dewey Cox or Drillbit Taylor, take your pick), the truth is that there’s lots of satiric fire left in the old furnace. Case in point, this wonderful brom-com from Freak and Geeks costar Jason Segel. While the story of a rather caustic breakup may seem like the last place heart or hilarity could be found, there’s a heaping helping of both in this tale of a struggling composer dumped by his TV star girlfriend. Our hero hopes a trip to Hawaii will cure what ails him. Turns out, his ex is there with her slezoid British boy toy as well.

There’s so much more to this movie than raunch and the risqué. Sure, penis abounds, but so does some emotional insights into how love can linger long after it really should. Besides, there’s puppets - putting on a production of Dracula - with music! How much more do you want. While Segel is a strange leading man, he is surrounded by a capable cast including Kristen Bell (riffing on her current career arc with self-deprecating brilliance), Mila Kunis, and UK yutz Russell Brand, playing every Amy Winehouse inspired pub spud imaginable. Together they take a subject that should sink like a stone and make it laugh out loud loveable. And rumor has it that Segel will be scripting the new Muppets movie. How weird is that?



# 4 - The Dhamma Brothers dir. Andrew Kukura, Jenny Phillips, Anne Marie Stein


We really don’t know what to do with our exploding prison population, do we? We love the notion of warehousing the dangerous and deadly, keeping ourselves and our wee ones away from the true (yet undeniable) horrors of the world. Yet mention the concept of rehabilitation or rights and the cold, conservative nature inherent in all of us leaps to the fore. We don’t want inmates given a chance. Instead, we demand that they be kept locked away forever - no matter what the judges, juries, or sentencing guidelines suggest. It’s from this narrow-minded premise that this look at the use of Buddhism in an Alabama penitentiary gets its undeniable power.

Certainly, there is every reason to be skeptical. As one of the guards convincingly argues, prisoners will “fake it ‘til they make it”, meaning they will do anything to gain some early release favor. But Vipassana (a tiring ten day ritual) seems like an insane way to achieve that ends, especially with all the deep-seeded personal problems and unhealed wounds it tends to open up. We learn a lot about these men - stories that seem antithetical to the crimes they committed and yet completely in line with the standard police profiling. Their tales of abandonment and abuse are horrific, just like the ways they choose to compensate for them. This is as eye opening and uneasy as fact filmmaking gets.



# 3 - Cloverfield dir. Matt Reeves


Sure, the viral marketing campaign that swept the Internet last summer seemed overly calculated, guaranteed to make whatever turned up in theaters four months later appear simultaneously exciting and exasperating. Who knew that producer JJ Abrams and a couple of his TV pals (Felicity‘s Matt Reeves and Lost‘s Drew Goddard) would turn the whole thing into one of the finest genre efforts of the new millennium. Sure, some consider this monster movie nothing more than Godzilla with a Blair Witch POV, but that’s just part of the film’s appeal. There are also riffs on 9/11, our current sense of social fear, and the notion that nothing is real unless it’s viewed through a camera or featured on TV.

Now that it’s out on DVD, the movie can be studied more closely (and without some of the accompanying handheld shaky-cam nausea), and some interesting elements definitely come to the fore. The relationship between the friends (and former lovers) becomes even clearer, the emotional needs that each carries adding to the seriousness of the situation. The monster’s movements are also clarified, thanks to the lack of an anticipation/shock factor. We get to see the amazing CG destruction in all its wow-factor glory. It all makes for one of the most creative kaiju-like efforts ever.



# 2 - Be Kind, Rewind dir. Michele Gondry


No, this was not that wacky, weirdo comedy that the presence of Mos Def or Jack Black would indicate. Nor was it just another piece of Michele Gondry wistfulness mistaking pure imagination for screenwriting. Instead, this is the finest love letter to the VCR and the videocassette ever constructed, a story that requires audiences to drop their pretexts and perceptions and recognize exactly what the scenes are saying. What we are witnessing here is not just the recreation of classic ‘80s films by a bunch of video store employees turned amateur auteurs. Instead, the so-called “Swedeing” that occurs is a reflection of just how pervasive cinema has become as part of our everyday lives.


As with most broad canvases, it’s the details that get lost. When Black and company make their new versions of these well-remembered films, they are done so without any real reference - no script, definitely no VHS copy to consider. Instead, this is moviemaking from memory, the rote revisiting of favored titles by people who have them memorized. All geek love should be this pure and pristine. Thanks to Gondry’s vision, which places all the action in a gee-whiz setting of communal consideration, we witness the first movie ever to acknowledge the seismic change that occurred when theaters headed home. Destined to be considered a modern masterpiece in the future.



# 1 - Young@Heart dir. Stephen Walker


Aging in America is its own prison, a metaphysical place where family members forget their loved ones because the stench of mortality is too great to bear. Even worse, because of horrific diseases like Alzheimer’s and dementia, the elderly are additionally viewed as ticking time bombs, burdens placed on relatives for reasons that are uncomfortable and unavoidable. So how refreshing is it to see a group of septa- and octogenarians expressing themselves in song as part of the community chorus. Even better, these good timing geezers use The Ramones, Talking Heads, Sonic Youth, and The Clash as points of sonic reference.


This fantastic feel good documentary, chronicling the preparations by the Massachusetts based choral for their latest world tour (that’s right - WORLD tour), is so uplifting that we need the occasional (and because of the subject matter, unavoidable) tragedy to keep us grounded. Balancing the joy inherent in making music with the inevitability of a life slowly fading away, we meet individuals so inspiring they practically preach to us. Certainly, British filmmaker Stephen Walker pushes a few buttons here and there, and middle aged choir director Bob Cilman can ham it up with the worst of them, but these are minor quibbles in what is destined to be another overlooked fact-film come Oscar time.


Now on PopMatters
PM Picks
Announcements

© 1999-2014 PopMatters.com. All rights reserved.
PopMatters.com™ and PopMatters™ are trademarks
of PopMatters Media, Inc.

PopMatters is wholly independently owned and operated.