CHICAGO — The video game is 40.
Its exact birthday is arguable. Cultural historians likely would date its origins further back, roughly a decade or two; prototypes of arcade games flourished in university computer labs in the 1950s. But “Pong,” the first viable mainstream video game, blipped and blooped into bowling alleys and bars in 1972, and the Magnavox Odyssey, the first home gaming console, arrived 40 autumns ago. So, let’s agree, 40 years old.
It may be the last time we agree.
Because chances are you find the video game a marginal presence in your life, at best a time-killing commuter distraction, somewhere on the cultural scale between Honey Boo Boo and “Ice Age” sequels. But you would be wrong. The video game at 40 is having a midlife identity crisis, albeit one that is redefining what the medium itself means. If you haven’t touched a joystick in decades, you would be disoriented: Metaphorically speaking, Mario’s Kart is a Passat now, Donkey Kong more of a tweedy Dr. Zaius. The video game at 40 is often thoughtful, aesthetically sophisticated, more diversified than ever and rigorously thought about. Lately, it’s been wondering whether it’s art or should even aspire to the label; often, the reply is, yes.
But it’s complicated.
The video game at 40 is also self-conscious and stuck in its ways, its direction uncertain. It loses its car keys a lot now. Financially, it’s unstable. Creatively, it’s not sure what anyone expects of it anymore. It can be flatulent, frustrating, juvenile, insular and needlessly abrasive — as technologically innovative as it is intellectually inert. But here’s the thing: Those contradictions are OK, even welcome. Because, in the life span of any art, 40 is like 15, and while rock ‘n’ roll considers retirement, the movie wonders about senility and the book ponders the grave, the video game, four decades old and more ubiquitous than ever, is maturing.
“You can’t deny video games increasingly do what art does,” said Julian Dibbel, a University of Chicago law student who once literally made his living buying and selling virtual items in online gaming worlds. “Video games shape identities now, they have ideas that carry weight in the world. There’s a culture there, and games with larger meanings. In many ways they are a tighter fit with contemporary culture than more traditional arts. In fact, why even hold up novels or films as what games should aspire to when, if we think of art as central to the way we live, the interactive nature of games is more relevant? Why aren’t other arts more like games?”
Nobody said puberty was pretty.
The problem is, try getting anyone to settle on what maturity should look like here. Does a mature video game culture mean a culture that is grim, serious and ambitious? A culture that accepts itself as broad entertainment? Or a culture that, as with film, literature and theater, reflects multitudes?
“You can’t expect the video game or video game culture to ever be entirely mature,” Nolan Bushnell said when I asked what he thought of the medium he more or less created 40 years ago. In 1972, Bushnell and programmer Ted Dabney started Atari, then developed “Pong,” which popularized video games in general a few years later when Sears sold a home version. “Considering all the threads that exist now in video game culture, I think it would be impossible for the video game to be entirely mature. Would you even want it to be?”
Of course not, though surely he had loftier aims for the video game than gobbling quarters?
“There was theoretical stuff involved,” he replied. “We learned it was all about balancing difficulty and ease. See, you wanted it to be easy enough for people to play but not so easy that they got very far every time.”
In other words, no loftier aims.
But even cinema had to work its way toward “Citizen Kane.” The fact that video game culture has arrived at the point where we can have a serious discussion about its maturity is itself a sign of its maturity. If becoming a viable art means having a worldview, affecting the way people think about the real world, drawing on its classics to create new works while recognizing that the older works are as rich as newer works … well, then, that culture is rich, ingrained and here. Next year, there will be a video game titled “Watch Dogs” that explores the modern surveillance state — and, whaddayaknow, uses an intricately mapped digital Chicago as its sandbox. “Wreck-It Ralph,” the new Disney movie, celebrates decades of video game lore, much the way “Who Framed Roger Rabbit?” did with a century of animation.
“Did we do demographic research to arrive at this subject?” asked Clark Spencer, the film’s producer. “No, it was less strategic. It was because we wanted to make a movie that appeals to everyone, and video games have become such a part of the daily fabric, it was very odd that it hadn’t been done before.” He said when the film was in early preproduction, a sheet was posted in the Disney animation department that asked for suggestions of which classic characters from the genre to include. The list quickly became many hundreds of names long.
When Pitchfork, the online music magazine, decided last summer to expand into other arts, it partnered with the smart gaming literary journal Kill Screen (launched by a former Wall Street Journal journalist) and started Soundplay, a companion site dedicated to the thoughtful discussion of video games. Asked why games, why now, Pitchfork President Chris Kaskie said, “We are interested in emerging culture, and the indie game scene, with its roots in digital, feels closest to where indie music is now.” Indeed, the independent game development scene, bolstered by the ability to distribute its work cheaply through mobile phone apps and the Xbox 360 marketplace, is not unlike the indie rock scene: awash in invention, often as committed to serious ideas and reimagining tropes as it is to reinforcing entertainment value and escapism.
Tracy Fullerton, the influential director of the Game Innovation Lab at the University of Southern California, recently began work on a game based on the philosophies of Henry David Thoreau. She told me she had been playing “Animal Crossing” with her niece and noticed that underpinning the game was the idea that, to progress in life, one had to acquire a bigger house and the approval of the community. She found it depressing. “So on a trip to Walden Pond, I wondered if you could make a game that expressed the ideas of Thoreau. If by using sounds of the forest, his ideas about basic needs, you could capture his spirit through a game.” The result, “Walden, a Game,” partly funded with a $40,000 grant from the National Endowment of the Arts, is among a growing subgenre, the meditative art game.
When I asked Ian Bogost, a game developer, professor of digital media at Georgia Institute of Technology and among the best of a new wave of video game critics, what this all meant, he sighed for a long moment.
“It’s hard to say, but it’s a conversation that has to be had,” he said, “because people, surrounded every day by games on their phones and Facebook and Xbox or whatever, don’t necessarily see themselves thinking about video games or understanding the video game. But they do and they are, and the broadening of the medium is key.
“As with photography, television, what makes an art powerful is not that it is placed in a gallery, but that it becomes so commonplace we don’t notice it anymore. A domestication happens. We have to face the music of what that means, that one of the challenges of maturity is recognizing this is OK.”
To put it another way: The end of the video game as a niche is the new beginning of the video game.
I don’t know how I feel about this.
Video games were my first love.
I still play video games. I am a lifelong gamer. My current game, my go-to for the past couple of years, is a visceral little war number called “Battlefield 1943,” in which I am thrust onto a side (Japanese or American), then fight my way through an elaborate match of Capture the Flag. I am very good and often dominate every match I am placed in, mercilessly dive-bombing Iwo Jima, then circling the digital island and striking again. It’s how I unwind. It’s not high-minded, there is no narrative. And as for having emotional resonance in the real world: Well, I play against real people who are logged in around the world; they swear at me over headsets and send notes via the Xbox that curse me in exotic languages. Does that count?
It’s not grown-up and I don’t care. I sort of hate this is true, but it is: Before movies, books, music, “The Brady Bunch,” I was hooked on games as fun. When I was 8, video games were 7. We grew up together. I developed a Stockholm syndrome attraction to the medium, buying every new system the day they came out, acquiring a lifetime of questionable, expensive decisions.
Which is why I cringe every time I see a grown man on a sitcom transfixed on his couch, playing a video game. It just feels too close. As with any love, video games have always been a source of guilt. Indeed, they were the reason for my first ethical dilemma: Unbeknownst to my mother, I won an Atari 2600 in a Little League raffle. Though I already owned an Atari 2600, I kept it anyway — just because. There is something selfish and myopic about the lifestyle.
Consider Dave Lang, who started Iron Galaxy Studios, a well-regarded Chicago video game development firm. He grew up in Richton Park. He lost a childhood to video games. In sixth grade he received a Commodore 64 and spent his summers in his bedroom, developing games on it. Because he made something of this, he doesn’t sound particularly conflicted.
But when I raise the question of games as art — whether games should aspire to be more than fun — a familiar unease creeps into his voice. “Developers pushing games in new directions has no doubt expanded the base of players,” he said, “but to be honest, when developers aspire to art, subvert expectations and go for serious subject matter, it’s interesting. But if it’s not fun, I don’t care.”
That’s how I feel: torn.
He pointed me toward a new PlayStation3 game, “Papo & Yo,” which begins with the hero growing up in an abusive home and navigating his demons, however metaphorical. The game, a work of autobiography, even starts with a dedication: “To my mother, brothers and sister, with whom I survived the monster in my father.”
Without developers willing to push boundaries, the medium founders. But the pretense can be unpalatable.
To an extent, Chicago was a small catalyst at stoking that ambition: In 2005, Roger Ebert wrote a column dismissing the idea that video games could ever attain the resonance of a traditional art form. Since games require choices, he reasoned, while serious film and literature require passivity and “authorial control,” games would always be at a fundamental disadvantage.
However questionable his logic, he had a point: Being unable to affect the outcome of a work of art, being unable to stop Moby Dick from drowning Ahab or Michael Corleone from selling out his ideals to become the Godfather, being forced to look on helplessly is intrinsic to what makes many works of art affecting.
Jonathan Blow, a San Francisco-based video game developer, arguably the most controversial in the medium, answered this argument famously. Like many game developers, he sees the future of games not in generating narrative but emotion and thought. A few years ago he released an ingenious game on Xbox, “Braid.” It was popular, fun and relatively familiar, about a hero rescuing a princess from a monster. Except, scene by scene, you begin to grasp that the princess is actually fleeing the hero. The game is about a breakup and regret. It aspires to pain.
His next game, due next year, is existential, he told me, “about what it means to walk around in a (digital) world, see and do things — a broad but difficult question.”
He sniffs that he doesn’t even consider himself part of the video game industry. He compares the development of the video game with the comic book: “There is nothing about the comic book itself, sequential pictures on a page, that demands it be constrained to superheroes. Yet here we are. Video games have a similar legacy of inertia, and I don’t know if anyone can break it, frankly. The Wii, iPhone games — it may broaden the reach of the medium, but it doesn’t mean it will be for mature, reasonable adults, either.”
Which is fair, somewhat naive, lacking generosity, though maybe necessary: “The kind of tension that Jonathan represents,” Bogost said, “is natural to the evolution of the medium. A David Lynch is only possible if you already have a Michael Bay.” But where the medium could get hung up, Fullerton said, is in the worry about being an art: “The question should be more fundamental: ‘Is this important to people?’”
About a week ago I was standing outside the Chicago Theatre. Going on inside was the rehearsal for a concert featuring the music of Nintendo’s 25-year-old “Legend of Zelda” video game series. The Chicagoland Pops Orchestra was running through a pounding, intense score, though on a screen behind it were comically mundane, old 8-bit images of Zelda gliding around a digital kingdom. The whole thing felt middlebrow, silly.
But on the sidewalk out front I met three guys in their 20s from Palatine. When I asked why they were going to this (sold-out) concert, they said they were fans of the “Zelda” series, they had never been to a concert of video game music, but “when you hear the music, it jolts you back to how you felt when you first experienced it.” They could have been talking about the Beach Boys. But there was nothing middlebrow about it. Another characteristic of genuine art is that the memory of a work is as rich as the work itself.
Dan Greenawalt, creative director of Microsoft’s decade-old “Forza Motorsport” series of ultrarealistic racing games, calls this everyday integration of the medium “the gamification of the world.” The “Forza” series, for instance, is so detailed the game is used at automotive-world board of director meetings to virtually kick the tires of new car models. Greenawalt routinely hears from people who use it as an alternative to test-driving cars in the real world.
Jaap Hoogstraten, director of exhibitions at the Field Museum, said, “We’re not looking to make a ‘World of Warcraft,’ and we don’t know much about what (video games in museums) means yet, but some kind of video game is becoming a standard part of shows.” Indeed, Sean Dove, a Chicago graphic designer who often finds himself using the design and iconography of games as his inspiration, has an Xbox on the floor of his North Center studio.
But that integration is slow. When new students come into the game development program at DePaul University, they also tend not to have thought that much about the relevance of the medium, Jose Zagal, an assistant professor in the program, told me. In fact, he said, they’re often deeply resistant to the concept.
“They haven’t started to think critically about games, the messages given off, why a game makes them feel or not feel a certain way,” Zagal said. “I’ll ask them if it’s an art, and they say, flat out, “Not really.’ They haven’t considered the idea of free will in the ‘Call of Duty’ series. They tend not to have thought about the medium at all. Though more and more, we do see students who have. We see students who dig into old games the way a filmmaker might look at the movies of the 1970s for inspiration.
“People don’t question the idea that movies can be philosophical anymore; nobody asks if it’s art. But I bet 80 years ago or so, that debate mattered.”