Gun Control: The Video Game Obsession with the Sword and the Gun

Devon Campbell
Image source:

The key observation to be made about the evolution of games into the violent extravaganzas of today is that, if you change the controller, you change everything.

The sword and the gun: the two items your character is most likely to wield in a top-selling videogame. Why is this the case? Much of this can be answered by taking a long, hard look at a staple of gaming since the beginning: the controller. Although it has evolved slightly, very little has changed between the release of the NES and the 360 and PS3 controllers we destroy our thumbs with today.

Many critics have claimed the chief factor holding gaming back from gaining legitimacy is the prevalence of violence as the primary mode of interaction. They will point to movies as an example of the direction games need to go. What if every movie were an action movie? I don’t know about you, but I would have trouble taking them seriously. Movies are capable of expressing so many things that seem impossible to express from behind the barrel of a gun.

Those critics will find no argument here. I, for one, would love to see innovation in this regard, but the movie comparison is seriously flawed. Games carry with them a burden that movies avoid entirely because of the expectations of their players. Where movie fans expect to absorb a crafted story over the duration of a couple of hours, gamers expect to interact with a story that takes place over many hours. Because developer resources are limited, this time will be filled with just a few tasks players perform over and over. Developing a game that allows the player to do many different things is far more expensive. Complicating the issue further, these tasks will be performed by the player using a standardized controller -- a controller very similar to the PlayStation One Dual Shock controller which has gone almost unchanged for more than 10 years. Doesn’t really leave much room for innovation, does it?

These factors among others -- namely, the cost of producing games, the risk associated with straying from established norms, and the fact that shooting and fighting are great compromises given the circumstances -- result in a litany of shooters and action games. Mainstream polygonal (that is, flat 3D as opposed to our modern glasses-requiring variety) games tend to feature a humanoid character which we immediately understand we can move using the left analog stick, and a virtual camera (and often aiming reticule) which can be moved with the right stick. Developers starting from this framework are speaking a language gamers can understand and a language new players can quickly learn. They shed a lot of overhead that might otherwise be tied up in tedious tutorials.

The conventions are so well established that they are difficult (albeit not impossible) to mess up. If developers wanted to mix things up a bit -- have the character use some other means of interacting with the world, or even ditch the concept of a character altogether -- they would not only need to teach the player how that works, but they would have to come up with some mechanic which stays fun and fresh for a 10-15 hour experience. That's a tall order.

I can easily justify how we got here and why we seem to be stuck in this cycle, and I wouldn't want to be deprived of my Fallout 3, my Mass Effect, or my Red Dead Redemption. At the same time, I want to see the industry succeed in terms of cultural relevance. Sure, gaming has relevance in pop culture, but I want gaming to be an art form for the history books, not just something kids did to waste their time. Developers should push against these conventions and break them whenever possible. Luckily, many have done so to varying degrees of success. The most recent prominent example is Heavy Rain.

Heavy Rain

David Cage is a trailblazer for games as a medium of artistic expression. He started the wheels turning for many of the ideas which would become ensconced in Heavy Rain years before in 2005's Indigo Prophecy. The analog sticks do double-duty to move the character and act out gestures for often menial tasks the character performs. It's hard to call this implementation of non-violent interaction "fun", but it does certainly create a different kind of bond between the player and the character on-screen. If we want to push forward with the common movie analogy, many movies are not so much "fun" as they are emotionally affecting -- look at Schindler's List. People still count movies like these among their favorites, despite the feelings of discomfort and sadness that sometimes come from watching. Perhaps this is the direction we may see videogames start to move in pursuit of cultural relevance.

Heavy Rain provides one of the most timely examples of the departure from violence as the chief interaction, but another comes from an even deeper pedigree: the point-and-click adventure. As an evolution of the early text-only adventures, point-and-click adventure games sexed up Zork and his kin with fancy graphics. Popular in the '80s and '90s, the adventure genre has seen a resurgence led by Telltale and LucasArts along with a boatload of indie developers. Games typically consist of talking with other characters, finding items, and solving puzzles. These certainly fit the bill in terms of non-violence, but it's difficult to envision these games in their current state pushing gaming to any greater levels of prominence as they are, in many ways, more abstract than the first-person shooters from which they distance themselves. The player will most often find himself moving a pointer on the screen to instruct the character to interact with an object in the environment. In addition to the added abstraction, here we have another game mechanic that is, in and of itself, not fun. There is nothing inherently fun about pointing and clicking objects. Point-and-click adventures are left to stand on their stories, dialog, and puzzles.

Silent Hill: Shattered Memories

One genre that could use less empowerment of the player character is the survival horror genre, but developers seem intent on playing it safe by giving the character a firearm. Silent Hill: Shattered Memories is a notable recent exception. It makes perfect sense that, in a game where dread and hopelessness are the currency, the character would be left defenseless, but, in fact, the game is a re-imagining of the first title in the Silent Hill series, which did include a number of melee weapons and guns. The game's lead designer, Sam Barlow, explained this decision in an interview with ("Silent Hill: Shattered Memories Developer Interview", 05 Feb 2010):

" honestly never occurred to us that Harry ought to have powerful offensive options. This was always a more psychological story where the biggest questions were about Harry and what is going on in his mind. It never made sense to have Harry shooting at zombies with a shotgun."

None of the games mentioned above have received the success or universal praise of the high-profile traditional action-focused games of the day: the Uncharteds, the Mass Effects, the Halos. This is most definitely the number one reason a greater variety of experiences go unexplored. With millions of dollars and the livelihoods of possibly hundreds on the line at major game studios, there is little room for risk. As a result, indie games have become the primordial soup of experimental gameplay. This is partly by necessity -- they can't compete with the big boys' big budgets by releasing the same types of games -- and partly by privilege -- with relatively little at stake, breaking with convention is the best way to stand out. Flower, Braid, The Path, Uplink, and countless others have taken up the torch of moving games forward. The most compelling ideas in these games ultimately catch the attention of someone in the big leagues and end up slowly seeping into major releases.

Until such a time that AAA game development is less risky, we will have to depend mostly on the indies to advance interaction on standard control interfaces. Perhaps the key observation to be made about the evolution of games into the violent extravaganzas of today is that, if you change the controller, you change everything. There were relatively few games trying to get the player to accurately aim using the NES gamepad because this was tedious; the controller was ill-suited to that task. By Wii launch day, Activision had already shoehorned motion controls into their latest shooter, Call of Duty 3. As with most motion-controlled first person shooters, the game didn't quite live up to its gamepad-controlled counterparts. A play mechanic which is perfect on a gamepad is, at the least, not as fun with motion controls... or dance pads... or guitar controllers. Once players learned this didn't work well, shooters stopped selling well on the Wii. Once developers saw this, fewer shooters were released for it. The Wii library may not be much to strive toward for the other consoles, but much of it provides evidence that violence is not the only way. So, we find ourselves carrying on two struggles simultaneously: one to develop new ways to keep games fun while telling different kinds of stories using a one-size-fits-all controller, and one to develop new controllers which will open new doors of interesting ways to interact.

Which is the correct approach? While shooters are an easy and proven paradigm on the console, there must certainly be other ways to interact with our games using a gamepad. On the other hand, there are modes of interaction we will never be able to explore with the gamepad; we will instead need more varied inputs. Developers should be approaching the problem from both angles. The easiest way to fast-track the medium into a different direction and perhaps away from its obsession with guns and other myriad weaponry is to change focus to a new input device. If the promises made to the hardcore by Microsoft and Sony about the new generation of motion controls hold true -- that they are supplemental to existing control interfaces rather than being all-out replacements -- they could provide creative developers with the extended toolset they need to break the mold of the shooters and brawlers. At the same time, they will push more conservative developers as they try to keep pace with those who are actually utilizing this new hardware to drive the new experiences players want.

It’s still a bit too early to tell for sure, but motion could actually be the saving grace we are looking for.

So far J. J. Abrams and Rian Johnson resemble children at play, remaking the films they fell in love with. As an audience, however, we desire a fuller experience.

As recently as the lackluster episodes I-III of the Star Wars saga, the embossed gold logo followed by scrolling prologue text was cause for excitement. In the approach to the release of any of the then new prequel installments, the Twentieth Century Fox fanfare, followed by the Lucas Film logo, teased one's impulsive excitement at a glimpse into the next installment's narrative. Then sat in the movie theatre on the anticipated day of release, the sight and sound of the Twentieth Century Fox fanfare signalled the end of fevered anticipation. Whatever happened to those times? For some of us, is it a product of youth in which age now denies us the ability to lose ourselves within such adolescent pleasure? There's no answer to this question -- only the realisation that this sensation is missing and it has been since the summer of 2005. Star Wars is now a movie to tick off your to-watch list, no longer a spark in the dreary reality of the everyday. The magic has disappeared… Star Wars is spiritually dead.

Keep reading... Show less

This has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it.

It hardly needs to be said that the last 12 months haven't been everyone's favorite, but it does deserve to be noted that 2017 has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it. Other longtime dreamers either reappeared or kept up their recent hot streaks, and a number of relative newcomers established their place in what has become one of the more robust rock subgenre subcultures out there.

Keep reading... Show less

​'The Ferryman': Ephemeral Ideas, Eternal Tragedies

The current cast of The Ferryman in London's West End. Photo by Johan Persson. (Courtesy of The Corner Shop)

Staggeringly multi-layered, dangerously fast-paced and rich in characterizations, dialogue and context, Jez Butterworth's new hit about a family during the time of Ireland's the Troubles leaves the audience breathless, sweaty and tearful, in a nightmarish, dry-heaving haze.

"Vanishing. It's a powerful word, that"

Northern Ireland, Rural Derry, 1981, nighttime. The local ringleader of the Irish Republican Army gun-toting comrades ambushes a priest and tells him that the body of one Seamus Carney has been recovered. It is said that the man had spent a full ten years rotting in a bog. The IRA gunslinger, Muldoon, orders the priest to arrange for the Carney family not to utter a word of what had happened to the wretched man.

Keep reading... Show less

Aaron Sorkin's real-life twister about Molly Bloom, an Olympic skier turned high-stakes poker wrangler, is scorchingly fun but never takes its heroine as seriously as the men.

Chances are, we will never see a heartwarming Aaron Sorkin movie about somebody with a learning disability or severe handicap they had to overcome. This is for the best. The most caffeinated major American screenwriter, Sorkin only seems to find his voice when inhabiting a frantically energetic persona whose thoughts outrun their ability to verbalize and emote them. The start of his latest movie, Molly's Game, is so resolutely Sorkin-esque that it's almost a self-parody. Only this time, like most of his better work, it's based on a true story.

Keep reading... Show less

There's something characteristically English about the Royal Society, whereby strangers gather under the aegis of some shared interest to read, study, and form friendships and in which they are implicitly agreed to exist insulated and apart from political differences.

There is an amusing detail in The Curious World of Samuel Pepys and John Evelyn that is emblematic of the kind of intellectual passions that animated the educated elite of late 17th-century England. We learn that Henry Oldenburg, the first secretary of the Royal Society, had for many years carried on a bitter dispute with Robert Hooke, one of the great polymaths of the era whose name still appears to students of physics and biology. Was the root of their quarrel a personality clash, was it over money or property, over love, ego, values? Something simple and recognizable? The precise source of their conflict was none of the above exactly but is nevertheless revealing of a specific early modern English context: They were in dispute, Margaret Willes writes, "over the development of the balance-spring regulator watch mechanism."

Keep reading... Show less
Pop Ten
Mixed Media
PM Picks

© 1999-2017 All rights reserved.
Popmatters is wholly independently owned and operated.