PM Pick

When Worlds Collide

Colin Harvey

Columnist Colin Harvey writes that video games suggest we may be altogether more Renaissance -- and less of a divided mind -- than we give ourselves credit. On the one hand we have video game players: art lovers who aren't allowed to say it and science buffs who don't realize it. On the other hand we have the creators of games: mathematicians and scientists who are really artists, and artists who are really scientists.

Are video games art or science? It's unlikely to be a question that preys very much on the minds of either players or non-players of the manifold digital diversions currently available on consoles, via the Internet or in the arcades. Neither the diehard nor casual player of Prince of Persia, The Sims or the latest iteration of Time Crisis need concern themselves with such an apparently abstract question, since it doesn't impact on his or her immediate engagement in the virtual milieu. At least not if the game is any good. Similarly, for the non-player, regardless of whether they are beguiled, intrigued, bemused, or appalled by what the medium offers, or whether in fact they're excluded from the medium or simply non-committal about it, which side of the science-art binarism we choose to place games is simply not seen as relevant to a discussion of the pros and cons of games. For players and non-players alike, games are purely and simply one thing: entertainment.

In 1956 the British writer CP Snow first used the expression "The Two Cultures" as a means of characterizing the split between the arts and the sciences. Snow was encapsulating a state of affairs, the roots of which stretch back through Romantic thinking to the Enlightenment. Since few of us would classify ourselves as polymaths, equally adept in the worlds of physics and literature, biology and cinema, we could easily conclude that The Two Cultures continues unabated, at least at the level of the distinctly non-Renaissance individual. Except that we could also argue that in the 20th century, art and science became, through means of production and distribution, inextricably linked. In fact, we might even see the video game as the ultimate expression of what happens when these two worlds collide.

Art in the age of mechanical reproduction, to invoke the famous line of another 20th century thinker, Walter Benjamin, surely requires the artist to be both artist and scientist. That as players we see the product of the collision between art and science, rather than the constituent elements that contribute to it, is a tribute to both the adaptability of the medium and to our adaptability as users. As we've become more technically adept, games have become easier to use and more redolent of the kind of leisure activity we're familiar with from other kinds of entertainment.

Any survey of the existing marketplace reveals a surfeit of games that look for all the world look like interactive movies, big budget thrillers, horrors, or adventures, which utilize many of the techniques and tropes familiar from the flickering and dancing images available on the cave walls of your local multiplex. Outside of those games that simply look like films, such as the recent Splinter Cell 2: Pandora Tomorrow and Manhunter, there are those games actually based on cinema releases. The merchandising onslaught for big budget Hollywood movies invariably includes, amidst the themed bathroom soap, specially repackaged baked beans, and iconic mouse mats, a licensed video game of some sort. My shelves are packed with polygon iterations of Hollywood celluloid, some of which exhibit strong and fruitful genetic connections with their movie parents, like Spiderman, Rocky and Lord of the Rings: The Two Towers. Alternatively, many, many movie licenses-turned-games feel like the predictable outcome of kissin' cousins taking their tactility one step too far: that lamentable six-toed runt of a game Enter the Matrix is a recent and particularly distressing warning to those tempted by incestuous cross-media breeding. Watch the gene pool, folks.

Whether a game is a successful reinterpretation of an existing cinematic work or instead employs techniques we recognize from film, contemporary games are guilty by association with movies, the pre-eminent mass entertainment form (along with TV) of this and the last century. But whereas sometimes movies are considered art, seldom is the same claim made for games. Neither do we think of video game players in the same way we might think of cinema buffs, jazz fans, or regular connoisseurs of art galleries. But if the wider culture's image of a game player largely omits to acknowledge that art appreciation might play some part in their play, other stereotypes hold fast.

Historically, game players have often been characterized by non-gamers (and gamers themselves, indeed) as geeks, often bereft of the social skills necessary in maintaining real relationships with real people in the real world. According to the stereotype, gamers are male adolescents, are often spectacle-wearing and overweight, and invariably possess a penchant for heavy metal music. This caricature may sound extreme, but anecdotally speaking, I'm not convinced that this particular view of gamers has necessarily subsided, despite what the facts of game consumption might otherwise tell us.

Set aside the small detail that the increasingly surreal nature of the violent reality happening around us arguably makes greater involvement in the "real" world a fairly unedifying proposition. Consider instead that the stereotypical construction of the nerdy male adolescent gameplayer runs counter to the facts: as the British academic James Newman highlights in his comprehensive new book, succinctly entitled Videogames and published by Routledge (March 2004), the IDSA (Interactive Digital Software Association) indicate the average age of a gameplayer as being 28 years old.

And game players aren't always male. In the past, media reports would periodically indicate the popularity of diverse games like Pacman and Doom amongst female players in duly patronizing, astonished tones. But female usage of games is more varied and widespread than the mainstream media generally leads us to believe. Dr Kathryn Wright, in her article Video Gaming: Myths and Facts, available at the website WomenGamers.com indicates that some " . . . 43% of PC gamers and 35% of console gamers are women." If we keep going at this rate, the male science geek caricature of the proto-Mr Magoo with the Darkness-fixation should crumble in, say, the next 50 years or so. Why, then, does the stereotype still obtain?

My suspicion is that it's a hangover from the past. Older game players, those of us that earned our stripes trying to load often reluctant, sometimes downright petulant games from cassette tape into the teeny weeny RAM of our struggling Commodore 64, ZX Spectrum or Atari 800, saw more clearly the science that underpinned our game experiences than most contemporary game players ever get to do. We could even type in programs from game magazines that, on rare occasions, actually transformed into working, albeit extremely simple, games. Game players of yore really did fit the Dungeons and Dragons player uber-nerd M.O., because we were the ones with access to the machines, and we had the time to learn the idiosyncrasies of the software and hardware. Crucially, in Britain certainly, we were also the ones who benefited from an educational system in which science and computing education disciplines were still heavily targeted at males.

Contemporary console games are presented in ultra user-friendly terms involving only the insertion of a glittering silver disk into the machine's waiting maw, thereby rendering game play an easy, consistently smooth activity. A contemporary game user generally waits a limited time before the machine yields up some more or less well-realised virtual environment, which then might or might not dazzle in terms of gameplay and graphical and audio sophistication. Forget having to be subjected to the flickering loading screens and noises that made the run-up to gameplay on first generation home computers reminiscent of KGB interrogation procedures. As a British Prime Minister once opined, "You've never had it so good".

Ask makers of games if they think games are science or art, and they might well answer both, perhaps also insisting on the primacy of "commerce" in the definition. Gone are the days when games were created by one, two, or three individuals in their laboratory, or bedroom or garage. Contemporary games are team efforts, and they command budgets that would be the envy of independent film-makers, if not quite Hollywood. Take a look at the job descriptions on offer at the leading industry watering holes, and you soon see the level of specialization now required by game developers. A quick survey of vacancies at Gamasutra.com, GameJobs.com, or in the pages of magazines like the British publication Edge reveals manifold different kinds of game jobs. But despite the contention that arts and sciences disciplines are of equal importance in the creation of video games, the dominance of science is very much the reality of this industry, and is evident as soon as you read the pre-requisites of the job description.

Aside from the positions requiring high-level knowledge of the various programming languages, various design positions also require knowledge of design software. This is not an industry that takes well to the idea of artistic muse separate from technological savvy: if you've got mathematics or science on your side, you'll do well. Without these skills, it's a trickier proposition altogether; not impossible, but trickier.

Do The Two Cultures really exist? On the contrary, video games suggest we may be altogether more Renaissance than we give ourselves credit. On the one hand we have video game players: art lovers who aren't allowed to say it and science buffs who don't realize it. On the other hand we have the creators of games: mathematicians and scientists who are really artists, and artists who are really scientists.

So far J. J. Abrams and Rian Johnson resemble children at play, remaking the films they fell in love with. As an audience, however, we desire a fuller experience.

As recently as the lackluster episodes I-III of the Star Wars saga, the embossed gold logo followed by scrolling prologue text was cause for excitement. In the approach to the release of any of the then new prequel installments, the Twentieth Century Fox fanfare, followed by the Lucas Film logo, teased one's impulsive excitement at a glimpse into the next installment's narrative. Then sat in the movie theatre on the anticipated day of release, the sight and sound of the Twentieth Century Fox fanfare signalled the end of fevered anticipation. Whatever happened to those times? For some of us, is it a product of youth in which age now denies us the ability to lose ourselves within such adolescent pleasure? There's no answer to this question -- only the realisation that this sensation is missing and it has been since the summer of 2005. Star Wars is now a movie to tick off your to-watch list, no longer a spark in the dreary reality of the everyday. The magic has disappeared… Star Wars is spiritually dead.

Keep reading... Show less
6

This has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it.

It hardly needs to be said that the last 12 months haven't been everyone's favorite, but it does deserve to be noted that 2017 has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it. Other longtime dreamers either reappeared or kept up their recent hot streaks, and a number of relative newcomers established their place in what has become one of the more robust rock subgenre subcultures out there.

Keep reading... Show less
Theatre

​'The Ferryman': Ephemeral Ideas, Eternal Tragedies

The current cast of The Ferryman in London's West End. Photo by Johan Persson. (Courtesy of The Corner Shop)

Staggeringly multi-layered, dangerously fast-paced and rich in characterizations, dialogue and context, Jez Butterworth's new hit about a family during the time of Ireland's the Troubles leaves the audience breathless, sweaty and tearful, in a nightmarish, dry-heaving haze.

"Vanishing. It's a powerful word, that"

Northern Ireland, Rural Derry, 1981, nighttime. The local ringleader of the Irish Republican Army gun-toting comrades ambushes a priest and tells him that the body of one Seamus Carney has been recovered. It is said that the man had spent a full ten years rotting in a bog. The IRA gunslinger, Muldoon, orders the priest to arrange for the Carney family not to utter a word of what had happened to the wretched man.

Keep reading... Show less
10

Aaron Sorkin's real-life twister about Molly Bloom, an Olympic skier turned high-stakes poker wrangler, is scorchingly fun but never takes its heroine as seriously as the men.

Chances are, we will never see a heartwarming Aaron Sorkin movie about somebody with a learning disability or severe handicap they had to overcome. This is for the best. The most caffeinated major American screenwriter, Sorkin only seems to find his voice when inhabiting a frantically energetic persona whose thoughts outrun their ability to verbalize and emote them. The start of his latest movie, Molly's Game, is so resolutely Sorkin-esque that it's almost a self-parody. Only this time, like most of his better work, it's based on a true story.

Keep reading... Show less
7

There's something characteristically English about the Royal Society, whereby strangers gather under the aegis of some shared interest to read, study, and form friendships and in which they are implicitly agreed to exist insulated and apart from political differences.

There is an amusing detail in The Curious World of Samuel Pepys and John Evelyn that is emblematic of the kind of intellectual passions that animated the educated elite of late 17th-century England. We learn that Henry Oldenburg, the first secretary of the Royal Society, had for many years carried on a bitter dispute with Robert Hooke, one of the great polymaths of the era whose name still appears to students of physics and biology. Was the root of their quarrel a personality clash, was it over money or property, over love, ego, values? Something simple and recognizable? The precise source of their conflict was none of the above exactly but is nevertheless revealing of a specific early modern English context: They were in dispute, Margaret Willes writes, "over the development of the balance-spring regulator watch mechanism."

Keep reading... Show less
8
Pop Ten
Mixed Media
PM Picks

© 1999-2017 Popmatters.com. All rights reserved.
Popmatters is wholly independently owned and operated.

rating-image