Death is rarely scary in games, mainly because it’s so common. As with anything else that we experience multiple times, death loses its impact. This is an obvious dilemma for horror games. Death is only scary when we don’t die. But when a horror game embraces this contradiction and helps the player stay alive for as long as possible, it becomes truly terrifying in a way that few games can manage.
Latest Blog Posts
A couple weeks ago, Jorge and I embarked on a journey. With full wallets, empty bellies, and half-tucked shirts, we journeyed to Subway. Purchasing some food allowed us to stave off hunger and gain early access to Uncharted 3‘s multiplayer. I was particularly fond of the beta, so this was an opportunity to get another chance to check out the full mode as well as to test a relatively new means of promoting and marketing a game. Now that I’ve had the time to play the game a bit more and to reflect on the promotion itself, I feel like my opinion regarding fast food sums up the Uncharted 3 multiplayer early access experience. It was immediately satisfying, but I fear it’s ultimately unhealthy.
I only half watched Sony’s new “Michael” ad late one night (see below if you haven’t seen it yet), as I was fixing myself something to eat during a commercial break. I stopped, somewhat mesmerized by the array of video game characters that suddenly appeared as (more or less) live action characters on my television screen.
The sight of a “real” Solid Snake discussing war in a throaty whisper was what gave me pause. Then I was kind of charmed by a portal opening behind the flaming head of Sweetooth and catching a fleeting glimpse of Chell briefly flitting by. It was the Little Sister, peering at me through the crowd in that ever eerily distant way, that left me a little stunned.
I’m not sure exactly why. It was seeing that strange creature transported out of her home medium into the “real world” of the televisual that made me realize that “my characters” had somehow arrived in what I think of as the “real” mainstream media. You know, television, that thing that my mother and father watch, not video games—that space left for me (a late-thirtysomething in obvious arrested development) and the kids.
These past two weeks, as part of one of my game studies classes, I’ve been engaged in taking a largely uninitiated party of undergraduates through the paces of a tabletop roleplay campaign. We had just come off a screening of Darkon and a series of readings on the Atari 2600 (including Adventure and the origins thereof) so we were all of a mindset to begin exploring actual game creation and interacting with real systems. Our professor, taking a philosophical approach to the subject that I wish more academics of new media would, divided the class into three groups: gamist, emphasizing combat systems; simulationist, emphasizing ambient world effects and modeling; and narrativist, emphasizing storytelling. I DMed for the last of these.
“But wait, Kris,” I hear you saying, “Aren’t you a ludologist?” I’m glad you asked, dear reader. I actually think of myself as a post-Aarsethian ergodic narrativist/aestheticist, but that is neither here nor there. The Great War of ludology versus narratology is an important conversation but a decidedly dead one, nor does it matter whether anyone won (arguably, the only winners were the ones who didn’t play). What does matter is that my professor suggested that narrativist tabletop roleplay was beset by cliche and was the structurally weakest of play types. That sounded like a thrown gauntlet to me.
High scores, achievements, leveling up. The system runs on points, measures us in points, validates us in points.
This week the Moving Pixels podcast considers the value of points. What points matter to us? Why do we want them? Why do they matter?