As a new contributing blogger to GeekOn, I’m not entirely sure if I’m old enough to be the grizzled veteran of the bunch, but going to go ahead, unilaterally seize dramatic license, and tell you that I am.
Entering my 30s, I became aware of a growing disconnect between who I thought I was and who I am actually becoming. Like everyone who has come before me and like everyone who will come after, there’s this one shining moment where you wake up in Adult Land and wonder how the heck you got there.
What happened to the days of going to concerts and staying out way too late? Who swiped that and replaced it with 9:30 bedtimes and an embarrassing interest in business seminars? Since when do I have lengthy conversations about watering the foundation of my house and forget to yawn? How do I know more about 401ks than the drink specials in whatever needlessly hip spot kids are frequenting these days? Somehow, my plan to remain young and cool forever had gone terribly awry.
Though, truth be told, “cool” is probably not meant the way most would understand it.
And you, sweet gamer, likely know exactly what I mean.
I was introduced to gaming in the halcyon days of the Nintendo Entertainment and Sega Master Systems. Atari, the granddaddy of them all, was still around but losing ground to the 8 full bits of majesty promised by the Third Generation of console gaming systems. I graduated to the Super Nintendo in the mid-90s and, though Playstation has been my console of choice ever since, it is rarely used for any game that does not involve a ball, puck, hidden blades, or gratuitous carjacking.
While I had dabbled in PC gaming throughout the 90s (King’s Quest, Warcraft II, Ultima Online – Woot!), it wasn’t until I showed up to Clark Hall on the campus of UNT in fall of 2000 with my own computer that I really entered my gaming heyday. The days were spent tweaking my rig to get max frames per second, the nights were spent on endless nights of Starcraft on battle.net or Counter-Strike on one of the dozens of hosted servers on campus.
I started following game development on message boards and immersing myself, as geeks are wont to do, in the nooks and crannies of gaming culture. This was, in many ways, a double edged sword. By dissecting what made games great, I developed a better appreciation for technical innovations, presentation, and storytelling. The downside was that dissection, as often happens in anatomy and biology classes, often results in killing your subject.
Patterns that were invisible before became distractingly obvious (like the Broken Glass Theory, for fans of How I Met Your Mother). Bad user interface design, insufficient code optimization, poor voice acting, overuse of non-engine (CGI) cutscenes, lack of customer and product support, overmarketing, intellectual property money-grabs; the more I played, the more patterns I began to recognize. Consequently, each new problem caused my enjoyment of video games to flag just a little bit more.
Beyond any of these technical issues, however, it was the storylines I found most wanting. The more games I played, the fewer original plot arcs I experienced. It became shocking how nearly every plot device was plucked from the Overcoming the Monster/Rags to Riches/Quest/Voyage and Return Literary Plot Tree. And it wasn’t just the overall plots that frustrated. The deus ex machina (the literary device, not the ground breaking first person shooter), long a source of excitement and immersion, became a groan-worthy distraction and barrier to immersion. And how many main characters with amnesia do there need to be before we all just admit it is tired and needs to be jetcanned into the farthest reaches of space? IGN took this on recently in a Top 10 Worst Plot Devices article, and it is spot-on.
So what’s a gamer to do here? Sure, every so often you get a jewel like Bioshock: Infinite that just rocks your socks off with book worthy writing, cinema-grade setpiece moments, memorable characters, and evocative plot developments. But most of the time, the games you’re playing has been done before. For a particularly egregious example, go play Dawn of War II and tell me it isn’t Starcraft with some minor scrambling, right down to races, characters, and story moments. When you play long enough, you start to notice these patterns, and it can be hard to drum up the same excitement when all you have before you is recycled, rehashed, and repackaged content.
That is why I believe the saving grace of video games, the future and the hope of this particular form of entertainment, is getting developers to start telling the stories, not dictating them. To put it another way, rather than railroad a player though recycled and unoriginal storylines, game developers need to start spending their time telling the stories of their players.
Create the sandbox, let us play in it, and use our own emergent stories to shape the narrative. Create a compelling environment, nudge and tweak for balance, and let us define our own experience.
And now, dear reader, you probably have some idea where I got the name for this column.
Over the next few months, I’m excited to roll out some of my long-harbored opinions on gaming (electronic AND tabletop), and how those of us 20+ years deep in gaming culture can find ways to continue enjoying our favorite pastime. There will surely be polarizing opinions and shockingly declarative statements, but you’ll also see little nuggets of wisdom, philosophical musings, and even some vaguely inspirational stuff on how gaming can actually make you a better person, at home and at work.
Everyone is welcome to post their comments and grievances. I would only ask that you make at least a passing attempt at civility, and oppose, by any means necessary, the intrusion of Godwin’s Law.
With love, pew-pew, and no QQ,