Game Studies 2.0 or Nach dem Spiel ist vor dem Spiel
Only five years after the ‘year one’ of game studies, it looks like the colonization of gamespace is over. First came the narratologists, seeking refuge from a world that had grown increasingly hostile to their humble profession year by year. They were usually well-meaning, bespectacled and slightly disorganised, wearing corduroy jackets and a mildly puzzled expression on their faces. To anyone but themselves it was clear that they would not last. Then came the ludologists, looking for a playground that they could claim for themselves, getting all huffy and puffy about anyone who dared cross the lines they drew in the sand. And draw lines they did. Efficient, practical-minded, and with an eye for interior design, they transformed the field into something that looked suspiciously Ikea-like: there was choice between different theoretical models, sure, but they were all based on the same basic principle: rules, also known as the Allen key of game studies.
This system worked remarkably well for a couple of years. Even the narratologists had to admit that the Juul armchairs were surprisingly comfortable, and the Aarseth rug really did hold the room together. Everybody got along so well with everybody else, being a game researcher sometimes felt like being part of a family from a 1950s sitcom. It was like an experiment in behavioural psychology gone wrong: here were all these people playing upward of 16 hours of Halo per day, and none of them ever got the least bit angry when somebody stepped on their toes. Even the ludologists mellowed considerably, eventually disclosing that they loved stories too. All along everybody had assumed that they were these badass research types who sneered at anything that looked like it had narrative potential, but as it turned out they were just as human as you and I. Some of them even cried over the death of Pey’j in Beyond Good and Evil.
Even if it felt weird at the time, in retrospect it seems like there was no reason why everybody should not have been be as close as peas in a pod. After all, we were all outcasts from the academic community. People made these very polite faces whenever we were asked what we did for our living. Some of them might even go as far as saying, ‘Oh, how interesting’. We were shunned by our colleagues, belittled by our friends, counselled to take alternative career paths by our spouses. But we persisted. The corduroy-jacket type of game researchers would play and replay Ico innumerable times, marvelling at the scenery, the depth of Ico’s relationship with Yorda, the profoundness of a story told without words. Meanwhile, the shaved head and Allen key kind of game researchers stripped layer upon layer of representation off their games until the transcendental beauty of the code shone through and bathed everything in the incandescent amber light of the Hercules graphics card.
In our hearts we all knew it couldn’t last, but it was still a shock to see the first signs of deterioration spreading through that idyll, although in retrospect it seems hard to pin down the decisive moment when our Ikea-furnished paradise started to go pear-shaped. Maybe it was when someone coined the term ‘serious games’. Maybe it was when finally our wishes were granted, and we got our own academic departments. Maybe it was when MTV launched a game called Darfur is Dying. By that time it was clear that the sell-out of game studies had begun. First, some of us were in denial: ‘Serious games – you can’t be serious!’ Then the other four stages of grief quickly followed: Anger, bargaining, depression, and finally a grudging acceptance that the world of game studies would never be the same again.
Of course, not all of us fell into a depression too deep to see the bright future of game studies. There were some who embraced this brave new world, and recognised its limitless potential. How clueless we had been, never venturing outside where games were suddenly selling like hot cakes. And I’m not talking about the otaku queuing up outside Tokyo department stores, I’m talking about real people! Politicians, educators, media moguls, healthcare professionals, people with places to go and money to spend. All of a sudden, they all wanted a part of the action, even if it meant joining forces with academics. Not that they necessarily needed the academics, but they looked cute in their corduroy jackets, and they could fix anything with these Allen keys.
That was the first step towards the phenomenon, which I will call, for lack of a better term, Game Studies 2.0. Think of it as a new, social-software-enhanced, Ajax-enabled, RSS-fed version of the endearing but hopelessly outdated game studies of yesteryear. Doing game studies on your own, with just your consoles for company, is like still using Internet Explorer, and keeping your bookmarks in a folder called ‘Favorites’. You might think that trends come and go, and you might be just a little bit concerned that Game Studies 2.0 is still, and will probably never cease to be, just a beta version. But, trust me, it’s here to stay, so you better get used to it. If you are a very cautious and mistrustful person, you might even wonder what Game Studies 2.0 is all about. Well, let me tell you.
First of all, game studies is not just for gamers anymore. While the ludologists have always suspected that their corduroy-wearing brothers might not really be up to par when it comes to a no-holds-barred match of Super Monkey Ball, narratologists are indeed quite erudite in Myst, Riven, and Uru, and they like to pepper their small talk with obscure allusions to Maniac Mansion. They may be baffled by WarioWare, and they will surely never set foot on a dancemat, but they do play. The new entrants to the field, however, might be dabbling in games but they do it in the way someone might take up smoking a pipe when he turns forty, or start taking wine tasting classes. One particularly striking example of this new kind of games research is McKenzie Wark’s post-book GAM3R 7HE0RY, which uses leet in its title to signal its hipness to the ways of Game Studies 2.0.
But let’s not fall back into the habit of territorial pissings again – after all, it’s not 2004 anymore. Anyone is welcome in the world of Game Studies 2.0, no matter what their gamer credentials are. It used to be that in order to bear the title of a game researcher you would have to have a level 60+ character in EverQuest, be able to speedrun through Super Mario Land in less than five minutes, and play Tetris blindfolded and with your hands tied behind your back. Fortunately, these times are over. Since people are always talking about the same five games anyway, it’s quite enough to have played those. At the risk of the field of game studies being swamped by yet another wave of research refugees from the sinking ship of new media studies – but, hey, I am all on favour of a liberal immigration policy– here is the full list: The Sims, Deus Ex, Rez, Ico, and Grand Theft Auto 3. It helps to have a World of Warcraft account as well.
The other thing about Game Studies 2.0: it’s not about how you approach games. It’s about you. Displaying personal preferences is a great way to make new friends, and, just as importantly, new enemies. I am deeply indebted to Espen Aarseth for coining the term ‘World of Warcraft studies’, because this is exactly what Game Studies 2.0 is about. We all know, of course, that interdisciplinarity has never been more than a buzzword, but somehow we still kind of believed in the concept, because it sounded, well, friendly. World of Warcraft studies resists the cushiness of the ideology of interdisciplinarity, and takes this resistance to its logical extreme. I am sure I am not the only one who would rather read the Martian Times than one of those articles on Terra Nova that positively bristle with jargon. In the final analysis, then, the real difference between Game Studies 1.x and Game Studies 2.0 is diversity and numbers. We are a force to be reckoned with. There are entire academic departments that depend on us, not to speak of funding proposals and a minor publishing industry that churns out book after book on video games. And churn out they do. There does not even seem to be a perfunctory pretence of copy-editing any more. Is this something that should make us wary? I think it is. It seems to me the clatter of the printing press might be singing the swan song of game studies – just before it makes way to the next big trend. Remember Internet Studies? Neither do I. And yet, in a sense, Internet Studies never went away. It just somehow dispersed into the academic landscape, becoming more and more irrelevant as it spread.
I don’t want to end on such a despondent note, so some more celebratory words on the emergence of Game Studies 2.0 seem in order. It strikes me that the field of game studies has matured considerably in the last two years or so. We have seen a number of potentially groundbreaking publications that have considerably increased the theoretical and methodological tools at our disposal. It’s not just the Allen key for us anymore. I am thinking of books like Digital Play by Stephen Kline et al., which struck a decisive blow against the prevailing ignorance of the political economy of games, of TL Taylor’s Play between Worlds with its innovative methodology and dedication to serious scholarship, of Ian Bogost’s Unit Operations, which opens up new theoretical horizons for game criticism.
There are others, but I will let these three stand in lieu of less visible but no less influential publications. What all of these publications have in common is an appreciation for the fact that games do not take place in a vacuum; they are embedded in cultural, social and political contexts. I am not a believer in serious games, but I believe that games can be the sites of serious discourse. And even if I am making fun of game studies for its proneness to be swept off its feet by every new trend that comes along, I believe that it now has the potential to find a more secure footing. As I’ve said in the beginning, the colonization of gamespace is over, and we are entering a phase of post-colonialism.
So what I have called Game Studies 2.0 is really a phase of transition between colonialist game studies and postcolonial game studies. The outside interest in our little corner of the world, both academic and non-, draws attention to just how comfortable this corner has become. However, complacency is hardly the appropriate attitude in the face of ongoing change all around us. We like to think that we have developed theories and models that allow us to speak with some authority about a topic that is dear to our hearts. We still like to play, after all, even if the games are getting more derivative and lacklustre year by year. This might easily turn out to be a fallacy. In a world where prominent theorists are becoming increasingly aware of gaming culture, playing games might just not be enough.
This is the reason why I look hopefully toward signs of change such as the development of MMOG studies, steeped in jargon as it may be, spearheading a renewed interest in the diversity of gaming cultures, and I rejoice at the sight of books by ‘non-gamers’ such as Alexander Galloway and McKenzie Wark. And despite my misgivings about serious games, I think the media interest it generates will contribute to a growing awareness that we, like the games we study, do not operate in a vacuum. It’s quite a shock to wake up from a snooze in your Juul armchair, after playing Ico for the umpteenth time, and find that someone has pulled the Aarseth rug from beneath your beslippered feet. But then you realise that the clichéd message of all the home makeover shows ever to flicker across your television screen is true: you’ll never find the strength to change until you buy some new furniture.
Julian Kücklich email@example.com