hc13: John Kirriemuir, Groundhog Day for Games in Learning

John Kirriemuir writes about games in education.

Groundhog Day for Games in Learning

The last seven years have seen an increasing number of reports about the potential and actual use of computer games in education. These are often commissioned and produced by education or research support bodies.

2006, in particular, saw an outpouring of such reports. In the last few months, we’ve seen a report by the Federation of American Scientists [FAS], while two reports, an EA/Microsoft/FutureLab collaboration [EA] and an ELSPA/DfES work [ELSPA], jostled for headlines at the same time. More recently, the JISC has added to this increasingly crowded field [JISC]. Those are some of the better reports; for example, the EA one is partially the product of in-school trials and therefore contains unique data.

On one level, this is all well and good. These reports provide good introductions to computer games and how they can be used in unconventional (serious? useful? practical? endearing?) ways. They often contain examples of teachers using games in learning situations. They show that there is a sustained interest in the application of games. And they provide a counter to the often relentless “bad news day” groove of the tabloid media, namely that there is a direct link between the latest violent incident and exposure to a particular video game.

But, like the repetitive release of Police Academy sequels, this trend is generating ever-diminishing returns. There are seven fundamental problems:

1. These reports are being publicised to, and read by, the same audience. They are announced on the same mailing lists, the same news websites and the same sections of the technical media. Is the message getting to anyone new, or is it just “preaching to the converted”?
2. Providing an “introduction to games” results in a load of flannel that can be read in any of 10,000 other media. Text such as “Many people now own a Playstation 2, which is made by Sony yadda yadda yadda” dates horribly.
3. The non-technical nature of these reports limits their use, with researchers gasping for more data, and with teachers gasping for instructions on how to use the games effectively.
4. The relatively small amount of genuine and novel research in the field – to date – results in substantial duplication of content, references and discussion. It is possible to almost “checklist” the reference sections of many reports with the same core group of citations. Malone being intrinsically motivating in 1980? Check. Csikszentmihalyi being in the flow in 1996? Check. We’ve been there before; it’s Groundhog Day and Punxsutawney Phil says we’ve another six weeks of identical citations on the way.
5. Literature reviews get confused with classical research, even by researchers. They aren’t the same. A literature review is, well, a review of existing literature. It does what it says on the tin. Unless there is analysis of the literature, e.g. 93% of reported peer-reviewed research shows that playing Crazy Taxi does make you a better driver, then it doesn’t add anything new to the knowledge base.
6. Dangerously, the gushing pipeline of reports gives an impression to many of progress in the associated research sector.
7. Even collectively, this shelf of reports still does not make a compelling case for the widespread use of digital games in learning. Here’s a recent post from the [Gamesandeducation] mailing list by Marshal Anderson:

“…I can’t help feeling that the community needs to look at what games-based learning does better than anything else (and I don’t have the answer to that question) and exploit that, rather than relying of the deeply dubious idea that somehow students who have trouble understanding division will suddenly see the light when it’s presented by Laura Croft. Let’s be straight about this (and please, someone, correct me here) the evidence that ICT based learning about anything but ICT is actually transferable is still sketchy to say the least.”

These suspicions are not rare. And they indicate a fundamental problem: that after so many similar reports, over several years, still the case has not been soundly made for using digital games in learning.

Thankfully, the world isn’t solely relying on these reports; there is substantive, deep, research out there. In the last few years we’ve seen PhD theses from people such as [Squire] and [Egenfeldt-Neilson]. Both went into schools with commercial digital games, giving them a thorough work-out in real classroom learning situations. And that’s the kind of work that needs to be funded to move the agenda forward. These works genuinely progress towards satisfactorily answering the question “How can games be used to effectively enhance learning?” – the shelf of glossy reports doesn’t.

So, if they are of increasingly limited use, why are such reports being produced so frequently? And how come the same education funding body will spend 35K on such a report, but turn down a 4K funding bid on applied research in the same area? Cynically, perhaps, it could be a combination of:

• Glossy reports looking more impressive, and media-dazzling, than a pedestrian research project.
• Brightly-coloured reports being quicker to produce and release to the media than a research paper.
• Shiny reports on games in learning making an organisation look contemporary, showing it is aware of current technological trends and learning practice, hopefully helping it receive more funding itself.
• Organisations seeing that other bodies are producing reports in this field, and becoming worried that they need to be seen doing something equally “cutting edge”.

Some report commissioners have notions of moving the agenda forward, pushing the envelope, or whatever the latest buzzword is. Others don’t. Here are four examples of backwards attitudes that have slunk into my inbox in the last year:

1. The commissioner wanted a publication aimed at people “new to games”, insisting that a large chunk of the report was squandered on detail about the latest generation of consoles.
2. Unrealistic and simplistic expectations: “Show how academic underachievers can become top-scoring academics through playing World of Warcraft. 100 words.” Yes, that would be nice to “show” in 100 words, and perhaps in a parallel universe where credibility is not an issue, it would be possible.
3. “Try to avoid lots of tables and statistics, as this may put off the audience”. No comment required.
4. And by the same commissioner (the last I heard, they still hadn’t produced a report): “Stick to exam result improvement; don’t mention anything that isn’t learning.”

So the end result is usually a glossy report, containing lots of pictures of school kids studiously gathered around on a PC. It looks good, though perhaps not as realistic as pictures of 11 year olds sneakily enjoying Grand Theft Auto which their game-illiterate parents purchased for them, or indulging in “happy slapping” which their mates film on their mobiles, then upload onto YouTube. Though innovative uses of technology, that wouldn’t make good copy.

These reports do not help move the research agenda on. Only research, and research funding, can do that. In a lively recent discussion on the [Gamesnetwork] mailing list, Ben Sawyer wrote:

“…Research. In terms of research the bottom line is there just aren’t a lot of empirical studies out there, be it for games and learning or whatever else. There needs to be more and we need more people to do the entirely hard work of going out and finding Ns>30 and building studies. There are 300 reasons for this but one of them has to be a lack of researchers trying to overcome the obstacles to even an n=100 study. But we’re going to need some groups to get serious about figuring out how to do this or it’s going to be a long road.”

Ben’s post spoke of needing some studies of scale, with the requisite infrastructure to produce them. He is especially concerned that such studies get good data about how different sub-segments of learners within the same sample respond to a game-based learning element. And so a debate ensued about the value of small n studies, the merits of non-empirical studies, and so forth. But the underlying sentiment of Ben’s post holds true: there isn’t enough raw, ongoing research, where new data, information, observations, trials and so forth are continually being served on the game researchers’ smorgasbord.

Teachers, too, are looking for something more applicable to their practice. To quote one at a recent Scottish teacher’s conference:

“I don’t have time to absorb the mainly theoretical considerations of a PhD thesis. And my time is wasted reading literature reviews. And these reports, case studies of games in lessons are usually described ineffectively, they outline what is done but not how it is done.”

So that’s where we’re at. Reports are being churned out. Researchers gain little “fresh meat” in them to sustain further work; teachers get frustrated at the lack of detailed applicability within.

Conclusion

Two recommendations for organisations thinking about “reinventing the wheel”:

1. If you really have to produce such reports, then:
• Ditch any “Introduction to games” material. The three people left in the modern world who haven’t yet encountered a video game can find this information elsewhere.
• Ditch literature reviews. Instead, refer the audience to the relevant chapters in [Squire] and [Egenfeldt-Nielsen], or the most up to date thesis in the field.
• When focusing on case studies, go for depth, not width. Instead of 20 brief case studies, focus on 2 and dig into the nitty-gritty mechanics of how the teacher uses the game within the classroom situation.
• Highlight aspects of the field that are deficient in research, and recommend that these have funding priority or urgency. Follow the lead of Torill: ask some questions [Mortensen]…
• …and stop boring everyone to death with violence and addiction stuff. All this does is feed the luddite wing of the media, who will focus on these points to the exclusion of everything else.
• Check that the report includes substance not present in any previous report.

2. Alternatively and preferably, just forget about the report. Unless there is a major change in gaming technology and software – and that doesn’t mean just a new generation of consoles – someone else has already written it. Instead, fund some research. You’ll rack up more kudos from the wider games research sector.

References

EA [2006] Teaching with games. Using commercial off-the-shelf computer games in formal education.

http://www.futurelab.org.uk/download/pdfs/research/TWG_report.pdf

Egenfeldt-Nielsen, S. [2005] Beyond edutainment: exploring the educational potential of computer games.

http://www.it-c.dk/people/sen/egenfeldt.pdf

ELSPA [2006] Unlimited Learning. Computer and video games in the learning landscape. http://www.elspa.com/assets/files/u/unlimitedlearningtheroleofcomputerandvideogamesint_344.pdf

FAS [2006] Harnessing the power of video games for learning.

http://fas.org/gamesummit/

Gamesandeducation mailing list: http://lists.becta.org.uk/mailman/listinfo/gamesandeducation

Gamesnetwork mailing list: https://listserv.uta.fi/cgi-bin/wa?A0=GAMESNETWORK

JISC [2007] It’s just a game? http://www.jisc.ac.uk/media/documents/programmes/elearning_innovation/gaming%20report_v3.3.pdf

Mortensen, T. [2007] Unasked Questions.

http://torillsin.blogspot.com/2007/01/unasked-questions.html

Squire, K. [2005] Replaying history: learning world history through playing Civilization III.

http://website.education.wisc.edu/kdsquire/dissertation.html

Contact

John Kirriemuir
An-Caladh, Berneray, Outer Hebrides, HS6 5BD, UK
Email: john (at) silversprite.com
Web and blog: www.silversprite.com

Be Sociable, Share!

Leave a Reply