Videogame preservation and massively multiplayer online role-playing games: A review of the literature



Videogames are important cultural and economic artifacts. They also present challenges that anticipate the problems inherent in any complex digital interactive system. Not only are they digital and hence very difficult to preserve but they also are software systems that have significant hardware, peripheral, and network dependencies, which are difficult to collect and formally represent. This article reviews the literature related to videogame preservation. In addition to covering the traditional technology-related issues inherent in all digital preservation endeavors, this review also attempts to describe the complexities and relationships between the traditional acts of technology preservation, representation, and collection development. Future work should include the identification of important user groups, an examination of games' context of use, and the development of representational models to describe interaction of players with the game and the interactions between players playing the game.


Although videogames have their history in youth culture, they are gaining respectability within the academic community. The last decade has seen an upswing in funding for scholarly projects focused on the preservation of videogames and other new media material: The Library of Congress funded a large preservation project through the National Digital Information Infrastructure Project (NDIIP) program (2007–2010) (McDonough et al., 2010b). The Institute of Museum and Library Services (IMLS) also has funded preliminary research (2007–2010) into videogame production and the creative process (Winget, 2008).

Videogames are a vibrant and important part of our culture and economy. Since the launch of the current generation of console games (Playstation 3, Xbox 360, and the Wii), global software sales have increased by more than 50%, from $30.3 billion in 2006 to $46.5 billion in 2009 (Wu, 2010). To put this number in perspective, the best selling game of 2007, Halo 3 (Bungie Studios, 2007), generated more revenue in its first-day sales ($170 million) than did Spiderman 3, which had the largest ever opening weekend for a movie; it also surpassed the first-day sales of Harry Potter and the Deathly Hallows (Grabstats, 2008). Grand Theft Auto IV's (Rockstar North, 2008) first-day sales of $310 million in 2008 eventually broke that record, followed by Call of Duty: Modern Warfare 2's (Infinity Ward, 2009) $410 million launch in 2009 (Reisinger, 2010). The game industry also plays an important part in the U.S. economy. In 2006, the entertainment software industry added $3.8 billion to the U.S. Gross Domestic Product, and employs more than 80,000 people in 31 states (Grabstats, 2008).

Games are not only important from an economic standpoint but they also are becoming increasingly pervasive in our society. Sixty-five percent of American households own console or computer games, and their market is no longer primarily for teenage boys. The average game player is 35 years old, and 40% of players are women. In fact, women over the age of 18 represent a larger portion of the game-playing population (33%) than boys aged 17 or younger (18%) (Grabstats, 2008).

In addition to their economic and social importance, games are attracting the attention of the scholarly and academic communities. Game studies is an interdisciplinary field that includes researchers and academics from many fields including computer science and engineering, communication and media studies, arts and humanities, and the social sciences. These scholars are primarily concerned with examination of games, game players, and the role games play in society. Major conferences devoted to game studies include the Digital Games Research Association (DiGRA), which holds a biyearly conference; and SIGGRAPH (, which has a research as well as a Sandbox track ( Digital humanities, information science, communication, and literary journals often include game-related research. There also are games-specific journals such as the Journal of Virtual Worlds Research (, Game Studies (, and Games and Culture (

Games are important from academic, social, and economic points of view, and there are a number of institutional collections focused on game research in the United States, including the Strong National Museum of Play in Rochester, New York (, the Cabrinety Videogame Collection at Stanford University (, the Learning Games Initiative Research Archive in Arizona (, the Videogame Archive at the University of Texas at Austin (, and the Computer and Videogame Archive at the University of Michigan ( Additionally, many public and academic libraries have circulating game libraries; some of these institutions also check out consoles and controllers with the games (Bridges, Hussong-Christian, & Mellinger, 2010). These collections, whether circulating or research-oriented, allow libraries to extend their user base, make connections within their communities, and provide support for game-related research.

While these are all beneficial outcomes, also note the opportunity for theory building that games provide. Games are intellectually challenging: Simply describing digital games is difficult; videogames are often created and distributed outside of traditional publication models; and as primarily conceptual rather than physical objects, they are very difficult to preserve for future access. Finally, digital games, while having been created primarily for enjoyment and play, also are at their core complex digital interactive systems and as such, share many characteristics with systems such as digital libraries and cyberinfrastructure applications. Some of these shared characteristics include huge datasets, whether of player data or scientific or textual data; multiple user groups, each with distinct needs; similar software creation models, which include the presence of “nonfunctional” requirements such as “fun” in the case of videogames or “trustworthiness” in the case of cyberinfrastructure; the dependence on specific and often bespoke input/output devices to generate and authenticate data; and in addition to known user groups, each type of artifact includes the existence of “rogue” users—people who use the system for unofficial, unauthorized, and unpredictable purposes. In the case of games, these are people who create mods, machinima, or emulated server environments; in the case of cyberinfrastructure, these users might include citizen scientists and skeptics. Developing a model for game collection, representation, and preservation will likely lead to advances in the management of other complex digital interactive systems and will further theory development in representation, collection building, and access.

Defining Terms: Videogames

Clearly defining the breadth and scope of videogames is a challenge. They are commercial products as well as artistic entities (Ebert, 2010a, 2010b). They provide spaces for competition and for socialization (Nardi, 2010). Finally, they are complex sociotechnological objects that depend on user interaction to achieve meaning (Lowood, in press). The term videogame is a general one, and it covers a broad range of formats and player experiences. Videogames can be disseminated over the Internet in the form of software or can be sold in a physical format such as cartridges or discs. Interaction with the game can be via computer networks on a home computer or mobile phone, on coin-operated machines, or on consoles built specifically for a particular game system. Videogames can range from relatively simple, text-based games to graphically rich, painstakingly crafted virtual worlds. Some games are played alone, in small groups, and with multitudes of geographically distant people. In terms of actual game play, videogames are only limited by the imagination of creators and players; they can follow a narrative exposition, they can be strategy based, they can test reflexes and reaction times, or they can be a little bit of everything combined.

Many of the discussions within this review will be relevant for different kinds of videogames and other complex interactive digital systems. However, here we attempt to specifically address the challenges inherent in preserving and collecting Massively Multiplayer Online Role-Playing Games (MMORPG), best exemplified by World of Warcraft (Blizzard Entertainment, 2004), and defined as:

… a genre of computer role-playing games in which a very large number of players interact with one another within a virtual game world. … As in all RPGs, players assume the role of a fictional character (often in a fantasy world), and take control over many of that character's actions. MMORPGs are distinguished from single-player or small multi-player RPGs by the number of players, and by the game's persistent world, usually hosted by the game's publisher, which continues to exist and evolve while the player is away from the game. (Wikipedia, 2010)

MMORPGs present representational and preservation challenges because they are complicated pieces of software, often with hardware, network, and input/output peripheral dependencies; they also are sites of complex and variable social interactions. Indeed, participants at a recent summit meeting for the National Videogames Archive agreed that in addition to control devices, the question of how to preserve MMORPGs was “the thing nobody wants to talk about” (Anderson, Delve, & Pinchbeck, 2010, p. 126). MMORPGs are the 800-lb gorilla of the game preservation community, and this review will be an attempt to provide a framework and tentative research agenda for formally discussing preservation challenges inherent in this medium.

A Brief Aside—Artifacts of Participatory Culture: Mods, Machinima, and “Free Shards”

In addition to the formally produced game, a side effect of the open source, or “maker” or “participatory” culture, is the existence of player-created game modifications called mods. When a game is released, very often its game engine—that code which contains all of the game's instructions, rules, artificial intelligence, and physics—also is open to individuals to personalize game play. Mod developers can use the game engine to augment the game by making new clothes, avatars, or even new game levels. Mod developers also can create entirely new games known as total conversion mods. An example of this would be Counter-Strike (Valve Software, 2005), which began its life as a very popular Internet-based multiplayer modification of single-player Half-Life (Valve Software, 1998). To clarify: this means that the Counter-Strike developers, who had no formal relationship with Valve, used the Half-Life game engine as the basis for their entirely new game. They independently developed this new game and released it to the Internet; it became one of the most popular Internet-based games of all time. Eventually, those developers were hired by Valve, and the game became a formal Valve commodity.

There also is a community of people who use game engines as the basis for theatrical narrative. Red Vs. Blue (Rooster Teeth, 2007), arguably the most famous machinima release, uses the Halo (Bungie Software, 1999) game engine as a dramatic space to put game characters into situations and contexts outside of the standard game narrative.

In terms of MMORPGs, which have a different kind of architecture than does Half-Life or Halo, some people develop tools to augment game play. Within World of Warcraft, for example, “add-on” developers have created interface tools that players use to complete quests, measure success, and evaluate game play (Kow & Nardi, 2010). Many players believe that it would be impossible to play World of Warcraft without some of these add-ons (Nardi, 2010).

Finally, some MMORPG players create server emulations that can run personal versions of the game. The term shard is specific to Ultima Online (Origin Systems, 1997), although all MMORPGs have communities of people who (legally or illegally) run personal emulated servers. Because of the distinct, but persistent, player bases, shards can develop different characteristics and play styles than can the “authoritative” or officially hosted version of the game, but “all shards use identical lands and systems” with only a couple of specialized exceptions, thus making all shards essentially identical copies of the game world (Mythic Entertainment, 2009).

Some shards emulate the game world as it was before the release of expansion packs, and some support different kinds of fighting or nonplayer character interaction (Blancato, 2005). To further illustrate this phenomenon, the text of a “Yahoo Answer” from 2007 regarding whether people still play Ultima Online is presented:

Many people still play it. In fact the official servers are still too crowded for me and full of idiots. Luckily, as usually happens with a mmorpg, there are now hundreds of servers being run by players (over 300 last I looked). … They have fragmented the desires of the players. There are servers which allow no player-kill-player, and ones that are totally about pkill and player hunts. There are some that are easy and you get lots of free gifts all the time, and ones where gains are hard and it takes months to get a decent leveled character. There are ones that are locked at some previous version of UO cause they think later versions ruined it, and ones that are modified to the point that you cant even recognize the maps or game altho it plays the same. ∼Gandalf Parker, “Best Answer” (O'Fama, 2007)

As a free shard, an authoritative creator no longer controls the game, and players are able to customize almost all aspects of the game to meet their own needs while still allowing for large numbers of concurrent players to play on a personal server.

While server emulation allows users to have more control over their game play experience and extends the life and scope of the game, they are not legal. In the case of Ultima Online, neither Electronic Arts, the publisher, nor Richard Garriott, the creator, are paid for shard emulation; operating and playing on a free shard violate the U.S.'s Ultima Online Terms of Service (Electronic Arts, n.d., Parts 5.A–D); and the reverse engineering required to bypass the packet encryption also violates the Digital Millennium Copyright Act (1998). While Electronic Arts seems to have taken a somewhat laissez-faire attitude toward shard emulation, other MMORPGs such as World of Warcraft and Everquest (Sony Online Entertainment, 1999) have been more proactive in finding and closing illegally emulated servers (Goldman, 2010).

These modifications, here termed artifacts of participatory culture, are not created by the formal development team but by independent individuals. They extend the boundaries of the formal game, and when thinking about games in terms of preservation, it becomes difficult to differentiate between the primary artifact—the game—and the mods related to it. The contemporary game scene is enriched by these projects and, to some extent, supported by the industry, but the modding culture results in the end product becoming less of a stable and identifiable object and more of a succession of artifacts that are related through differing kinds of interactions. This is just as true for console and computer games as it is for MMORPGs.

As stated earlier, games, specifically MMORPGs, are artifacts that are not objects; they are not even singular artifacts but interdependent amalgamations of user intent, interactions, rules, code, hardware, and software. Although MMORPGs represent the most extreme problem set, these kinds of artifacts are proliferating: websites, blogs, e-books, and web databases all share characteristics with games, particularly in their mutability, the ability of users to interact freely with the system, and hardware and software dependencies. The only difference lies in the extremity of the situation. In terms of their variability, whereas blogs or wikis change with every new comment or post update, for example, an MMORPG changes every time a player interacts with the system. Multiply an individual player's hundreds of interactions per hour by hundreds of thousands of players, and the extreme variability of the MMORPG data becomes evident. Furthermore, not only do these hundreds of thousands of players introduce a level of variability that is difficult to conceptualize but they also are fundamentally important to understanding the experience of playing the game. Without the existence of large numbers of other players, World of Warcraft, for example, would be impossible to play after the earliest levels, the world would be depopulated, and the experience of playing the game would be deceptively simplistic. To borrow from Sartre (1958), World of Warcraft “ … is other people.” The question then is: How does one go about formally describing and preserving an entity that does not really exist until 100,000 people interact with it?

The remainder of this article will attempt to provide the beginnings of an answer to that question. First, I present the technical approaches to the problem of digital preservation in general and preservation of games specifically. I discuss the representational challenges in formally describing games, then review the current state of thinking on collection development and the different ways that institutions are building research collections and providing access to them. The final section will outline a research agenda for the preservation of MMORPGs.

Preservation Challenge: Technical Dependencies

In the past, conservators have had the “luxury” of being able to focus on specific technical challenges. … Their interventions might have been complicated, but there were options—technical problems with technical solutions. … Today cultural heritage managers must contemplate the very nature of an object. (Cloonan, 2001, p. 237)

Preserving and providing meaningful access to any traditional media is based on a history of knowledge and practice. Initially through the process of trial and error and then through scientific principles, people have been conserving traditional objects for hundreds of years. This time span allows for models to be created: models for definitive collection building, models for creation and authority, and models for use. However, videogames do not have the luxury of history, their modes of production and use are either in flux or are difficult to formally represent, and there is very little time to “hope for the best,” in terms of preservation. While it is still possible to look at drawings on cave walls from 35,000 years ago, for example, or Roman frescoes, and in some cases we can even look at preliminary or ancillary materials related to those artifacts, there is a significant risk of losing basic access to a digital file as soon as 10 years after its initial creation (Conway, 1999).

Because videogames do have a significant digital component, they tend to rapidly become inaccessible or irretrievably lost. With funding from the NEH and IMLS, scholars in the related field of new media art have produced numerous theoretical and practical tracts with which to work, including the development of a notation framework for new media art (Rinehart, 2004), a systematic review of emulation as a strategy for preservation of a multimedia work (Rothenberg, 2006), and the formulation of agreed-upon theories and methods for the preservation of variable media art (Depocas, Ippolito, & Jones, 2003). Furthermore, there have been multiple studies within the digital curation community focusing on emulation strategies for console games in particular (Guttenbrunner, Becker, & Rauber, 2010; Hedstrom, Lee, Olson, & Lampe, 2006). While these projects have achieved great success in terms of tool and theory development, they focus to one degree or another on an entity produced by an individual or a small team of authoritative creators, which has an end product that is relatively static with easily defined boundaries. This is not the case for modern videogames, particularly MMORPGs, which are often created by huge, geographically diverse development teams and have multiple technological dependencies including hardware, software, and network requirements. Furthermore, MMORPGs construct and depend upon social interactions that are difficult to formally model. Major research projects that have focused specifically on MMORPG preservation include federal grants from the Library of Congress' NDIIP to fund the “Preserving Virtual Worlds” project (McDonough et al., 2010b), which systematically examined preservation and representation challenges inherent in virtual worlds; and an IMLS funded project focused on examining the creation behaviors of videogame producers, developers, and designers (Winget & Murray, 2008).

The primary concern with the longevity of digital documents is the “viewing problem” (Besser, 2000). Unlike analog or physical information, which tends to exist independent of human involvement, digital information needs constant intervention to survive. History has shown that digital documents are problematic by default. Whereas we can actually look at the Sistine Chapel ceiling, painted 500 years ago, or play games such as go that was invented over 1,000 years ago, it is difficult if not impossible to simply view documents on 8-in. floppy disks created in the last 20 years, even if there has been an immediate, proactive role in preserving them. Without concerted effort on the part of archivists and preservationists, digital objects quickly become obsolete or inaccessible due to unforeseen, although anticipated, advances in information technology.

The variable media art community currently utilizes four digital preservation strategies, all focused on the end product. The first three methods have technical origins and are based on general digital preservation practices. Related to “the viewing problem,” they are: refreshing, the upgrade of storage mechanisms; migration, the premeditated upgrade of file formats; and emulation, which focuses on development of Ur operating systems able to run obsolete media. The fourth option, developed by and for the new media art community, is reinterpretation (Depocas et al., 2003), where the curators attempt to recreate a work given comprehensive documentation of the original artifact.


Migration and emulation are the two primary methods in managing the problem of obsolete file formats (Waters & Garrett, 1996). Migration focuses on the files themselves, periodically updating files in new software formats. While successful migration allows preservationists to provide access to the content of a work, its deeper meaning is often lost, particularly when dealing with artistic objects wherein an archivist or curator has to make major artistic choices specifically related to format. Migration is particularly unsuited for game preservation. A 2003 attempt to migrate the game Quake (id Software, 1996) described the process of migration as a “gruesome operation,” and that the complexity involved paralleled that of emulating the platform itself (Dondorp & van der Meer, 2003, p. 40). Note, however, that Quake was created 15 years ago and is comparatively simple in terms of the modern gaming environment (Anderson et al., 2010). The traditional proponents of migration as a viable preservation strategy have been archivists (Bearman, 1999; Duranti, 2002), although the archival literature has suggested that this is beginning to change, particularly for complex interactive digital artifacts (Hunter & Choudhury, 2004; Lyman, 2002; von Suchodoletz & van der Hoeven, 2009).


The second method of preservation is emulation, which can be either at the system or the software level. System emulation focuses on developing systems that mimic the hardware used to create or run the original artifact. By writing just a few hardware emulators, dozens of operating systems, thousands of applications, and millions of documents could be accessed (Rothenberg, 1999). Emulation is currently the most widely accepted digital preservation method for many kinds of digital artifacts, including variable media art (Rinehart, 2000, 2002) as well as videogames.

In 2006, Jeff Rothenberg reported his process for renewing Grahame Weinbren and Roberta Friedman's The Erl King (1982–1985), hailed as one of the first works of interactive video art. In terms of preservation, The Erl King was a combination of obsolete hardware, artist-written software, and custom-made components. In his article, Rothenberg (2006) described the technical processes involved in exhibiting the work for the “Seeing Double: Emulation in Theory and Practice” exhibit held at the Guggenheim Museum in 2004. In this exhibit, curators at the Guggenheim displayed seven original interactive and performative artworks created between 1960 and 2004, alongside emulated versions of the same work. Although The Erl King emulation was deemed a “success” in that the emulated version looked and acted like the original, Rothenberg's article leaves no doubt about the theoretical, technical, and philosophical complexities involved in emulating such a complex work. In addition to needing extensive knowledge of artistic intent, a lot of time, and a deep practical skill in low-level programming, the emulation team had to make important theoretical decisions regarding what was “important” about the work in terms of creation, reception, and exhibition.

Within the public sphere, a number of emulation projects are focused on games. The MAME architecture, which supports the emulation of many arcade games, is a prominent example (Salmoria, 2010). Additionally, a number of console emulators are available on the web, but most of these depend on hacked BIOS and thus are illegal. Legitimate platform developers often use emulation to extend the shelf life of their earlier games and to extend their user base. Sony's Playstation 3 has a PSX emulator, as does the Playstation Portable (PSP). Likewise, the Nintendo Wii provides access through its online store to a large selection of emulated versions of earlier console titles (Pinchbeck et al., 2009).

Even though emulation is the most widely recognized preservation solution, traditional emulation is problematic for a number of reasons, including usability and legal concerns, but the most egregious is the fact that an emulator itself can be considered a file format and hence is susceptible to the same preservation challenges that can befall any other kind of digital content.

Modular Emulation and the Universal Virtual Computer

The concept of the Universal Virtual Computer (UVC), developed by Raymond Lorie (2002a, 2002b) provides a viable solution to ensure that emulators will be consistently available in the future. A UVC is a virtual machine that was specifically designed to preserve digital objects held by libraries and archives. The method is based on emulation, although it does not require specialized hardware or full emulation. Instead, the UVC combines elements of migration and emulation in innovative ways: It calls for emulation in that the UVC is a platform-independent logical layer that can sit on top of current and future hardware and software, and it uses migration in that it specifies format conversion to “universal technology-independent formats based on XML-like specifications” (van der Hoeven, Van Diessen, & Van Der Meer, 2005, p. 197). Because both the UVC and the file formats are independent, they will be able to run on current platforms as well as those that have not yet been developed.

The National Library of the Netherlands (Koninklijke Bibliotheek, or the “KB”) has been a leader in the development of the UVC-based preservation method. Following the successful implementation of the UVC on a test set of images (van der Hoeven et al., 2005), the KB has continued to develop their emulation strategy for long-term digital preservation by focusing on modular hardware emulation, and in cooperation with the National Archive of the Netherlands, the KB recently delivered Dioscuri (, the first modular emulator designed for digital preservation.

Dioscuri is a modular x86 computer hardware emulator written in Java. It is component-based: A software surrogate, called a module, emulates each hardware component. Users can combine several modules, allowing personalized configuration of the resulting system. It also is possible to add new or updated modules to the software library, resulting in a more comprehensive base from which to work.

Dioscuri is still in active development, so it is difficult to evaluate; the most recent release, from September 2010, is Version 0.6.0. In a 2009 report to the National Library of Australia (Long, 2009), Dioscuri Version 0.4.0 was only functional with the MS DOS 6.2 and MS Windows 3.1 operating systems. It was very slow, and “none of the tested media files could be rendered sufficiently well to give a useful performance” (Long, 2009; p. 39). On the other hand, van der Hoeven, Lohman, and Verdegem (2008) stated that “old games like Chess, Ironman and PC versions of Tetris and Prince of Persia all work well” on Dioscuri Version 0.2.0. What constitutes “working well” is unclear, but because of its modularity and platform independence, the UVC-based preservation methodology represents a promising avenue of investigation for the digital preservation community.

One exciting project using the concept of modular emulation is Keep Emulation Environments Portable (KEEP), which was launched in January 2009. This is an international project that will attempt to build a prototype emulation access platform, specifically for games (Pinchbeck et al., 2009). Success for this project would go a long way toward addressing the difficulties of archiving large bodies of related material associated with most games. Opening up an obsolete gaming environment provides coverage for many more kinds of artifacts than simply the game itself. Users could run mods within this emulated environment, or they could choose different modules to access different aspects of the game: Some modules could support game play over operating system functionality whereas other modules could support code examination over graphical considerations (Pinchbeck et al., 2009).

Role of Significant Properties in Digital Preservation

Making copies of materials, as in the case of migration, or mimicking an artifact's original computing environment, as in the case with emulation, inevitably introduces some degree of loss (Yeo, 2010). Migration provides access to content rather than to layout or structure. Emulation focuses on surface reproduction, but changes the artifact's underlying computing environment. Neither of these approaches deals with input/output peripherals or network dependencies. Each of these methods is acceptable under certain circumstances, given an understanding regarding what qualities are important about an artifact or a class of artifacts, what characteristics must be retained, and what characteristics might be acceptably lost. These important qualities are variously called significant properties (Hedstrom& Lee, 2002), “salient features” (Allison, Currall, Moss, & Stuart, 2005), or “essential characteristics” (Hoffman, 2002).

The idea of preserving an artifact's “significant properties” is a central tenet of the archival tradition. It refers to the idea that archivists and other cultural heritage managers should be able to identify a document or collection's most important characteristics to make appropriate appraisal, collection, preservation, and access decisions. However, there are a number of different theories regarding what properties of different artifacts are, in fact, significant. In making a distinction between an artifact's form and its content, some writers have suggested that the most significant properties are those related to content rather than those that are related to appearance (DeRose, Durand, Mylonas, & Renear, 1990). While this work was focused on literary analysis and the development of the Text Encoding Initiative, this attitude also is present in library imaging projects, which attempt to capture “… the informational content of the original … no more, no less” (Kenney & Chapman, 1996, p. 8; as cited in Yeo, 2010). Virtually all of the large digitization projects have emphasized structured content over form (cf. Project Gutenberg,, where the primary artifact is often stripped of all design and formal characteristics not relating directly to content. This focus on structured content also is present in the archival literature: The Effective Records Management Project at the University of Glasgow (Currall, Johnson, Johnston, Moss, & Richmond, 2002) recommended that digital preservation strategies focus on maintaining the structured content of records as opposed to their presentational form. According to Currall et al. (2002), this is in direct opposition to the preservation practices and expectations associated with paper documents (as cited in Yeo, 2010).

This attitude is not true for all archival scholars: In the 1990s, Bearman and Sochats (1996) explored the notion of digital records' “essential properties,” specifically as pertaining to a record's ability to stand up as evidence. Forgoing the language of “significance” or “essential,” they determined that a “complete” record is one that retains its content, structure, and context. Furthermore, they felt that the most important characteristics of a record were not solely related to its content but also in its connection to appropriate metadata, specifically that metadata that can verify the record's comprehensiveness, identity, and authority (Bearman & Sochats, 1996).

The archival literature often refers to a “record,” typically defined as

materials created or received by a person, family, or organization, public or private, in the conduct of their affairs that are preserved because of the enduring value contained in the information they contain or as evidence of the functions and responsibilities of their creator. (Pearce-Moses, 2005; cf. term: “archival records”)

The term implicitly refers to documents rather than to artifacts or published materials, but archival records can be in any format, including text, photographs, and motion pictures. Archival records are often collected and preserved for their evidential, or legal, value. While videogames are not archival records in the sense that they serve an evidential purpose, by collecting and preserving them, archival institutions are implicitly responsible for preserving them in such a way that they retain their comprehensiveness, identity, and authenticity as artifacts. There has been little formal research in the videogame literature to suggest that anyone knows precisely what it means to preserve a videogame's authenticity, although McDonough and colleagues recently won an IMLS National Leadership Award to examine that question (GSLIS, 2010). One would hope that the game's content, structure, and context would be preserved.

Finally, because archives traditionally were narrowly defined as being repositories for evidential records, the idea of significance and authority was limited to those users (e.g., historians or lawyers) who had a narrow set of needs, specifically that a record be “what it purports to be, and is free from tampering or corruption” (Duranti, 2002, p. 21). These are admirable goals for many kinds of artifacts, but not necessarily useful from a videogame perspective. For example, videogames are often “tampered” with, as in the case with different kinds of game modifications, and while “corruption” is a strong term, many games are released as “buggy,” with patches and new releases being the norm. Furthermore, like many digital artifacts, videogames are boundary objects (Star & Griesemer, 1989) that anticipate multiple user groups, which have diverse needs and varied goals. For example, different users accessing the same game might want to play the game and have an “enjoyable” experience whereas other users might want to play the game in its original form; finally, some users might want to access the game code to see the game's inner workings. The idea of authenticity, therefore, must become more expansive, based on user needs rather than archival expectations and traditional practice (Dappert & Farquhar, 2009; Yeo, 2010).

In 2006, Hedstrom et al. (2006) presented an experiment reviewing users' reactions to preservation artifacts. In this article, which dealt with normal text documents as well as an early British videogame (Chuck E. Egg), users interacted with original, migrated, and emulated versions of the materials and were then asked their opinions of each. This research showed that the concept of digital preservation of something like a videogame is more complex than originally was thought, and that some of the assumptions underlying digital preservation research are unfounded. Emulation, for example, did not preserve “look and feel” properties any better than did migration. Furthermore, while users recognized that they were losing some “important” aspects of the original game, such as the original, obsolete keyboard and the processor, they preferred to play migrated and emulated versions of the game over the original. The keyboard was unusual and difficult to navigate, the “slow and cumbersome” processor screen made interaction difficult, and the aesthetic experience was unpleasant. These negative characteristics were mediated somewhat by the processes of migration and emulation.

Although there are no other user-centered experimental research studies that have focused on videogames' significant properties, informal experience suggests that there is a preference for characteristics that might be considered “inauthentic” given traditional archival theory. Because processor speeds continually improve and there are no intuitive markers for “age,” or “rarity” in digital artifacts, it is difficult for modern users to recognize the value of digital originals. Compared to contemporary videogames, games produced only 20 years ago are slow, their graphics are pixilated, and the interactions are both tortuous and simplistic. Furthermore, older games are often much more difficult to play and have different game mechanics than modern players are used to. Part of playing Ultima I (California Pacific Computer Company, 1980), for example, is to get killed over and over again in the first encounter until the player realizes that this particular monster cannot be killed, and the only option is simply to run away. This is anathema to modern gamers who are typically not given challenges they are unable to master (Arnold, 2008). Providing reliable representations of these originals is therefore problematic because interaction with old games primarily evoke either a “so what” or an “argh” response, which overrides the ability of users to recognize the innovative aspects of the game. In the words of one academic blogger, “I can no longer assume the game [Ultima IV] will make its case for greatness all by itself” (Abbott, 2010). What does this mean in terms of videogame collections within research institutions?

Future research must address this reality of varied user needs and expectations, particularly as it pertains to complex digital interactive systems such as videogames. Instead of relying on traditional notions of authenticity and significance, institutions need to develop multiple models, based on extensive user study, which will support utility over static, evidential representations. Archivists understandably want to provide the most authentic, reliable representation of the artifacts under their care. Unfortunately, what constitutes authentic and reliable is no longer fixed, and necessitates the development of new models.

In their experiment to explore emulation strategy for videogame preservation, Guttenbrunner et al. (2010) provided some commentary on this question, albeit tangentially. In their experiment, they created a taxonomy of significant properties for console videogames and tested various games against commercially available emulators to see how well the different methods maintained the significant properties. These properties were defined as those characteristics related to (a) object characteristics such as speed, graphics, audio, and network support; (b) infrastructure, those characteristics related to legality, stability, and the need for peripherals; (c) process characteristics such as usability and configurability; (d) costs related to preserving the gaming environment; and (e) context and data characteristics, including metadata referencing the game system and particular applications. In terms of this literature review, the most interesting finding was that the highest scoring preservation method was to simply film game play. Although interactivity was completely lost, which automatically ruled it out as a viable preservation method because of the setup of their evaluation system, the video option was the highest scoring and most reliable representation of the game as a whole in all other respects. This finding is not particularly surprising, although it is gratifying to see a formal study supporting intuition. For nearly a decade, Henry Lowood (2002) has called for the development of what he termed “game performance archives,” arguing that these materials represent game culture and interactivity in a way that the system alone can not.

As mentioned earlier, the studies referenced here focus on the preservation of relatively static artifacts that have a traditional creation model. Console and arcade videogames and interactive art objects such as The Erl King have clear-cut boundaries, and it is still possible for institutions to identify technical approaches to technical problems. While there are some aspects of digital preservation that are problematic even for these “simple” artifacts, emulation and migration still appear to be reasonable—if somewhat problematic—preservation methods because of their static and stable nature. This is not necessarily the case with the more complex, modern videogames such as World of Warcraft, which have multiple technical and network dependencies and have highly variable boundaries. It is exceedingly difficult to formally define where an MMORPG begins and ends, how it differs from its augmentations and surrogates, and the role of the players in game creation and representation. Emulating the hardware and software will only go so far in truly preserving this kind of work and would never be able to reproduce the entire Internet environment necessary for comprehensive representation. Research on MMORPG preservation is still in its formative stages, where scholars are still attempting to build models of what an MMORPG is, how to formally describe it, and how to represent the interaction of all its parts (McDonough et al., 2010a).

Preservation Challenge: Representation

Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information? (T.S. Eliot, “Choruses from the Rock,” 1934; as cited in Depocas, 2002)

It will be impossible to preserve videogames without the existence of structured documentation that describes the game's technical components as well as the context in which it was played. Games present a representation challenge for a number of reasons: (a) Games in general are difficult to formally describe; (b) videogames, even the most simplistic ones, have complex and sometimes convoluted technical dependencies that change with different releases and instances; and (c) games are social artifacts and, as such, require explicit descriptions of their use and interaction states. This section of the review provides a background on these issues, although note that the field of representing and formally describing videogames is in its infancy; thus, the readings are sparse.

Games, in general, are difficult to formally describe. In their influential book The Study of Games, anthropologists Elliott Avedon and Brian Sutton-Smith asked, “What are games? Are they things in the sense of artifacts? Are they behavioral models, or simulations of social situations? Are they vestiges of ancient rituals or ancient rites?” (as cited in Lowood, 2002, p. 7). Although Avedon and Sutton-Smith asked these questions in an attempt to get their readers to consider the similarities among games, they remain relevant for the current discussion of videogame preservation and representation: What are games? What differentiates a game from everyday life? Are games things or are they activities? Are they authoritative texts or are they individualized experiences that are built around social interaction, competition, and play? (Lowood, in press) Until very recently, the information science community has treated videogames as artifacts, which have technical solutions to specifically technical problems. However, as mentioned earlier, when describing videogames, in addition to any technological issues, the challenge hinges on the possibility of determining the precise nature of the primary artifact, how it differs from surrogates or secondary artifacts, and how we can represent those distinctions for organization, access, and preservation. These are all difficult problems, but they leave out an important aspect of interactive software: namely, its interactivity, and the resulting flexibility of both development and use. Without a good model for describing artifacts of interaction, any representation of a digital media artifact will necessarily be incomplete. Crawford described interactivity as follows:

Interactivity is not about objects, it's about actions. Yet our thought processes push us towards objects, not actions. This explains why everybody is wasting so much time talking about “content.” Content is a noun! We don't need content; we need process, relationship, action, verb. (as cited in Lowood, 2002, p. 7).

Interactivity can refer to those processes and relationships that are necessary for the game to run: These are the object-centric technical interactions that include descriptions of hardware and software dependencies, code libraries, or network infrastructure. Interactivity also can refer to more informational, or social, aspects of the game: how players interact with the game, what relationship the game has to the environment, and how players interact with each other while playing the game. Typically, the tools that the information science community has at its disposal provide relatively robust models to describe technical interactions, but prove less useful for describing the social ones.

Game-Specific Descriptive Framework

A number of commercial and community sites, such as the Pan European Game Information Network (PEGI; Interactive Software Federation of Europe, 2007) and MobyGames (MobyGames, 2011) have maintained metadata about specific games; however, these metadata schemas typically tend toward descriptive cataloging information, recording title, data, platform, genre, and sometimes system requirements. A recurring theme in the literature (Björk & Holopainen, 2005; Dahlskog, Kamstrup,& Aarseth, 2009) is the need to revisit genre classifications for games. While these elements are definitely important in terms of representation and preservation, research institutions need more detailed information regarding hardware, software, and network dependencies, as well as interaction and contextual information to adequately represent and preserve the materials in their collections. The only systematically designed, game-specific descriptive framework comes from a German master's thesis by Karsten Huth (2004), which focused on describing games for the relatively ancient Commodore 64 (Commodore International, 1982) and the Atari 2600 (Atari Inc., 1977) systems (reviewed extensively in Anderson et al., 2010).11

Huth split the descriptive framework into five groups:

  • Representation Information contains basic information regarding installation and control instructions, software and storage requirements, necessary input/output devices, and patch and plug-in associations;

  • Reference Information represents standard descriptive cataloging data such as title, platform, version, and so on;

  • Provenance Information provides extensive technical descriptors, legal information, and an extensive list of fields related to identifying possible emulation environments. This field includes descriptions of compilers, programming languages, a full list of modules, an overview of program architecture, and data flow. Provenance information also has a field for source code, which may or may not be realistic given the

  • Fixity Information records' descriptive functionality as well as copy protection information and

  • Context Information, which is drawn mainly from the Dublin Core Metadata Element Set (2003).

The development of this framework is an important step forward in terms of videogame collection and representation. It is particularly important for modular emulation projects, such as KEEP, which need extensive technical metadata to build relevant modules. However, it was developed for older systems, and the gaming environment has changed extensively since the inception of the Atari 2600 in 1977 and the Commodore 64 in 1982. Specifically, the presence of artifacts of participatory culture, such as mods, machinima, and emulated servers, is not covered in this framework; furthermore, it lacks a precise and formalized description of interaction dependencies between different software components. There also is no easy way to track relationships between patches, versions, and new releases.

Additionally, given the presence of huge, geographically diverse development teams, the Creator and Credits fields need to be expanded and subdivided into job titles and roles. There are important people within the game industry that operate both above and below the radar of “fame,” and it would be beneficial to be able to track their progress among and between studios, publishers, and even genres.

Finally, there is no field to differentiate between game application and engine. This is a direct result of the limitations of the Commodore 64 and the Atari 2600, which were self-contained systems in which development was isolated for very specific application environments. In that era, there was no overlap between applications and engines; however, contemporary game developers often take generic game engines and modify them for their individual needs. For example: Gears of War (Epic Games, 2006), Batman: Arkham Asylum (Rocksteady Studios, 2009), Zumba Fitness (Pipeworks Software, 2010), and DC Universe Online (Sony Online Austin, 2011) all use Epic's Unreal Engine 3 (Wikipedia, 2011). Not only is this information noteworthy from a development and relational standpoint but it also might have an impact on preservation methods and recreating a valid run-time environment upon emulation.

Describing Technical Relationships: Functional Requirements for Bibliographic Records

Huth's (2004) framework provides a stable and robust descriptive model for videogames, but one of its major inadequacies is its inability to formally track relationships between software and hardware components, versions, patches, and new releases. In a recent publication, McDonough et al. (2010b) described the effectiveness of using the Functional Requirements for Bibliographic Records (FRBR) entity-relationship model to describe videogames for the purpose of preservation. Their primary case is ADVENTURE, alternately described as “one of the most influential interactive fiction works” (McDonough, Kirschenbaum, Reside, Fraistat, & Jerz, 2010a, para 5), or “the classic text computer game” (Jerz, 2007, para 1), illustrating the problematic nature of simply classifying these artifacts. McDonough et al.'s (2010a) report showed, in detail, how a relatively straightforward text-based game can quickly become unmanageable within the FRBR framework. They specifically found that the ambiguous distinction between Work and Expression is particularly problematic when dealing with videogames and interactive fiction, and also argued that because of the existence of mods, there is a “reasonably compelling justification [for] the notion of a Superwork” (McDonough et al., 2010a; para. 31), which would assist in retrieval and collocation of materials. While it does allow for adequate description of relationships at both the Work and the Expression levels for ADVENTURE, the reality of using FRBR to describe videogames could easily become “a tremendous burden” (McDonough et al., 2010a, para 29) for those who would actually have to describe the games.

A complete description of a computer game within the FRBR framework would need to identify all of the various subsidiary Works constituting the games' technological components, whether created by the game author or not, delineate the relationships between all of the different components, and provide some level of intellectual description of each. (McDonough et al., 2010a, para. 29)

Finally, as with the earlier discussion of emulation of The Erl King, the authors stressed the unusual and somewhat extreme skill set necessary to conduct an FRBR-based description:

Our work applying FRBR to ADVENTURE required an advanced knowledge of antiquarian computers, systems, and programming languages, as well as an appreciation for how the game has been ported and reworked by diverse constituencies over the course of several decades. (McDonough et al., 2010a, para 38)

Future work in this area would include the continued development of descriptive frameworks that could handle complex interactive entities, and perhaps a simplified method of describing relationships of components within and related to a game artifact. Additionally, providing a precise definition of the representation system's purpose or intended audience would improve the system's utility by allowing developers to precisely determine the scope of specificity in description.

Describing Technical Relationships: Simulation

A project that bridges the gap between technical dependencies and representation is the “Visual6502 Project” (James, Silverman, & Silverman, 2010a). This is an attempt to build a logical simulation of the fabled 6502 microprocessor, which was “the brain” in the early Apple, Commmodore, Acorn, and Atari systems. It also ran the Atari 2600 console, and its core was incorporated into the central processor of the Nintendo Entertainment System (Nintendo, 1984) (James, Silverman,& Silverman, 2010b). This innovative project had three phases: In the first phase, developers took high-resolution photomicrographs of the surface of a 6502 chip and assembled the photographs into a single vector image using a custom Python application. In the second phase, the chip was then stripped down to its substrate features using sulfuric acid, rephotographed, and the images from this step were assembled, again using the Python application, into a second full layer aligned with the image from Step 1. In the third and final phase, they used these two images to draw the intricate connections and wires on the chip. James, Silverman, and Silverman (2010a, p. a26) stated that “from these two images, we derived polygonal representations of each physical layer of the chip, including the conductive substrate regions, polysilicon gate wires, buried contact areas, vias, and the topmost metal layer.” Essentially, this process allowed them to make an exact logical replica of the 6502 chip, which they could then use to run simulations of programs that would have run on these early computers.

This project differs from emulation in subtle, but specific, ways (James et al., 2010b). First, it is primarily a descriptive project rather than a systemic one. Emulation is a process by which developers attempt to approximate the target system by gathering what documentation is available and using that as the programming base to reverse-engineer the original. This documentation is often neither complete or entirely accurate, and results in imperfect representations (cf. Phillips, 2010). While a “disciplined” emulator will often capture traces of actual chip behavior, these are not formalized representations, and it is impossible, using this method, to capture the billions of sequences of bits that a real chip produces. The Visual6502 project, on the other hand, uses the actual microscopic parts of the original physical chip to both logically represent and physically simulate the original chip behavior. By performing this process on different parts of a computer motherboard, the Visual6502 team has built compelling simulations of historic operating systems that outperform the most popular emulation systems (James et al., 2010a).

Describing Social Interaction and Context

Archivists need to shift from a paradigm centered around saving a completed work to a new paradigm of saving a wide body of material that contextualizes a work. (Besser, 2001, para 14)

To make sense of historical artifacts and activities, particularly those that are important for their technological advances, information about the context of their creation, reception, and display facilitate comprehension and deeper understanding. Physical artifacts often carry some hint of their social or creation context with them: Paintings have ornate or simple frames, they are hung in a particular space, they are made of particular materials, they are and have been available to particular kinds of people throughout their history. Through the use of technology, we can see underpainting, or rough sketches under paint, and we can even see artist fingerprints imprinted in the paint itself. Books are kept in libraries, they have particular bindings, and they are printed on different kinds of paper using different kinds of printing techniques. For archival collections, context is preserved through the application of “original order” and “respect des fonds,” the related ideas that papers from different offices should be kept separate and in their original order. All of these clues, whether they are explicitly defined, carry meaning, and people can perceive the different contexts of creation, or reception, or use—at a glance. Digital objects are not like this. Their physicality is subtler than that of a book or a painting. Examining production methods of modern computing environments requires a specialized skill set, and maybe even a particular mindset (i.e., hacking or tinkering), and the physical traces of a digital object's having been made is hidden under multiple layers of mediation, requiring esoteric skills in digital forensics (Kirschenbaum, Ovenden, & Redwine, 2010). Furthermore, access to digital games is more democratic than are other kinds of physical materials: They can be accessed by anyone, anywhere, and as long as there is a computer available; and often, particularly if they are complex digital interactive systems, their variability ensures that there is no distinct, correct, physical “order” which can be maintained. The traditional idea of context, particularly as defined by the archival community, should be expanded to be more inclusive for these artifacts that do not carry their context implicitly in their physical existence.

There have been some initial forays into building formalized systems to describe context (Winget, 2009). Although it primarily deals with fine art as its sample case, Winget's (2009) system provides the means to describe explicit contextual information for all kinds of artifacts. Specifically, her system augments traditional descriptive standards by providing a framework to describe spatial conditions of use, exemplified attributes, and conditions of presentation. Winget's (2009) system is derived from the work of David Summers (2003), and she essentially argued that because artifacts gain meaning within a cultural or social construct, the job of any subsequent representation of that artifact is to replace it back into the context of its original creation and use. Winget (2009) argued that we can do this by reconceptualizing artifacts as primarily spatially situated; their content thus becomes less important than does their placement within real space, the precise description of their creation, and the particulars of use.

Placement within real space.

Videogames, like many digital artifacts, are problematic in that their access model is variable. Simplistically listing where people play videogames would be redundant because any person, anywhere, can play videogames on various kinds of monitors, using a variety of controllers, hardware, and software. Games on mobile devices can literally happen anytime, anywhere. Bars hold Dance Dance Revolution (Konami, 1998) nights, and local libraries hold Super Smash Brothers Brawl (Nintendo, 2009) tournaments for local teenagers. There has been some attempt to describe and archive “game atmosphere” by modeling a living room in 1982, a large arcade in 1983, a small arcade in 1989, and a bedroom in 1995 as three-dimensional virtual spaces (Esposito, 2005), but the environments from this study are limited in that they are generalized approximations of game atmospheres rather than specific exemplars.

Precise description of creation.

Huth's (2004) descriptive framework, the FRBR model, and the Visual6502 project (explained earlier) meticulously described the hardware and software necessary to run games. Each piece of hardware and software has a history of creation of its own, as does the design of the game as a whole. Winget (in press-b) conducted interviews with stakeholders within the game industry to learn more about their creation behaviors: how the development team communicates ideas, how they transform their vision to an actual working game, and how they coordinate the work and vision of sometimes hundreds of geographically distant people. Future work in this area might continue along these lines, perhaps collecting oral histories of independent and corporate game developers, discussing where they get inspiration, who they admire within the industry, how they generate ideas, and how they envision game play. Oral history projects for early software developers and Silicon Valley executives exist, and while there are nascent projects to collect oral histories from game pioneers (cf. Motherboard, 2009) these projects are ad hoc and decentralized.

Particulars of use.

Related to the description of spaces of use is the need to record the particulars of use. Lowood (2002) called for the development of game performance archives (described in more detail later), which would include machinima as well as mods, but also would document other kinds of use and interaction scenarios. Examples include (a) “speed runs,” a recording of a person playing through a particular level or quest in a game as quickly as possible, usually under certain constraints; (b) “walk-thrus” or “cheat codes,” which are textual and/or video instructions for how a player can efficiently complete certain game requirements; and (c) recordings of game play itself. These recordings could refer to the image on the game screen or they could be recordings of people playing the game, thereby giving future users some idea of how the game was approached, the physical relationship between the player and the game console or computer, and what different kinds of hardware and software were used.

Preservation Challenge: Collection Development

Lowood (2002) was the first to discuss the importance of providing access to different game characteristics. He proposed a framework for collecting complex interactive digital systems (i.e., games), and identified four areas for development: (a) emulation testbeds, (b) game development archives, (c) artifact collections, and (d) game performance archives. Lowood (2002) also called for collecting institutions such as archives, museums, and libraries to work together so these materials could be collected in one spot, leading to more meaningful access and retrieval. This framework reflects the complexity of collecting and preserving complex digital artifacts, and provides a meaningful attempt to preserve information in an object-centered culture (Swade, 1993, as cited in Lowood, 2002).

Emulation Testbeds

Because technology advances so quickly, Lowood (2002) called for an emulation testbed to perform experiments related to authenticity, reliability, and access given the variability of emulation algorithms, applications, and differing game foci. For example, game emulation must be able to account for differing operating systems and software environments, but also the advances in audio and video codecs, different programming interfaces, networking protocols, processor speeds, recommended display requirements, and specialized controllers. Existing emulation testbeds include those run by individuals and primarily commercial institutions, such as the recently suspended Home of the Underdogs (2010), the Emulator Zone (Visei Internet, 2010), and RetroGamer (Mindspark Interactive Network, Inc., 2011).

Game Development Archives

The videogame creation process is fraught with variability not only within the industry, in that each game company has its own development model, but also within the community of player interaction and expectations of modifiability. Traditional concepts of authorship do not necessarily apply to the game-development process, and the collecting community must devise a new creation model to authentically and reliably document videogame history. In addition to projects focused on interviewing game designers, producers, and developers for the purpose of building a documentation model for game development (Winget & Murray, 2008), there are nascent projects focused on formally defining characteristic game design models and genres for the purpose of scholarship.

Artifact Collections

Stanford's Stephen M. Cabrinety Collection in the History of Microcomputing, the Computer History Museum, the Strong Museum of Play, and the Learning Games Initiative Research Archive all have extensive hardware collections. Additionally, there are numerous academic and public libraries that are collecting game cartridges and CDs. These collections ensure that there will be some record of the variability and complexity in the physicality of these materials. While it seems straightforward to collect these kinds of materials, a recent article by Gooding and Terras (2008) contends that original game cartridges and hardware are becoming difficult to obtain and that the game collecting community is about to reach a crisis point in terms of materials.

Game Performance Archives

Because MMORPGs are primarily about interaction—with the game and with other players—game performance archives provide a means to recreate or access some record of how people interacted with the games themselves. As conceptualized by Lowood (2002), these archives focus on documenting game play, and would include artifacts of participatory culture such as machinima, game mods, and speed runs as well as mash-ups not related to game play, fan-generated fiction, wikis documenting group behavior, or other records that document the effect a given game has on individuals, groups of individuals, or society (Winget, in press-a).

With videogames, there is a tension between the relative technological fixity of a finished game and the experiences enabled by the game, which change with every individual interaction. In terms of game development, player experiences are generated through rules, code, and narrative arc; but on the side of player experience, they are engendered through interactivity, social networking, and game modifications to meet individual or group needs. Until traditional institutions create collection development models that include artifacts of participatory culture, the institutional representation of game collections runs the risk of being incomplete or even unreliable.


This research suggests that emulation is a viable preservation option, particularly for console games, and work in that area is ongoing. MMORPGs and other complex digital interactive systems have additional requirements, though, demonstrating extra representational and preservation challenges. While migrating and/or emulating the software will provide a means to interact with the game on a basic level, without some representation of player interactions with and around the game, the technical solutions run the risk of meaning very little. When collecting and providing sustained access to MMORPGs, in addition to any technological and legal concerns, of which there are many, the challenges ultimately hinge on representational issues: determining the precise nature of the primary artifact, defining its boundaries and how it differs from surrogates or secondary artifacts, and modeling the nature of the artifact's interactivity.

In the absence of representational models describing use and social interactions, collecting information related to the “context of use” will become very important. In addition to creating a framework to describe context, there needs to be a shift from thinking about digital objects as discrete entities to thinking about them as parts of an integrated whole. Ethnographic studies of players, makers, and various other stakeholders will provide valuable information on the role of videogames within our culture. Possible future research in this area might examine the actual game playing space of a particular title: Where do different kinds of people play Grand Theft Auto: San Andreas, for example? Where is the console set up? Do teenagers play that particular game in the privacy of their room? Do adults play it in the living room? If a person lives alone, where do they set up their console, as opposed to those people who live with roommates or family? Do people have different game play spaces for console and computer games? Another project might be to look at the layout of different arcades during a particular decade. Where were arcades located? Were they located in malls or strip malls, or in storefronts on city streets? Were there sections of town that were more likely to have arcades than were others? Did the arcades in the different sections of town have different clientele? Did they contain different games? Was there any similarity in arcade layout: Was Pac Man always placed next to Galaga, for example? Was there one prime spot in the arcade where owners would put the most popular game? Were there avenues of sight lines, like the grand boulevards of Paris, or was arcade layout more informal? While these kinds of questions represent complicated anthropological research, outside of the purview of traditional institutional responsibilities, scholarship in this area would provide valuable insight into the realities of gaming culture.

The development of a national collecting strategy also would be beneficial for all kinds of complex digital interactive systems, but particularly for videogames. An infrastructure between institutions would allow for specialization in hardware and software collection development, convergent representation, and access models, and it would provide a means to share the effort related to representation, preservation, and access. Individual institutions could focus their energies on collecting particular hardware or software with the knowledge that other, related institutions were covering related materials. Alternatively, a national infrastructure could allow institutions to support different types of users' needs or utility models. The development of said utility models will depend on extensive user studies: establishing the relevant stakeholders, their needs, and the methods by which they tend to express those needs.

Finally, the existence of “rogue” users, who use systems for unpredictable and unauthorized purposes, will only increase as digital systems become more widely available and the model of “openness” becomes more widespread. Extensive and formalized research into the behaviors, expectations, and ramifications of these users upon traditional collecting institutions will provide valuable insight into the ultimate value and effectiveness of these systems.

Videogames are important cultural and economic artifacts. They also present challenges that anticipate the problems inherent in any complex digital interactive system. Not only are they digital and hence very difficult to preserve but they also are software systems that have significant hardware, peripheral, and network dependencies, which are difficult to collect and formally represent. This article reviewed the literature related to videogame preservation. In addition to covering the traditional technology-related issues inherent in all digital preservation endeavors, this review also attempted to describe the complexities and relationships between the traditional acts of technology preservation, representation, and collection development. Future work should include the identification of important user groups, an examination of games' context of use, and the development of representational models to describe interaction of players with the game and the interactions between players playing the game.


  1. 1

    The discussion here relies exclusively on and is a paraphrasing of the review by Anderson et al. (2010).