SEARCH

SEARCH BY CITATION

Keywords:

  • privacy;
  • education;
  • game design;
  • internet;
  • personal information

Abstract

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

Using the politics of personal information and online privacy as a case study, this article sets out the justification for the use of games in the education and communication of online privacy issues. It draws upon existing research into privacy knowledge and behaviour, game design for education and the experience of the Visualisation and Other Methods of Expression (VOME) project in designing a privacy education game.

The Visualisation and Other Methods of Expression (VOME) project (http://www.vome.org.uk) set out to explore how communities engaged with concepts of information privacy and consent in online interactions. The aim was to develop alternative conceptual models of online privacy that would enable people to make more empowered online disclosure choices. It was also hoped that such approaches would support critical surveillance awareness, going beyond critique to provide educational resources and an innovative form of academic knowledge transfer. Following these motives, one element of the project was the creation of a bespoke card game as a political intervention – as an alternative means of expressing issues of online privacy and consent. It was hoped that a privacy and consent game would work towards these aims by communicating alternative models of online interactions. This article positions political games as an intervention targeted at a range of audiences and addressed to a highly complex information environment. The purpose of the article is to provide a justification for the use of games to explore the politics of privacy, set out a design brief and the game that resulted from it, and provide an evaluation of this game.

The article first sets out the problems of online privacy and consent engagement and understanding, to which an educational game is then positioned as a response. The nature of the online environment makes privacy both an extremely important and relevant issue, but also increases the complexity of understanding its dynamics and logics. The article then examines the literature on games and specifically games in education. It makes an argument for the power of games to assist in visualising and understanding complex information environments. This argument is based upon Ian Bogost's (2007) concept of procedural rhetoric. The article then introduces the Privacy card game, and outlines the process from research to games design, before concluding with the empirical evaluation of the game.

Online privacy is a politicised issue, with impacts upon freedom, life chances, distribution of resources, political communication, deliberation and knowledge. It is an issue with contested norms and currently active legislative debates. Online privacy is also implicated in internet freedom and access, and surveillance debates. It can be difficult to engage with, but potentially impacts upon large parts of the population. It is therefore a particularly suitable starting point for developing a model of political discussion, engagement and education through game design.

This article aims to address three key research questions: first, is it feasible to produce a playable, usable card game within the context of an academic research project; second, can a card game be used as an engagement and awareness tool for promoting critical digital literacy in regard to privacy; and third, what is the value of a card game as a qualitative research tool.

The Online Privacy and Consent Problem

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

Privacy online is a complex socio-technical politics that incorporates multiple technologies, social institutions, countries, social practices and individuals. It operates across a range of institutional, economic and national boundaries. While there are ongoing discussions about the usefulness of privacy terminology (Bennett, 2011; Stalder, 2002), it has public, legal and policy meaning and purchase in both politics and information technology disciplines.

Yong Jin Park (2011) examined the impact of digital literacy on privacy-related behaviour online. This research was a critical response to the model of the ‘omni-competent’ user and the concept of the individual as the sole guardian of their privacy as enshrined in much government policy. Finding a strong predictive relationship between digital literacy and privacy-related behaviour, Park argues that critical knowledge should now encompass understanding flows of data and the implicit rules of the digital environment. Information literacy (including familiarity with the technical architecture of the internet, awareness of common institutional practices and understanding of current privacy policy) allows individuals to take empowered and informed control of their digital identities. Not understanding data flows is therefore a hindrance to complex decision making. Similarly, Ana Viseu et al. (2004) argue that privacy mediates individuals’ online practice, and is constantly practically redefined. They maintain that the ‘freedom of choice’ perspective is problematic in assuming the individual is free to choose; that acting on preferences is easy; that one is aware of dangers; and that the context of choice remains stable over time. They associate lack of knowledge with passivity. Danah Boyd and Alice Marwick (2011) find that teenagers’ activities online are shaped by their understanding of the social situation, attitudes towards publicity and their ability to navigate the technological environment. Sherry Turkle's (2011) psychological perspective on online sociability suggests not an absence of desire for privacy, but an absence of understanding of the rules and practices of online privacy.

James Grimmelmann (2009) explores this complexity in the specific case of Facebook, although his findings apply more broadly. He identifies social motivations for information-sharing and privacy-discarding activities, including identity, relationships and community. Identity includes identity formation, convincing others to accept your claims about self, impression management and strategic revelation of information. He finds social media profiles to be wholly social artefacts – partially controlled performative expression directed towards a specific audience. Sharing information is seen as a basic component of relationships and intimacy, but he identifies that the semi-public performance of relationships has a broader social value. Sharing information facilitates recognition as a valid member of a community, the social network specifically allowing both the visualisation of social networks and status within communities. These social needs cannot be satisfied in conditions of complete secrecy. However, for Grimmelmann, the dynamics of social networks cause a misunderstanding of risk. Not only is there no plausible way to assign probabilities to complicated, situated, emotional-social dynamics, but also many people use unreliable heuristics to guide decision making. These include concepts of safety in numbers, signals of privacy, assumptions that audiences for information are similar to the subjects, social obligation and the absence of behavioural cues for privacy.

Bernhard Debatin et al. (2009) draw upon a series of empirical studies to suggest why – despite the privacy risks posed – social networking is popular: it provides a high level of gratification and provides users with real benefits, primarily in terms of social capital. This view of personal information as social capital comes from Yasmin Ibrahim (2008). Debatin et al. also argue that social media use (and by extension the revelation of personal information online) can become routine and ritualised. Finally, they associate absence of direct experience of privacy violation with the idea that it happens to other people. Sally Jo Cunningham et al. (2010) used ethnographic interviews and observation to find that users are often not consciously thinking about privacy, with strategies that are often piecemeal or contradictory and dependent upon trust, relationships, etiquette or common sense established elsewhere. Concern for privacy rarely translates into using technology to protect privacy (Coles-Kemp et al., 2010, p. 4; Gürses, 2010).

Individual privacy activity online must be contextualised against a range of much broader political, social and economic environments. These include the personal information economy (Lace, 2005) in which individuals are made visible to corporate and governmental entities through a huge and growing market for personal information. Research into consumer surveillance has also identified discrimination based upon data profiles (Turrow, 2006). Similarly, sociologists have discussed the concept of the ‘surveillance society’ (Lyon, 1994; Murakami Wood et al., 2006) in which social sorting – discrimination on the basis of personal information – frequently has substantial impacts upon the lives of individuals and communities (Gandy, 2009).

Privacy is hard to achieve online. This is due to a number of interlinked factors. Reduced costs of storage and more powerful computer processors have facilitated a situation where large volumes of data become both easily storable and capable of being mined for additional information. This generates an interest in the acquisition and processing of personal data, at the same time as data can be assumed to persist.1 Second, many of the business models built around the provision of free content online are dependent upon advertising for revenue. This advertising is more valuable if it can be targeted. This results in additional information being sought about online activity to aid in the targeting and delivery of personalised advertisements. This information can be acquired through persuasive design techniques that encourage users to contribute ever more information. Combined with the nature of the medium, and attempts by various governments to monitor internet traffic for security purposes, an environment is created where several prominent actors have an interest in reducing individual privacy. Additionally, privacy online is often opaque and hard to assess. Individual internet users are often unsure about the purposes to which websites or other actors are putting their personal information. Furthermore this opacity is often a necessary condition of certain business and political practices which require that the individual remain unaware in order to facilitate an institutional purpose. This opacity is compounded by the theoretical multiplicity and contested nature of the concept of privacy, across academic disciplines and broader social life. Privacy is described as multidimensional, multifaceted (Solove, 2008), contextual and situated (Nissenbaum, 2010).

Current communication and engagement efforts surrounding online privacy addressed to the average member of the public are primarily couched in two particular discursive forms. The first of these is a discourse of ‘personal identity management’ in which the individual is informed that they are at risk due to the value and exposure of their personal information in a personal information economy, and then informed of a particular set of responsible strategies that they should follow to minimise their individual risk (Barnard-Wills and Ashenden, 2010). The second discourse is ‘e-safety’ as delivered in most UK schools as part of the National Curriculum. This is primarily focused upon teaching young people to stay safe online and during their use of digital technology. The primary threats are conceived as sexual predation, online bullying and harassment. While rarely delivered in a language of privacy education, and dominated by a police and criminal justice approach, there is some overlap between this discourse and education on privacy, predominately through the teaching of skills and practices that can be thought of as privacy protecting. E-safety includes some discussion and education around the nature of the online environment. However, the discourse systematically ignores a range of non-criminal threats to privacy in areas such as data mining, consumer profiling and civil liberties (Barnard-Wills, 2012).

Socio-technical design approaches suggest that it is often unclear what a socio-technical system or a socio-technical information network looks like. This creates difficulties both in communicating the nature of this network or ecology to others and making systematic and rational judgements about actions within that system (Scacchi, 2004, p. 6). ‘Looks like’ is itself an imperfect metaphor for coming to understand a complex network of interactions across multiple levels. The online privacy problem is a challenging combination of information asymmetry, political and social forces, structural non-visibility, technological elements, complex contextual privacy and insufficient modes of communication. Therefore a combination of low awareness of how to make privacy decisions, an environment that tends to work against such decisions, and existing discourses not facilitating critical awareness, suggests the need for critical interventions as part of a broader politics of privacy and surveillance.

Privacy Politics Education through Games Design

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

This section examines what we mean by a privacy education and engagement game, drawing upon theories of learning and procedural rhetoric. After evaluating more complex options Jesse Schell (2008, p. 37) settles on the elegant definition of a game as ‘a problem-solving activity, approached with a playful attitude’ with the consequences that a game designer must ask: ‘What problems does the game ask the player to solve?’ A slightly more expansive list of criteria suggests that a game must be entered wilfully; have goals, conflict and rules; can be won or lost; is interactive; has challenge; can create its own internal value; can engage players; and is a closed formal system (Schell, 2008, p. 34). This article adopts a medium neutral approach to games for privacy education.2

There is a strongly established relationship between play and educational development, language development and socialisation. This is supported by thinkers as diverse as Plato, Kant, Piaget and Dewey (Huang and Plass, 2009). Designing a game for learning purposes presupposes building a game framework upon a theory of learning, which then informs the end product. We are not making the argument here that games are generally educational (some games simply teach the player how to play that game), but rather that specifically designed games can serve this purpose. ‘Teaching’ in this perspective is a form of mediated experience in which information is ordered and reorganised, ideally into a sequence that is more comprehensible for the learner (Moon, 2004). The model here is that learning is an active process on the part of the learner, in which they construct their own understanding by engaging with it, testing hypotheses and experimentation. We learn by ‘fitting new understanding and knowledge into and with, extending and supplanting old understanding and knowledge’ (Fry et al., 2009, p. 10).3 Ralph Koster's (2005) perspective is that play is so close to the way in which we learn that we all play all the time – it is just that the name is not given to the activity. Games can support learning through encouraging players to approach and explore complex problems, and therefore learn how to tackle these problems in the future in other contexts. They also offer the ability to try out approaches and experience the consequences of those alternative strategies – how manipulating a system brings about different effects (Futurelab, 2005, p. 3). Concrete experiences and active experimentation are essential to allow later reflective observation and abstract conceptualisation (Fry et al., 2009, p. 15). Learning through playing games is associated with learning the underlying rules of those games through interaction. Narrative elements of games can also immerse players in the discursive environments of their settings (Futurelab, 2005, p. 4). A privacy game would therefore be a reordering of the complex, scattered information about online privacy, with which players could engage.

Susan Brooks-Young (2010, p. 90) makes the argument that good education games have many commonalities with effective learning experiences, comparing a good game to a well-developed lesson plan with structure and feedback, built upon challenges and an engaging sequence. Nicola Whitton (2010, p. 31) develops a similar argument, identifying the following elements of an educational game:

  • the goal to achieve an outcome that is superior to others (competition);
  • tasks that require effort and are non-trivial (challenge);
  • a context-sensitive environment that can be investigated (exploration);
  • the existence of a make-believe environment, characters or narrative (fantasy);
  • measurable results from game play (scoring);
  • explicit aims and objectives (goals);
  • action in game that changes the state of play and generates feedback (interaction).

Useful and supportive critical feedback is important in learning. Phil Race (2005) considers feedback vital in nearly all learning contexts. Good feedback is timely (the sooner after a learning-orientated action the better), intimate and individual (fitting the learner's achievement, individual nature and personality), empowering (not purely negative, allowing the learner to move forward) and should open the learner up to further engagement (terminology is important, and should not be final or closed) (Race, 2005). Games therefore provide a capacity for a useful though not perfect form of feedback. In-game feedback on actions, provided by the game mechanics and through the actions of other players, can be very timely and tied to the player's actions. It is not tied to the individual personality and will be relatively inflexible. This can, however, be supported by further human feedback and discussion in relatively formal learning environments. Feedback is particularly absent in online privacy decisions, the impacts of which are often distant in space and time. Similarly, privacy decisions are frequently taken in isolation and collective engagement with these issues might be particularly rewarding.

Whitton also argues for the critical importance of aligning game play with learning objectives, or (to transfer this model outside the formal classroom) the message the designers wish to communicate:

[A] game [must be] designed in such a way that progress necessitates engagement with the intended learning objectives [as] then it is much more likely to be a successful learning tool. A computer game can be extremely motivating and engaging for students, but if it doesn't teach what they are meant to be learning in a particular syllabus, then it will not be educationally effective in that context (Whitton, 2010, p. 90).

Additionally, games offer the potential to engage with a system or set of logics from a perspective that is not that of the player. They can take on the roles of other actors within a system and can uncover a model of their desires and strategies. Game designer Will Wright (creator of Sim City) suggests that the ability to place the player in another's shoes is one of the most powerful tools a game designer has. An example would be playing a serious game about malaria infection from the perspective of malaria itself.4 Adopting alternative roles in online privacy contexts allows their exploration, shifts the perspective from individual choice and responsibility to that of multiple actors, and moves beyond the broken heuristic that everybody in an online environment thinks and acts like us (Grimmelmann, 2009).

Bogost (2007) advocates thinking of games as having the potential to contain a procedural rhetoric, which is ‘the art of persuasion through rule based interpretations and interactions rather than the spoken word, writing, images or moving pictures’ (p. ix). ‘[Games] represent how real and imagined systems work. They invite players to interact with those systems and form judgements about them’ (Bogost, 2007, p. vii). As representational and procedural systems, Bogost claims games are particularly adept at ‘representing real or imagined systems that themselves function in some particular way – that is operate according to a set of processes’ (Bogost, 2007, p. 5). This can be extended to the way that social systems work, making a game a potential way to represent the privacy and consent ecology, the flow of personal information and its actors and logics. It does this by representing social processes through game processes, which can be non-textual, and as such offers a new way to make claims about how things work (Bogost, 2007, p. 29).

Koster (2005) argues that games are abstracted from reality because they are iconic depictions of patterns in the world. This parallels how our brains experience reality by abstracting it and recognising patterns within the infinitely complex perceptual environment. Patterns depicted in a game may or may not exist in reality but the rules and patterns are perceived in the same way. One of the advantages of a game compared with ‘real-world’ experience of privacy and consent (which we can assume that players will either already have, or will likely be exposed to in some way without our intervention) is that we can strip away some of the ‘noise’ of the online information environment and focus upon the critical elements derived from research and normative commitments. This is similar to the metaphorical approach to politics education advocated by Judith Best (1984). A game can provide stronger and clearer feedback about the results of decisions and choices made by the player. A game mechanism can stand in for privacy and consent decisions made online, and show the effects of these decisions more clearly than is the case in the real world.

This representation can be explicitly constructed by the game designer – ‘a games procedural rhetoric influences the player's relationship with it by constraining the strategies that yield failure or success’ (Bogost, 2007, p. 242). However, Bogost (2007, p. 243) also suggests that the ‘bare mechanics do not determine its semantic freight’, and that the narrative and representation applied to those mechanics are rhetorically important. Because mechanics are particular representations of a way of understanding a system, they are inherently biased – and as such must be based upon a particular ontology and normative position, which may not reflect any existing reality. The designer's choice is either to adopt an explicit, theorised and justified basis, or accept an unacknowledged, unreflective position. The ontology of an online privacy and consent educational game must be based upon an understanding of the online privacy ecology, and in this case upon an emancipatory, empowering humanist normative framework.

There is some disagreement over the extent to which the visual imagery and narrative built on top of the game mechanics contributes to the representation. The distinction appears to be linked to how abstracted the game mechanism is. A game teaching reaction time can have almost any visual representation. One representing a complex social system must cleave more closely to that reality. The concept of ‘set dressing’ and procedural narratives reflects the ‘figure and the ground’ learning model, in which the design of a learning activity includes some choices about where to direct the learner's attention. Koster suggests: ‘Mismatch between the core of the game (“ludemes”) and the dressing can result in serious problems for the user experience. It also means that the right choice of dressing and fictional theme can strongly reinforce the overall experience and make the learning experience more direct for players’ (Koster, 2005, p. 166).

Bogost (2007, p. x) also identifies the disruptive and critical capacity of games, linked to their capacity to give consumers and workers a means to critique business, social and moral principles. Many ‘serious games’ simply try to leverage the properties of games to support existing social relations (for example, training McDonalds employees through a virtual environment or the US army's use of America's Army Online). However, there is a critical potential in games if they support the interrogation of the rules of the system being represented. Players can be invited to ask clearly political questions such as:

What are the rules of the system?

What is the significance of these rules (over other rules)?

What claims about the world do these rules make?

How do I respond to these claims? (Bogost, 2007, p. 258)

A critical game prompts reflection upon the processes being procedurally represented. In our context, a player might learn by playing that there are unexpected consequences of sharing their personal information, or of not protecting their privacy. They may then be prompted to reflect on how this relates to their experience outside the game, and if they support or reject this ordering of the social and informational world. The learner-player is understood as an active agent in this process:

Players might oppose, question or otherwise internalise [the game's] claims: Which processes does it include and which does it exclude? What rules does the game enforce and how do these rules correlate, correspond or conflict with an existing morality outside the game? (Bogost, 2007, p. 284)

There is some overlap with political games that attempt to put across a political message through the medium of a game. These vary in complexity with some (The War of Terror) being full, complex games, while others (Bullshit Plug) are shorter and make a simpler point.5 An example of this at work can be found in the game Oiligarchy by Molleindustria.6 In Oiligarchy the player takes on the role of the oil industry, with the goal of maximising their profit from extracting and selling oil. The player chooses where to place oil rigs, but must also choose how to deal with indigenous uprisings or political opposition to drilling in the Arctic. The game is somewhat loaded – if the player follows their directed goal of maximising profit, then they will almost inevitably drive the world towards ecological disaster and global war. Its procedures, by being played, represent to the player a particular political view about the way that structures of capitalism, including the unrestrained pursuit of profit, result in catastrophe. A large number of other persuasive games can be found at: http://www.persuasivegames.com/games/

The Game: Privacy

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

This section provides a brief introduction to Privacy, the game designed by the VOME project.7 This will provide context for the following discussion of the design process that produced this game. Privacy is a card game for two to five players. The core aim is to balance public and private information (represented by cards) by choosing what information to play, keep in your hand or trade with other players. The game plays in about 30 minutes, and consists of 112 cards with a two-sided A4 rule sheet.

The game draws from poker and similar card games, in that players have a hidden hand, and play cards in a shared space visible to all players. Players have a hand of five Personal Information cards (Figure 1). On their turn they play one of these cards on to the table, trying to match it up with columns of the same category of personal information (Figure 2). They then have the option to propose a trade of any sort between themselves and other players, after which they draw cards from the deck to bring their total hand to five. Each player has a character, kept hidden until the end of the game, which affects the type of information for which they score extra points (Figure 3, left). Event cards (Figure 3, right) based upon real-world events such as super-injunctions, WikiLeaks releases or government laptops left on trains, are drawn every time play comes back to the first player, and affect all players in the same way, adding an element of unpredictability. Play concludes when the Personal Information deck is exhausted. Players reveal their character and private hand, and calculate scores. Winning the game requires the awareness of points, the construction of columns that score the player points, getting and keeping hold of the best cards through trading, being aware of the potential points of other players and potentially figuring out the identity they are hiding.

figure

Figure 1. Information Cards

Download figure to PowerPoint

figure

Figure 2. Played Cards, with Two Scoring Columns (Biographical and Health)

Download figure to PowerPoint

figure

Figure 3. Character Cards (Left) and Event Cards (Right)

Download figure to PowerPoint

Players are therefore continuously presented with a series of disclosure/privacy decisions in an uncertain and competitive information environment. Fundamentally the game poses a choice between doing three things with a card: keeping it in hand (private), playing it on the table (public) or trading it with another player. Randomly drawn cards, Event cards and privacy ‘trick’ cards break up these choices, introducing uncertainty and disrupting strategies. The types of personal information and event all come from VOME fieldwork interviews and focus groups, and the cards feature quotes from these interviews.

From Research to Design Brief

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

Having introduced Privacy, this section of the article reflects on the process of moving from initial research to that finished game. The following section will set out how the game has been evaluated and the findings from that process.

The design of Privacy started with an initial exploration of the literature on game design, serious games and games in education, as summarised earlier in this article. This established the possibility and feasibility of exploring games design as a component part of the VOME project. The process then drew upon existing privacy literature and the ongoing fieldwork from VOME to produce a set of arguments and assumptions about the nature of the online environment. These assertions were generated during a project meeting of the various teams in the VOME consortium, held at the University of Salford. It therefore drew upon a broad range of disciplines and literature as well as the fieldwork experience of the project. This meeting also generated a list of initial design requirements for the game.

This was followed by initial prototyping, in which various types of game were considered and some initial game mechanics were selected. The project considered videogames, alternate reality games and some form of role-playing game, before deciding upon a card game that might later be developed into a digital format.8 The key considerations in this choice were cost and speed of production and that the familiar ‘grammar’ of card games – such as decks, hands, drawing and playing cards – would be accessible to most potential players.

To the extent that a privacy education game is meaningfully based upon a theoretical model of the online privacy ecosystem, it should not be simply a graphical skin overlaid on an existing game, but rather the game's mechanics (processes) should make a procedural argument for the theoretical model, so as to encourage both understanding and critical reflection upon that model. Early prototyping involved finding ways of implementing the desired arguments and design requirements into a playable game. The initial design and prototyping work was the most reliant on intuition, experience with other games, and theoretical sources, to come up with an initial framework. The design was then iterative, going through four full revisions with numerous smaller tweaks based upon feedback as we trialled the game. Revisions included changes for accessibility, to shift the focus from the numbers and colours to the categories of personal data they stand in for, and the position of the trading round in the turn order: in some variants we used open trading, where trades could be proposed by any player at any time; this proved too complex for most players, and trading rapidly tailed off in play. The process exhibited a certain degree of path dependency once in motion, where it would have been unfeasible to reconsider entirely the medium of the game.

Cards were produced through http://www.moo.com, a website that specialises in photo-printing. Moo has an online interface, and a relatively rapid turn-around time, allowing for rapid prototyping of cards. Design work was done in Adobe Photoshop. While effective, the medium produced constraints upon possible game design, such as the number of cards in a pack or the number of different card designs.

We selected a theoretical model of online privacy – the ‘argument’ in our procedural rhetoric. It would have been possible to build a game based upon either the e-safety or personal identity management paradigms of privacy and consent engagement, but as previously discussed both of these approaches suffer from limitations. We tried to mirror the ‘rules’ of the online privacy and consent ecology for our purposes. This is not a positivist argument about causal laws. These rules do not have ontological status, and are an analytical abstraction of a situation of real complexity to highlight particular elements. We were actually positing a set of normative statements here – these were the elements of the online privacy and consent ecology that we believed should be represented in a game. This might arise because they are so pivotal to understanding the environment that to leave them out would create a distorted image, because they are under-represented in existing communication about online privacy, or because we wish to highlight them to make an explicit political point. The rules are also subject to political change, but the very nature of turning a perception of the world into game ‘rules’ requires some level of abstraction. Table 1 sets out the ways in which intended learning outcomes (the arguments we wanted to make about online privacy) led to design requirements, and in turn how these requirements were implemented in the final game design.

Table 1. Intended Learning Outcomes, Design Requirements and Implementation in Privacy
Intended learning outcomeDesign requirementImplementation in Privacy game
Personal information is valuable both to the subject and to others. Organisations use information to make decisions about people, to categorise and to advertise things to them. There is a growing market for personal information and companies exist with business models built around personal information. Information about people is not always (legally) ‘Personal Information’.The game should feature as a core mechanism the disclosure, retention or exchange of information. Information should be valuable in the game and the management of this resource should be the route to winning or losing (personal information is a rival good).Playing or keeping Personal Information cards, trading information, information has points value, and manipulating these cards will win the game.
Sometimes people benefit from making information about themselves public. Some people produce music, writing or want people to know about their lives. People often have to disclose information to live a ‘normal’ life.Players should have to disclose or reveal some information in order to progress their own in-game goals. They may have to balance or offset different and potentially competing values.Mandatory playing of one card in public each turn. Some cards will contain both public and private information so player must choose where to play.
There are a lot of different types of personal information.There should be different types of personal information in play.Six broad categories of personal information.
Different actors have different tolerances for false information. The aims/goals and methods of different actors are not always aligned, and may sometimes be opposed to each other.Multiple parties should be able to initiate disclosure.Players suggest trades on their turn, and can negotiate trades on other turns.
Personal Information often has some legal protections (such as the Data Protection Act in the UK).There should be a choice of which other players to interact with, and how this interaction occurs. It should not be forced, except in rare circumstances.Free choice of who to trade with. Trading is optional.
There are a lot of different actors (people, companies, organisations, governments) involved in online privacy and personal information. The aims/goals and methods of different actors are not always aligned, and may sometimes be opposed.There should be multiple roles available in play, representing different actors in the online privacy and consent ecology. Play should be competitive.Eleven different character cards, with varying desired categories of information, ranging from intelligence agencies and hackers to advertising companies and social networks.
There are complex relationships between actors, including trading and sharing information. These relationships may change over time. People engage in a very wide range of things online. It can be very hard (or impossible) to tell what information is held about you, where it is held, and where it is going to.Interactions (revelation of personal data) can happen directly or through a third party. Information revealed or traded with one source can be passed on to a third or fourth.Cards can be traded multiple times, out of control of the original player.
Actors are not always transparent in their intentions and activities, potentially extending to deceit. Some actors involved in this environment are criminal. People often don't have all the information they might want or need in order to make decisions about privacy they would be fully happy with.Decisions about privacy and personal information are made without full information about the state of play. Players need some way to lie, provide false information or hide the identity of the type of actor they are playing.

A hidden hand of cards and a hidden character card only revealed at the end.

Privacy trick cards allow deceitful trades. Players can bluff potential traders.

Different types of personal information are valued differently by different actors. Some are seen as more useful or more predictive of people's behaviour.Actors not only seek different information, but value information differently. There are therefore asymmetric win conditions.Each character has preferred public and private categories of information for which they score double points.
Different rules about privacy and personal information may apply in different areas of political, social or economic life.Different rules may apply in different contexts of political, social or economic life.Event cards can modify the rules of the game during play.
No objectively correct privacy decisions.There should be a random element to introduce uncertainty and unpredictability.Event cards add uncertain events which change available options and strategies each round.

Reflecting upon this process, the development of a card game within the context of an academic research process is feasible, but requires resources, planning, some familiarity with games and the support of other researchers and project partners.

Evaluation

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

We adopted a threefold evaluation of Privacy, as a game, an educational and social intervention and as a research tool. We evaluated the game with a wide range of groups. While not a random sample, the range of groups provided a range of demographics, across age, gender, education, knowledge of privacy and familiarity with games of various sorts. The groups included: experienced games players from the videogame industry; game design students at a workshop at Game City in Nottingham; youth workers and young people at an event hosted with the Sunderland Voluntary Sector Youth Forum (SVSYF); the Microsoft Serious Games team; young people at the Sunderland YMCA; doctoral students from the Surveillance Studies Network; IT security professionals at the Chartered Institute for IT (BCS); Consult Hyperion and Bank of New York Mellon (BNYM); the Warwickshire County Council E-Safety team; staff in the Department of Informatics and Systems Engineering, Cranfield University; and undergraduate IT students at the Perth Institute of Business and Technology. In total, the game was play tested with around 130 players.

Games designers reflecting on play tests have developed a set of practices very similar to those of qualitative researchers in the social sciences, and tools such as ethnography, field observation, focus groups, interviews and surveys are all applicable to game evaluation (Schell, 2008). We observed play sessions, facilitated play and actively played the game with participants. We received written reports on play sessions we could not attend, conducted post-game discussions and asked players to complete short surveys.

Privacy as a Game

This is the least politically and educationally important element of our evaluation, but relates to the desire to evaluate games per se as an intervention or research tool. Measuring the success of a game without the measure of market return is incredibly subjective. Some players did not take to the card game, put off by its complexity or being simply ‘not a game player’. However the game was generally well received by experienced gamers, and considered to be playable and interesting (‘genuinely fun’). The game was compared favourably to Trivial Pursuit or Monopoly and most akin to Pit9 and Chase the Ace.10 They were all happy to play, and felt that the project had succeeded in making a playable game. Players did feel that perhaps the game took one play through to understand fully, but it was not too complex for them and could potentially carry more complexity. It was expressed that perhaps there was not enough of a consequence to revealing your character. This was related to the long columns appearing in the game, indicating which categories of data players were aiming for. There were concerns about the difficulty and anti-climax of calculating the scores at the end. Some experienced players thought the game lacked a competitive edge, with too much focus on building one's own pattern, and little reason to look at those of others. One suggested adding more privacy trick cards to manipulate, block or misinform opponents.

Graphical design emerged as an important element of the game. Observing initial reactions to the game, players engaged with it in a similar way to a professional product. People seemed interested in the cards, often flicking through the deck for cards of interest. When we first presented the printed prototype to players, it received positive reactions because it resembled a ‘real’ game that you might buy in a shop. This moved us past some potential objections to an amateur-produced game, and away from preconceptions of a classroom exercise. Accessibility was raised in a number of contexts, particularly in relation to visibility of text on cards, and discerning different categories by colour. These suggestions were incorporated into later versions of the game. In general, players were often willing to suggest alternative game designs, changes or mechanics that they felt would improve the game, and several of these were included in the final version. Game design is therefore a collaborative exercise, which works best when it incorporates participant perspectives.

Privacy as a Political Intervention

One of the aims of producing the game was to support critical surveillance awareness on the part of the players. The intent was to build a model of the personal information and privacy situation into the game and let players explore this through play. Four main themes emerged during the evaluation of this aspect: stimulation, complexity, theme and scaffolding. Assessing complex behaviour change is problematic. We do not have a baseline for online privacy behaviour for any given player and generalisations based upon age, gender or educational background are problematic. Furthermore, behaviour change may take place over a long period of time, with a large number of other uncontrollable variables that might affect perspectives on privacy and consent online. Important in this is the way that the messages and statements about the online privacy and consent world that are built into the game are understood by the players.

With regard to the extent that the game helped with understanding and thinking about online privacy, we had positive responses to it, both observed and expressed by participants, and both during and after playing the game. Players discussed online privacy issues with each other, and shared experiences and strategies. The game therefore appears to work as a useful stimulus for collective discussions about online privacy. Leveraging the social environment of a face-to-face game may be one of the most effective elements. Additionally, the events at the BCS, BNYM and with the Technology Strategy Board revealed interest in using the game as an internal data protection or IT security training tool from various organisations. This was not initially anticipated but – given that the intention was never to limit the game to use by young people – is supported by the procedural rhetoric model. This is an area where the relative complexity of the game is a positive feature.

The complexity of a game is an emergent property that arises from several sources, and can be mitigated by several strategies. We made an explicit choice to attempt to represent several facets of the online personal information environment rather than focus on a single particular issue. This raised the complexity of the game, which was a particular problem for some players:

Too complex to engage with – felt a bit lost, silly.

Relatively confusing set up – I'd like to play for longer to understand the dynamics of it more comprehensively. Too many elements to consider. Confused at the value of private and public information and how it related to gameplay.

The E-Safety team felt that the game was simply too complex for playing with young school students: ‘There seemed to be a number of things to do all at once. We kept forgetting!’

Complicated game design might be a distraction from the message the game attempts to put across; that is, the pattern cannot be recognised among the detail. For example, a player does not realise that they are being presented with choices about disclosure or protection of information, because they are trying to spot patterns in the colours on cards. Reducing complexity seemingly has to be balanced against excessive abstraction from the context of the game.

In general, feedback seems to suggest that the level of complexity of privacy was too high for some of its intended objectives. It did not seem too complex for many adult players, and was not thought to be complex by experienced gamers. Many players suggested that they believed they would understand the game on a second play. Some thought that the game would be too complex for others even if they understood it.

A continuing issue throughout all play tests and one of the most common pieces of feedback we received was players focusing upon the colours and numbers on the cards rather than on the meaning of those values to the game's ‘fiction’. This was the most significant barrier we encountered to understanding the message of the game. Games design considerations suggest that experienced game players are the most likely to strip away narrative fiction and play with only the mechanically required elements. It might be the case that our mechanic did not tie closely enough to the process it is intended to represent. Mechanics were not always dominant, and sometimes produced a positive friction with the narrative of the game. One player remarked that although they knew they should, strategically, play a particular card, they were reluctant to because they would not make public that type of information in their real lives.

Feedback from evaluation suggested that for education, deeper communication and reflection, the game (and similar activities) must be scaffolded by other efforts, including discussion, supporting resources and information. Scaffolding and support are critical to achieving educational goals. This information can be built into games in various degrees. The educational impacts for casual, unstructured play are likely significantly lower. Facilitation is important, and having a facilitator who is knowledgeable about the purposes of the game, the messages built into it (and the way they are built in) as well as about issues of information security increases the impact. The game does not do all the teaching or education for a facilitator, and is probably therefore a tool rather than a replacement. As a standalone product, the game can draw attention to issues of privacy, and sensitise players, but it does not do as much education without a facilitator. The following facilitation practices and techniques used with the game may serve to support learning:

  • drawing in narrative and role-playing elements, encouraging players to imagine themselves in the perspective of their character, and why they might want particular types of information;
  • asking players to describe the person depicted in the database they have produced by the end of the game. Do they know anybody like this?
  • asking players to read out the cards as they play them, to reinforce the text on the cards and the fact that they are trading personal information;
  • stacking the event deck before a game to highlight particular issues they wish to discuss afterwards or that might be most relevant to the players of the game;
  • some players (SVSYF, Perth Undergraduates and Warwickshire E-Safety team) requested additional assistance with learning the rules, perhaps in the form of video tutorials.

Discussion of the issues and topics raised by the game is a very important feature. This might suggest that the game is limited for use in educational work, and that the discussion of the issues is sufficient on its own. From the other fieldwork experience of the VOME project, such discussions on privacy can often be fairly shallow, and might not capture the range of potential issues very well. The game therefore acts as a stimulus and a source of inspiration, as well as a means to interact with the issues in a direct way. The play and discussion are really working in synergy, as well as potentially appealing to different learning styles.

Privacy as a Research Tool

Finally, to evaluate the game as a research tool, we observed a number of these sessions from the perspective of a qualitative social science researcher trained in focus groups, participant observation methods and ethnographic approaches. It was therefore possible to compare the use of the game with these approaches. All research methods have their specific affordances and impact upon the research process in particular ways. Teaching and playing the game takes time, which can be a valuable commodity for some research participants. The trade-off then becomes time against potential quality and depth of discussion. In use, the game was, we believe, a useful stimulus for focus group discussion. Aside from the game, the cards themselves can function as a scaffold for qualitative dialogue. The quotes on the cards served as starting points for conversations, and it was possible to have discussions about the relative scores and importance of certain cards. Participants could be invited to reflect upon or challenge the categories and scores, or the importance of a particular type of personal data for themselves. However, playing a game is an artificial construct in most contexts, which can cause some resistance.

Designing the game in a way that incorporated qualitative research required some buy-in from the researchers who had conducted it, as well as access to the field notes and reports from that research. Collaborative cross-disciplinary workshops seemed to work well for developing a shared understanding of the conceptual space in which the game would be set, what particular problems players should address and what sort of decisions they would have to make. One issue that emerged in these sessions was how much the game abstracts specific contexts. Some events and types of information will have more relevance and impact for specific audiences than others. This abstraction may be problematic for some traditions of research.

Conclusions

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies

We believe that there is sufficient reason, given the complex environment of online privacy and consent and the limitations of existing interventions, to find a better way to support decision-making activity and privacy and consent practices online. This activity should take into account social and political research, technological perspectives and communication methods. This article has argued for the value of using a game as a mode of communication and engagement and to facilitate online privacy literacy. It has drawn upon research to produce such a game and provided an account of one set of design decisions. Procedural rhetoric has potential, but is not a magic bullet for engagement on a political issue. We believe Privacy to be a relatively innovative game intervention, and fairly unique from a political research perspective. There is also a broader potential for political engagement and education, with many alternative arguments expressible in similar forms, and this methodology could be applied to other contentious political issues.

Notes

The game and this research would not have been possible without the participation of all of our players and play testers. Particular thanks to Lizzie Coles-Kemp, Alison Adam, Conn Crawford at Sunderland City Council, Dave Birch and Margaret Ford at Consult Hyperion, and the rest of the VOME team. The VOME project was funded by EPSRC, ESRC and the Technology Strategy Board; grant reference EP/G00255X/1.

  1. 1

    This has been described as the death of forgetting (Mayer-Schönberger, 2009).

  2. 2

    Meaning that we believe that many of the arguments made here would apply equally to a computer game as to more ‘traditional’ card or board games. Particular game forms might be considered to have particular affordances and capacities.

  3. 3

    We talk of ‘learning’ here rather than communication or engagement, because this perspective does not assume that the game designer or privacy researcher has all the answers. Instead we propose a way of supporting investigation and questioning of online privacy and consent ecologies.

  4. 4
  5. 5

    Similarly, ‘serious games’ are seen as those games that focus on specific and intentional learning outcomes to achieve serious, measurable, sustained changes in performance and behaviour (Bogost, 2007, p. 4). The category of serious games also includes a range of games that attempt to harness the activity of players to produce not just (or even) educational development in the players, but to achieve positive social change in the world. Drawing upon multiplayer games, these games attempt to harness the willingness of large numbers of people to pool their effort, knowledge and skills in attempts to solve complex problems (McGonigal, 2011). Examples include A World without Oil, which asked players to imagine they were living in a world where petrol supplies had run out and how they would adjust their lives, and Evoke, a game designed by the World Bank Institute to help empower young people to come up with creative solutions to development problems. Serious games have found use across a wide range of organisations and social sectors, including higher education and defence (Caspian Learning, 2008).

  6. 6
  7. 7
  8. 8

    VOME did not pursue this, but Privacy has since been taken forward by the Open University and developed into an online platform (http://www8.open.ac.uk/platform/news-and-features/new-privacy-game-explores-the-cost-telling-all-online).

  9. 9
  10. 10

References

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies
  • Barnard-Wills, D. (2012) ‘E-Safety: Young People, Surveillance and Education’, Criminology and Criminal Justice, 12 (3), 239255.
  • Barnard-Wills, D. and Ashenden, D. (2010) ‘Public Sector Engagement with Online Identity Management’, Identity in the Information Society, 3 (3), 657674.
  • Bennett, C. (2011) ‘In Defence of Privacy: The Concept and the Regime’, Surveillance and Society, 8 (4), 485496.
  • Best, J. (1984) ‘Teaching Political Theory: Meaning through Metaphor’, Improving College and University Teaching, 32 (4), 165168.
  • Bogost, I. (2007) Persuasive Games: The Expressive Power of Videogames. Cambridge MA: MIT Press.
  • Boyd, D. and Marwick, A. (2011) ‘Social Privacy in Networked Publics: Teens’ Attitudes, Practices and Strategies’, A Decade in Internet Time: Symposium on the Internet and Society, 22 September.
  • Brooks-Young, S. (2010) Teaching with the Tools Kids Really Use: Learning with Web and Mobile Technologies. Thousand Oaks CA: Corwin.
  • Caspian Learning (2008) Serious Games in Defence Education. Shrivenham: Defence College of Management and Technology.
  • Coles-Kemp, L., Lai, Y. and Ford, M. (2010) ‘Privacy on the Internet: Attitudes and Behaviours’, VOME. Available from: http://www.vome.org.uk/wp-content/uploads/2010/03/VOME-exploratorium-survey-summary-results.pdf [Accessed 6 January 2011].
  • Cunningham, S. J., Masoodian, M. and Adams, A. (2010) ‘Privacy Issues for Online Personal Photograph Collections’, Journal of Theoretical and Applied Electronic Commerce Research, 5 (2), 2640.
  • Debatin, B., Lovejoy, J., Horn, A. and Hughes, B. N. (2009) ‘Facebook and Online Privacy: Attitudes, Behaviours, and Unintended Consequences’, Journal of Computer-Mediated Communication, 15 (1), 83108.
  • Fry, H., Ketteridge, S. and Marshall, S. (2009) A Handbook for Teaching and Learning in Higher Education: Enhancing Academic Practice. New York: Routledge.
  • FutureLab (2005) Games: Games and Learning: A Handbook from Futurelab. Bristol: FutureLab.
  • Gandy, O. H. (2009) Coming to Terms with Chance: Engaging Rational Discrimination and Cumulative Disadvantage. Farnham: Ashgate.
  • Grimmelmann, J. (2009) ‘Saving Facebook’, Iowa Law Review, 94 (4), 11371187.
  • Gürses, S. (2010) ‘Pets and Their Users: A Critical Review of the Potentials and Limitations of the Privacy as Confidentiality Paradigm’, Identity in the Information Society, 3 (3), 539563.
  • Huang, T. and Plass, J. L. (2009) History of Play in Education. White Paper. New York: Institute for Games for Learning.
  • Ibrahim, Y. (2008) ‘The New Risk Communities: Social Networking Sites and Risk’, International Journal of Media & Cultural Politics, 4 (2), 245253.
  • Koster, R. (2005) A Theory of Fun for Game Design. Scottsdale AZ: Paraglyph Press.
  • Lace, S. (2005) ‘Introduction’, in S. Lace (ed.), The Glass Consumer: Life in a Surveillance Society. Bristol: Policy Press/National Consumer Council, pp. 116.
  • Lyon, D. (1994) The Electronic Eye: The Rise of Surveillance Society. Minneapolis MN: University of Minnesota Press.
  • McGonigal, J. (2011) Reality is Broken: Why Games Make Us Better and How They Can Change the World. London: Jonathan Cape.
  • Mayer-Schönberger, V. (2009) Delete: The Virtue of Forgetting in the Digital Age. Princeton NJ: Princeton University Press.
  • Moon, J. (2004) A Handbook of Reflective and Experiential Learning: Theory and Practice. London: Kogan Page.
  • Murakami Wood, D., Ball, K. and Raab, C. (2006) A Report on the Surveillance Society. Wilmslow: Information Commissioner's Office.
  • Nissenbaum, H. (2010) Privacy in Context: Technology, Policy and the Integrity of Social Life. Stanford CA: Stanford Law Books.
  • Park, Y. J. (2011) ‘Digital Literacy and Privacy Behavior Online’, Communication Research, Online first. doi: 10.1177/0093650211418338.
  • Race, P. (2005) ‘Using Feedback to Help Students Learn’, Higher Education Academy. Discussion Paper. Available from: http://phil-race.co.uk/wp-content/uploads/using_feedback.pdf [Accessed 1 May 2013].
  • Scacchi, W. (2004) ‘Socio-technical Design’, in W. S. Bainbridge (ed.), The Encyclopedia of Human–Computer Interaction. Great Barrington MA: Berkshire Publishing Group, pp. 656659.
  • Schell, J. (2008) The Art of Game Design: A Book of Lenses. Burlington MA: Morgan Kaufmann.
  • Solove, D. (2008) Understanding Privacy. Cambridge MA: Harvard University Press.
  • Stalder, F. (2002) ‘Privacy is Not the Antidote to Surveillance’, Surveillance and Society, 1 (1), 120124.
  • Turkle, S. (2011) Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.
  • Turrow, J. (2006) Niche Envy: Marketing Discrimination in the Digital Age. Cambridge MA: MIT Press.
  • Viseu, A., Clement, A. and Aspinall, J. (2004) ‘Situating Privacy Online: Complex Perceptions and Everyday Practices’, Information, Communication and Society, 7 (1), 92114.
  • Whitton, N. (2010) Learning with Digital Games: A Practical Guide to Engaging Students in Higher Education. London: Routledge.

Biographies

  1. Top of page
  2. Abstract
  3. The Online Privacy and Consent Problem
  4. Privacy Politics Education through Games Design
  5. The Game: Privacy
  6. From Research to Design Brief
  7. Evaluation
  8. Conclusions
  9. References
  10. Biographies
  • David Barnard-Wills is a Senior Research Analyst at Trilateral Research and Consulting LLP. He holds a PhD in Politics from the University of Nottingham, and is an Associate Member of the Higher Education Academy. He has previously been a Research Fellow at the University of Birmingham, Cranfield University and the Parliamentary Office of Science and Technology. Research interests include the politics of privacy, surveillance and security technologies. His research blog is http://www.surveillantidentity.com. David Barnard-Wills, Trilateral Research and Consulting LLP, Crown House, 72 Hammersmith Road, London W14 8TH, UK; email: david.barnard-wills@trilateralresearch.com

  • Debi Ashenden is a Senior Lecturer in the Department of Informatics and Systems Engineering at Cranfield University. She specialises in information assurance in general, and risk assessment in particular. Other areas of interest include human factors in information assurance, information sharing, threat assessment and information security awareness. Debi Ashenden, Department of Informatics and Systems Engineering, Cranfield University, Shrivenham, Swindon SN6 8LA, UK; email: d.m.ashenden@cranfield.ac.uk