Angie McKeown

              I'm only me, but I'm very good at it

Cybersecurity, Game Design, Papers Please, Writing

Privacy and Games Development

Using the FAIR Taxonomy to align Games Development with GDPR

Disclaimer – I am not a lawyer, and you should always get advice before relying on anything which might impact your business legally.

Abstract—This paper briefly talks about what GDPR is, what the FAIR framework is, and the reasons why GDPR impacts Security Risk Assessment for Games Development. It then discusses the increasing use of personal data in Games, and the difficulty of identifying ‘personal data’ under GDPR within a Games environment, and identifies some cyber attack vectors for Games. It then uses the FAIR framework to give examples from the Games domain, and finally concludes that while only time will tell how GDPR will impact Games fully, FAIR is a responsible step in the right direction.

Index Terms—Games Industry, Global Data Protection Regulation, GDPR, ePrivacy Regulation, FAIR framework

                                                                                                                                                                                                           I. Introduction

The games industry produces a wide-range of products which may handle vast amounts of user data, are technically complex both locally and over IP, and subject to cyber attacks in equal or greater measure than other industries because its own users will frequently engage in low level attacks in order to ‘cheat the system’ to gain in-game benefits. In addition, many customers are minors, exacerbating the magnitude of consequence of any data breach. Another complicating factor is that despite first appearances, games studios outside the big league of TenCent, Microsoft, Nintendo, Activision, Sony, are often much smaller affairs than it would first spear, on a tight budget with only a handful of underpaid staff, struggling to survive until next release day – a scenario which does not bode well for devoting adequate time to security concerns.


With the imminent introduction of the EU General Data Protection Regulation (GDPR) [1] applying to every European service user, regardless of where the business itself is based, and the upgrade of the EU ePrivacy Directive (colloquially known as the Cookie Law) to a Regulation [2], and the exponential increase in potential financial fines should a breach occur, some serious conversations need to take place about Security Risk Assessment within Games.


The new GDPR regulations will require all games companies to handle EU personal data in a much more careful manner, and allow the user many opt-outs and controls not previously granted by law. In addition, the fines are considerable. While previous Data Protection laws were more concerned with apportioning punishment in line with the severity of any breach, GDPR fines concern themselves with non-compliance to the regulations, and are substantially greater: up to €20 million or 4% of turnover, whichever is higher. Few current games companies could survive this.


Part of the requirement of showing your GDPR data processing is legitimate, consensual, and ‘as described’ is having a Risk Assessment, and this is where the FAIR [3] risk taxonomy comes into play.


FAIR is a step-by-step framework which assists the business in preparing a detailed risk assessment which looks at the physical impact, and more unusually, also the business impact of any breach. The FAIR framework is particularly noted for its flexibility in working with other existing systems, and ability to delve into depth for assessing Risk, with a level of granularity that affords a clarity of coverage that other frameworks may not achieve.


This paper considers whether the FAIR framework can therefore be used successfully during game development to bake in Privacy-By-Design concepts and help to correctly identify and assess the increased risks that the new EU regulations bring specifically to the Games Industry.


                                                                                                                                                                              II. Trend discussions/Background


Despite GDPR being finalised and announced almost two years ago in May 2016, there has been widespread apathy about it from most sectors, and Games has been one of the worst, with many companies assuming it just didn’t apply to them as it was more about customer names and addresses. Even ICANN, the global domain name registrar, brushed off warnings from several EU bodies for years before finally admitting it needed to do something, and then realised it was too late to put a system in place in time [4].


So, what’s the trend, and what have game companies done to deal with this already? Unfortunately, not an awful lot. A representative from The Pokémon Company, which brought in $3.3 billion in retail revenue in 2016 [5] and manages the handheld console and online Pokémon games for the UK and US markets, confided to me they have been ‘up to their eyes in [GDPR] for months’. So, they’re clearly taking it seriously, but have only been preparing for months, not the years that were available, despite their resources. Other companies have already hinted that it’s just too much time and money to refactor old code to bring old games into compliance, and that older online titles and servers may just be retired. WarpPortal went a step further and announced that “all games and WarpPortal services to the European regions … will be terminated on May 25th, 2018” [6], including the hugely popular Ragnarok Online – a move which did not endear them to users.


In the literature there are few papers tackling the issue of privacy specifically within Games, but those that do certainly agree it is a concerning issue, and tend to propose assessment frameworks of their own for a very niche subject area like power grids [7] or specific technical workarounds for gathering data less problematically [8] such as location authentication by image, rather than GPS.

They universally neglect to mention GDPR. The single paper I could find which takes a thorough approach to the subject, called “Incorporating Privacy into Digital Game Platform Design” [9] still fails to specifically mention GDPR or take account of its vastly increased penalties and increased focus on user opt-outs and requests for erasure, despite being published well after the release of the legislation.


Those papers that mention FAIR do not connect it with Games Development. There are no scholarly articles connecting GDPR and FAIR that I could find, just a blog post on how Enterprise Architects can deal with GDPR [10]. This also mentions the ePrivacy Directive and the NIS Directive, and advocates using common standards such as ISO 27001 to identify useful controls, but while it is a reasonable overview of compliance steps it is primarily promoting the organisation’s own product ‘Enterprise Studio’, which is built on ArchiMate [11] and FAIR standards, rather than being a useful in-depth analysis.


                                                                                                                                                      III. Personal data in games: Defining the Problem


Gamification is increasingly recognised as a useful tool for learning and motivation, and is used ever more widely in education and business as a means to teach and to retain information, and also as a means to record what has been learned. Users unlock learning achievements and ‘level up’ their skills, and this is reported back, unlocking access to certificates and other rewards. As users provide payment information, confirm identity data, interact with contacts, record skill levels, or store location data to unlock new challenges, this has further increased the amount of sensitive user information available, opening Games companies and those who utilize their products up to a myriad of privacy and data protection issues.

There are strikingly obvious concerns about the personal data we are willingly handing over and that is being stored and/or processed about us. We expect any company to carefully control and store personal data such as payment and identity information, for example.

Personal data: Location information

When it comes to Games, however, we can be in a whole new territory. For an example, take the many motivational running games now available. Run an Empire [12] will map the location data of your run to an area and then claim it as your territory. It seems innocent enough, but even without a data breach this kind of tracked location data can still be problematic personal information, as can be seen from the recent scandal caused by fitness app Strava’s collated map visualisation data, which tracked its users onto secret military installations and then displayed those on the web for all to see [13]. With an actual data breach there is a chance individual information could be used to pinpoint where a user works or lives or areas they frequent – even more problematic for women, who are often subjected to street harassment or stalking.

Personal data: In-game chat

Many games now have voice and message chat between players, which may be stored on the game servers. There is no way to tell whether this contains personal information for any one user, but it would seem prudent to protect it and treat it as though it does. The PR disaster alone would be enormous if it leaked.

Personal data: Character accounts

Theft of character accounts isn’t just a security issue.  The fact that character accounts are often connected to real world currency now means that monetary loss could occur to the player, and the loss of in-game items, stats, and reputation may also cause players a lot of anguish. All these things reflect on the player’s person, and would be considered a serious betrayal of trust in any breach, which will therefore damage the reputation of the game/games company.


Personal data: Cyber Stalking

And it isn’t just physical location data we need to be worried about with Games. Now that we’re playing more and more multiplayer online games, and online harassment is a recognised and growing problem, online locations may come into play, too. If my real life home address is personal information, is my in-game home base private too? One look at the enormous amount of statistics you can access about your Warcraft character, from the amount of random rabbits you have unnecessarily killed to the amount of times you’ve visited a particular game location, should make it obvious that if someone wanted to follow you around, harass you, or prevent your progression in an online world, they absolutely could. And in many systems there is not necessarily a robust or effective way of dealing with that [14].

If your game carelessly gives my online information away to my cyber-stalker, do I have a case against you? We don’t yet know. And so it behoves companies to treat this with some care.


Personal Data: Biometrics

With the increasing popularity of VR games and advances in the marketplace, a whole new realm of personal data is out there waiting to escape into the wild, which is biometric data. Immersive games now regularly track eye movement and galvanic skin response, and a new slew of headsets and accessories can now track breathing rates, heartrates and even your thoughts, as a way of controlling in-game avatars and situations.

John Burkhardt, a behavioural neuroscientist, warns [15] that access to biometric data streams like eye tracking, facial tracking & emotional detection, galvanic skin response, EEG, EMG, and ECG means “big data companies who may be capturing [biometric data] could be able predict our behavior, but potentially even be able to directly manipulate and control it”.

Personal data: Addiction/Behavioural Psychology

On yet another deeper level of hidden personal information, companies have shown there is a wealth of money to be made from finessing in-game content according to player behaviour. Zynga and King were by no means the first but could certainly be considered some of the most successful to use this model, where game difficulty levels, colours, phrases and reward tiers are tuned through continual data collection and A/B testing to increase the psychological commitment of the user and make it difficult to stop playing. Games such as Farmville and Candy Crush Saga spring to mind, but the technique is used across the board now in everything from Facebook games to Gambling sites. Extensive analytics tracking is built into both the major games engines Unity3D [16] and Unreal Engine [17], and questions have already been raised about how the use of these conflicts with GDPR requirements [18].


When Blizzard roll out a new dungeon in Warcraft they closely analyse the playthrough of the first few elite teams of players in order to tweak difficulty settings, generating a vast amount of player data as they do so. Keeping players addicted at just the right level of difficulty, giving quests to keep them interested and feeling like they’ve accomplished something tough, but not so hard it turned them away, is Big (and repeat) Business. It’s a feeling of satisfaction with ourselves that we can rarely accomplish so reliably in real life. And in this hyper-analysis of our ‘Likes’ and ‘Quests’ and ‘Kills’, and all of the other in-game choices we make, they have a vast amount of data about us that they are using to piece together what motivates us as a person.


It’s pretty certain, however, that players do not understand the full extent of how their behaviour is being monitored, analysed, and then used to tweak the game difficulty. This is a fundamental part of difficulty setting and responsiveness in gaming, and making the user experience feel dynamic, and yet when developers talk about it openly it is regularly met with anger from the public, who can feel like they are being taken advantage of. If this misunderstanding about the necessary use of behavioural data was to translate to a GDPR case the company in question could be in serious trouble.


So, the question then becomes “Is this personal data?” even though it was actually an individual acting as a game character at the time? If it can be analysed to profile us and manipulate us, should we consider it personally identifiable information? Should it then be protected like the rest of our personal data, even though technically it belongs to our game character? That will be a question for the courts, in the event of a breach. It could be that using “it’s only character-data” will be a shaky argument in the face of the current, albeit untested, GDPR laws.

                                                                                                                                                                                                                    IV. GDPR

As you can see, the thorny issues games throw up around privacy are non-trivial. But why does it really matter? Well under the Data Protection Act it may not have. Much more emphasis was placed on the intent of the company, and the actual damage done by the breach. GDPR, however, is much more concerned with effort (and indeed success) in complying with the legislation, than whether any harm was done in the real world. As previously mentioned, the financial penalties are considerable.


GDPR considers various lawful bases for processing personal data within Article 6.


Legitimate Interests is one of the most used and certainly the most applicable to games, as the Information Commissioner’s Office (ICO) describes it as one where you must “use people’s data in ways they would reasonably expect and which have a minimal privacy impact [19]”.


In addition, many games have minors as customers, and GDPR insists that only children over the age of 13 are able to provide their own consent, and that age-appropriate privacy notices (i.e. of a suitable reading level and clarity) are also required. Consultation with children during design is good practice.


GDPR also insists on a right to erasure for personal data, and especially for minors. This can be tricky or impossible for some games  to retrofit into their code, and will affect both game structure and security, which is why using a framework and making these considerations during the design phase is so important.


We have already established that games companies can have a multitude of data points at their fingertips, but what they are doing with them can also be problematic.

Article 4(4) of the GDPR defines profiling as including “…automated processing of personal data to…analyse or predict…interests,…behaviour, location or movements.”

You must inform people they can object to profiling, but this becomes difficult if profiling is part of the functionality of your game.


Alternatively, if they decide via behavioural analysis/profiling that you love Pikachu because of data collected in-game, and they then use that in a targeted marketing campaign to you, could it be argued that is using the data for a purpose not originally declared to you. Can an extensive End User License Agreement (EULA) really cover this usage? The lawful basis of Consent says the individual must give ‘clear consent’ for their information for a ‘specific purpose’.

In fact, if they are carrying out profiling activities with data such as heartrate or other biometrics, companies are also required to have carried out a Privacy Impact Assessment [20].


For example, Oculus (which is owned by Facebook) has a privacy policy [21] which some VR professionals remain unhappy with, due to the general concerns around data aggregation and sharing that has occurred with the Facebook parent company, and their tendency to combine data across multiple services.

When Oculus says that they collect “information about your environment, physical movements, and dimensions when you use an XR device” it doesn’t make it clear how much of this data can be tied to your identity or used to reconstruct movements.


The GDPR defines breaches to include (amongst other things) unauthorised alteration of data, and loss of availability (two very common problems for games), and mandates a notification to all affected parties within 72 hours. When you consider how this could affect brand perception for some companies, it should become clear how important a proper Risk Assessment really is.


Are smaller games companies really equipped to protect themselves against the new GDPR provisions? Most likely not. And this makes their security assessments absolutely critical.


                                                                                                                                                     V. cyber attack vectors on games/The Fair Model


So why are games such a target? Well, mostly because gamers like to make life easy for themselves. Have you ever spent hours stuck on one level? Did you ever type in a ‘cheat code’ from a magazine as a child? Did you ever Google a walkthrough? Gamers often expect these glitches or ‘easter eggs’ to be built into games and go looking for them. This can result in unexpected behaviour, especially when the developers have often run out of time or budget and the game itself has been rushed out the door with less testing time than would be ideal.

In addition, a huge shift has happened towards micro-transactions and downloadable content in the past decade. This means that many games are connected to payment accounts, which has turned games into a viable financial target for hackers.


It’s not too hard to break a system if you have a dedicated group of people who have an expectation that there will be a reward for their time investment (whether in-game bonuses or actual money) and a lot of time on their hands (many hardcore gamers and hackers are teenagers, so not yet working or with families to look after).


In addition, a whole industry around in-game mods has sprung up within Massively Multiplayer Online Roleplaying Games (MMOPRGs). Most of these are not for the purpose of directly cheating, as this would result in an immediate ban and is often detectable by the game engine itself, but can be used to improve display of weapons or inventory, or to speed up keystrokes or reassign key bindings or macros for common tasks. Enthusiastic coders designing these interface mods sometimes accidentally find ways to cheat the system. Popular interfaces, even if they do not directly cheat, can send player stats or login information back to third party servers, compromising user privacy and security without their knowledge. This is a murky grey area of liability. If your own user data from Big Game Co is leaked out through an unauthorised mod that you installed yourself, who is responsible for the data? How much should you be protected from your own stupidity? How secure is secure enough? Once again, until we see some of these cases in court, there are no clear answers.

Threat Modelling

Understanding what threats and vulnerabilities may occur is a vital part of completing any Risk Assessment thoroughly and accurately. Thorough Threat Modelling is not always easy, and Game Designers who are usually focussed on user interaction models and game aesthetics may find the mindset particularly difficult to step into.

A helpful preparatory step before the Risk Assessment may therefore be to use some industry standard Threat Modelling tools [Figure 1] [22] such as STRIDE, Security Cards, or Persona Non Grata in order to help identify threats to incorporate into the FAIR framework.

Figure 1: Threat Modelling Tools [22]

For example, STRIDE provides a model where multiple threat questions are posed across six categories; such as “How can an attacker change the authentication data?” within the ‘Spoofing identity’ category, to try and ensure all areas are considered.


Security Cards encourage you to think broadly and creatively about computer security threats. Explore with 42 cards along 4 dimensions (suits): Human Impact, Adversary’s Motivations, Adversary’s Resources, Adversary’s Methods. These may be particularly suited to use in games companies because they most closely resemble an actual game.


The idea behind Persona Non Grata is simply to create User Stories for attackers, in addition to those of normal users, and then derive threat models from them, and is a technique now widely used under various names in software assurance at all levels of industry.


As any Risk Assessment is only as good as the information that goes into it, any of these would be good preparation. For larger companies, those dealing with personal information, or simply keen to do due diligence, reference to ISO 27005 Information Security Risk Management, ISO 27035-Incident management and even ISO 27018 -PII in public cloud would also be advisable reading.

The FAIR Taxonomy

Figure 2: The FAIR Taxonomy

The FAIR framework (Figure 2) is divided into two halves. One deals mainly with the event itself, and the second mainly with the effects to the business. It is this direct translation of the loss to the business, making the event tangible, which helps to make FAIR so valuable. Due to a need for brevity we will use only a few examples from possible games applications of FAIR, but the real advantage is the flexibility of this framework to be applied to games (which is often an unusual industry), and that it easily covers the business and privacy factors which GDPR brings into play, without breaking down or requiring adjustment.

Loss Event Frequency

Games suffer from a relatively high Threat Event Frequency; Massively Multiplayer Online (MMO) games in particular.

This is due to the fact that the majority of attacks are experienced from within their own customer base. The first factor is from players who wish to game the system in order to gain advantage for their in game player characters.

Usually this takes the form of some kind of hacking, and can range from anything from very simple interface hacking to speed up the execution of normal commands, right through to awarding abilities or funds that the player character would not normally have access to. This can require a skill level which is anything from downloading and installing a ‘patch’ from a website and running it on the local game client, to coding a customised system after hacking into the game server itself. Its effects can be considered relatively innocuous on the surface, although more serious breaches could result in unknown results in game server behaviour.

The second is usually a social engineering attack by individuals or organised crime to steal user accounts and sell them for profit. In the past, the player’s remedy for this was to report the theft, prove somehow that the account was yours all along (usually by proving the payment method was registered to you), and the game admins would restore it. The idea that your personal information was, for a time, in the hands of another, was barely addressed. The fact that it was frequently happening to minors (for some value of ‘minor’ anyway) was ignored, also. The Threat Capability from this group is relatively low, and usually amounts to social engineering or keyloggers attached to ‘patch’ files which users who try to customise their experience then fall foul of. However, they are driven by the value of the asset: for example, a Warcraft ‘alt’ (a premade high-level character sold on the grey market, then transferred to your account, against the T&Cs of the game) can cost hundreds of Euros.

This in turn means that the Vulnerability of the system holistically may be high, even when Control Strength is well maintained, just because it is under continual and sustained attack from script kiddies who are always searching for new ways to get the best loot. The introduction of two factor authentication in some systems has gone some way to mitigating the effects of this by increasing control strength, but in others, especially games aimed at a younger audience such as Pokémon Go, two factor authentication is a much harder sell, as it may be difficult to make it simple enough for their players to understand and use without bespoke hardware.

This in turn means that Loss Event Frequency may be high.

Probable Loss Magnitude


As mentioned, the breakdown of Probable Loss Magnitude is where FAIR excels.

Despite account stealing being widely prevalent for years it has been generally accepted by gamers as a known risk, and so Reputational Loss from this would likely be surprisingly low. In part this is because MMO Games companies, and in particular Blizzard – a pioneer in this field, have worked hard on exceptional customer service to try and restore accounts with all speed, thus minimising inconvenience to customers and making them somewhat blasé about the phenomenon.

Accounts are generally stolen via social engineering or rogue plugins on the user’s machine, so the cost of Replacement is only the staff hours to talk the user through securing their own machine and then reinstating the account on the server, and the actual breach is not on the game server itself. Many gamers also have limited realisation of how personal the details in their account are.

At the height of the problem an outcry had begun about why companies were not doing more to prevent users installing unauthorised plugins for the game which compromised their machines, but a widespread push for two-factor authentication has helped alleviate the problem. With the introduction of 2FA companies now also have the ability to reflect blame on the customer by asking why they did not have 2FA turned on.

In a true breach of multiple MMO player accounts, there is a reasonable likelihood that the vast majority of them would already be protected by 2FA, and that customers would trust the company to restore them without too much fuss, meaning Reputation Loss is likely to be low. They are much more likely to be worried about getting back their progress from hours of play time than about their personal data going missing, and so will be easy to reassure.


The Legal & Regulatory Asset Loss factor is a big consideration for all Risk, now that GDPR is imminent. This risk will change depending on whether the breach involves biometric data, profiling data, or personally identifiable data such as address and payment information. In addition, in-game information may not have been considered to be privileged before, and it is yet unclear how that will play out in the courts, so this risk may also change over time.


Cost is another key loss factor easily arrived at. One user account is worth a certain amount to the company, so this can be multiplied by how many accounts are currently active.


Threat loss factors:

As previously discussed, Actions in the case of gaming are most likely to be unauthorised modification, in order to gain extra credits, modify a game interface for easier use, faster interactions, automation, or some other competitive advantage.

Access Denial via DOS attack is also common on game servers, and in addition to the usual reasons of distraction, a prelude to a larger hack, or to cost the company money, it is often used to penalise other players and cost them in-game advantage or assets, such as in the example of Eve Online, where players may lose expensive undocked ships in some outages [23].


It may seem like critical asset loss is less likely to result in Productivity Loss for games companies as their few staff are not operating on the same network as the game servers, but one must remember that productivity in this scenario is that of the company, and therefore if the game is not making money because servers are down, then productivity is actually impaired and Productivity Loss is high.

However, at least for larger businesses, with today’s trends towards cloud hosting, robust failover to a new server and spinning up new game instances is likely to be automated for larger games, ensuring outages are much reduced.


Organisational Loss factors like Timing almost always apply in a games scenario because due to their nature game companies are never out of the hype cycle. The build up to release is vital, and then release, and DLC content keep interest high throughout the lifetime of the product. Rabid fanbases on Reddit and Twitter ensure there is a never a slow time in which a breach would fly under the radar.

In addition, Due Diligence is a factor that gaming continually struggles with as it is rarely met. The pressures of the industry to release faster and keep up with breaking technologies mean that corners are often cut, even by large studios. In every area that isn’t immediately obvious in gameplay terms, it is likely that best practice has been sacrificed to budget and time constraints.

External Loss factors were previously primarily reaction driven losses; ie. if they weren’t detected there typically was little negative response, but GDPR mandatory reporting has changed the landscape of this completely.

Competition as a Stakeholder loss factor is a particularly interesting one in the games arena. Obviously, there are some areas in which there are dozens of similar games of one type, Bingo Games perhaps, and players may desert readily for another if there has been a breach. But there are other games (take Pokémon Go for an example) where players might decide to forgive and accept any risk from a breach as they do not wish to abandon the huge amount of enjoyable work and time they have invested in the game, and so they will look for ways to accept any reassurance the company wishes to offer.

Criticism of FAIR

FAIR is a victim of its own thoroughness in some ways, as new users of the framework in particular can struggle with the differing meanings of similar terms in different places within the framework.

The difference between Threat Capability and Threat Competence is subtle, but important, for example. Or the difference in Reputation damage to the company, compared to Reputation loss as a Sensitivity/Asset Loss factor.

Even then it still compares favourably with many frameworks which require a lot more training to use effectively.

Other Risk Assessment Methodologies

FAIR’s strengths lie in its ability to be granular, or not, depending on the circumstance. This is helpful in Games, where a company can be dealing with a Risk Assessment for a massively multiplayer game, yet also an Assessment for a smaller game of a different type, with a completely different threat landscape. Training the team on one model which can easily handle both scenarios is efficient.


I did look at, and discount, some other Risk Assessment models, however. RiskIT [24] was too focussed on a business IT infrastructure to deal with the variety that games presented.

Structured Risk Analysis [25] was overly practical and not really in-depth enough, and I felt it did not pay enough heed to the business intangibles. The Microsoft Threat Model we’ve already mentioned in passing, as it uses topic categories with the acronyms STRIDE to model threats and then DREAD [26] to compute risk. It is overly simplistic, however, relying on experts who understand the suggested direction of the terminology to strategise out from it.

OWASP [27] is the only model I felt came close in usefulness, and was appealing in it’s mapping of security terminology onto the various levels of the process, customisation opportunities, and the weighting ability, which all helped with flexibility. It could be a suitable replacement candidate if the team felt more comfortable with the terminology.

                                                                                                                                                                                                       VI. In Conclusion


Part of the flexibility of the FAIR model is that it allows for varying depth and complexity as required, compared to some other, overly rigid or less appropriately structured risk models.

It’s obvious that there is an unusually granular level of customer data and some ambiguity over use cases and consent, which can make some of the modelling difficult in terms of GDPR, but this will become clearer over time.

However, one of the criticisms [3] of the FAIR model is the inavailability of sufficient data, which is not something that the games industry suffers from as it is under almost constant low-level attacks, and so FAIR suits it very well. The main issue may actually be that good Risk Analyses need to be iterative, and so automating the process with a software tool would be more ideal. It is a shame that the FAIRiq tool seems to now be defunct.

This would also counteract another criticism [3] of FAIR, which is that it spends some time going over Risks which are already mitigated, and that unmitigated Risks are the only ones worth flagging. However, being able to show to due diligence is important from a GDPR perspective, and desperately needed in Games. Once again having an automated tool would go some way to helping with producing the correct reporting and accountability in the background, while still flagging outstanding liabilities to management and the security team in a more obvious manner.


As the Games Industry continues to embrace emerging technologies, incorporate IoT devices and cross media platforms, the problem of dealing with player information without running afoul of GDPR and other privacy laws will become more and more difficult, and the need for a robust but easily understandable framework for security and privacy will increase. I think FAIR is one of the few good contenders for this at present.


                                                                                                                                                                                                          VII. References

[1] European Parliament and the Council of the European Union, “Regulation (EU) 2016/679 of the European parliment and of the council (General Data Protection Regulation),” Official Journal of the European Union, pp. L119/1-L119/88, 2016.
[2] “Proposal for a Regulation on Privacy and Electronic Communications,” European Commission, 10 Jan 2017. [Online]. Available: [Accessed Mar 2018].
[3] The Open Group, “FAIR Technical Standard Risk Taxonomy,” Jan 2009. [Online]. Available: [Accessed Mar 2018].
[4] K. McCarthy, “As GDPR draws close, ICANN suggests 12 conflicting ways to cure domain privacy pains,” The Register, 09 Feb 2018. [Online]. Available: [Accessed Mar 2018].
[5] R. Sephazon, “The Pokémon Company Reports a 2016 Retail Revenue of $3.3 Billion,” Nintendo Life, Mar 2018. [Online]. Available: [Accessed Mar 2018].
[6] “Important Notice Regarding European Region Access,” WarpPortal, [Online]. Available: [Accessed Mar 2018].
[7] C. Rottondi and G. Verticale, “A Privacy-Friendly Gaming Framework in Smart Electricity and Water Grids,” IEEE Access, vol. 5, pp. 14221 – 14233, 2017.
[8] R. Gharsallaoui, M. Hamdi and T.-H. Kim, “A novel privacy technique for Augmented Reality cloud gaming based on image authentication,” in Wireless Communications and Mobile Computing Conference (IWCMC), 2017 13th International, Valencia, Spain, 2017.
[9] J. P. P. J. J. I. a. A. S. J. Laakkonen, “Incorporating Privacy into Digital Game Platform Design: The What, Why, and How,” IEEE Security & Privacy, vol. 14, no. 4, pp. 22-32, 2016.
[10] M. Lankhorst, “8 Steps Enterprise Architects Can Take to Deal with GDPR,” Bizzdesign, 31 Jan 2017. [Online]. Available: [Accessed Mar 2018].
[11] “Welcome to ArchiMate® 2.1, an Open Group Standard,” The Open Group, 2013. [Online]. Available: [Accessed Mar 2018].
[12] “Run An Empire,” Location Games Ltd, 2016. [Online]. Available: [Accessed Mar 2018].
[13] H. Lamb, “Engineering & Technology,” IET, 29 January 2018. [Online]. Available: [Accessed Mar 2018].
[14] R. Koster, “Still Logged In: What AR and VR Can Learn from MMOs,” Game Developers Conference 2017, 14 Mar 2017. [Online]. Available:
[15] K. Bye, “Voices of VR Podcast,” Mar 2017. [Online]. Available:
[16] “Unity, GDPR and Data Privacy – FAQ,” Unity, Mar 2018. [Online]. Available: [Accessed Mar 2018].
[17] “In-Game Analytics,” Unreal Engine, 21 Dec 2017. [Online]. Available:
[18] Various, “Unity Analytics and GDPR,” Unity, 17 Jan 2018. [Online]. Available:
[19] “Legitimate Interests,” Information Commissioner’s Office, [Online]. Available:
[20] “What does the GDPR say about automated decision-making and profiling?,” Information Commissioner’s Office, [Online]. Available: [Accessed Mar 2018].
[21] “Oculus Privacy Policy,” Oculus VR Ireland Ltd, 12 Feb 2018. [Online]. Available:
[22] N. M. Forrest Shull, “Cyber Threat Modeling: An Evaluation of Three Methods,” Software Engineering Institute | Carnegie Mellon University, 11 Nov 2016. [Online]. Available:
[23] “Connectivity Issues – DDoS Attack,” EVE Online, 24 Apr 2018. [Online]. Available: [Accessed Jul 2018].
[24] ISACA, “The Risk IT Framework Excerpt,” 2009. [Online]. Available: [Accessed Mar 2018].
[25] N. McEvoy and A. Whitcombe, “Structured Risk Analysis,” in International Conference on Infrastructure Security, Springer, Berlin, Heidelberg, 2002.
[26] Hardware dev center, “The DREAD approach to threat assessment,” Microsoft, 27 06 2018. [Online]. Available:
[27] OWASP, “OWASP Risk Rating Methodology,” Feb 2018. [Online]. Available:
[28] D. Ionita, “Current Established Risk Assessment Methodologies & Tools,” Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS), Enschede, Netherlands, 2013.




Figure 1: Threat Modelling Tools [22] 4

Figure 2: The FAIR Taxonomy  [3]

Comments are Closed

Theme by Anders Norén