Adam's book notes


Lost in a Good Game: Why We Play Video Games and What They Can Do for Us

Book cover

Author: Pete Etchells

Book details

Table of contents

Prologue

Screens are part of everyday life for most of us now. They connect us to friends, family and strangers. We use them to work, to play, to inspire and to distract. Even so, many people are wary or scared of them.

Using screens does change how our brains work, but this isn't anything special; everything we do changes our brain. As to how they change it, the science on this is new and very incomplete - but what exists suggests an answer that is complex and nuanced.

Spending a lot of time in front of screens seems to feel instinctively bad. The difference between that feeling and the latest science might explain the existence of the current culture war against screens and video games in particularly.

It isn't a zero sum game. Less time spent outdoors doesn't straightforwardly correlate with more time playing video games.

In any case the assumption there is that playing outdoors is healthy and wholesome whereas video games are a waste of time and a health risk. But video games are a creative medium that offer us unique opportunities to explore what it means to be human, our thoughts, feelings and aspirations if not abused. Outdoors isn't intrinsically healthy either - think of air pollution, traffic, "stranger danger". Assuming outdoors means the countryside also comes from a place of privilege.

The common stereotype of a young male gamer is not accurate. The Interactive Software Federation of Europe found that across all age groups a roughly equal number of men and women play games. Adults aged 45+ were more likely to play games than children aged 6-14.

Different people play different games for different reasons

Chapter 1 - Dragons and demons

World of Warcraft is one of the most successful massively multiplayer online (MMO) games.

There are many different ways to play it, not all of which involve violence.

The sense of freedom it engenders may explain its success. Unlike other media, video games like this are an inherently personal experience.

Naomi Alderman, novelist and games designer, said:

While all art forms can elicit powerful emotions, only games can make their audience feel the emotion of agency. A novel can make you feel sad, but only a game can make you feel guilty for your actions. A play can make you feel joyful, but only a game can make you feel proud of yourself. A movie can make you feel angry with a traitor, but only a game can make you feel personally betrayed.

The series of choices a game forces upon us force us to define ourselves. We have to make decisions to achieve something.

Video games research is a new area of science, considered part of psychology. It's particularly complicated because:

We have not yet discovered any conclusive and universal results about the effects of video games on us or the reasons why people play them.

Reasons might include wanting to interact with people or escapism - but they all let us learn something new, explore new places and perhaps find out something about ourselves.

The five stages of grief are considered to be denial, anger, bargaining, depression, acceptance. However there's never been a study showing that those rigid stages actually exist - our emotions are not that granular or consistent.

A video series called Low Batteries looks at how video games and mental health interact - e.g. how PTSD is portrayed in games or how games are used by some to cope with mental illness.

Games can help people through grief, stress, anything - but sometimes they can become an all-consuming obsession.

We should not discuss games in black and white terms. It's incorrect to take either the position that they're always wholly good and benign, or the position they're always dangerous and harmful.

We might have a finite amount of time on this earth, but video games allow us to live multiple lives in a countless number of ways

Death in video games is usually a learning tool. If we die it's because we did something wrong - but unlike real life we get another chance to fix it.

Chapter 2 - A brief history of video games

The Ferranti Nimrod Digital Computer was one of the first prototypes of a computer that could play a game against a human player - in this case Nimrod. It was shown in the 1951 Festival of Britain.

The subject of game theory uses math to describe and predict how decision-makers interact with each other. It's useful in economics, political science, biology and beyond.

The National Videogame Arcade was the ‘the first permanent cultural centre for video games’. It has interactive exhibits, video game displays, coding workshops, gamer competitions and talks.

Ralph Baer, one of the potential "fathers of video games" invented the Magnavoc Odyssey in the 1970s. It came with dice, pencil and paper for keeping score and screen transparencies. It served as an intermediate step between traditional board games and digital games consoles. It was the first time people could control the action taking place on the TV screen.

Most early games of the 1950s were created more for demonstrating the capabilities of a machine than for the purposes of making the game available.

In the 1960s, "Spacewar!" moved the industry beyond the lab into arcades. In parallel, the late 1960s saw the introduction of electro-mechanical coin operated arcade machines, originating from earlier pinball machines. Sega produced one, "Periscope", in 1966.

In the 1960s the cost of computer hardware was generally too expensive to allow for creating profit-making arcade machines or home gaming computers.

Pinball machines were illegal in many US cities until the 1970s, being considered a game of chance and hence gambling.

In 1972, the first digital arcade game was released - "Pong", a table tennis simulator.

The development of video games is as incremental as the progress of science. Every development is incremental, based on all the developments that came before.

The 1970s also saw the release of the tabletop role-playing game Dungeons & Dragons, which also helped shape the future of video gaming. It also led to one of the first game-based moral panics of the 20th century wherein some US Christian groups believed D&D promoted worship, witchcraft and murder.

D&D inspired the introduction of text-based adventure video games such as "Colossal Cave Adventure" and "Zork" in the 1970s.

In 1978 the game "Multi-User Dungeon" was released (MUD). Players could interact with each other at the same time in the same text-based world.

One reason it's hard to create a single timeline of video-games from birth to today is because the definition of "video-games" hasn't been stable. Even now it really means only whatever you want it to mean. But definitions are important as one's choice of them influences the direction of research on them and the tone of public discourse about them.

Chapter 3 - Why do we play video games?

Developers may have deep insight into what motivates players to play games. They're typically players themselves.

Not much research has been done on why it is that people play video games. There's an almost infinite range of genres, themes and play styles, but majority of research has focused only on whether violent video games have impact on our behaviour in terms of aggression, depression, etc.

Some research exists that tries to categories styles of game play and player attitudes.

Most studies originate from a classification that Richard Bartle - one of the creators of MUD, one of the first text-base virtual gaming worlds - came up with in 1996.

Here classified MUD players into:

The author freely admits this isn't very scientific, it's just what he came up with after years of experience and discussions with players.

Nick Yee conducted probably the largest study of gamer archetypes in 2006. He surveyed >3000 players of MMO games looking at how they identified themselves. He found 3 main categories of player motivation:

Players could fit one of more of the categories; the reasons people play are varied and complex.

A limitation of Yee's work is that it used survey questions based on Bartle's classification, thus could lead players to respond within that framework. It could only confirm the existence of known archetypes rather than find new and unknown categories. This is the same for almost every study since.

One review was surprised to find no mention of aesthetic enjoyment or utilitarian motivations (such as playing to earn a living - although this is a more recent phenomenon).

There's a risk that these archetypes become a self-fulfilling categorisation. If games developers target these categories when designing games then the games they produce will reinforce the existence of these categories. In reality, video game players tend to be curious and often come up with new ways to play a game that those who designed them didn't think about or appreciated.

Developers often use psychological techniques to motivate players, even when they have no training in behavioural science.

As games become larger, more complex and more immersive developers should consider that with great power comes great responsibility.

How we play games is different from why we play them. Both vary between people, between games, and can change even within a single person over the course of a single game.

It's instructive to look at the deep psychological drives that motivate us to do anything in life.

One such account is the "Self-Determination Theory" developed in the 1980s. It hold that behaviours are driven by either or both of:

Underpinning intrinsic motivation are 3 human needs:

Intrinsic motivation is thought to be why we do most forms of sports and play. The best games address all 3 points. If a game is easy to pick up, gives you freedom to do what you want and lets you interact with other players then it has a good chance of becoming popular.

Players who experienced greater competency when playing a game reported increased self-esteem, a positive emotional experience and feeling energised. Those who felt like they could be more autonomous also reported higher self-esteem, a positive mood and that the game felt more valuable to them.

Relatedly, a newer model, the Player Experience of Need Satisfaction or PENS model, found that autonomy, competence and relatedness are the causal elements of a fun experience.

Chapter 4 - Control and Imaginiation

Minecraft, released in 2009, was one of the most important titles of the past decade. It's a sandbox game where you start off in a woodland with no particular goal.

You can play it in creative mode, where you can construct anything, or survival mode where you have to explore, forage and fight enemies.

Some people have created amazing works within it, e.g. replicas of cities, scenes from TV shows, or a world with replicas of various molecules called MolCraft that could be used for educational purposes, and even working computers.

These are feats of creativity rather than skill - the only limit is your imagination.

Minecraft takes on varied roles for difference people, e.g. helping to process grief, a social platform, a way to communicate with hard to reach people, a way to tell stories. No other entertainment medium offers this sort of possibility. You can get a sense of pride and achievement for having fully participated.

Minecraft was banned in Turkey in 2015 for being too violent and creating social isolation. These claims are ridiculous to anyone who has played it.

Games are still vilified, potentially because they're comparatively new technology. Television, radio, the telephone, writing letters and the printing press have also been vilified in the past when they were first introduced. Each time moral guardians were worried that the proletariat had gotten their hands of something that's "bad" for them - oftentimes only really deemed bad on the basis that they like it.

Socrates is often taken to be expressing the dangers of even the written word:

… they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality

But it's a misreading. Plato wasn't against the technology of books. He found that they were OK if they were used as intended, as a way to acquire knowledge to be used in conjunction with discussion and other forms of communication.

The Adornian critique goes further and claims that games are both a waste of time and harmful to society.

The philosopher Theodor Adorno argued that mass media has effects beyond transmitting the obvious messages they contain. TV shows etc. reflect the beliefs of those who create and control their content. If we engage in it passively then it reduces our:

But games like Minecraft can be played in many ways - perhaps Adorno would approve of games that give you the ability to explore and construct virtual worlds.

The term "video games" doesn't make much sense. Nobody who plays them calls them that. Everyone has different ideas of what it means - someone might regard Call of Duty as a video game but ignore Candy Crush. This limits how we can categorise products as video games and people as gamers.

"You don't look like a gamer" could be rewritten as "You don't look like someone who enjoys fun". The phrase often comes from a stereotype of gamers as being young males.

It was only in the 90s where games were skewed towards men. That was because men had more disposable income and hence marketers targeted them.

Games that aren't on consoles or PCs, don't need much skill and aren't aimed at the stereotype male demographic are often not considered to be video games by many people. But the features of the game are less relevant than the perceptions of the game or its players.

FarmVille is derided and Farming Simulator is lauded, not because of the way in which they are played – they are largely the same game. What differs is the perceptions that different subgroups of gamers hold, not just of the games themselves, but of the people who play them.

The lack of a clear definitions means we all impose our own thoughts and beliefs on what we think they are. This affects whether we think a video game is good, or whether we think in general video games are good or bad.

When someone with different beliefs engage with us, many psychological mechanisms come to the fore to protect what we believe to be true.

With confirmation bias, people selectively filter information to fit in with what they already believe. Evidence is cherry-picked or interpreted in a specific way. Even evidence that obviously contradicts our beliefs can make our beliefs stronger - the backfire effect.

This can cause an "us vs them" mentality.

If you don't have much experience playing video games then it might worry you when you see your children playing them. Perhaps the difference in general attitudes between e.g. sports and video games is mostly a matter of how much time and experience someone has engaging with either.

The current generation of parents have often themselves played video games which leads to healthier conversations with their children.

The "mere exposure effect" means that people who are repeatedly exposed to a particular stimulus end up reacting to it more positively.

Understanding both what their children play and how they play it can help parents:

Game developers are often ignored in video game debates, even though they're the ones that have insights into the processes and decisions that go into creating the content and play styles.

Much of the industry until recently has been white, straight young males. Developers who don't fit that stereotype are often subject to abuse.

Video games have a high barrier to entry vs other entertainment media. You have to buy, install and configure the hardware to run it on, install the game, and then have the dexterity needed to adequately control it. You'll then need to learn about how the game world behaves before can become proficient at it.

If their first experiencing of playing games isn't good then people can be put off from gaming as a whole. That's unlike other forms of media - seeing a bad film doesn't generally deter people from watching more films.

There's often little reason to come back to a game after you finished it.

As games can be a very personal experience we tend to develop our own ideas about what games are or should be. They can seem like alien and isolating experiences to outside observers.

We need to develop a vocabulary to talk about games that helps explain why they're interesting and enjoyable in a way accessible to all. But imposing too strict a vocabulary would risk limiting what future developers decide to create.

It's important to acknowledge the concerns of people on all sides of the debate.

Chapter 5 - A brief interlude

Most research on gaming is done within the field of psychology, a field of science that is going through many problems at present. The issues associated with the practice of psychology research affect what we do or don't know about the effects of video games.

A paper from Daryl Bem in the Journal of Personality and Social Psychology appeared to show that humans can predict the future. Effects seemed to happen before cause. This meant that either psychic abilities are real or there is a problem with how researchers conduct studies.

No other groups could replicate Bem's findings. When one such group tried to publish their failed replication in the JPSP the journal declined it solely on the basis that they don't publish replications.

This type of decision leads to publication bias or the file-drawer effect. Studies that don't produce statistically significant effects or aren't new and exciting are less likely to get published. Thus the scientific record does not accurately reflect what we actually know.

A big collaborative study saw 270 scientists try to replicate 100 published psychology studies. Only 36% replicated successfully, and even in those cases the effect size was around half the size of the originally reported one.

Academics operate within a publish-or-perish culture. What's good for their career is different for what's good for the academic record.

Fanelli found that 90% of published psychology studies report positive effects.

There are many questionable research practices (QRPs). Some of these are:

Many researchers might not realise they're following QRPs due to how they were taught to work. Others may do so deliberately, on the path to explicit fraud.

One study showed 94% of psychologists admitted to at least one QRP. 40% are estimated to have falsified data at some point.

Other fields of research have similar problems.

Psychologists are starting to develop reforms that will minimise future QRPs. These includes:

Chapter 6 - Are violent video games bad for us?

Ever since video games became popular people have worried whether they're "bad" for us; if playing violent games makes us more violent or aggressive in general.

In 2021 researchers tried to quantify the level of violence in games rated as suited for young children. They recorded the amount of time taken up by violent acts within the first 90 minutes of a game.

These aren't the games people worry about. Modern debates are usually around first-person shooter games like Call of Duty.

Concerns rise partiularly after a mass shooting has taken place. All the below massacres have at some point been linked to allegedly obsessive video game use by the shooter:

We feel the need to explain and rationalise these horrific acts in order to make sense of it and stop it from happening again. Because video games are one of the most popular contemporary leisure activities it's likely that the killer will have played them, and so the spotlight focuses on that.

We need science to investigate whether there is really any link between playing games and violent behaviour. It might be that:

The research so far has many issues that mean it's hard to come to a real conclusion.

A key issue is how studies can measure aggression whilst remaining ethical. Using a questionnaire might not be sufficient as participants can behave strangely if they figure out what the study is about.

One measure is the Competitive Reaction Time Task (CRTT). Participants play a video game first and later have the chance to "punish" other competitors by blasting them with a loud noise if they want to. You're considered to be more aggressive if you blast your opponents with louder noises for longer periods.

There are many ways in which CRTT data can be analysed. Researchers have shown how the same experiment can legitimately be analysed in different ways, each of which leads to a different conclusion.

...if you want to find out whether violent games cause aggression, you won’t find the answer in the data you collect. It will be entirely in what you do with that data.

So far more CRTT studies haven't used pre-registration to reduce the risk of "flexible analysis". The different analytics choices mean that it's hard to compare the results across different studies.

Another option is the "hot sauce paradigm" where after playing video games participants are given the chance to pick from different hot sauces that someone else, who they're told doesn't like spicy foods, must consume. Those who select hotter sauces are considered more aggressive.

But hot sauces and loud noises are not the sort of "aggression" that people are really concerned about. It's not clear that even if the experiments were well designed that they'd be able to tell us anything about real-world aggression, let alone mass murder.

Another research obstacle is in the determination of which game the participants should play. Usually we are comparing a violent game with a non-violent one.

But if we were comparing e.g. Call of Duty to Candy Crush Saga then there are many differences between the games that aren't to do with violence. Any differences seen in players might relate to one of those - for example any frustration due to the difficulty of the game. Most studies do not even attempt to use two games that are the same as each other except for the amount of violent content featured.

Some researchers have shown that it might be the level of competitiveness in a game rather than the level of violence that might drive aggressive behaviour.

As well as running experiments, researchers can do longitudinal studies where they follow large numbers of people over time and collect information on what they're doing, how they do at school, what's going on in their home life etc.

Some people will become aggressive. Some people will play video games. There will also be potentially confounding factors such as socioeconomic status or parental monitoring.

Longitudinal studies provide useful information at a population level, but interpreting them as showing causality must be done with caution. Often the results one gets depends on which confounding variables are controlled for. No such study is perfect, you can't control for everything - so when you see "X is associated with Y" then usually this means only that a correlation has been found.

We should consider how to improve video game research for the future.

Firstly we should re-evaluate how video games are categorised. "Violent" vs "Non-violent" categories are too vague to be useful. Instead we might modify specific games so there are versions that are more or less violent, more or less competitive and so on so as to otherwise provide the same gaming experience. Some games already have an extensive modding community that could be leveraged.

We should also consider the context that the playing is taking place in. Some people play alone, with family or friends, with people across the world and for a variety of different reasons. Most research ignores these contexts.

We must also develop better measures of "aggression".

Perhaps gaming is a risk marker rather than cause of aggression. If someone becomes obsessed with gaming at the expense of everything else then it might be an indicator that they are having other problems in life that we should help them overcome.

Chapter 7 - Moral panics

There's a tension in research between people who think that there is a conflict-of-interest style problem with people who enjoy gaming doing research on the topic and those who don't. The latter group may believe they can do better research given they've already been exposed to the nuances of different types of content, games and contexts involved.

This cultural divide affects how research on this difficult topic is approached and communicated.

So far there's little consensus about any link between playing violent games and aggression or violent crime.

One of the papers often cited in favour of violent media causing aggression was Bushm et al published in the Psychology of Popular Media Culture journal. However its interpretation of the data is rather tenuous and its history rather strange. The published paper is different to the one that was peer reviewed.

The danger of falsely claiming a consensus answer is that it shuts down any future debate. Few people will want to work on a research question when they think there's already a clear and proven consensus answer.

Moral Panic Theory (MPT) suggests that societies go through periods of creating "folk devils" - people, groups or objects that are described by media as being deviant in some way - that are the cause of a wide variety of social ills. We then feel that by banning or limiting these devils we'll get control of the social problems we are worried about - a false hope of a fix.

With moral panics:

In the 1950s there was concern that comic books caused juvenile delinquency. In the 1980s the concern focused on towards Dungeons & Dragons. In the 1990s some people worried about the influence of The Simpsons. More recently the concern has been around video games or screen time in general.

Moral panics start with the reporting of personal anecdotes. Researchers can later perpetuate the panic themselves due to the pressure exerted on them to produce results that meaningfully impact society. Evidence that supports the panic is accepted uncritically. Evidence against its existence is distrusted.

It feels good if your study is reported on widely and discussed by a lot of people. It gives you status, potentially funding and the feeling of having a big impact. Given the analytic degrees of freedom in this field, it's easy for researchers to produce results in favour of a chosen narrative, even unconsciously.

The focus on this aspect of gaming research means some important questions are being overlooked. These include:

Researchers looking at the association between violent crimes and video games sales found a negative correlation. There are many factors other than gaming that might explain this but if there was a strong and clear relationship between playing violent video games and real-world violence you might expect to see some positive correlation in the data.

Video game developers are aware of these issues and should be included in the research. Their opinions vary, but many of the big game design companies actively seek ways in what to reduce toxic behaviours in-game. The Fair Play Alliance was set up by over 90 individuals and companies to do research on how to reduce disruptive behaviour and promote fair play in games. As well as any specific concerns around real-world violent behaviour, these efforts are in direct alignment with the wishes of developers to create the best and most positive gaming experience for players.

Chapter 8 - Are video games addictive?

The question of whether video game obsession can become a problematic addiction is another oft-raised concern. In 1982 the Journal of the American Medical Association reported on 3 men who claimed they had "Space Invaders obsession".

It's another issue where what the media reports is often different to what the scientific research says.

There are some genuine worries to consider, for example how games are monetised and marketed towards children. But it's wrong to conclude that if some aspects of gaming are problematic then the whole of gaming is bad.

Observational research in the 1980s began to create the idea that video game addiction was a real thing that manifested either like other behavioural disorders such as gambling addiction or substance abuse disorders.

From the 1990s onwards, experimental research began. Mostly this involved taking questionnaires that were originally designed for gambling or substance addictions, modified to focus on video games. This approached concluded that some gamers might meet a certain set of criteria that one can call "gaming addiction" on the basis that they're similar symptoms that exist in other types of addiction.

Much disagreement exists within the literature. A key question is whether it's appropriate to use the criteria for other types of addictions as the criteria for gaming addiction.

Just because someone is immersed in a game doesn't mean other aspects of their life are suffering. The assumption that socialising through online gaming channels is less worthwhile or fulfilling that offline socialising isn't supported. Studies show that deep, meaningful and stable relationships can be formed over the internet.

Online socialising isn’t necessarily better or worse than the offline equivalent, it’s just different.

In 2018 gaming addiction was classified as a mental health disorder by the World Health Organisation’s International Classification of Diseases (ICD).

This created much academic debate between those who thought:

The author is more in the latter group. Firstly he believes there's not a clear picture of what a gaming disordered would look like. Secondly if we don't understand what it is then we can't get a sense of how prevalent it is. We might over-diagnose the condition, pathologising anyone who games a lot, minimizing the importance of any genuine clinical disorder. Any people who do need help might not get the recognition, understanding and treatment they need.

Estimates of gaming disorder prevalence range widely, from around 0.2% to 46% of the gaming population.

If high engagement in video games is misinterpreted as addiction it might undermine especially children's rights, restricting access to a pastime that may be neutral or even beneficial to them. Any restrictions risk being ineffective or backfiring.

There are certainly cases where playing video games is having an obviously harmful effect on someone's life. But there's no clear answer in the research as to how many people are affected, who they are, or whether playing video games is the causal reason behind the negative issue the person has. This means developing a treatment is extremely difficult. Poorly-thought out national policies do more harm than good and are a disservice to those people who do need help.

One study found that none of the people who met the diagnostic criteria for "Internet Gaming Disorder" at the start of the study still did so 6 months later. Neither were any direct and observable effects of this supposed condition on their health visible.

Video games should still be a cause for concern though. In recent years there's been a shift in how games are marketed and monetised, coinciding with the advent of smartphones that are powerful enough to run games. Now we see "free-to-play" games that do not cost anything to start playing but encourage players to repeatedly and frequently sped small amounts of money for new levels, characters, items etc. Some of these exploit the same kind of psychological weaknesses that gambling machines do.

A typical console game might cost around £50. That's not viable for a smartphone game when people are used to paying more like £1 for an app. Hence the "freemium" model wherein players can start for free, but once they have some kind of investment in the game then additional content starts to require frequent expenditure of small amounts of money to proceed.

People who spend a lot of money in these games are known as "whales", which is a term used to refer to casino gamblers who spend a lot. One 2016 analysis found that the top 10% of in-game spenders account for 60% of a game's revenue and that they were spending around £260 every month. These users are at the highest risk of developing a problematic relationship with the game.

One key mechanism these games exploit is the "variable-ratio schedules of reinforcement".

Slot machines in casinos are the archetypical example.

Another mechanism often exploited is that when the game artificially limits how long we can play it for then that increases how much we want to play it. This relates to "hedonic adaption". If we have unrestricted access to something we enjoy then the amount of pleasure we get from it diminishes over time. But if access is taken away then the pleasure we get from it remains stable - leaves us wanting more. This encourages us to spend money to e.g. unlock more lives or time in a game.

Increasingly videogame monetisation uses mechanics classically used in gambling. Developers seem to be so caught up in the profitability of this that they don't consider the long term implications for their players or industry.

The Belgium Gaming Commission found that the loot box systems in several mainstream games violated their gambling legislation and hence were illegal.

There's a correlation between the extent to which players pay for in-game loot boxes and the levels of problematic gambling that they report. But the data doesn't yet show the direction of cause: do games monetised this way cause problematic gambling, or are people with gambling problems attracted to these games?

Despite the WHO's classification, we don't yet really know what a general gaming addiction disorder looks like or who is at risk of it. The author's best estimate is that today prevalence is low, but that the rise of freemium-style games present a specific concern that video games could become truly addictive in the future.

The chances of having a sensible discussion about game monetisation and marketing is derailed by dramatic media stories claiming that games are like hard drugs.

Chapter 9 - Screen time

As smartphones have become more popular there has been a rise in concerns about the effect that the amount of time we spend looking at screens has on us. "Screen time" concerns encompass a fear that society is regressing, becoming more narcissistic, uncaring and unwholesome. We fear that social media isn't social but rather brings out our worst aspects, that children are losing interest in the world outside of their screens and that it's causing all of our mental health to suffer.

The scientific evidence on this is complicated and nuanced. It often fails to impress when compared to our personal experience.

Baroness Susan Greenfield has led the charge against screens in the UK. Her concern is that screen technology "can change our brains". But everything we do changes our brains - that's how we learn and remember things - so it has to be shown that the specific changes are bad. Her critics allege that she has never done actual research or published academic papers on this topic to show that.

Jean Twenge has similar concerns but has done some research into the matter. She's shown, based on questionnaires, that the happiness of teenagers has fallen recent years, whilst rates of loneliness, depression and suicide have risen. The amount of time that teenagers spend going out with friends, on dates or having sex has reduced. During the same period smartphone ownership has risen. Twenge believes that this isn't simply a correlation.

Twenge's analysis shows that measures of teenage depression and suicide risks increased between 2010-2015, and those who spent more time on social media were more likely to report issues with mental health.

But:

...it can be remarkably difficult to conduct research into screen time that gives clear, meaningful and convincing answers

How we talk about mental health has changed in recent years. It's hard to quantify but it might be that teens are simply more comfortable these days about revealing how they feel.

...while over time it may appear as if depression rates are increasing, there’s also a possibility that these numbers simply reflect a societal willingness to be more open and honest about mental health

Screen time is an extremely vague concept. It's easy to measure but misses out the critical point of what the user is doing. It might be that what the screen is being used for and the context that it's in is important.

The best evidence for effects of screen use on wellbeing comes a study by Przybylski and Weinstein, published in "Psychological Science" in 2017. It tests two hypotheses:

The study showed more evidence in favour of the latter. And even the negative effects seen at the extreme ends of usage were small - higher levels of screen time accounted for 1% of the participants' well-being scores.

The emerging picture from the research literature, then, is that while screen time does appear to have an impact on things like childhood depression and well-being, these effects are small, and likely not the main driving factors

As screen time is a complex research field with a lot of ambiguity it's easy to interpret things in line with our existing beliefs. We can guard against this by rethinking how we critically evaluate evidence.

Think of screens as a tool. We need to learn to use them properly. They don't control us.

We probably don't need to worry about screen technology all that much. Avoid falling into the trap of thinking that limiting screen use is an easy fix to the real and complicated problem of increasing rates of mental health issues. Otherwise we will miss other, probably more impactful, causes as well as missing out on the benefits that video games and social media can being.

Chapter 10 - Immersion and virtual reality

No-one really knows how long people spend playing games. Most studies use self-report measures. These are less useful when the games we play are so immersive. We're not good at keeping track of time when in them.

Immersion is about making you feel closely connected to the game's world.

It's understandable to worry about the high levels of immersion that some games create. But it's exactly that that lets us explore our humanity within them, and may in fact offer new ways to treat complicated health issues.

Immersion isn't the only factor that contributes to being a good game. Mobile games tend to have different goals than traditional console or computer games.

Conducting research on immersion is challenging. Sometimes the literature mixes up the terms of "immersion" and "presence".

Immersion might best be thought of as a form of spatial presence, of feeling physically located in the game. But there are other presences too - e.g. social presence (how much you interact with other characters as if they're real) or self-presence (how much your actual self merges into your digital avatar.

How to measure presence? Questionnaire scales exist but the concept is so vague and temporary that they may not be very useful. They're often administered after the immersion ends, making them subject to confounding.

Troscianko and Hinde tried to measure immersion more directly. They found that people rated themselves as more immersed in a movie when watching it on a larger screen. They then had people press a button when they heard a beep whilst watching a movie. They also recorded the participant's eyes and measured the diameter of their pupils. There's a link between how dilated they are and the amount of processing happening in your brain.

People were slower to press the button and had more dilated pupils when they watched on the big screen.

A different study used eye-tracking methods on the basis that as we become more focused on a task we tend to make fewer eye movements and fixate on what we're interest in for longer.

If we care about whether games might be addictive or change our behaviour in negative ways then understanding how they capture our attention is important.

Immersion will become of greater interest now that virtual reality is emerging as a viable platform.

Little work on VR has been done except for a few studies on the subject of violence and aggression which has shown mixed results. One study finds that playing a game in VR had no effect on self-reported measures of presence, hostile thoughts and aggressive behaviour. Another study did find an effect. So whilst there might be reason to think more immersive games might affect our behaviour more there's not really research to back this up a yet.

Nonetheless, there are issues of responsibility that should be worked out if VR becomes viable. For instance, games often seek to create a visceral reaction in the player, which might be very uncomfortable or even harmful to health if encountered in VR.

Some of the discussion about the future of VR is very positive. A thoughtful multiplayer VR game could provide a new way of thinking about how we live our future lives. Whilst the idea of people becoming disconnected from reality is very uncomfortable, especially if we're considering systems designed to exploit people rather than serve their best interests, a VR environment built on positive principles might offer a different way of looking at things - creating new opportunities and experiences for people.

If AI automates away jobs in the offline world, perhaps people can take up jobs in a MMO, meaning that the time spent in game would have a value to it over that of simply entertainment.

In any case, it's probably inevitable that we'll create an immersive virtual world soon and that people will want to spend time in it, so we need to discuss how we can do that in a socially and environmentally responsible way. We must not allow the exploitation of vulnerable individuals.

Whilst VR worlds can offer new ways to earn a living and foster meaningful relationships with people, we might be concerned they get shaped by the gambling mechanisms that are so popular and easy to implement in today's non-VR games.

There are examples of VR being used for good already.

If we can do all of that, and get it right, then the future of virtual reality – the scientific and medical benefits that it could afford us, as well as the opportunities for unleashing our creativity in unimaginable new ways – will be a truly exciting thing to behold.

Chapter 11 - Wayfaring and wayfinding

Videogames can expose us to landscapes with no limits beyond our own imagination. We get to satisfy the deep human desire of exploring and discovering things about the world around us and ourselves.

The Zelda series of games encapsulates this well. Ocarina of Time is one of the best-regarded entries. It was the first game that let players save their progress at any time, meaning that players could undergo a more complex, less frantic, less linear and more exploratory experience.

2017's Breath of Wild to this to much higher levels. There are no restrictions on where you go or which quests you do over the 100+ hours of play needed to fully complete the game.

Returning to the games we played as a child is often a negative experience - our rose-tinted lenses confront the realities of old technology, cumbersome controls and low quality visual experiences.

Alzheimer's disease robs people of both their memories and also they ability to navigate around space.

There are 2 frames of reference we use to navigate space:

We don't yet really understand how healthy individuals use these systems or how they decline as Alzheimer's disease progresses. Video games have started to be used to help us learn about these.

A 2015 study was written up in the media as showing that playing games like Call of Duty could increase the risk of developing Alzheimer's. As usual, the actual study was a lot more complicated and not anywhere near as like as worrisome.

A mobile game called Sea Hero Quest transmits data about what routes players take and their actions to a research team. Researches use this to understand what 'normal' navigation abilities look like and hence what's going on in Alzheimer's disease.

It's already led researchers to learn:

More typical citizen science projects use humans as "sophisticated computers" - e.g. Reverse the Odds was a game that shows players images of samples taken from bladder cancer patients between levels of the games and asked them to count and describe the cells. This data can be used to understand what type of treatment a cancer patient will best respond to. Humans are far better at these types of tasks than computers. Reverse the Odds incentivised participation by providing a fun game.

Video games can offer us opportunities to:

Chapter 12 - Digital spectator sports

Humans like stories that they can relate to, as well as watching people do the things they like at an extraordinarily proficient level, be that football, poker or video games.

Some people are skeptical that watching people play video games can be fun or meaningful. The author thinks this is an experiential rather than generational issue.

From the start, video games were designed to be multi-person, social, shared experiences.

Competitive gaming, aka e-sports, started towards the beginning of video game history. The first "intergalactic spacewar Olympics" was held in Stanford university in 1972.

In 2000, the huge popularity of e-sports in South Korea led their government to found the Korea e-Sports Association to promote and regulate e-sports tournaments.

Elsewhere televised coverage of games had limited take-up. To become a global phenomenon it needed to find a way to bring the human element of video gaming to the fore. This started happening once online streaming platforms such as Twitch were created.

Twitch made it easy to broadcast events to massive audiences. The resulting video usually shows the game screen as the player sees it along with the player who provides commentary. There's a text chat window where viewers can talk to each other and the player. These interactions create an immediate community based around a shared interest.

Game developers also started to realise that people wanted to watch games being played and building specific spectator-oriented features into games - theatre modes, highlights reels etc.

The popularity of e-sports has dramatically increased. In 2018 the global market was estimated to be worth $900 million dollars and have an audience of 380 million viewers.

The British E-Sports Association has been piloting holding tournaments in schools. First results suggest that getting involved in these competitions is beneficial to children in terms of building confidence, improving teamwork, communication and attendance.

In 2017 Blizzard build a e-sports stadium in California. They used it to host the first Overwatch League season in 2018. This became a multi-million dollar event, with broadcast rights alone though to cost $90 million. League players earn at least $50,000 a year plus extensive benefits.

It still feels to the author like there's something missing from the league. In a sport like football the players are at the heart of it - we cheer them on, they're are seen as heroes and villains. In e-sports more often however the game itself is often the star. There's a lack of human interest.

Some e-sports players do become celebrities of a kind - but mostly they have to generate celebrity themselves through social media.

Competitive video games constantly shift and evolve as the developers react to the ways players push the rules to gain an advantage. "The meta" is a always-changing form of gameplay that involves players determining the best way to use certain characters or strategies. Developers must react to the meta in after releasing new content in order to maintain a balanced game.

Zero out of the 100+ players starting the first Overwatch season were women. This isn't about ability - one of the best players in the world is a 19 year old South Korean woman, Kim ‘Geguri’ Se-yeon. All the excuses given as to why women aren't recruited are pathetic.

Games should be a great leveler. They do not require great physical strength, but rather mental agility, rapid reflexes and good communication and coordination skills - all of which in all genders.

There was much misogyny throughout the first season of the Overwatch League, which is doubly disheartening as a secondary goal of that particular game is to be inclusive. Characters from all walks of life are included without fanfare.

Overwatch – the game – is trying to simply normalise the different ways that people are, through its storytelling

To become a leader in terms of diverse competitive e-sports they the league needs to acknowledge the wrongs that happened in the past, fix them and proactively seek out players from a wide range of background. In turn, the players should passively and actively contribute towards creating an inclusive culture, highlighting that everyone can enjoy this form of entertainment.

Chapter 13 - Loss

It's harder than it seems to preserve video games for future generations. Each console, controller, cartridge or disk starts decaying from day 1. Parts go missing or break.

This is challenging for preservation specialists. But preserving video games is important, in the same way as preserving art or buildings. Video games are an integral part of our culture. We need to chronicle stories of how people played games and what people believed about them, even when it's hard to conserve the technology that runs them.

The preservation process should safeguard not only the game, but the context it was played in, what it was like to play, how the surrounding culture was and how fan communities participated - gameplay preservation, rather than simply game preservation.

The industry is generally not helpful. It tends to engage in cycles of planned obsolescence with the view that people should upgrade to a new console every so often.

There has been a recent burst of interest in retro games leading to emulators that play old games being made available. Sometimes this isn't helpful to preservation efforts as it can mask the way in which gameplay changed over time. Even though the Pac-Man game didn't change much, the way it was played very much did. Books on Pac-Man playing were published in the 1980s that focused on the maze. But in the 1990s, players' focus changed towards the behaviour of the ghosts. It wasn't until the late 1990s that the first "perfect" game of Pac-Man was played. This led players to discover that the game breaks at level 256.

Recording the evolution of gameplay provides a rich historical account of a cultural and social phenomenon.

Individually, the way we play a given game - why, how, what it means to us - can change over time, for instance as we get better or our life circumstances change.

It's hard to produce this documentation. Perhaps sometimes it's best to let old games and consoles vanish. Trying to keep them working as long as possible loses something important but intangible about them.

The powerful notion of loss forms the narrative of some video games themselves - e.g. Last Day of June. Other games might unintentionally end up representing loss to you as you place your own narrative on them.

We play video games for many reasons including:

They can fix us, and yes, they can also break us, and because of their immersive nature they can perhaps do this more strongly than any other form of media

Games haven't yet achieved full cultural acceptance. We don't have a great way of talking about them. For the most part though it's probably safe to ignore news articles claiming that video games are the cause of all negative things in society. But that's not to say we should ignore all concerns - some are valid:

Are games good or bad for us? The honest answer is that we don’t convincingly know either way, and it’s probably a bit of both.

...

As much as they’re a form of entertainment, they’re also a tool: one that must be treated with respect and responsibility, in the full knowledge that if used improperly or without due care and attention, they may cause as much damage as good.


Recent posts