Sei sulla pagina 1di 32

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/334225070

26 Intelligence and Video Games

Preprint · July 2019

CITATIONS READS

0 965

2 authors:

Maria A. Quiroga Roberto Colom


Complutense University of Madrid Universidad Autónoma de Madrid
2 PUBLICATIONS   0 CITATIONS    255 PUBLICATIONS   6,628 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Intelligence studies View project

Brain Image Analysis View project

All content following this page was uploaded by Roberto Colom on 04 July 2019.

The user has requested enhancement of the downloaded file.


C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 626 [626–656] 29.6.2019 7:37PM

26 Intelligence and Video Games


Maria Ángeles Quiroga and Roberto Colom

Introduction: Intelligence and Video Games


A great deal has been learned using the testing techniques that are ubiquitous today.
Forgetting or denigrating this information would be silly; science progresses by
building on the past. But it is time to move on to new techniques of measurement if
we want to obtain any major breakthrough.
(Hunt, 2011, p. 864)

Defining the Playground


A video game involves a user interface generating feedback from a device such as
a TV screen, a computer monitor, a tablet, or a smartphone. There are hundreds of
video games and some of them require reasoning, planning, solving problems, and
learning. These features are included in the definition of intelligence: “a very general
mental capability that involves the ability to reason, plan, solve problems, think
abstractly, comprehend complex ideas, learn quickly, and learn from experience”
(Gottfredson, 1997a).
Charles Spearman (1904) postulated the principle of the indifference of the
indicator based on the positive manifold (the substantive correlation among cogni-
tive tasks irrespective of their content). This manifold is one the most replicated
findings in psychology (Kovacs & Conway, 2016) and the implication is this: The
vehicles (or superficial characteristics) of the situations science uses for assessing
intelligence and cognitive ability are relatively irrelevant. The key lies in their
cognitive requirements (Hunt, 2011, Jensen, 1998). From this perspective, it
becomes possible to use video games for obtaining measures of the construct of
interest.
Nevertheless, defining the playground is relevant. “Intellectual ability” refers to
a consistent and stable disposition to solve families of cognitive problems (abstract,
verbal, numerical, visuospatial, mechanical, and so forth). According to the Cattell-
Horn-Carroll model (CHC; McGrew, 2009), several second-stratum or broad “cog-
nitive” abilities can be identified, but “cognition” and “intelligence” are not synon-
ymous. Intelligence (from the Latin term intelegere) refers to the ability to choose the
best solution to solve a problem. Cognition (from the Latin cognoscere) refers to the

626
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 627 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 627

faculty to process information from perception and acquired knowledge. Intelligence


includes psychological processes, but the reverse is not true.
Second-stratum or broad abilities capture shared variance among diverse tests
measuring a common ability to some extent. The psychometric properties of these
tests are carefully addressed, but this is usually not the case for the experimental
tasks tapping cognitive processes such as attention or working memory updating.
Finally, “skill” and “ability” must be clearly distinguished. Skill refers to the
easiness of doing a given activity well because of a greater experience or training. To
cook or to drive a car are skills. Higher-ability levels may facilitate the acquisition of
skills, but once acquired and automated, ability differences may become less
important.

First Studies Using Video Games


Video games research has considered “performance” and “experience.” The first
refers to the level obtained in the game, whereas the second refers to people’s playing
habits such as hours per week or genre. On the other hand, the “correlational” and
“comparative” approaches have been applied. The first focuses on analyzing the
covariance between standard ability tests and video game performance. The second
considers ability differences associated with levels of experience or amount of
training on the video games of interest.
The first studies using video games and administering intelligence tests were run
thirty years ago (Jones, Dunlap, & Bilodeau, 1986; Rabbitt, Banerji, and Szymanski,
1989). Significant correlations between the variables assessed by games and tests
were found. Jones and colleagues (1986) administered thirteen intelligence tests
(from the Kit of Factor-Referenced Cognitive Tests by R. B. Ekstrom, French, and
Harman, 1976) and five video games for the Atari console (Air Combat
Maneuvering, Breakout, Race Car, Slalom, and Antiaircraft). Correlation values
ranged from 0.18 (for the Slalom game) to 0.50 (for the Race Car game).

General
intelligence (g)

Fluid Crystallized Memory and Visual Auditory Retrieval Cognitive Processing


intelligence intelligence learning perception perception ability speediness speed

Stratum I cognitive abilities

Figure 26.1 Simplified depiction of the Cattell-Horn-Carroll (CHC) model.


There are more than sixty stratum I cognitive abilities summarized in a much
smaller set of stratum II abilities. General intelligence (g) is at the apex (stratum
III). The location of the stratum II abilities represents their higher or lower
relationship with the higher-order factor.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 628 [626–656] 29.6.2019 7:37PM

628 maria ángeles quiroga & roberto colom

Three years later, Rabbitt and colleagues (1989) assessed intelligence with the
AH4 test (Heim, 1968), whereas video game performance was evaluated using Space
Fortress. The AH4 test is a group-administered test consisting of sixty-five items
belonging to the verbal and numerical domains. Space Fortress is a video game
designed at the University of Illinois for studying complex-skill acquisition. The
game’s goal is to shoot missiles and destroy a space fortress (it can be installed and
run from http://hyunkyulee.github.io/research_sf.html), and participants played over
five successive days. Greater correlations were observed with increased practice
(from 0.28 to 0.68). This was the main conclusion: “a relatively unsophisticated
video-game, on which performance may reasonably be expected to be independent
of native language or acquired literacy, and which is greatly enjoyed by young people
who play it, rank orders individual differences in ‘intelligence’ nearly as well as
pencil and paper psychometric tests which have been specially developed for this
purpose over the last 80 years” (p. 13).
The increase in correlations from the first to the fifth session suggests that some
practice was necessary to overcome preexisting differences in familiarity with the
video game. The correlation became stable once those differences disappeared (stay
tuned).

The Video Games Jungle


Video games comprise a variety of genres. Their cognitive requirements are
different regarding planning, speed, psychomotor ability, and so on. Beyond their
superficial similarities, small differences among games may recruit different cogni-
tive processes (Sedig, Haworth, & Corridore, 2015). Thus, for instance, the term
“action video games” is highly unspecific: “it encompasses several video game
genres, without controlling for effects potentially stemming from differences in
mechanics between these video games” (Dobrowolsky et al., 2015, p. 59).
Video games can be categorized into genres taking into account (1) their gameplay
mechanics, (2) the in-game tasks, (3) the rules that players must follow, (4) whether
they are multiplayer or not, and (5) the devices required to play the game (Torre-
Tresols, 2017). Sajjadi, Vlieghe, and De Troyer (2017), to help researchers, describe
the connection between gameplay mechanics and a variety of abilities.
Martinovic and colleagues (2014) elaborated a matrix detailing the cognitive
processes probably recruited by several video games with the main aim of categor-
izing them. Based on their proposal, Table 26.1 provides a list of video games
classified by genre, subgenre, and main features.
The differentiation among video games has been increasingly refined. Indeed,
distinguishing genres and genre subtypes is tough. These fine distinctions were
absent in the first studies relating cognitive abilities and video game performance.
For example, Green and Bavelier (2003) considered some action video games titles
to belong to the Action genre, but others to belong to fighting, shooter, and car races
(see Table 26.1).
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 629 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 629

Table 26.1 Types of video games (after Torre-Tresols, 2017)


Genre Subgenre Main features Examples
1. Action Speed; high level of psychomotor Grand Theft Auto
abilities.
2. Adventure Quick-time events; exploring big
surfaces collecting objects to solve
problems.
2.1. Graphic Point and click; 2D or 3D; player Monkey Island
adventure interacts with the mouse or different Hotel Dusk Room
control devices to complete tasks. 215
Zero Scape
2.2. Visual novel Player has no control over the Steins; Gate
character.
2.3. Mixed These games have features both from Heavy Rain
graphic adventures and visual novel. The Wolf among Us
3. Action- Action games which place an
adventure importance upon narrative; combine
elements from different gameplay
styles with the same focus (when the
game focuses mainly on a specific play
style it will be grouped in that category
instead of here); subgenres differ in the
focus they give to certain elements
(planning versus shooting, for
example).
3.1. Stealth action Require mainly planning because Metal Gear
players cannot defend themselves from Deus Ex
the enemy through simple force or
Sprinter Cell
firepower.
3.2. Survival horror Main goal is to survive in a frightening Resident Evil
atmosphere; player can, in some titles, Clock Tower
defend themselves from enemies.
Survival Run
3.3. Platforms Player has to overcome obstacles. Super Mario
Sonic the
Hedgehog
Megaman
Rayman
3.4. Metroidvania Combines exploring plus platform Ori and the Blind
features. Forest
Hollow Knight
4. Sandbox Player moves freely in an open world Minecraft
where they can choose what to do; Garry’s Mod
player can change the game world.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 630 [626–656] 29.6.2019 7:37PM

630 maria ángeles quiroga & roberto colom

Table 26.1 (cont.)


Genre Subgenre Main features Examples

5. Fighting Physical fighting in small and closed


areas. These games require high
mechanical skill to use the control
buttons.
5.1. Fighting game Combat between two or more fighters, Street Fighter
classic which can be grouped, of comparable Super Smash Bros
strength; often divided into rounds.
Tekken
Guilty Gear
Blazblue
5.2. Beat ’em up Focused on cooperative game and Double Dragon
player versus environment; usually 2D; Golden Axe
easy combats that include narrative.
5.3. Hack ’n’ slash A variation of Beat ’em ups; hand-to- Devil May Cry
hand combat; medium mechanical Bayonetta
skill.
Metal Gear Rising:
Revengeance
6. Shooter Mainly focused on moving and
shooting; 2D or 3D; first (FPS) or third
person (TPS) shooting.
6.1. Shoot ’em up Similar to beat ’em up but shooting; Aero Fighters
action could be vertical or horizontal. Space Invaders
Galaga
Satazius
6.2. Danmaku These games show a curtain of fire; Ikaruga
(bullet hell) require high attentional level, Touhou
adaptation to novelty; and high control
of fine motor movements; it is more
important to dodge bullets than to
attack enemies.
6.3. Classic shooter Shoot ’n’ run; high speed of Doom (FPS)
movements required; very quick Wolfenstein (FPS)
action; unlimited arsenal of weapons;
Unreal Tournament
one player or multiplayer; lack of life
(Multiplayer FPS)
regeneration.
Quacke
(Multiplayer FPS)
Half Life
(Multiplayer)
6.4. Tactical Limited amount of weapons; the Counter-Strike
shooter chosen weapon determines the
character’s speed; multiplayer.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 631 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 631

Table 26.1 (cont.)


Genre Subgenre Main features Examples

6.5. Modern Multiplayer; reduced teams, small Titanfall (FPS)


shooter and closed maps; small recoil-weapons; Battlefield (FPS)
life regeneration.
Gear of War (TPS)
Splatoon (TPS)
7. Role- playing Main feature is to let the character
game (RPG) evolve; high load on narrative;
real-time action or in turns; player
knows the quantitative value of the
attributes they obtain (strength and
speed usually); different roles: tanks,
healers; and DPS (damage per second).
7.1. Western RPG More focused on expression and Pillars of Eternity
fantasy; player Divinity
develops their avatar in a story; combat
in turns.
7.2. Japanese RPG No avatar; highly narrative loaded, Final Fantasy
mainly focused in interpersonal Bravely Default
relationships; combats in turns.
7.3. Action RPG Real-time combats. The Legend of
Zelda
Bayonetta
Dark Souls
7.4. Dungeon Player explores gigantic dungeons; 3D; Etrian Odissey
crawler first-person perspective. Might and Magic
7.5. Tactical RPG Include elements from strategy genre; Final Fantasy
combat, rather than exploration, is Tactics
central .
8. Strategy Player has to focus on tactics and long-
term plans to complete the mission.
8.1. Strategy in Player controls units, collects, and Civilization
turns manages resources; one or multiplayer;
map for the mission is divided into
cells; only one action per turn; more
focused on strategy than in combat.
8.2. Real-time Continuous action without pauses; Starcraft (one of
strategy (RTS) player controls several units they can the more famous
send to combat, build, or collect; one or esports)
multiplayer. Hears of Iron
Age of Empires
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 632 [626–656] 29.6.2019 7:37PM

632 maria ángeles quiroga & roberto colom

Table 26.1 (cont.)


Genre Subgenre Main features Examples

8.3. Tower defense Building defenses to avoid enemies Flash Element


trespassing; decision-making about Tower Defense
strategic locations for your defenses; Desktop Tower
some titles from this genre include Defense
shooter playability, allowing player to
Orcs Must Die!
defend against enemies.
Sanctum
8.4. MOBA Action real-time strategy (ARTS) or League of Legends
(multiplayer online Dota-like (Dota = Defense of the Dota 2
battle arena) Ancients); player controls a unit that
(both titles are
moves along a symmetric map to
famous eSports
destroy the enemy base; multiplayer
games)
only.
8.5 Tactic in turns Similar to strategy genre in turns and X-Com
tactic RPG, but combat focused.
8.6. Real-time Combat strategy, emulating tactics Total War
tactics from the battlefield. Full Spectrum
Warrior
9. Puzzle Focused on solving problems unrelated
to each other; barely including
narrative (exception is the Professor
Layton saga); solutions to problems do
not rely either on speed or accuracy but
on intellectual abilities.
9.1. Educational Goal is to teach though problems or Brain Training
games questions; usually for children; when Big Brain Academy
games include playful elements the
genre is named “edutainment games.”
9.2. Action puzzles Played in real time; player has to Portal
perform very coordinated actions to Portal 2
solve the problems; usually problems
are visuospatial.
10. Car races Driving vehicles; titles can be more or Gran Turismo
less realistic. Forza
Mario Kart
Crash Team Racing
11. Music These games are focused on dancing,
singing, or following rhythms.
11.1. Rhythm Player must type commands following Beatmania
the music; two types: played by hand Dance, Dance
and played by feet. Revolution
Pump It Up
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 633 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 633

Table 26.1 (cont.)


Genre Subgenre Main features Examples

11.2. Dancing Player has to imitate the movements Just Dance


they see on the screen while hearing
a song.
11.3. Singing Player has to sing a song whose lyrics Sing Star
are on the screen (like a karaoke).
12. Sports These games reproduce the practice of Fifa
a sport. Wii Sports
13. Nonmechanical Titles in this genre can also belong to
genres one of the other genres, for reasons
other than game mechanics.
13.1. Arcade Essentially this refers to the distribution
format; in earlier times, each of these
games was played on a specific
machine in a public environment (e.g.
pub); never rely on narrative.
13.2. Simulators “Simulation games” may or may not The Sims
refer to the game mechanics, because Animal Crossing
they can be referred to as their own
Farmville
genre or be a simulation game of
another genre (e.g. racing simulation). Forza (driving)
StarCitizen
(piloting an spatial
ship)
13.3. Massive All players play in a shared world World of Warcraft
multiplayer online interacting among themselves. Usually Dungeon Fighter
(MMO) these games are also RPG Online (playability
(MMORPG). like beat ’em up)
13.4. Roguelike These games consist of dungeons that The Binding of
are built at random each time the player Isaac
plays; only one player; no life Enter the Gungeon
regeneration; exploration is essential to (action)
solve the game; the goal is not to finish
Nuclear Throne
the game once but many times (to
(action)
unlock special features).
Faster Than Light
(tactics)
Many of these games are also action- Strafe (FPS)
adventure genre but there are Forgotten Depths
exceptions.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 634 [626–656] 29.6.2019 7:37PM

634 maria ángeles quiroga & roberto colom

Genre is important when comparing video game players with non-players. In this
regard, Dobrowolski and colleagues (2015) compared people who had played
mainly first-person shooter (FPS) or real-time strategy (RTS) video games for
seven or more hours per week during the six months previous to starting the study
with people who had played five hours or less per week (including no more than two
hours per week of FPS and/or RTS). Players and non-players were compared on task-
switching performance and multiple-object tracking. RTS players outperformed
non-players on the set size they could accurately follow in the MOT and they were
also less affected by switches than non-players in the switching task. There were no
differences found between the FPS players and FPS non-players.
The enhancement of attention skills sometimes found (Green and Bavelier, 2003)
might result from the different processes in players whose experience comes from
different game genres. These processes may remain unknown if the analyzed group
of participants includes more individuals who have played shooter and strategy
games (e.g., Team Fortress Classic) than people who have played car racing
games (e.g., Super Mario Kart), even though both are “action games.”
Experience with video games is usually assessed in terms of hours per week and
video game genre. There are some questionnaires for assessing these variables. The
Video Games Playing Habits (VGPH) questionnaire by Quiroga and colleagues
(2011) and the Video Game Playing Questionnaire by the Bavelier Lab (Bediou
et al., 2018) are two examples.
The first studies considered non-players only (Adams & Mayer, 2012; Glass
Maddox, & Love, 2013; Quiroga et al., 2009, 2011) because it was relatively easy
to find naïve participants. When video game experience started to be explicitly
considered, the cut-off was five or more hours per week (Green & Bavelier, 2003;
Green, Pouget, & Bavelier, 2010). Later this cut-off rose to 6–7 hours per week (West
et al., 2017). The numbers are expected to increase steadily.

Intelligence Assessment Using Video Games

The Association between Intelligence and Video Game Performance:


Cautionary Tales
There are research findings showing a lack of association between playing commer-
cial video games and individual differences in cognitive ability (Gnambs & Appel,
2017; Unsworth et al., 2015). However, other studies report very different conclu-
sions (Foroughi et al., 2016; Kokkinakis et al., 2017; Quiroga et al., 2015). To know
why there is this discrepancy, some crucial points must be clarified.
First, video game performance cannot (and should not) be estimated using time
invested playing video games. The amount of time devoted to doing something is not
a guarantee of achieving greater performance (Macnamara, Hambrick, and Oswald,
2014). As noted by Green and colleagues (2017), the relationship between practice
and outcome is not linear and, therefore, it is strongly inappropriate to use playing
time as a proxy of playing performance.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 635 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 635

Unfortunately, this is usually overlooked. Thus, for instance, Sala, Tatlidil, and
Gobet (2018) examined the meta-analytic correlation between video game perfor-
mance and cognitive ability (or cognitive processes). Their main conclusion was
this: There is no relation between the two domains. However, studies measuring
video game performance (N = 28) and those measuring video game playing hours
(N = 38) were combined, leading to a strange mix of effects (performance, motiva-
tion, etc.). As detailed below, correlations between intelligence and video games are
indeed substantial when studies measuring just playing hours are excluded.
Second, not all tasks are proper measures of cognitive ability. Visual attention
tasks, for instance, do not measure any second-stratum or broad ability. At best, they
can be considered within the first or narrow stratum below general visualization (Gv)
(Figure 26.1). At worst, some visual-attention tasks measure very specific cognitive
processes weakly related with the cognitive ability of interest.
Cognitive “abilities” and cognitive “processes” belong to conceptual realms that
must be distinguished. Again, in the meta-analytic study by Sala and colleagues
(2018), from the twenty-eight papers using raw scores as video game performance,
twelve referred to action video games. Among those twelve papers, seven have been
published in peer-reviewed journals. The correlations between cognitive ability tests
and video game performance were: Progressive Matrices = 0.63; Symmetry Span =
0.30; Mental Rotation = 0.69; Mental Paper Folding = 0.40. However, the correlations
between cognitive tasks1 and video game performance were: Antisaccade task = 0.15;
Change Detection Task = −0.11; Color Wheel Task = −0.31; Matching Figure Task
(RT) = 0.12; Matching Figure Task (Accuracy) = 0.01; Visual Search Task = 0.11. The
difference between cognitive ability tests (average correlation 0.69) and cognitive
tasks (average correlation 0.14) is pretty obvious.
Furthermore, research is moving fast beyond computing simple correlations
between one test or task and performance on a given video game. The interest
focuses now on the latent traits tapped by various specific measures (Baniqued
et al., 2013; McPherson & Burns, 2007, 2008; Quiroga et al., 2009, 2011). Results
derived from this much more appropriate approach – based on the estimation
of second-stratum abilities (usually Gf, Gc, Gv, Gy, Gs) and the computation of
structural equation models (SEM) correlating latent factors for cognitive ability and
for video game performance (Baniqued et al., 2013; Foroughi et al., 2016; Quiroga
et al., 2015, 2018) – are summarized in Table 26.2.
Third, the characteristics of the sample must be explicitly considered. Studying
children, adolescents, or adults may have differential impact on the observed
findings.
It is very important to keep in mind that meta-analytic studies are not the “cure-all”
remedy for psychological science. The combination of weak studies, even using
sophisticated statistical tools, cannot replace carefully designed and developed
studies. Mega-samples of individuals combined from largely disparate designs

1
We use the term “cognitive tasks” instead of “cognitive tests” for those measures that were designed as
laboratory tasks and lack precise psychometric properties.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 636 [626–656] 29.6.2019 7:37PM

636 maria ángeles quiroga & roberto colom

may appeal to the naïve reader, but must be deeply inspected by the specialist before
buying the message.
Meta-analytic reports can be very damaging for emerging research fields. The
meta-analysis of Sala and colleagues (2018) discussed above is a paradigmatic
example. Only four of the eighteen studies specifically focused on cognitive abilities
measuring video game “performance” were considered. As underscored by
H. J. Eysenck (1993):
Including all relevant material – good, bad, and indifferent – in meta-analysis admits
the subjective judgments that meta-analysis was designed to avoid. Several
problems arise in meta-analysis: regressions are often non-linear; effects are often
multivariate rather than univariate; coverage can be restricted; bad studies may be
included; the data summarized may not be homogeneous; grouping different causal
factors may lead to meaningless estimates of effects; and the theory-directed
approach may obscure discrepancies. (p. 789)

In short: revise and think carefully about the information included in published
meta-analyses because there may be much more than meets the eye (and for the
worse).

Intelligence and Video Game Performance in Adults


Table 26.2 summarizes the results reported in research studies relating cognitive
ability and video game performance published since 2007.
There is variability among studies, but commonalities can be highlighted. All
correlations are positive, which is consistent with the principle of the indifference of
the indicator, even when different genres are considered: puzzles, sports, shooters,
real-time strategy, MOBAs (multiplayer online battle arena), or customized ones.
The correlation with intelligence tests can be underestimated studying MOBAs.
Furthermore, video game matchmaking ranking (MMR: ratio of historical wins to
losses) is sometimes used as the performance measure, although MMR from differ-
ent leagues does not imply the same performance level. This practice may also
underestimate the correlation with intelligence.
The reliability values for the video games considered are in the medium to high
range (Table 26.2): (1) for puzzles values are high (internal consistency from 0.75 to
0.942; stability from 0.65 to 0.84; split-half = 0.92); (2) for third-person shooters they
are also high (0.92) and (3) for sports games they are medium to high (0.77 to 0.86).
Cognitive abilities assessed with video games differ across studies, including fluid
intelligence (Gf), broad visual perception (Gv), general memory and learning (Gy),
and processing speed (Gs), but the correlation values are similar when using com-
posite scores: from 0.69 to 0.74 for Gf, or from 0.41 to 0.67 for Gs, for instance (see
Table 26.2 for further details). The lowest correlations are for attention tasks. Results
from the two SEM models tested were 0.93 for brain games and 0.78 for non-brain
games. Therefore, the general cognitive ability factor (g) and the general video game

2
Except for two Big Brain Academy games; Faces = 0.44, and Color Count = 0.57.
Table 26.2 Summary of studies relating intelligence and video games

Results (reliability and convergent


validity, at the test and at the latent
Study Video game description Ability measured1 levels)
McPherson & Burns Space Code Game (customized action game). Gs (processing speed) Test-retest = 0.84
(2007) Computer mouse response method. Gs = 0.67 (4 tests)
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

Goal is to destroy enemies’ spaceships that appear in the Gv = 0.35 (3 tests)


window view of a cockpit. At the bottom of the cockpit, APM = 0.49 (Gf)
637

nine spaceships are presented, each one with a single VAL WJ-III = 0.24 (Glr)
digit placed directly above. Destroying a spaceship
requires firing the number placed above the matching
ship at the bottom of the screen.
[626–656] 29.6.2019 7:37PM

Video game experience was not assessed.


N = 61, 37 females, mean age 20.
McPherson & Burns Space Matrix (customized action game). Gy (general memory and learning) Test-retest = 0.77
(2008) In this game, participants were asked to destroy Correlations at the test level:
spaceships in the same manner as in Space Code while Dot Matrix = 0.66
also monitoring where dots were located on a 5 x 5 grid APM = 0.51
as the one Picture Swaps = 0.54
developed for the Dot Matrix test (Miyake et al., 2000). Correlations at the construct level:
These dot locations were described as indicators of Gf/WM = 0.69 (3 tests)
which “sector” of space they were operating in. From Gs = 0.40 (3 tests)
time to time participants had to report back to head-
quarters which sectors they had been operating in. The
screen layout was the same as that for Space Code, with
the addition of the sector grid, which appeared at inter-
vals to the right of the numerical response grid on the
cockpit control panel.
Table 26.2 (cont.)

Results (reliability and convergent


validity, at the test and at the latent
Study Video game description Ability measured1 levels)

Video game experience was assessed related to hours per


week played and experience playing with a mouse.
N = 70, 40 females, mean age 19.6.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

Quiroga et al. (2009) Three games from Big Brain Academy for the Nintendo g (general mental ability), obtained Mallet Math = 0.12 to −0.52
638

Wii Console: Mallet Math, Reverse Retention, and from 5 tests (Numerical Rev. Retention = 0.43 to 0.34
Train Turn (puzzle games). Reasoning, FDSPAN, FLSPAN, Train Turn = 0.49 to 0.67
Participants played 10 blocks, each consisting of 10 Rotation of Solid Figures, and
items, during 2 nonconsecutive weeks (15 days of Corsi Block).
[626–656] 29.6.2019 7:37PM

separation).
Video game experience was assessed. Selected partici-
pants had no previous experience with these video
games.
N = 27, 17 females, mean age 21.5.
Quiroga et al. (2011) Two games from Big Brain Academy for the Nintendo g (general mental ability), obtained Train Turn = 0.65 to 0.67
Wii Console: Train Turn and Speed Sorting (puzzle from 5 tests (PMA-R, PMA-S, D- Speed Sorting = 0.65 to 0.34
games). 48, and Rotation of Solid Figures).
Participants played 25 blocks, each consisting of 10 items,
during 5 consecutive weeks.
Video game experience was assessed, with the Video
Games Playing Habits. Selected participants had no
previous experience with these video games.
N = 27 females, mean age 21.
Adams & Mayer (2012) Tetris (puzzle game). Gv (broad visual perception) static Gv static:
Unreal Tournament (UT; classic shooter game). (Paper Folding and Mental Pap. Fold. – Tetris = 0.24
All participants were non-video game players. Rotation) and Gv dynamic (Race2 Pap. Fold. – UT = 0.27
N = 69, 44 females, mean age 19.3. and Interception Tasks by Hunt et M. Rotat (errors) – Tetris = −0.21
al., 1988). M. Rotat. (errors) – UT = −0.27
Gv Dynamic:
Race2 RT – Tetris = −0.20
Intercep. Hits – Tetris = 0.19
Race2 RT – UT = −0.27
Intercep. Hits – UT = 0.21
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

Baniqued et al., (2013) 20 casual games (for computer; can be considered to be Gf (fluid intelligence) At the test level (Zmean):
puzzle games), grouped in 4 types: Reasoning, Working Gy (general memory and Gf Gy Gs Glr Att.
639

Memory, Spatial Reasoning, Attention, Visuo-Motor learning) WM+R 0.65 0.55 0.36 0.12 0.06
Speed and Perceptual Speed. Gs (processing speed) Spat. Rel. 0.57 0.44 0.18 0.00 0.13
N = 219, 33% male, mean age 21.7. Glr (long-term retrieval) Attention 0.46 0.41 0.28 −0.03 0.19
Attention Vis.Motor 0.27 0.17 0.23 0.03 0.08
[626–656] 29.6.2019 7:37PM

Per.Speed 0.36 0.24 0.24 0.02 0.06


At the latent level:
Ventura et al. (2013) Virtual Spatial Navigation Assessment – VSNA (adventure Gv (broad visual perception) Test-retest = 0.65
game). Spatial Orientation Test = 0.18
Participants have to find a set of gems in a 3D environment Mental Rotation Test = 0.26
using a first-person avatar.
Participants answered a general question about how often
they play video games.
N = 323, 194 females, no mean age reported.
Shute, Ventura, & Ke Portal 2 (action puzzle game) Gf (fluid intelligence) Portal 2:
(2015) Lumosity Platform (includes 52 puzzle games). Gv (broad visual perception) SPM-reduced = 0.02
77 participants, 43% male, mean age 19.7. Creativity Insight Test = −0.38
Problem-solving Remote Association Test (creativity)
= −0.18
Mental Rotation = −0.33
Spatial Orientation = 0.27
Table 26.2 (cont.)

Results (reliability and convergent


validity, at the test and at the latent
Study Video game description Ability measured1 levels)

VSNA = 0.34
Lumosity:
SPM-reduced = 0.37
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

Insight Test = 0.26


640

Remote Association Test (creativity)


= 0.10
Mental Rotation = −0.05
Spatial Orientation = −0.11
[626–656] 29.6.2019 7:37PM

VSNA = −0.10
Quiroga et al. (2015) Ten Big Brain Academy games and Garden Gridlock (for Gg (general mental ability) Reliability (internal consistency):
the Nintendo Wii console) plus Tilt Maze (for compu- Gf (fluid intelligence) Analyze games: 0.76 to 0.80
ter), which were grouped in 4 types following game Gc (crystallized intelligence) Memorize: 0.44 to 0.67
developers’ descriptions: Analyze, Memorize, Gv (broad visual perception) Compute: 0.57 to 0.80
Compute, and Visualize. Gy (general memory and Visualize: 0.71 to 0.95
All are puzzle games. learning) At the test level:
Video game experience was assessed. Selected partici- Gs (processing speed) Gf Gc Gv Gy Gs
pants were naïve for the Wii console and Big Brain Analyze 0.62 0.40 0.65 0.30 0.55
Academy video game. Memorize 0.46 0.44 0.32 0.37 0.44
N = 188, 67 men, mean age 22.2. Compute 0.54 0.47 0.48 0.44 0.46
Visualize 0.64 0.34 0.66 0.29 0.41
At the latent level:
g – video game latent = 0.93
Buford & O’Leary Modified version of Portal 2 (action puzzle game). Gf (fluid intelligence) Split-half reliability of 0.92
(2015) Gc (crystallized intelligence) SPM = 0.46 (IPS)
Previous game experience as well as experience and skill Shipley Block Patterns = 0.49 (IPS)
with Portal 2 was assessed. Shipley Vocabulary = 0.30 (IPS)
Two samples of 94 (online sample – OS; mostly men, Wonderlic = 0.27 (OS)
mean age 24.8, very high experience playing. Only 27
completed the cognitive measures and 73 (in person –
IPS; 58% women, mean age 19.6, almost no experience
playing) participants.
Foroughi et al., (2016) A version of Portal 2 developed by authors (action puzzle Gf (fluid intelligence) Reliability (α) = 0.80
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

game). At the test level:


Two samples: APM – VGPs = 0.65
641

APM – both = 0.61


• N = 35 video game players (VGPs) experienced playing
BOMAT – both = 0.63
Portal 2, 9 females, mean age 21.3
APM – NVGPs = 0.60
• * N = 100 video game players and non-video game BOMAT – NVGPs = 0.67
[626–656] 29.6.2019 7:37PM

players (NVGPs), 74 females, mean age 21.6. At the latent level:


Gf = 0.78
Bonny, Castaneda, & Dota 2 (action real time strategy – ARTS; MOBA). Gy (general memory and WM task accuracy = 0.03
Swanson (2016) Participants were recruited in the International learning) WM task RT = −0.04
Tournament 5 of Dota 2. Numerical processing ability Location Mem. Task-d’ = −0.06
Dota play history was assessed. Location Mem. Task – RT = −0.10
N = 396, 34 females, mean age 23.4. Number Task Accuracy = 0.24
Number Task – RT = −0.11
Quiroga et al. (2016) • Professor Layton and the Curious Village (puzzle game). g (general mental ability) obtained Reliability (internal consistency):
• Two samples recruited in 2009 and 2016: from 3 tests: AR, SR, and VR from 0.94
the Differential Aptitudes Test). First sample:
• N = 47, 9 males, mean age 19.6 Puzzles found = 0.10 to 0.65
• N = 27, 6 males, mean age 20.6. Puzzles solved = 0.20 to 0.58
Participants completed 15 hours playing in six weeks. Second sample:
Puzzles found = 0.40 to 0.53
Puzzles solved = 0.30 to 0.58
Table 26.2 (cont.)

Results (reliability and convergent


validity, at the test and at the latent
Study Video game description Ability measured1 levels)

Kranz et al. (2017) Six casual games (3 adaptive and 3 non-adaptive. Action Gf (fluid intelligence) Reason. – adaptive games = 0.60 to
puzzle games and puzzle and skill games). Gy (general memory and 0.74
Ten 20-minute playing sessions (2 to 3 per week) learning) WM – adaptive games = 0.40 to 0.65
N = 94, 30 males, mean age 21.2. Gs (processing speed) Per. Speed – adap. games = 0.00 to
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D

0.05
642

Reason. Non-adap. games = 0.60 to


0.48
WM – Non-adap. games = 0.50 to
0.38
Percep. Speed – Non-adapt. games =
[626–656] 29.6.2019 7:37PM

0.37 to 0.38
Kokkinakis et al. (2017) League of Legends (LoL), Dota 2 (action real time strat- Gf (fluid intelligence) Matrix-WASI II = 0.44
egy, multiplayer online battle arena). Gy (general memory and Rotation Span = 0.26
All subjects were experienced LoL players who had played learning) Symmetry Span = 0.12
a large number (> 100) of both “ranked” and Operation Span = 0.03
“unranked” matches.
N = 56, 51 males, mean age 20.5 years.
Kirkegaard (2018) Dota 2, League of Legends, Starcraft II (action real time g (general mental ability) National IQ and general gaming ability
strategy –ARTS, MOBA); = 0.79
Counter Strike: Global Offensive; Overwatch (first-person
shooter)
Counter Strike (tactical shooter)
Hearthstone (card game)
Super Smash Bros (classic fighting game).
Data collected at the country level (N = 195 countries).
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
643

Lim & Furnham (2018) Taboo (board game), Portal (action puzzle). Gf (fluid intelligence) Portal (time taken) – APM = −0.61
N = 112, 101 males, mean age 18.6 years. Gc (crystallized intelligence) Taboo (describing) – APM = 0.33
[626–656] 29.6.2019 7:37PM

Taboo (guessing) – APM = 0.28


Quiroga et al. (2018) Space Invaders (shoot ’em up), Splatoon (third-person g (general mental ability)G Reliability (internal consistency):
shooter), Art of Balance, EDGE, Hook, Rail Maze, Blek Gf fluid intelligence) Splatoon = 0.92; Blek = 0.91; Rail
(puzzle games), Unpossible (action game), Sky Jump, Gv (broad visual perception) Maze = 0.81; Sky Jump = 0.77;
Crazy Pool (sports game). Gs (processing speed) Crazy Pool = 0.86.
N = 134, 29 males, mean age 21.04. At the latent level:
g – video game latent = 0.79
1
Second-stratum abilities will be referred to using the CHC theory of cognitive abilities.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 644 [626–656] 29.6.2019 7:37PM

644 maria ángeles quiroga & roberto colom

Figure 26.2 Correlations between the latent factors representing general video
game performance (VG) and the general factor of intelligence (g) from SEM
model with brain games (upper panels) and from SEM model with non-brain
games (bottom panels) (after Quiroga et al., 2015, 2018).

performance factor are closely similar. This opens the door to the design of intelli-
gence assessment batteries using video games (see Figure 26.2).
Regarding Gf, when video games are very novel, raw correlations with cognitive
ability are low at the beginning and increase until reaching the 0.65/0.74 range
(Kranz et al., 2017; Quiroga et al., 2016; Rabbitt et al., 1989). This increased
correlation demonstrates that video game performance is far from automated across
practice (Ackerman, 1988; Quiroga et al., 2011).
Studies include players and non-players. In this regard, the study by Foroughi and
colleagues(2016) shows that previous experience with the game hardly changes the
correlation between fluid intelligence and video game performance. New items were
designed using the mod that Portal 2 includes (Buford & O’Leary, 2015; Foroughi
et al., 2016). Portal 2 consists of chambers containing puzzles to be solved. The mod
allows researchers to build their own chambers (each chamber is usually like an item
in a test) and so remove the effect of previous experience to solve the new game.
Note that these results support measurement invariance for video games related to
Gf, or in other words, the video game is measuring the same construct irrespective of
the experience players have.
More recent studies have introduced the assessment of playing habits and self-
perceived skill when playing for identifying profiles of video game players across
different genres. The first studies simply selected participants without any experi-
ence, but this is unfeasible nowadays.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 645 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 645

There is a lack of studies regarding predictive validity. The few studies consider-
ing this crucial issue focused on the association between academic success and
playing habits assessed by hours per week devoted to playing (Drummond &
Sauer, 2014; Posso, 2016). Higher scores in the 2012 Program for International
Student Assessment (PISA) were observed in students playing more hours per week.
Specifically, students who played online games almost every day scored 15 points
above the average in math and reading, and 17 points above the average in science.
This advantage was absent in those using social networks. In fact, students using
online social media on a daily basis scored 4 percent lower than the average on math,
reading, and science.
There are no studies relating video game performance and job performance, but
Chiang (2010) enumerated ten ways video games might boost occupational achieve-
ment using World of Warcraft (a role-playing [RPG] and massive multiplayer online
[MMO] game; see Table 26.1 for details). Chiang enumerated several facets (leader-
ship, dealing and learning from failure, teamwork, developing talent, flexibility
[learning to improvise], being performance driven, living for challenge, competi-
tiveness, entrepreneurship, and managing information) but this still requires formal
research.
In conclusion, video game performance correlates with cognitive abilities.
However, more systematic research is required using a clear theoretical framework
regarding the cognitive abilities considered along with the superficial features and
mental requirements of the analyzed video games.

The Measurement of Cognitive Processes Associated with


Intelligence Using Video Games
Cognitive processes involve (1) the acquisition and understanding of
knowledge, (2) decision-making, and (3) problem-solving. There are two processes
extensively analyzed with respect to video game performance: perception and
attention (Bediou et al., 2018).
In this regard, video game research has been focused on first- and third-person
shooters, usually referred as action video games. The key features of these games
are: fast pace, high cognitive load requiring updating, systematic switching between
local and global fields of action, and selective attention to detect relevant items
among distractors.
The fast pace of these games is ideal for youngsters. Findings usually show that
video game experience is associated with more efficient cognitive processes: visuos-
patial cognition g3 = 0.75; perception g = 0.78; top-down attention g = 0.63; multi-
tasking/switching g = 0.55; inhibition g = −0.31 and verbal cognition g = 0.30.
Obtained effect sizes for video game training are lower than those for video game
experience (more than 30 hours of training are required for achieving noticeable

3
This is Hedge’s g, which is equivalent to Cohen’s d but especially suited for small sample sizes in meta-
analysis. It estimates effect size correcting for positive bias.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 646 [626–656] 29.6.2019 7:37PM

646 maria ángeles quiroga & roberto colom

improvements). However, follow-up data are required for confirming these positive
effects. Furthermore, it is quite possible that children showing higher cognitive
ability levels from the outset are more prone to play. Unfortunately, these research
studies do not measure ability baseline levels.
For real-time strategy (RTS) games, video game experience is related to the set
size that can be followed accurately in a multiple object tracking (MOT) task. Also,
RTS players are less affected by task switching than non-players (Dobrowolski et al.,
2015).
In a study of video game training by Glass and colleagues (2013), involving forty
hours training on Starcraft, a gaming condition that emphasized rapid switching
between multiple sources of information and action (the player commands and
controls two separate bases in multiple battles against two different opponent
bases) led to a large increase (Stroop d = 0.70) in cognitive flexibility compared to
playing The Sims (a life simulator game) for the same amount of time. Interestingly,
an even larger effect (d = 1.44) in cognitive flexibility has been obtained after
training for only two hours with a customized game that requires switching between
competing tasks (Parong et al., 2017).

Intelligence, Video Games, and the Brain


Playing video games is usually intensive and extensive. We have already
highlighted the relevance of different playing habits with respect to the measurement
invariance of video games. Now we discuss some neural correlates of video game
playing. These neural correlates will be related with (1) how intensive and extensive
the practice has been, (2) the video game genre, and (3) the players’ cognitive profile.
Table 26.3 summarizes the published studies.
Analyzing the same group of participants who had completed Professor Layton
and the Curious Village (see Tables 26.1 and 26.2 for a comprehensive description),
which took sixteen hours on average (four hours per week during four weeks),
structural and functional brain changes were observed when compared with
a control group. Regarding brain structural responsiveness to practice, Colom and
colleagues (2012) analyzed cortical gray-matter volume, cortical surface area, cor-
tical thickness, and white matter integrity. Gray-matter changes were mainly circum-
scribed to frontal regions, but there were also some findings in the temporal and
parietal lobes. White matter integrity increased in the hippocampal cingulum and the
inferior longitudinal fasciculus.
The study by Martínez and colleagues (2013) computed group-independent com-
ponent analyses applying multi-session temporal concatenation on test-retest resting
state fMRI along with a dual-regression approach. The key finding revealed
increased correlated activity in parietal-frontal networks after playing the game
(Figure 26.3) (the video animation showing the regions involved on the identified
networks can be seen here: www.youtube.com/watch?v=jj3eaMm-Frc).
The functional changes occurred mainly in left temporal, parietal, and frontal
networks involved in varied memory and executive functions presumably relevant
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 647 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 647

Table 26.3 Main neural correlates of playing video games (after Palaus et al., 2017; Colom et al.,
2012; Martínez et al., 2013)

Variable Video game Neuroimaging method Results


Intelligence
and working
memory
Video game Rise of Nations (real- Magnetic resonance Volumetric changes in
(VG) time strategy) imaging (MRI) with dorsolateral prefrontal cortex
performance optimized voxel-based (dlPFC).
morphometry (VBM)
Warship Commander NIRS (infrared Higher activation of prefrontal
Task (action) spectroscopy) regions associated to game
difficulty (dlPFC).
Neuroracer (3D - Stimulating left dlPFC using
customized game) tDCS obtained improvement in
multitasking performance.
Tank Attack 3D (action Diffusion tensor imaging White matter FA in the right
game); Sushi Go Round (DTI) scans fornix/stria correlated with
(strategy without action) action game learning whereas
white matter FA in the left
cingulum/hippocampus
correlated with strategy game
learning.
VG Starcraft (real-time Cortical thickness Increased cortical thickness in
experience strategy) (FreeSurfer software) parietal cortex correlated with
winning rates of the league.
League of Legends; MRI and fMRI Consolidate connectivity
Dota (functional magnetic between executive regions
resonance imaging) (dlFC and PPC) and the salience
network (anterior insula and the
ACC).
Guilty Gear (third- VBM and statistical Structural gray matter change in
person shooter) parametric mapping posterior parietal. VGPS higher
(SPM) analysis right inferior parietal lobe. ROI
analysis increased gray matter
volume in left caudate nucleus.
VG training Brain Fitness (puzzle); Diffusion-derived white Puzzle game: changes in
Space Fortress (shoot matter integrity; integrity occipitotemporal
’em up); Rise of Nations functional connectivity white matter.
(real-time strategy) Puzzle and action games:
decrease functional con-
nectivity between SPC
and ITL compared to Rise
of Nations.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 648 [626–656] 29.6.2019 7:37PM

648 maria ángeles quiroga & roberto colom

Table 26.3 (cont.)

Variable Video game Neuroimaging method Results

Space Fortress (shoot EEG; ERSPS Frontal alpha power and alpha
’em up) (event-related spectral and delta ERSPS predicted
perturbations) subsequent learning and
performance.
Space Fortress (shoot fMRI Changes in functional activity in
’em up) SPL.
Professor Layton and Connectivity-wise Resting-state functional
the Pandora’s Box Resting state connectivity changes in frontal,
(puzzle) parietal, and temporal areas.
Professor Layton and MRI-optimized VBM; Volumetric changes in frontal,
the Pandora’s Box cortical surface; cortical parietal, and temporal lobes,
(puzzle) thickness; white matter bilateral. White matter:
integrity volumetric changes in
hippocampal cingulum and
inferior longitudinal fasciculus.
Super-Mario 64 (action MRI VBM8 toolbox Gray-matter increases in right
adventure) hippocampus. RdlPFC and
bilateral cerebellum.
Hippocampal increase related to
changes from egocentric to
allocentric navigation.
Visuospatial
ability
VG training Super Mario 64 Cortical thickness Increased hippocampal
(FreeSurfer) volumes.
Space Fortress (Shoot fMRI Decreased activation in
’em up) occipitoparietal regions linked
to improved visuomotor task
performance.
VG Hours per week without Cortical thickness Structural volume enlargements
experience specifying types of (FreeSurfer) in the right hippocampus.
games
Puzzle, action and role MRI VBM Entorhinal cortex was positively
games correlated with lifetime
experience in logic/puzzle VGs
but negatively with action-based
role-playing games.
Expert gamers (more EEG Earlier N100 latencies in visual
than 8 years playing pathways.
more than 20 hours/
week last 6 months)
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 649 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 649

Table 26.3 (cont.)

Variable Video game Neuroimaging method Results

Attention
VG Halo; Counterstrike; MRI FMRIB Software In non-gamers, a frontoparietal
experience Gears of War; Call of Library network of areas showed greater
Duty (first-person recruitment as attentional
shooter) demands increased. Gamers
barely engaged this network as
attentional demands increased.
Action games Steady-state visual evoked P300 larger amplitude in VGPs
potentials than in NVGPs.
VG training Space Fortress (Shoot fMRI (FSL 4.1 and FEAT) After training, participants
’em up) showed a reduction of activation
of the right middle frontal gyrus,
right superior frontal gyrus and
ventral medial prefrontal cortex
while control group continued to
engage these areas.
VG Mario Power Tennis EEG (spectral analysis of Increment of the midline theta
performance (sports) theta and alpha waves) rhythm that increases with
practice and decrease of the
parietal alpha wave activity
followed by a slow increase.

for the game (co-activated during video game playing). Playing the game may,
therefore, feed the interaction between prefrontal and posterior memory-related
regions for cognitive control of encoding and retrieval processes when the informa-
tion stored in the short-term is monitored and manipulated within the working-
memory system.
We discuss next three examples related with video game (1) training, (2) experi-
ence, and (3) performance.
Kühn and colleagues (2013) analyzed gray-matter volume changes after two
months (thirty minutes per day) of practice with Super Mario 64 (an action-
adventure game) in young adults with little or no game experience in the past six
months and who had not previously played Super Mario 64. The results obtained
showed significant increase in gray-matter volume in the right hippocampus, right
dorsolateral prefrontal cortex, and bilateral cerebellum. Regarding number of play-
ing hours, Kühn and colleagues (2014) found a positive association between cortical
thickness and two brain areas that belong to the frontoparietal network. They did not
report the genres played and, therefore, their results can be interpreted as a brain
mean effect of playing video games in general.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 650 [626–656] 29.6.2019 7:37PM

650 maria ángeles quiroga & roberto colom

Figure 26.3 Regions showing increased functional connectivity at rest, after


playing Professor Layton and the Curious Village four hours per week over four
weeks (Martínez et al., 2013).

The comparison of individuals who play shooter video games (at least five hours
per week playing video games like Call of Duty, Halo, Counterstrike, or Gears of
War in the previous twelve months) and non-players (less than one hour per week
playing the aforementioned video games in the previous twelve months, but playing
other games such as puzzle, card, or strategy games) has revealed clear differences
between those groups when completing selective attention tasks (Bavelier et al.,
2012). Functional MRI showed higher frontoparietal activation in non-players with
increased attention requirements, whereas this was not the case for experienced
players. Therefore, experienced players seem more efficient in filtering irrelevant
information.
In the third study, Nikolaidis and colleagues (2014 used Space Fortress to analyze
whether changes observed in some brain areas while playing predict changes in
nontrained working memory tasks. Participants were nonfrequent players (less than
four hours per week). Results showed that activity changes in the superior parietal
lobe, the paracingulate gyrus, and the precuneus predicted 37 percent of the
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 651 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 651

individual differences observed in a nontrained working memory task, but not in


a change-detection task.
The studies described support the association between practice with commercial
video games (puzzles and shooters) and brain changes. The conclusion is reinforced
by a recent meta-analysis (Palaus et al., 2017) but, again, it is crucial to have clear
frameworks for orienting research efforts to avoid wasting time and resources
(Colom & Román, 2018).

What’s Next?
We have seen that people can be ranked according to their video game
performance. This parallels ranking using standardized intelligence tests. The corre-
lation between cognitive ability tests and video game performance is medium to high
at the test level (Baniqued et al., 2013; McPherson & Burns, 2007, 2008; Quiroga
et al., 2016), but the values are extremely high at the latent level (Quiroga et al.,
2015, 2018) (see Figure 26.2). These results apply to quite heterogeneous genres,
from puzzles to MOBAs.
Video games are also useful to test for intelligence in both players and non-players
(they show measurement invariance; Foroughi et al., 2016). If (and only if) video
games show medium cognitive complexity, are relatively consistent, and avoid
transfer, extensive practice does not change their correlation with standard intelli-
gence tests (Quiroga et al., 2009, 2011).
We are now ready to ask the next question: Is it time to use video games for
measuring intelligence and related cognitive abilities?
Yes, it is. We strongly endorse the message contained in the quote that opens this
chapter.
However, several issues must be addressed.
First, psychologists must be involved in the steps required for designing a video
game: content, mechanics, complexity levels, variables to be saved, and scores to be
computed. Commercial video game creators don’t care about the information
researchers and practitioners want.
Using commercial video games for research is inefficient because of time spent
and research assistants needed, and this may explain why researchers do not test
video game performance but rather video game experience (with a questionnaire).
However, as already noted, these two measures tell different stories. Furthermore,
when psychologists are there from the very beginning (i.e., at the development of the
game, as in the case of McPherson & Burns’ 2007, 2008 research), correlations with
paper-and-pencil tests increase because the video game is oriented toward tapping
the cognitive ability of interest.
Second, video games can be designed as adaptive tests by broadening their scope.
Video games can easily include individualized pathways with different endings
depending on the difficulty levels achieved. If a player cannot overcome a certain
difficulty level, an exit pathway can be provided to avoid negative feelings.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 652 [626–656] 29.6.2019 7:37PM

652 maria ángeles quiroga & roberto colom

Intelligence models may guide these pathways. This double adaptive approach will
allow implementing, in (say) a video game designed to measure fluid reasoning, the
rules and components for inductive reasoning considered by Primi (2014): (1) quanti-
tative pairwise progression; (2) figure addition and subtraction; (3) distribution of three
values in which the elements are instances of a conceptual attribute; (4) attribute
addition; (5) distribution of two values. This may allow the design of criterion-
referenced assessment tools avoiding arbitrary metrics based on normative scores.
Third, video games would be a useful way for estimating the average IQ level of
populations (Kirkegaard, 2018). They might contribute to assessing inaccessible
groups (video games can be implemented in cell phones) and also to systematically
analyze the link between intelligence and health in real time, as suggested by
Kokkinakis and colleagues (2017).
Fourth, video games may allow testing for response processes. They can record
the continuous “flow” of behaviors. This would increase ecological validity. In
everyday life settings, the same result can be achieved by using different pathways.
Furthermore, emotions can be manipulated to test their influence (or lack of) over
cognitive performance. Forgotten Depths (downloadable for free from www
.quirogas.net) is a customized game designed for achieving this goal. The game
taps working memory with or without an environment that evokes fear. The software
provides accuracy and time data scores for the primary (processing) and secondary
(storing) tasks (to find the exit to each labyrinth and to collect all the required gems,
respectively).
Available results show a correlation of 0.70 between standard working memory tasks
and video game performance within neutral labyrinths (without emotion – no spiders).
However, the correlation is decreased within the emotional labyrinths (r = 0.50).
Forgotten Depths also provides data about (1) time invested in “safe” or “risky”
places within each labyrinth, (2) number of clicks to exit, (3) number of times spiders
killed the player, (4) number of times the player used their weapon, (5) number of spiders
killed, and so on. Using these variables, results have shown that fearful people, even if
they have the same working memory ability level on the standard tasks, perform worse
on the video game that contains spiders than non-fearful people (d = 0.48) because they
invest more time in finding the exit of the labyrinth (d = 0.44), although they collect the
same amount of gems (achievement measure d = 0.05). Also, they stay longer in risky
areas than non-fearful players (d = −0.54) and spiders bite them more frequently than
non-fearful players (d = −0.59). Fearful people seem to experience greater levels of fear
while solving the game, obtain worse working memory scores, are easily disoriented
(more time on risky areas and more clicks to exit) and show lower reaction behaviors
(more spiders bite them).
This customized video game includes a mod for researchers to elaborate their own
labyrinths as needed. Forgotten Depths is a good example of the type of video game
required for measuring cognitive abilities properly.
In closing, systematic research is needed. The available evidence is highly pro-
mising, but funds are greatly required. Commercial video games must be substituted
with games designed by scientists from the very beginning if they are to be used in
both research and practice.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 653 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 653

References
Ackerman, P. J. (1988). Individual differences and skill acquisition. In P. L. Ackerman,
R. J. Sternberg, & R. Glaser (Eds.), Learning and individual differences: Advances
in theory and practice (pp. 165–217). New York: W. H. Freeman and Company.
Adams, D., & Mayer, R. (2012). Examining the connection between dynamic and static
spatial skills and video game performance. Proceedings of the Annual Meeting of
the Cognitive Science Society, 34. https://escholarship.org/uc/item/8vc391r3
Baniqued, P. L., Lee, H., Voss, M. W., Basak, C., Cosman, J. D., DeSouza, S., et al. (2013).
Selling points: What cognitive abilities are tapped by casual video games? Acta
Psychologica, 142, 74–86. http://dx.doi.org/10.1016/j.actpsy.2012.11.009
Bavelier, D., Achtman, R. L., Mani, M., & Föcker, J. (2012). Neural bases of selective
attention in action video game players. Vision Research, 61, 132–143. https://doi
.org/10.1016/j.visres.2011.08.007
Bediou, B., Adams, D. M., Mayer, R. E., Tipton, E., Green, C. S., & Bavelier, D. (2018). Meta-
analysis of action video game impact on perceptual, attentional, and cognitive skills.
Psychological Bulletin, 144(1), 77–110. http://dx.doi.org/10.1037/bul0000130
Bonny, J. W., Castaneda, L. M., & Swanson, T. (2016). Using an international gaming
tournament to study individual differences in MOBA expertise and cognitive skills.
In Proceedings of the SIGCHI conference on human factors in computing systems.
(3473–3484) San José, CA. http://dx.doi.org/10.1145/2858036.2858190
Buford, C. C., & O’Leary, B. J. (2015). Assessment of fluid intelligence utilizing a computer
simulated game. International Journal of Gaming and Computer-Mediated
Simulations, 7, 1–17. http://dx.doi.org/10.4018/IJGCMS.2015100101
Chiang, O. (2010). Ten ways games can boost your careers. www.forbes.com/2010/07/19/
career-leadership-strategy-technology-videogames_slide.html#2b2f83c05cb6
Colom, R., Quiroga, M. A., Solana, A. B., Burgaleta, M., Román, F. J., Privado, J., et. al.
(2012). Structural changes after videogame practice related to a brain network
associated with intelligence. Intelligence, 40, 479–489.
Colom, R., & Román, F. J. (2018). Enhancing intelligence. From the group to the individual.
Journal of Intelligence, 6 (1), 11. https://doi.org/10.3390/jintelligence6010011
Dobrowolsky, P., Hanusz, K., Sobczyk, B., Skorko, M., & Wiatrow, A. (2015). Cognitive
enhancement in video game players: The role of video game genre. Computers in
Human Behavior, 44, 59–63. http://dx.doi.org/10.1016/j.chb.2014.11.051
Drummond, A., & Sauer, J. D. (2014). Video-games do not negatively impact adolescent
academic performance in science, mathematics or reading. PLoS One, 9(4), e87943.
Ekstrom, R. B., French, J. W., & Harman, H. H. (1976). Manual for kit of Factor-Referenced
Cognitive Tests. Princeton: Educational Testing Service.
Eysenck, H. J. (1993). Meta-analysis and its problems. British Medical Journal, 309, 789–792.
Foroughi, C. K., Serraino, C., Parasuraman, R., & Boehm-Davis, A. (2016). Can we create
a measure of fluid intelligence using Puzzle Creator within Portal 2? Intelligence,
56, 58–64. http://dx.doi.org/10.1016/j.intell.2016.02.011
Glass, B. D., Maddox, W. T., & Love, B. C. (2013). Real time strategy game training:
Emergence of a cognitive flexibility trait. PLoS One, 8(8), e70350. http://dx
.doi.org/10.1371/journal.pone.0070350
Gnambs, T., & Appel, M. (2017). Is computer gaming associated with cognitive abilities?
A population study among German adolescents. Intelligence, 61, 19–28. http://dx
.doi.org/10.1016/j.intell.2016.12.004
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 654 [626–656] 29.6.2019 7:37PM

654 maria ángeles quiroga & roberto colom

Gottfredson, L. (1997a). Mainstream science on intelligence: An editorial with 52 signatories,


history, and bibliography. Intelligence, 24(1), 13–23.
Gottfredson, L. (1997b). Why g matters: The complexity of everyday life. Intelligence, 24,
79–132.
Green, C. S., & Bavelier, D. (2003). Action video game modifies visual selective attention.
Nature, 423, 534–537.
Green, C. S., Kattner, F., Eichenbaum, A., Bediou, B., Adams, D. M., et al. (2017). Playing
some video games but no others is related to cognitive abilities: A critique of
Unsworth et al. (2015). Psychological Science, 28(5), 679–682. http://dx.doi.org/10
.1177/09566797616644837
Green, C. S., Pouget, A., & Bavelier, D. (2010). Improved probabilistic inference as a general
learning mechanism with action video games. Current Biology, 20, 1573–1579.
http://dx.doi.org/10.1016/j.cub.2010.07.040
Heim, A. W. (1968). AH4 Test. Windsor, UK: Nfer-Nelson.
Hunt, E. B. (2011). Where are we? Where are we going?: Reflections on the current and future
state of research on intelligence. In R. J. Sternberg & S. B. Kauffman (Eds.,)
Cambridge handbook of intelligence (pp. 864–885). New York: Cambridge
University Press.
Hunt, E., Pellegrino, J. W., Frick, R. W., Farr, S. A., & Alderton, D. (1988). The ability to
reason about movement in the visual field. Intelligence, 12, 77–100.
Jensen, A. (1998). The g factor: The science of mental ability. Westport, CT: Praeger.
Jones, M. B., Dunlap, W. P., & Bilodeau, I. M. (1986). Comparison of video game and
conventional test performance. Simulation and Games, 17(4), 435–446.
Kirkegaard, E. O. W. (2018). Is national mental sport ability a sign of intelligence? An
analysis of the top players of 12 mental sports. https://psyarxiv.com/9qnwy
Kokkinakis, A. V., Cowling, P. I., Drachen, A., & Wade, A. R. (2017). Exploring the
relationship between video game expertise and fluid intelligence. PLoS One 12
(11), e0186621. https://doi.org/10.1371/journal.pone.0186621
Kovacs, K., & Conway, A. R. A. (2016). Process overlap theory: A unified account of the
general factor of intelligence. Psychological Inquiry, 27, 151–177. https://doi.org
/10.1080/1047840X.2016.1153946
Kranz, M. B., Baniqued, P. L., Voss, M. W., Lee, H., & Kramer, A. F. (2017). Examining the
roles of reasoning and working memory in predicting casual game performance
across extended gameplay. Frontiers in Psychology, 8 (203), 1–13. http://dx.doi.org
/10.3389/fpsyg.2017.00203
Kühn, S., Gleich, T., Lorenz, R. C., Lindenberger, U., & Gallinat, J., (2013). Playing Super
Mario induces structural brain plasticity: Gray matter changes resulting from train-
ing with a comercial video game. Molecular Psychiatry. https://doi.org/10.1038/mp
.2013.120
Kühn, S., Lorenz, R., Banaschewski, T., Barker, G. J., Büchel, C., Conrod, P. J., et al., (2014).
Positive association of video game playing with left frontal cortical thickness in
adolescents. PLoS One, 9(3), e91506.
Lim, J., & Furnham, A. (2018). Can commercial games function as intelligence tests? A pilot
study. Computer Games Journal, 7(1), 27–37. https://doi.org/10.1007/s40869-
018–0053-z
Macnamara, B., Hambrick, D., & Oswald, F., (2014). Deliberate practice and performance in
music, games, sports, education, and professions: A meta-analysis. Psychological
Science, 25 (8), 1608–1618. http://dx.doi.org/10.1177/0956797614535810
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 655 [626–656] 29.6.2019 7:37PM

Intelligence and Video Games 655

Martínez, K., Solana, A. B., Burgaleta, M., Hernández-Tamames, J. A., Alvarez-Linera, J.,
Román, F. J., et al. (2013). Changes in resting-state functionally connected parieto-
frontal networks after videogame practice. Human Brain Mapping, 34, 3143–3157.
http://dx.doi.org/10.1002/hbm.22129
Martinovic, D., Ezeife, C. I., Whent, R., Reed, J., Burgess, G. H., Pomerleau, C. M., et al.
(2014). “Critic-proofing” of the cognitive aspects of simple games. Computers and
Education, 72, 132–144. http://dx.doi.org/10.1016/j.compedu.2013.10.017
McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the
shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1–10.
http://dx.doi.org/10.1016/j.intell.2008.08.004
McPherson J., & Burns N. R. (2007). Gs invaders: Assessing a computer game-like test of
processing speed. Behavior Research Methods, 39, 876–883. http://dx.doi.org/10
.3758/BF03192982
McPherson, J., & Burns, N. R. (2008). Assessing the validity of computer-game-like tests of
processing speed and working memory. Behavior Research Methods, 40, 969–981.
http://dx.doi.org/10.3758/BRM.40.4.969
Miyake, A., Friedman, N. P., Emerson, M. J., Witzki, A. H., Howerter, A., & Wager, T. (2000).
The unity and diversity of executive functions and their contributions to complex
“frontal lobe” tasks: A latent variable analysis. Cognitive Psychology, 41, 49–100.
Palaus, M., Marron, E. M., Viejo-Sobera, R., & Redolar-Ripoll, D. (2017). Neural basis of
video gaming: A systematic review. Frontiers in Human Neuroscience, 11(248),
1–40. http://dx.doi.org/10.3389/fnhum.2017.00248
Parong, J., Mayer, R. E., Fiorella, L., MacNamara, A., Homer, B. D., & Plass, J. L. (2017).
Learning executive function skills by playing focused video games. Contemporary
Educational Psychology, 51, 141–151. http://dx.doi.org/10.1016/j.cedpsych
.2017.07.002
Posso, A. (2016). Internet usage and educational outcomes among 15-year-old australian
students. International Journal of Communication, 10, 3851–3876.
Primi, R. (2014). Developing a fluid intelligence scale through a combination of Rasch
modeling and cognitive psychology. Psychological Assessment, 26(3), 774–788.
http://dx.doi.org/10.1037/a0036712
Quiroga, M. A., Aranda, A., Román, F. J., Privado, J., & Colom, R. (2018) Intelligence can be
measured with video games other than “brain-games.” Intelligence, 75, 85–94.
Quiroga, M. A., Escorial, S., Román, F. J., Morillo, D., Jarabo, A., Privado, J. et al. (2015).
Can we reliably measure the general factor of intelligence (g) through commercial
video games? Yes, we can! Intelligence, 53, 1–7. http://dx.doi.org/10.1016/j
.intell.2015.08.004
Quiroga, M. A., Herranz, M., Gómez-Abad, M., Kebir, M., Ruiz, J., & Colom, R. (2009).
Video-games: Do they require general intelligence? Computers and Education, 53,
414–418. http://dx.doi.org/10.1016/j.compedu.2009.02.017
Quiroga, M. A., Román, F. J., Catalán, A., Rodríguez, H., Ruiz, J., Herranz, M., et al. (2011).
Videogame performance (not always) requires intelligence. International Journal of
Online Pedagogy and Course Design, 1, 18–32. http://dx.doi.org/10.4018/ijopcd
.2011070102
Quiroga, M. A., Román, F. J., De la Fuente, J., Privado, J., & Colom, R. (2016). The
measurement of intelligence in the XXI century using video games. Spanish
Journal of Psychology, 19, 1–13.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 656 [626–656] 29.6.2019 7:37PM

656 maria ángeles quiroga & roberto colom

Rabbitt, P., Banerji, N., & Szymanski, A. (1989). Space Fortress as an IQ test? Predictions of
learning and of practiced performance in a complex interactive video game. Acta
Psychologica, 71, 243–257.
Sajjadi, P., Vlieghe, J., & De Troyer, O. (2017). Exploring the relation between the theory of
multiple intelligences and games for the purpose of player-centered game design.
Electronic Journal of e-Learning, 15(4), 320–334. www.ejel.org/main.home
Sala, G., Tatlidil, K. S., & Gobet, F. (2018). Video game training does not enhance cognitive
ability: A comprehensive meta-analytic investigation. Psychological Bulletin, 144
(2), 111–139. http://dx.doi.org/10.1037/bul0000139
Sedig, K., Haworth, R., & Corridore, M. (2015). Investigating variations in gameplay:
Cognitive implications. International Journal of Computer Games Technology,
Article ID 208247. http://dx.doi.org/10.1155/2015/208247
Shute, V. J., Ventura, M., & Ke, F. (2015). The power of play: The effects of Portal 2 and
Lumosity on cognitive and noncognitive skills. Computers and Education, 80,
58–67. http://dx.doi.org/10.1016/j.compedu.2014.08.013
Spearman, C. (1904). “General intelligence,” objectively determined and measured.
American Journal of Psychology, 15(2), 201–292.
Torre-Tresols, J. J. (2017). Clasificación de géneros de videojuegos [Classification of video
games genres]. Laboratory of Intelligence (Faculty of Psychology), Universidad
Complutense de Madrid: Laboratorio de Inteligencia y videojuegos. www
.quirogas.net
Unsworth, N., Redick, T. S., McMillan, B. D., Hambrick, D. Z., Kane, M. J., & Engle, R. W.
(2015). Is playing video games related to cognitive abilities? Psychological Science,
26, 759–774. http://dx.doi.org/10.1177/0956797615570367
Ventura, M., Shute, V. J., Wright, T., & Zhao, W. (2013). An investigation of the validity of the
virtual spatial navigation assessment. Frontiers in Psychology, 4, 852. http://dx
.doi.org/10.3389/fpsyg.2013.00852
West, G. L., Konishi, K., Diarra, M., Benady-Chorney, J., Drisdelle, B. L., Dahmani, L., et al.
(2017). Impact of video games on plasticity of the hippocampus. Molecular
Psychiatry, 23, 1566–1574. http://dx.doi.org/10.1038/mp.2017.155

View publication stats

Potrebbero piacerti anche