Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
net/publication/334225070
CITATIONS READS
0 965
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Roberto Colom on 04 July 2019.
626
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 627 [626–656] 29.6.2019 7:37PM
General
intelligence (g)
Three years later, Rabbitt and colleagues (1989) assessed intelligence with the
AH4 test (Heim, 1968), whereas video game performance was evaluated using Space
Fortress. The AH4 test is a group-administered test consisting of sixty-five items
belonging to the verbal and numerical domains. Space Fortress is a video game
designed at the University of Illinois for studying complex-skill acquisition. The
game’s goal is to shoot missiles and destroy a space fortress (it can be installed and
run from http://hyunkyulee.github.io/research_sf.html), and participants played over
five successive days. Greater correlations were observed with increased practice
(from 0.28 to 0.68). This was the main conclusion: “a relatively unsophisticated
video-game, on which performance may reasonably be expected to be independent
of native language or acquired literacy, and which is greatly enjoyed by young people
who play it, rank orders individual differences in ‘intelligence’ nearly as well as
pencil and paper psychometric tests which have been specially developed for this
purpose over the last 80 years” (p. 13).
The increase in correlations from the first to the fifth session suggests that some
practice was necessary to overcome preexisting differences in familiarity with the
video game. The correlation became stable once those differences disappeared (stay
tuned).
Genre is important when comparing video game players with non-players. In this
regard, Dobrowolski and colleagues (2015) compared people who had played
mainly first-person shooter (FPS) or real-time strategy (RTS) video games for
seven or more hours per week during the six months previous to starting the study
with people who had played five hours or less per week (including no more than two
hours per week of FPS and/or RTS). Players and non-players were compared on task-
switching performance and multiple-object tracking. RTS players outperformed
non-players on the set size they could accurately follow in the MOT and they were
also less affected by switches than non-players in the switching task. There were no
differences found between the FPS players and FPS non-players.
The enhancement of attention skills sometimes found (Green and Bavelier, 2003)
might result from the different processes in players whose experience comes from
different game genres. These processes may remain unknown if the analyzed group
of participants includes more individuals who have played shooter and strategy
games (e.g., Team Fortress Classic) than people who have played car racing
games (e.g., Super Mario Kart), even though both are “action games.”
Experience with video games is usually assessed in terms of hours per week and
video game genre. There are some questionnaires for assessing these variables. The
Video Games Playing Habits (VGPH) questionnaire by Quiroga and colleagues
(2011) and the Video Game Playing Questionnaire by the Bavelier Lab (Bediou
et al., 2018) are two examples.
The first studies considered non-players only (Adams & Mayer, 2012; Glass
Maddox, & Love, 2013; Quiroga et al., 2009, 2011) because it was relatively easy
to find naïve participants. When video game experience started to be explicitly
considered, the cut-off was five or more hours per week (Green & Bavelier, 2003;
Green, Pouget, & Bavelier, 2010). Later this cut-off rose to 6–7 hours per week (West
et al., 2017). The numbers are expected to increase steadily.
Unfortunately, this is usually overlooked. Thus, for instance, Sala, Tatlidil, and
Gobet (2018) examined the meta-analytic correlation between video game perfor-
mance and cognitive ability (or cognitive processes). Their main conclusion was
this: There is no relation between the two domains. However, studies measuring
video game performance (N = 28) and those measuring video game playing hours
(N = 38) were combined, leading to a strange mix of effects (performance, motiva-
tion, etc.). As detailed below, correlations between intelligence and video games are
indeed substantial when studies measuring just playing hours are excluded.
Second, not all tasks are proper measures of cognitive ability. Visual attention
tasks, for instance, do not measure any second-stratum or broad ability. At best, they
can be considered within the first or narrow stratum below general visualization (Gv)
(Figure 26.1). At worst, some visual-attention tasks measure very specific cognitive
processes weakly related with the cognitive ability of interest.
Cognitive “abilities” and cognitive “processes” belong to conceptual realms that
must be distinguished. Again, in the meta-analytic study by Sala and colleagues
(2018), from the twenty-eight papers using raw scores as video game performance,
twelve referred to action video games. Among those twelve papers, seven have been
published in peer-reviewed journals. The correlations between cognitive ability tests
and video game performance were: Progressive Matrices = 0.63; Symmetry Span =
0.30; Mental Rotation = 0.69; Mental Paper Folding = 0.40. However, the correlations
between cognitive tasks1 and video game performance were: Antisaccade task = 0.15;
Change Detection Task = −0.11; Color Wheel Task = −0.31; Matching Figure Task
(RT) = 0.12; Matching Figure Task (Accuracy) = 0.01; Visual Search Task = 0.11. The
difference between cognitive ability tests (average correlation 0.69) and cognitive
tasks (average correlation 0.14) is pretty obvious.
Furthermore, research is moving fast beyond computing simple correlations
between one test or task and performance on a given video game. The interest
focuses now on the latent traits tapped by various specific measures (Baniqued
et al., 2013; McPherson & Burns, 2007, 2008; Quiroga et al., 2009, 2011). Results
derived from this much more appropriate approach – based on the estimation
of second-stratum abilities (usually Gf, Gc, Gv, Gy, Gs) and the computation of
structural equation models (SEM) correlating latent factors for cognitive ability and
for video game performance (Baniqued et al., 2013; Foroughi et al., 2016; Quiroga
et al., 2015, 2018) – are summarized in Table 26.2.
Third, the characteristics of the sample must be explicitly considered. Studying
children, adolescents, or adults may have differential impact on the observed
findings.
It is very important to keep in mind that meta-analytic studies are not the “cure-all”
remedy for psychological science. The combination of weak studies, even using
sophisticated statistical tools, cannot replace carefully designed and developed
studies. Mega-samples of individuals combined from largely disparate designs
1
We use the term “cognitive tasks” instead of “cognitive tests” for those measures that were designed as
laboratory tasks and lack precise psychometric properties.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 636 [626–656] 29.6.2019 7:37PM
may appeal to the naïve reader, but must be deeply inspected by the specialist before
buying the message.
Meta-analytic reports can be very damaging for emerging research fields. The
meta-analysis of Sala and colleagues (2018) discussed above is a paradigmatic
example. Only four of the eighteen studies specifically focused on cognitive abilities
measuring video game “performance” were considered. As underscored by
H. J. Eysenck (1993):
Including all relevant material – good, bad, and indifferent – in meta-analysis admits
the subjective judgments that meta-analysis was designed to avoid. Several
problems arise in meta-analysis: regressions are often non-linear; effects are often
multivariate rather than univariate; coverage can be restricted; bad studies may be
included; the data summarized may not be homogeneous; grouping different causal
factors may lead to meaningless estimates of effects; and the theory-directed
approach may obscure discrepancies. (p. 789)
In short: revise and think carefully about the information included in published
meta-analyses because there may be much more than meets the eye (and for the
worse).
2
Except for two Big Brain Academy games; Faces = 0.44, and Color Count = 0.57.
Table 26.2 Summary of studies relating intelligence and video games
nine spaceships are presented, each one with a single VAL WJ-III = 0.24 (Glr)
digit placed directly above. Destroying a spaceship
requires firing the number placed above the matching
ship at the bottom of the screen.
[626–656] 29.6.2019 7:37PM
Quiroga et al. (2009) Three games from Big Brain Academy for the Nintendo g (general mental ability), obtained Mallet Math = 0.12 to −0.52
638
Wii Console: Mallet Math, Reverse Retention, and from 5 tests (Numerical Rev. Retention = 0.43 to 0.34
Train Turn (puzzle games). Reasoning, FDSPAN, FLSPAN, Train Turn = 0.49 to 0.67
Participants played 10 blocks, each consisting of 10 Rotation of Solid Figures, and
items, during 2 nonconsecutive weeks (15 days of Corsi Block).
[626–656] 29.6.2019 7:37PM
separation).
Video game experience was assessed. Selected partici-
pants had no previous experience with these video
games.
N = 27, 17 females, mean age 21.5.
Quiroga et al. (2011) Two games from Big Brain Academy for the Nintendo g (general mental ability), obtained Train Turn = 0.65 to 0.67
Wii Console: Train Turn and Speed Sorting (puzzle from 5 tests (PMA-R, PMA-S, D- Speed Sorting = 0.65 to 0.34
games). 48, and Rotation of Solid Figures).
Participants played 25 blocks, each consisting of 10 items,
during 5 consecutive weeks.
Video game experience was assessed, with the Video
Games Playing Habits. Selected participants had no
previous experience with these video games.
N = 27 females, mean age 21.
Adams & Mayer (2012) Tetris (puzzle game). Gv (broad visual perception) static Gv static:
Unreal Tournament (UT; classic shooter game). (Paper Folding and Mental Pap. Fold. – Tetris = 0.24
All participants were non-video game players. Rotation) and Gv dynamic (Race2 Pap. Fold. – UT = 0.27
N = 69, 44 females, mean age 19.3. and Interception Tasks by Hunt et M. Rotat (errors) – Tetris = −0.21
al., 1988). M. Rotat. (errors) – UT = −0.27
Gv Dynamic:
Race2 RT – Tetris = −0.20
Intercep. Hits – Tetris = 0.19
Race2 RT – UT = −0.27
Intercep. Hits – UT = 0.21
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
Baniqued et al., (2013) 20 casual games (for computer; can be considered to be Gf (fluid intelligence) At the test level (Zmean):
puzzle games), grouped in 4 types: Reasoning, Working Gy (general memory and Gf Gy Gs Glr Att.
639
Memory, Spatial Reasoning, Attention, Visuo-Motor learning) WM+R 0.65 0.55 0.36 0.12 0.06
Speed and Perceptual Speed. Gs (processing speed) Spat. Rel. 0.57 0.44 0.18 0.00 0.13
N = 219, 33% male, mean age 21.7. Glr (long-term retrieval) Attention 0.46 0.41 0.28 −0.03 0.19
Attention Vis.Motor 0.27 0.17 0.23 0.03 0.08
[626–656] 29.6.2019 7:37PM
VSNA = 0.34
Lumosity:
SPM-reduced = 0.37
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
VSNA = −0.10
Quiroga et al. (2015) Ten Big Brain Academy games and Garden Gridlock (for Gg (general mental ability) Reliability (internal consistency):
the Nintendo Wii console) plus Tilt Maze (for compu- Gf (fluid intelligence) Analyze games: 0.76 to 0.80
ter), which were grouped in 4 types following game Gc (crystallized intelligence) Memorize: 0.44 to 0.67
developers’ descriptions: Analyze, Memorize, Gv (broad visual perception) Compute: 0.57 to 0.80
Compute, and Visualize. Gy (general memory and Visualize: 0.71 to 0.95
All are puzzle games. learning) At the test level:
Video game experience was assessed. Selected partici- Gs (processing speed) Gf Gc Gv Gy Gs
pants were naïve for the Wii console and Big Brain Analyze 0.62 0.40 0.65 0.30 0.55
Academy video game. Memorize 0.46 0.44 0.32 0.37 0.44
N = 188, 67 men, mean age 22.2. Compute 0.54 0.47 0.48 0.44 0.46
Visualize 0.64 0.34 0.66 0.29 0.41
At the latent level:
g – video game latent = 0.93
Buford & O’Leary Modified version of Portal 2 (action puzzle game). Gf (fluid intelligence) Split-half reliability of 0.92
(2015) Gc (crystallized intelligence) SPM = 0.46 (IPS)
Previous game experience as well as experience and skill Shipley Block Patterns = 0.49 (IPS)
with Portal 2 was assessed. Shipley Vocabulary = 0.30 (IPS)
Two samples of 94 (online sample – OS; mostly men, Wonderlic = 0.27 (OS)
mean age 24.8, very high experience playing. Only 27
completed the cognitive measures and 73 (in person –
IPS; 58% women, mean age 19.6, almost no experience
playing) participants.
Foroughi et al., (2016) A version of Portal 2 developed by authors (action puzzle Gf (fluid intelligence) Reliability (α) = 0.80
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
Kranz et al. (2017) Six casual games (3 adaptive and 3 non-adaptive. Action Gf (fluid intelligence) Reason. – adaptive games = 0.60 to
puzzle games and puzzle and skill games). Gy (general memory and 0.74
Ten 20-minute playing sessions (2 to 3 per week) learning) WM – adaptive games = 0.40 to 0.65
N = 94, 30 males, mean age 21.2. Gs (processing speed) Per. Speed – adap. games = 0.00 to
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
0.05
642
0.37 to 0.38
Kokkinakis et al. (2017) League of Legends (LoL), Dota 2 (action real time strat- Gf (fluid intelligence) Matrix-WASI II = 0.44
egy, multiplayer online battle arena). Gy (general memory and Rotation Span = 0.26
All subjects were experienced LoL players who had played learning) Symmetry Span = 0.12
a large number (> 100) of both “ranked” and Operation Span = 0.03
“unranked” matches.
N = 56, 51 males, mean age 20.5 years.
Kirkegaard (2018) Dota 2, League of Legends, Starcraft II (action real time g (general mental ability) National IQ and general gaming ability
strategy –ARTS, MOBA); = 0.79
Counter Strike: Global Offensive; Overwatch (first-person
shooter)
Counter Strike (tactical shooter)
Hearthstone (card game)
Super Smash Bros (classic fighting game).
Data collected at the country level (N = 195 countries).
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D
643
Lim & Furnham (2018) Taboo (board game), Portal (action puzzle). Gf (fluid intelligence) Portal (time taken) – APM = −0.61
N = 112, 101 males, mean age 18.6 years. Gc (crystallized intelligence) Taboo (describing) – APM = 0.33
[626–656] 29.6.2019 7:37PM
Figure 26.2 Correlations between the latent factors representing general video
game performance (VG) and the general factor of intelligence (g) from SEM
model with brain games (upper panels) and from SEM model with non-brain
games (bottom panels) (after Quiroga et al., 2015, 2018).
performance factor are closely similar. This opens the door to the design of intelli-
gence assessment batteries using video games (see Figure 26.2).
Regarding Gf, when video games are very novel, raw correlations with cognitive
ability are low at the beginning and increase until reaching the 0.65/0.74 range
(Kranz et al., 2017; Quiroga et al., 2016; Rabbitt et al., 1989). This increased
correlation demonstrates that video game performance is far from automated across
practice (Ackerman, 1988; Quiroga et al., 2011).
Studies include players and non-players. In this regard, the study by Foroughi and
colleagues(2016) shows that previous experience with the game hardly changes the
correlation between fluid intelligence and video game performance. New items were
designed using the mod that Portal 2 includes (Buford & O’Leary, 2015; Foroughi
et al., 2016). Portal 2 consists of chambers containing puzzles to be solved. The mod
allows researchers to build their own chambers (each chamber is usually like an item
in a test) and so remove the effect of previous experience to solve the new game.
Note that these results support measurement invariance for video games related to
Gf, or in other words, the video game is measuring the same construct irrespective of
the experience players have.
More recent studies have introduced the assessment of playing habits and self-
perceived skill when playing for identifying profiles of video game players across
different genres. The first studies simply selected participants without any experi-
ence, but this is unfeasible nowadays.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 645 [626–656] 29.6.2019 7:37PM
There is a lack of studies regarding predictive validity. The few studies consider-
ing this crucial issue focused on the association between academic success and
playing habits assessed by hours per week devoted to playing (Drummond &
Sauer, 2014; Posso, 2016). Higher scores in the 2012 Program for International
Student Assessment (PISA) were observed in students playing more hours per week.
Specifically, students who played online games almost every day scored 15 points
above the average in math and reading, and 17 points above the average in science.
This advantage was absent in those using social networks. In fact, students using
online social media on a daily basis scored 4 percent lower than the average on math,
reading, and science.
There are no studies relating video game performance and job performance, but
Chiang (2010) enumerated ten ways video games might boost occupational achieve-
ment using World of Warcraft (a role-playing [RPG] and massive multiplayer online
[MMO] game; see Table 26.1 for details). Chiang enumerated several facets (leader-
ship, dealing and learning from failure, teamwork, developing talent, flexibility
[learning to improvise], being performance driven, living for challenge, competi-
tiveness, entrepreneurship, and managing information) but this still requires formal
research.
In conclusion, video game performance correlates with cognitive abilities.
However, more systematic research is required using a clear theoretical framework
regarding the cognitive abilities considered along with the superficial features and
mental requirements of the analyzed video games.
3
This is Hedge’s g, which is equivalent to Cohen’s d but especially suited for small sample sizes in meta-
analysis. It estimates effect size correcting for positive bias.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 646 [626–656] 29.6.2019 7:37PM
improvements). However, follow-up data are required for confirming these positive
effects. Furthermore, it is quite possible that children showing higher cognitive
ability levels from the outset are more prone to play. Unfortunately, these research
studies do not measure ability baseline levels.
For real-time strategy (RTS) games, video game experience is related to the set
size that can be followed accurately in a multiple object tracking (MOT) task. Also,
RTS players are less affected by task switching than non-players (Dobrowolski et al.,
2015).
In a study of video game training by Glass and colleagues (2013), involving forty
hours training on Starcraft, a gaming condition that emphasized rapid switching
between multiple sources of information and action (the player commands and
controls two separate bases in multiple battles against two different opponent
bases) led to a large increase (Stroop d = 0.70) in cognitive flexibility compared to
playing The Sims (a life simulator game) for the same amount of time. Interestingly,
an even larger effect (d = 1.44) in cognitive flexibility has been obtained after
training for only two hours with a customized game that requires switching between
competing tasks (Parong et al., 2017).
Table 26.3 Main neural correlates of playing video games (after Palaus et al., 2017; Colom et al.,
2012; Martínez et al., 2013)
Space Fortress (shoot EEG; ERSPS Frontal alpha power and alpha
’em up) (event-related spectral and delta ERSPS predicted
perturbations) subsequent learning and
performance.
Space Fortress (shoot fMRI Changes in functional activity in
’em up) SPL.
Professor Layton and Connectivity-wise Resting-state functional
the Pandora’s Box Resting state connectivity changes in frontal,
(puzzle) parietal, and temporal areas.
Professor Layton and MRI-optimized VBM; Volumetric changes in frontal,
the Pandora’s Box cortical surface; cortical parietal, and temporal lobes,
(puzzle) thickness; white matter bilateral. White matter:
integrity volumetric changes in
hippocampal cingulum and
inferior longitudinal fasciculus.
Super-Mario 64 (action MRI VBM8 toolbox Gray-matter increases in right
adventure) hippocampus. RdlPFC and
bilateral cerebellum.
Hippocampal increase related to
changes from egocentric to
allocentric navigation.
Visuospatial
ability
VG training Super Mario 64 Cortical thickness Increased hippocampal
(FreeSurfer) volumes.
Space Fortress (Shoot fMRI Decreased activation in
’em up) occipitoparietal regions linked
to improved visuomotor task
performance.
VG Hours per week without Cortical thickness Structural volume enlargements
experience specifying types of (FreeSurfer) in the right hippocampus.
games
Puzzle, action and role MRI VBM Entorhinal cortex was positively
games correlated with lifetime
experience in logic/puzzle VGs
but negatively with action-based
role-playing games.
Expert gamers (more EEG Earlier N100 latencies in visual
than 8 years playing pathways.
more than 20 hours/
week last 6 months)
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 649 [626–656] 29.6.2019 7:37PM
Attention
VG Halo; Counterstrike; MRI FMRIB Software In non-gamers, a frontoparietal
experience Gears of War; Call of Library network of areas showed greater
Duty (first-person recruitment as attentional
shooter) demands increased. Gamers
barely engaged this network as
attentional demands increased.
Action games Steady-state visual evoked P300 larger amplitude in VGPs
potentials than in NVGPs.
VG training Space Fortress (Shoot fMRI (FSL 4.1 and FEAT) After training, participants
’em up) showed a reduction of activation
of the right middle frontal gyrus,
right superior frontal gyrus and
ventral medial prefrontal cortex
while control group continued to
engage these areas.
VG Mario Power Tennis EEG (spectral analysis of Increment of the midline theta
performance (sports) theta and alpha waves) rhythm that increases with
practice and decrease of the
parietal alpha wave activity
followed by a slow increase.
for the game (co-activated during video game playing). Playing the game may,
therefore, feed the interaction between prefrontal and posterior memory-related
regions for cognitive control of encoding and retrieval processes when the informa-
tion stored in the short-term is monitored and manipulated within the working-
memory system.
We discuss next three examples related with video game (1) training, (2) experi-
ence, and (3) performance.
Kühn and colleagues (2013) analyzed gray-matter volume changes after two
months (thirty minutes per day) of practice with Super Mario 64 (an action-
adventure game) in young adults with little or no game experience in the past six
months and who had not previously played Super Mario 64. The results obtained
showed significant increase in gray-matter volume in the right hippocampus, right
dorsolateral prefrontal cortex, and bilateral cerebellum. Regarding number of play-
ing hours, Kühn and colleagues (2014) found a positive association between cortical
thickness and two brain areas that belong to the frontoparietal network. They did not
report the genres played and, therefore, their results can be interpreted as a brain
mean effect of playing video games in general.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 650 [626–656] 29.6.2019 7:37PM
The comparison of individuals who play shooter video games (at least five hours
per week playing video games like Call of Duty, Halo, Counterstrike, or Gears of
War in the previous twelve months) and non-players (less than one hour per week
playing the aforementioned video games in the previous twelve months, but playing
other games such as puzzle, card, or strategy games) has revealed clear differences
between those groups when completing selective attention tasks (Bavelier et al.,
2012). Functional MRI showed higher frontoparietal activation in non-players with
increased attention requirements, whereas this was not the case for experienced
players. Therefore, experienced players seem more efficient in filtering irrelevant
information.
In the third study, Nikolaidis and colleagues (2014 used Space Fortress to analyze
whether changes observed in some brain areas while playing predict changes in
nontrained working memory tasks. Participants were nonfrequent players (less than
four hours per week). Results showed that activity changes in the superior parietal
lobe, the paracingulate gyrus, and the precuneus predicted 37 percent of the
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 651 [626–656] 29.6.2019 7:37PM
What’s Next?
We have seen that people can be ranked according to their video game
performance. This parallels ranking using standardized intelligence tests. The corre-
lation between cognitive ability tests and video game performance is medium to high
at the test level (Baniqued et al., 2013; McPherson & Burns, 2007, 2008; Quiroga
et al., 2016), but the values are extremely high at the latent level (Quiroga et al.,
2015, 2018) (see Figure 26.2). These results apply to quite heterogeneous genres,
from puzzles to MOBAs.
Video games are also useful to test for intelligence in both players and non-players
(they show measurement invariance; Foroughi et al., 2016). If (and only if) video
games show medium cognitive complexity, are relatively consistent, and avoid
transfer, extensive practice does not change their correlation with standard intelli-
gence tests (Quiroga et al., 2009, 2011).
We are now ready to ask the next question: Is it time to use video games for
measuring intelligence and related cognitive abilities?
Yes, it is. We strongly endorse the message contained in the quote that opens this
chapter.
However, several issues must be addressed.
First, psychologists must be involved in the steps required for designing a video
game: content, mechanics, complexity levels, variables to be saved, and scores to be
computed. Commercial video game creators don’t care about the information
researchers and practitioners want.
Using commercial video games for research is inefficient because of time spent
and research assistants needed, and this may explain why researchers do not test
video game performance but rather video game experience (with a questionnaire).
However, as already noted, these two measures tell different stories. Furthermore,
when psychologists are there from the very beginning (i.e., at the development of the
game, as in the case of McPherson & Burns’ 2007, 2008 research), correlations with
paper-and-pencil tests increase because the video game is oriented toward tapping
the cognitive ability of interest.
Second, video games can be designed as adaptive tests by broadening their scope.
Video games can easily include individualized pathways with different endings
depending on the difficulty levels achieved. If a player cannot overcome a certain
difficulty level, an exit pathway can be provided to avoid negative feelings.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 652 [626–656] 29.6.2019 7:37PM
Intelligence models may guide these pathways. This double adaptive approach will
allow implementing, in (say) a video game designed to measure fluid reasoning, the
rules and components for inductive reasoning considered by Primi (2014): (1) quanti-
tative pairwise progression; (2) figure addition and subtraction; (3) distribution of three
values in which the elements are instances of a conceptual attribute; (4) attribute
addition; (5) distribution of two values. This may allow the design of criterion-
referenced assessment tools avoiding arbitrary metrics based on normative scores.
Third, video games would be a useful way for estimating the average IQ level of
populations (Kirkegaard, 2018). They might contribute to assessing inaccessible
groups (video games can be implemented in cell phones) and also to systematically
analyze the link between intelligence and health in real time, as suggested by
Kokkinakis and colleagues (2017).
Fourth, video games may allow testing for response processes. They can record
the continuous “flow” of behaviors. This would increase ecological validity. In
everyday life settings, the same result can be achieved by using different pathways.
Furthermore, emotions can be manipulated to test their influence (or lack of) over
cognitive performance. Forgotten Depths (downloadable for free from www
.quirogas.net) is a customized game designed for achieving this goal. The game
taps working memory with or without an environment that evokes fear. The software
provides accuracy and time data scores for the primary (processing) and secondary
(storing) tasks (to find the exit to each labyrinth and to collect all the required gems,
respectively).
Available results show a correlation of 0.70 between standard working memory tasks
and video game performance within neutral labyrinths (without emotion – no spiders).
However, the correlation is decreased within the emotional labyrinths (r = 0.50).
Forgotten Depths also provides data about (1) time invested in “safe” or “risky”
places within each labyrinth, (2) number of clicks to exit, (3) number of times spiders
killed the player, (4) number of times the player used their weapon, (5) number of spiders
killed, and so on. Using these variables, results have shown that fearful people, even if
they have the same working memory ability level on the standard tasks, perform worse
on the video game that contains spiders than non-fearful people (d = 0.48) because they
invest more time in finding the exit of the labyrinth (d = 0.44), although they collect the
same amount of gems (achievement measure d = 0.05). Also, they stay longer in risky
areas than non-fearful players (d = −0.54) and spiders bite them more frequently than
non-fearful players (d = −0.59). Fearful people seem to experience greater levels of fear
while solving the game, obtain worse working memory scores, are easily disoriented
(more time on risky areas and more clicks to exit) and show lower reaction behaviors
(more spiders bite them).
This customized video game includes a mod for researchers to elaborate their own
labyrinths as needed. Forgotten Depths is a good example of the type of video game
required for measuring cognitive abilities properly.
In closing, systematic research is needed. The available evidence is highly pro-
mising, but funds are greatly required. Commercial video games must be substituted
with games designed by scientists from the very beginning if they are to be used in
both research and practice.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 653 [626–656] 29.6.2019 7:37PM
References
Ackerman, P. J. (1988). Individual differences and skill acquisition. In P. L. Ackerman,
R. J. Sternberg, & R. Glaser (Eds.), Learning and individual differences: Advances
in theory and practice (pp. 165–217). New York: W. H. Freeman and Company.
Adams, D., & Mayer, R. (2012). Examining the connection between dynamic and static
spatial skills and video game performance. Proceedings of the Annual Meeting of
the Cognitive Science Society, 34. https://escholarship.org/uc/item/8vc391r3
Baniqued, P. L., Lee, H., Voss, M. W., Basak, C., Cosman, J. D., DeSouza, S., et al. (2013).
Selling points: What cognitive abilities are tapped by casual video games? Acta
Psychologica, 142, 74–86. http://dx.doi.org/10.1016/j.actpsy.2012.11.009
Bavelier, D., Achtman, R. L., Mani, M., & Föcker, J. (2012). Neural bases of selective
attention in action video game players. Vision Research, 61, 132–143. https://doi
.org/10.1016/j.visres.2011.08.007
Bediou, B., Adams, D. M., Mayer, R. E., Tipton, E., Green, C. S., & Bavelier, D. (2018). Meta-
analysis of action video game impact on perceptual, attentional, and cognitive skills.
Psychological Bulletin, 144(1), 77–110. http://dx.doi.org/10.1037/bul0000130
Bonny, J. W., Castaneda, L. M., & Swanson, T. (2016). Using an international gaming
tournament to study individual differences in MOBA expertise and cognitive skills.
In Proceedings of the SIGCHI conference on human factors in computing systems.
(3473–3484) San José, CA. http://dx.doi.org/10.1145/2858036.2858190
Buford, C. C., & O’Leary, B. J. (2015). Assessment of fluid intelligence utilizing a computer
simulated game. International Journal of Gaming and Computer-Mediated
Simulations, 7, 1–17. http://dx.doi.org/10.4018/IJGCMS.2015100101
Chiang, O. (2010). Ten ways games can boost your careers. www.forbes.com/2010/07/19/
career-leadership-strategy-technology-videogames_slide.html#2b2f83c05cb6
Colom, R., Quiroga, M. A., Solana, A. B., Burgaleta, M., Román, F. J., Privado, J., et. al.
(2012). Structural changes after videogame practice related to a brain network
associated with intelligence. Intelligence, 40, 479–489.
Colom, R., & Román, F. J. (2018). Enhancing intelligence. From the group to the individual.
Journal of Intelligence, 6 (1), 11. https://doi.org/10.3390/jintelligence6010011
Dobrowolsky, P., Hanusz, K., Sobczyk, B., Skorko, M., & Wiatrow, A. (2015). Cognitive
enhancement in video game players: The role of video game genre. Computers in
Human Behavior, 44, 59–63. http://dx.doi.org/10.1016/j.chb.2014.11.051
Drummond, A., & Sauer, J. D. (2014). Video-games do not negatively impact adolescent
academic performance in science, mathematics or reading. PLoS One, 9(4), e87943.
Ekstrom, R. B., French, J. W., & Harman, H. H. (1976). Manual for kit of Factor-Referenced
Cognitive Tests. Princeton: Educational Testing Service.
Eysenck, H. J. (1993). Meta-analysis and its problems. British Medical Journal, 309, 789–792.
Foroughi, C. K., Serraino, C., Parasuraman, R., & Boehm-Davis, A. (2016). Can we create
a measure of fluid intelligence using Puzzle Creator within Portal 2? Intelligence,
56, 58–64. http://dx.doi.org/10.1016/j.intell.2016.02.011
Glass, B. D., Maddox, W. T., & Love, B. C. (2013). Real time strategy game training:
Emergence of a cognitive flexibility trait. PLoS One, 8(8), e70350. http://dx
.doi.org/10.1371/journal.pone.0070350
Gnambs, T., & Appel, M. (2017). Is computer gaming associated with cognitive abilities?
A population study among German adolescents. Intelligence, 61, 19–28. http://dx
.doi.org/10.1016/j.intell.2016.12.004
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 654 [626–656] 29.6.2019 7:37PM
Martínez, K., Solana, A. B., Burgaleta, M., Hernández-Tamames, J. A., Alvarez-Linera, J.,
Román, F. J., et al. (2013). Changes in resting-state functionally connected parieto-
frontal networks after videogame practice. Human Brain Mapping, 34, 3143–3157.
http://dx.doi.org/10.1002/hbm.22129
Martinovic, D., Ezeife, C. I., Whent, R., Reed, J., Burgess, G. H., Pomerleau, C. M., et al.
(2014). “Critic-proofing” of the cognitive aspects of simple games. Computers and
Education, 72, 132–144. http://dx.doi.org/10.1016/j.compedu.2013.10.017
McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the
shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1–10.
http://dx.doi.org/10.1016/j.intell.2008.08.004
McPherson J., & Burns N. R. (2007). Gs invaders: Assessing a computer game-like test of
processing speed. Behavior Research Methods, 39, 876–883. http://dx.doi.org/10
.3758/BF03192982
McPherson, J., & Burns, N. R. (2008). Assessing the validity of computer-game-like tests of
processing speed and working memory. Behavior Research Methods, 40, 969–981.
http://dx.doi.org/10.3758/BRM.40.4.969
Miyake, A., Friedman, N. P., Emerson, M. J., Witzki, A. H., Howerter, A., & Wager, T. (2000).
The unity and diversity of executive functions and their contributions to complex
“frontal lobe” tasks: A latent variable analysis. Cognitive Psychology, 41, 49–100.
Palaus, M., Marron, E. M., Viejo-Sobera, R., & Redolar-Ripoll, D. (2017). Neural basis of
video gaming: A systematic review. Frontiers in Human Neuroscience, 11(248),
1–40. http://dx.doi.org/10.3389/fnhum.2017.00248
Parong, J., Mayer, R. E., Fiorella, L., MacNamara, A., Homer, B. D., & Plass, J. L. (2017).
Learning executive function skills by playing focused video games. Contemporary
Educational Psychology, 51, 141–151. http://dx.doi.org/10.1016/j.cedpsych
.2017.07.002
Posso, A. (2016). Internet usage and educational outcomes among 15-year-old australian
students. International Journal of Communication, 10, 3851–3876.
Primi, R. (2014). Developing a fluid intelligence scale through a combination of Rasch
modeling and cognitive psychology. Psychological Assessment, 26(3), 774–788.
http://dx.doi.org/10.1037/a0036712
Quiroga, M. A., Aranda, A., Román, F. J., Privado, J., & Colom, R. (2018) Intelligence can be
measured with video games other than “brain-games.” Intelligence, 75, 85–94.
Quiroga, M. A., Escorial, S., Román, F. J., Morillo, D., Jarabo, A., Privado, J. et al. (2015).
Can we reliably measure the general factor of intelligence (g) through commercial
video games? Yes, we can! Intelligence, 53, 1–7. http://dx.doi.org/10.1016/j
.intell.2015.08.004
Quiroga, M. A., Herranz, M., Gómez-Abad, M., Kebir, M., Ruiz, J., & Colom, R. (2009).
Video-games: Do they require general intelligence? Computers and Education, 53,
414–418. http://dx.doi.org/10.1016/j.compedu.2009.02.017
Quiroga, M. A., Román, F. J., Catalán, A., Rodríguez, H., Ruiz, J., Herranz, M., et al. (2011).
Videogame performance (not always) requires intelligence. International Journal of
Online Pedagogy and Course Design, 1, 18–32. http://dx.doi.org/10.4018/ijopcd
.2011070102
Quiroga, M. A., Román, F. J., De la Fuente, J., Privado, J., & Colom, R. (2016). The
measurement of intelligence in the XXI century using video games. Spanish
Journal of Psychology, 19, 1–13.
C:/ITOOLS/WMS/CUP-NEW/18363569/WORKINGFOLDER/STERNBERG-UK/9781108485104C26.3D 656 [626–656] 29.6.2019 7:37PM
Rabbitt, P., Banerji, N., & Szymanski, A. (1989). Space Fortress as an IQ test? Predictions of
learning and of practiced performance in a complex interactive video game. Acta
Psychologica, 71, 243–257.
Sajjadi, P., Vlieghe, J., & De Troyer, O. (2017). Exploring the relation between the theory of
multiple intelligences and games for the purpose of player-centered game design.
Electronic Journal of e-Learning, 15(4), 320–334. www.ejel.org/main.home
Sala, G., Tatlidil, K. S., & Gobet, F. (2018). Video game training does not enhance cognitive
ability: A comprehensive meta-analytic investigation. Psychological Bulletin, 144
(2), 111–139. http://dx.doi.org/10.1037/bul0000139
Sedig, K., Haworth, R., & Corridore, M. (2015). Investigating variations in gameplay:
Cognitive implications. International Journal of Computer Games Technology,
Article ID 208247. http://dx.doi.org/10.1155/2015/208247
Shute, V. J., Ventura, M., & Ke, F. (2015). The power of play: The effects of Portal 2 and
Lumosity on cognitive and noncognitive skills. Computers and Education, 80,
58–67. http://dx.doi.org/10.1016/j.compedu.2014.08.013
Spearman, C. (1904). “General intelligence,” objectively determined and measured.
American Journal of Psychology, 15(2), 201–292.
Torre-Tresols, J. J. (2017). Clasificación de géneros de videojuegos [Classification of video
games genres]. Laboratory of Intelligence (Faculty of Psychology), Universidad
Complutense de Madrid: Laboratorio de Inteligencia y videojuegos. www
.quirogas.net
Unsworth, N., Redick, T. S., McMillan, B. D., Hambrick, D. Z., Kane, M. J., & Engle, R. W.
(2015). Is playing video games related to cognitive abilities? Psychological Science,
26, 759–774. http://dx.doi.org/10.1177/0956797615570367
Ventura, M., Shute, V. J., Wright, T., & Zhao, W. (2013). An investigation of the validity of the
virtual spatial navigation assessment. Frontiers in Psychology, 4, 852. http://dx
.doi.org/10.3389/fpsyg.2013.00852
West, G. L., Konishi, K., Diarra, M., Benady-Chorney, J., Drisdelle, B. L., Dahmani, L., et al.
(2017). Impact of video games on plasticity of the hippocampus. Molecular
Psychiatry, 23, 1566–1574. http://dx.doi.org/10.1038/mp.2017.155