The Measurement of Intelligence in the XXI Century using Video Games

  1. M. A. Quiroga 1
  2. F. Román 2
  3. J. De La Fuente 1
  4. J. Privado 1
  5. R. Colom 3
  1. 1 Universidad Complutense (Spain)
  2. 2 University of Illinois at Urbana-Champaign (USA)
  3. 3 Universidad Autónoma de Madrid (Spain)
Revista:
The Spanish Journal of Psychology

ISSN: 1138-7416

Año de publicación: 2016

Volumen: 19

Tipo: Artículo

DOI: 10.1017/SJP.2016.84 DIALNET GOOGLE SCHOLAR lock_openAcceso abierto editor

Otras publicaciones en: The Spanish Journal of Psychology

Resumen

This paper reviews the use of video games for measuring intelligence differences and reports two studies analyzing the relationship between intelligence and performance on a leisure video game. In the first study, the main focus was to design an Intelligence Test using puzzles from the video game. Forty-seven young participants played “Professor Layton and the curious village”® for a maximum of 15 hours and completed a set of intelligence standardized tests. Results show that the time required for completing the game interacts with intelligence differences: the higher the intelligence, the lower the time (d = .91). Furthermore, a set of 41 puzzles showed excellent psychometric properties. The second study, done seven years later, confirmed the previous findings. We finally discuss the pros and cons of video games as tools for measuring cognitive abilities with commercial video games, underscoring that psychologists must develop their own intelligence video games and delineate their key features for the measurement devices of next generation.

Referencias bibliográficas

  • Abad F., Quiroga M. A., & Colom R. (2016). Intelligence assessment. In Encyclopedia of applied psychology. Online reference database titled Neuroscience and biobehavioral psychology. Oxford, UK: Elsevier Ltd.
  • Ackerman P. J. (1988). Individual differences and skill acquisition. In P. L. Ackerman, R. J. Sernberg, & R. Glaser (Eds.), Learning and individual differences: Advances in theory and practice (pp. 165–217). New York, NY: W.H. Freeman and Company.
  • Arce-Ferrer A. J., & Martínez-Guzmán E. (2009). Studying the equivalence of computer-delivered and paper-based administrations of the raven standard progressive matrices test. Educational and Psychological Measurement, 69, 855–867. http://dx.doi.org/10.1177/0013164409332219
  • Baniqued P. L., Lee H., Voss M. W., Basak C., Cosman J. D., DeSouza S., … Kramer A. F. (2013). Selling points: What cognitive abilities are tapped by casual video games? Acta Psychologica, 142, 74–86. http://dx.doi.org/10.1016/j. actpsy.2012.11.009
  • Bennett G., Seashore H., & Wessman A. (1990). DAT-5. Test de aptitudes diferenciales. Manual. Madrid, Spain: TEA.
  • Boot W. R. (2015). Video games as tools to achieve insight into cognitive processes. Frontiers in Psychology, 6, 1–2. http://dx.doi.org/10.3389/fpsyg.2015.00003
  • Buford C. C., & O’Leary B. J. (2015). Assessment of fluid intelligence utilizing a computer simulated game. International Journal of Gaming and Computer-Mediated Simulations, 7, 1–17. http://dx.doi.org/10.4018/ IJGCMS.2015100101
  • Burke B. (2014). Gartner redefines gamification. Stamford, CT: Gartner, Inc. Retrieved from http://blogs.gartner.com/ brian_burke/2014/04/04/gartner-redefines-gamification/
  • Byrne A. J., Hilton P. J., & Lunn J. N. (2007). Basic simulations for anaesthetists. A pilot study of the ACCESS system. Anaesthesia, 49, 376–381. http://dx.doi.org/10.1111/j. 1365-2044.1994.tb03466.x
  • Cattell R. B. (1979). Adolescent age trends in primary personality factors measured in T-data: A contribution to use of standardized measures in practice. Journal of Adolescence, 2(1), 1–16. http://dx.doi.org/10.1016/ S0140-1971(79)80002-0
  • Detterman D. (1979). A job half-done: The road to intelligence testing in the year 2000. Intelligence, 3, 295–306. http://dx. doi.org/10.1016/0160-2896(79)90024-2
  • Entertainment Software Association (2014). Essential facts about the computer and video game industry. Washington, DC: Author. Retrieved from http://www.theesa.com/ wp-content/uploads/2014/10/ESA_EF_2014.pdf
  • ETS, Pearson & CollegeBoard, (2010). Some considerations related to the use of adaptive testing for the common core assessments. Boulder, CO: Author.
  • Foroughi C. K., Serraino C., Parasuraman R., & BoehmDavis D. A. (2016). Can we create a measure of fluid intelligence using puzzle creator with Portal 2? Intelligence, 56, 58–64. http://dx.doi.org/10.1016/j.intell.2016.02.011
  • Granic I., Lobel A., & Engels R. C. M. E. (2014). The benefits of playing video games. American Psychologist, 69(1), 66–78. http://dx.doi.org/10.1037/a0034857
  • Gray W. D. (2002). Simulated task environments: The role of high-fidelity simulations, scaled worlds, synthetic environments, and laboratory tasks in basic and applied cognitive research. Cognitive Science Quarterly, 2, 205–227.
  • Hays R. T., Jacobs J. W., Prince C., & Salas E. (1992). Flight simulator training effectiveness: A meta-analysis. Military Psychology, 4, 63–74. http://dx.doi.org/10.1207/ s15327876mp0402_1
  • Heim A. W. (1967). AH4 group test of intelligence. London, UK: National Foundation for Educational Research.
  • Hopson J. (2001). Behavioral game design. New York, NY: Gamasutra. Retrieved from http://www.gamasutra.com/ view/feature/131494/behavioral_game_design.php
  • Horn J. L. (1979). Trends in the measurement of intelligence. Intelligence, 3, 229–239. http://dx.doi.org/10.1016/01602896(79)90019-9
  • Hunt E. B. (2011). Human intelligence (pp. 31–63). New York, NY: Cambridge University Press.
  • Hunt E., & Pellegrino J. (1985). Using interactive computing to expand intelligence testing: A critique and prospectus. Intelligence, 9, 207–236. http://dx.doi.org/10.1016/01602896(85)90025-X
  • Interactive Software Federation of Europe/ IpsosMediaCT (2015). Game track quarterly digest. Brussels, Belgium: ISFE. Retrieved from http://www.isfe.eu/industryfacts/statistics
  • Landers R. N. (2015). Guest editorial preface. Special issue on assessing human capabilities in video games and simulations. International Journal of Gaming and ComputerMediated Simulations, 7, 4–8.
  • Landers R. N., & Bauer K. N. (2015). Quantitative methods and analyses for the study of players and their behaviour. In P. Lankowski & S. Bjork (Eds.), Research methods in game studies (pp. 151–173). Pittsburg, PA: ETC Press. Retrieved from http://press.etc.cmu.edu/files/Game-ResearchMethods_Lankoski-Bjork-etal-web.pdf
  • Lavie N. (2005). Distracted and confused? Selective attention under load. Trends in Cognitive Science, 9, 75–82. http:// dx.doi.org/10.1016/j.tics.2004.12.004
  • Lunce L. M. (2006). Simulations: Bringing the benefits of situated learning to the traditional classroom. Journal of Applied Educational Technology, 3(1), 37–45.
  • Mané A., & Donchin E. (1989). The space fortress game. Acta Psychologica, 71, 17–22.
  • Margalit L. (2015). Why are the Candy crushes of the world dominating our lives? New York, NY: Psychology Today. Retrieved from https://www.psychologytoday.com/ blog/behind-online-behavior/201508/why-are-thecandy-crushes-the-world-dominating-our-lives
  • McPherson J., & Burns N. R. (2007). Gs invaders: Assessing a computer game-like test of processing speed. Behavior Research Methods, 39, 876–883. http://dx.doi.org/10.3758/ BF03192982
  • McPherson J., & Burns N. R. (2008). Assessing the validity of computer-game-like tests of processing speed and working memory. Behavior Research Methods, 40, 969–981. http://dx.doi.org/10.3758/BRM.40.4.969
  • Miyake A., Friedman N. P., Rettinger D. A., Shah P., & Hegarty M. (2001). How are visuospatial working memory, executive functioning, and spatial abilities related? A latent-variable analysis. Journal of Experimental Psychology: General, 130, 621–640. http://dx.doi. org/10.1037/0096-3445.130.4.621
  • Pearson TalentLens (2007). Core abilities assessment. Evidence of reliability and validity. Retrieved from: https://www.talentlens.co.uk/assets/legacy-documents/ 36917/caa_evidence_of_reliability_and_validity.pdf
  • Primi R. (2014). Developing a fluid intelligence scale through a combination of rasch modeling and cognitive psychology. Psychological Assessment, 26, 774–788. http://dx.doi. org/10.1037/a0036712
  • Quiroga M. A., Colom R., Privado J., Román F. J., Catalán A., Rodríguez H., … & Ruiz J. (2009, December). Video games performance and general intelligence. Poster presented at the Xth Annual ISIR Conference, Madrid, Spain.
  • Quiroga M. A., Escorial S., Román F. J., Morillo D., Jarabo A., Privado J., … Colom R. (2015). Can we reliably measure the general factor of intelligence (g) through commercial video games? Yes, we can! Intelligence, 53, 1–7. http:// dx.doi.org/10.1016/j.intell.2015.08.004
  • Quiroga M. A., Herranz M., Gómez-Abad M., Kebir M., Ruiz J., & Colom R. (2009). Video-games: Do they require general intelligence? Computers & Education, 53, 414–418. http://dx.doi.org/10.1016/j.compedu.2009.02.017
  • Quiroga M. A., Román F. J., Catalán A., Rodríguez H., Ruiz J., Herranz M., … Colom R. (2011). Videogame performance (not always) requires intelligence. International Journal of Online Pedagogy and Course Design, 1, 18–32. http://dx.doi. org/10.4018/ijopcd.2011070102
  • Rabbitt P., Banerji N., & Szymanski A. (1989). Space fortress as an IQ test? Predictions of learning and of practised performance in a complex interactive video game. Acta Psychologica, 71, 243–257.
  • Resnick L. B. (1979). The future of IQ testing. Intelligence, 3, 241–253.
  • Rubio V. J., & Santacreu J. (2003). TRASI. Test adaptativo informatizado para la evaluación del razonamiento secuencial y la inducción como factores de la habilidad intelectual general. [TRASI: Computerized adaptive test for the assessment of sequential reasoning and induction as factors of general mental ability]. Madrid, Spain: TEA.
  • Salkind N. J. (1999). Métodos de investigación [Exploring research]. México, México: Prentice Hall.
  • Santacreu J., Shih P. Ch., & Quiroga M. A. (2011). DiViSA. Test de Discriminación simple de árboles [DiViSA. Simple visual Discrimination Test of trees]. Madrid, Spain: TEA Ediciones.
  • Shute V. J., Ventura M., & Ke F. (2015). The power of play: The effects of Portal 2 and Lumosity on cognitive and noncognitive skills. Computers and Eductation, 80, 58–67. http://dx.doi.org/10.1016/j.compedu.2014.08.013
  • Sitzmann T. (2011). A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Personnel Psychology, 64, 489–528. http://dx.doi. org/10.1111/j.1744-6570.2011.01190.x
  • Szumal S. (2000). How to use problem-solving simulations to improve knowledge, skills and teamwork. In M. Silberman & P. Philips (Eds.), The 2000 team and organizational development sourcebook. New York, NY: McGraw Hill.
  • Tavakol M., & Dennick R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55. http://dx.doi.org/10.5116/ijme.4dfb.8dfd
  • Tekla Inc. (2016). Witness. Lexington Park, MD: Author. Retrieved from http://the-witness.net/news/
  • Thompson B. (2004). Exploratory and confirmatory factor analysis: Understanding concepts and applications. Washington, DC: American Psychological Association.
  • Valve Corporation (2011). Portal 2. Bellevue, WA: Author. Retrieved from http://www.valvesoftware.com/games/ portal2.html
  • Ventura M., Shute V. J., Wright T., & Zhao W. (2013). An investigation of the validity of the virtual spatial navigation assessment. Frontiers in Psychology, 4, 852. http://dx.doi.org/10.3389/fpsyg.2013.00852
  • Weiss D. J., & Kingsbury G. G. (1984). Application of computerized adaptive testing to educational problems. Journal of Educational Measurement, 21, 361–375. http:// dx.doi.org/10.1111/j.1745-3984.1984.tb01040.x
  • Wilding J., Munir F., & Cornish K. (2001). The nature of attentional differences between groups of children differentiated by teacher ratings of attention and hyperactivity. British Journal of Psychology, 92, 357–371. http://dx.doi.org/10.1348/000712601162239