Estado de las prácticas científicas e investigación educativa. Posibles retos para la próxima década

  1. Ángeles Blanco-Blanco 1
  1. 1 Universidad Complutense de Madrid
    info

    Universidad Complutense de Madrid

    Madrid, España

    ROR 02p0gd045

Journal:
Revista de educación

ISSN: 0034-8082

Year of publication: 2018

Issue: 381

Pages: 207-232

Type: Article

DOI: 10.4438/1988-592X-RE-2017-381-386 DIALNET GOOGLE SCHOLAR lock_openOpen access editor

More publications in: Revista de educación

Abstract

This paper presents a review of the state of current scientific practices and their potential impact on the quality of educational research. From a post positivist conception of scientific research in education, the matter is addressed in the general context of the current debate about science, its reliability, robustness and reproducibility. Theoretically and conceptually, the study adopts a metaresearch approach. From a methodological perspective, a review of the literature on the subject is carried out that allows a reasoned reflection on the status quo of scientific practices. This is done by considering some of the most relevant papers published in recent years on meta-science in general, and in the field of Education and Behavioral Sciences. In the first place, the so-called crisis of science is characterized, and particularly the presence of biases and questionable research practices in scientific research. Below are some of the key corrective elements proposed to strengthen and enable a more effective advancement of the scientific enterprise. These include: alternative and ¿new¿ emphasis on the statistical analysis of scientific data; renewed impetus to replication and reproducibility; and new modes of production, dissemination and assessment of research associated with open science. The article closes with some reflections regarding possible challenges for Spanish educational research in the next decade. The conclusions are organized around four axes: the development of meta-research studies; training, information and awareness-raising of researchers about questionable research practices; updating editorial policies; and the role of funders and evaluators of scientific production.

Funding information

Funders

Bibliographic References

  • American Educational Research Association. (2006). Standards for reporting on empirical social science research in AERA publications. Educational Researcher, 35(6), 33–40. doi:10.3102/0013189X035006033.
  • American Psychological Association. (2001). Publication manual of the American Psychological Association (5th ed.). Washington, DC: American Psychological Association.
  • American Psychological Association. (2010). Publication manual of the American Psychological Association (6th ed.). Washington, DC: American Psychological Association.
  • American Psychological Association. (2017, August 1). APA Journals Program collaborates with Center for Open Science to advance open science practices in psychological research. Retrieved from: http:// www.apa.org/news/press/releases/2017/08/open-science.aspx.
  • Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533(7604), 452-454. doi:10.1038/533452a
  • Balluerka, N., J. Gómez, et al. (2005). The controversy over null hypothesis significance testing revisited.  Methodology: European Journal of Research Methods for the Behavioral and Social Sciences 1(2): 55-70.
  • Caperos, J.M., & Pardo, A. (2013). Consistency errors in p-values reported in Spanish psychology journals. Psicothema, 25(3), 408-414.doi: 10.7334/psicothema2012.207.
  • Cohen, B. H. (2017). Why the resistance to statistical innovations? A comment on Sharpe (2013). Psychological Methods, 22(1), 204-210. doi: 10.1037/met0000058.
  • Cumming, G. (2014).The new statistics: why and how. Psychological Science, 25,7–29.doi:10.1177/0956797613504966.
  • Fernández-Cano, A., & Fernández-Guerrero, I. (2009). Crítica y alternativas a la significación estadística en el contraste de hipótesis. Madrid: La Muralla.
  • García, J. Campos, E., & De la Fuente, L. (2011). The use of the effect size in JCR Spanish journals of Psychology: from theory to fact .The Spanish Journal of Psychology, 14(2), 1050-1055.
  • Goodman, S.N. (2016). Aligning statistical and scientific reasoning. Misunderstanding and misuse of statistical significance impede science. Science, 352 (6290), 1180-1181. doi: 10.1126/science.aaf5406.
  • Harlow, L. L., Mulaik, S. A., & Steiger, J. H. (Eds.). (2016). What if there were no significance tests? Classic Edition. New York: Routledge.
  • Henson, R.K., Hull, D.M., & Williams, C.S. (2010). Methodology in our education research culture: toward a stronger collective quantitative proficiency. Educational Researcher, 39(3), 229-240. doi: 10.3102/0013189X10365102.
  • Ioannidis, J. P., Fanelli, D., Dunne, D. D., & Goodman, S. N. (2015). Meta-research: evaluation and improvement of research methods and practices. PLoS Biology, 13(10),1-7. doi: 10.1371/journal.pbio.1002264.
  • Ioannidis, J. P., Munafo, M. R., Fusar-Poli, P., Nosek, B. A., & David, S. P. (2014). Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention. Trends in cognitive sciences, 18(5), 235-241.
  • Izquierdo, I., Olea, J., and Abad, F.J. (2014). Exploratory Factor Analysis in validation studies: uses and recommendations. Psicothema, 26(3), 395-400.
  • Ledgerwood, A. (2014). Introduction to the Special Section on Advancing Our Methods and Practices Perspectives on Psychological Science, 9(3) 275–277. doi: 10.1177/1745691614529448.
  • Lopez,X., Valenzuela, J., Nussbaum, M. & Tsai, C.(2015). Some recommendations for the reporting of quantitative studies (Editorial). Computers & Education, 91,106-110.
  • Makel, M. C., & Plucker, J. A. (2014). Facts are more important than novelty: Replication in the education sciences. Educational Researcher, 43, 304 –316. doi: 10.3102/0013189X14545513.
  • Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in psychology research: How often do they really occur? Perspectives in Psychological Science, 7, 537–542. doi:10.1177/1745691612460688.
  • McNutt, M. (2014). Journals unite for reproducibility (Editorial). Science, 346(6210), 679-679. doi:10.1038/515007a.
  • Morey, R. D., Rouder, J. N., Verhagen, J., & Wagenmakers, E. J. (2014). Why hypothesis tests are essential for psychological science: a comment on Cumming (2014). Psychological science, 25(6), 1289-1290.
  • Morey, R.D. et al. (2016). The Peer Reviewers’ Openness Initiative: incentivizing open research practices through peer review. Royal Society Open Science, 3, 150547. doi: http://dx.doi.org/10.1098/ rsos.150547.
  • Mulder, J. & Wagenmakers, E.J. (2016). Editors’ introduction to the special issue ‘‘Bayes factors for testing hypotheses in psychological research: Practical relevance and new developments’’. Journal of Mathematical Psychology, 72, 1–5.
  • Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., du Sert, N. P., ... & Ioannidis, J. P. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. doi:10.1038/s41562-0160021.
  • Nature (2014). Journals unite for reproducibility (Editorial). Nature, 515(7525), 7.
  • Nosek, B. A., et al. (2015). Promoting an open research culture. Science, 348, 1422-1425. DOI: 10.1126/science.aab2374.
  • Nosek,B. A. & Lakens, D.(2014). Registered reports. A method to increase the credibility of published results. Social Psychology, 45, 137-141. doi: 10.1027/1864-9335/a000192.
  • Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349 (6251). doi: 10.1126/science. aac4716.
  • Peng, C-Y., Chen, L-T., Chiang, H-M., & Chiang, Y-C. (2013). The impact of APA and AERA guidelines on effect size reporting. Educational Psychology Review, 25, 157-209. doi: 10.1007/s10648-013-9218-2.
  • Peng, R.D. (2015). The reproducibility crisis in science. A statistical counterattack. Significance, 30-32
  • Perspectives on Psychological Science. (2012). Special section on replicability in psychological science: A crisis of confidence? Retrieved from: http://pps.sagepub.com/content/7/6.toc
  • Perspectives on Psychological Science. (2014). Special section on Advancing our methods and practices. Retrieved from: http://journals. sagepub.com/toc/ppsa/9/3.
  • Ruiz-Corbella, M., Galán, A. & Diestro, A. (2014). Las revistas científicas de Educación en España: evolución y perspectivas de futuro. RELIEVE, 20 (2), art. M1. doi: 10.7203/relieve.20.2.436.
  • Savalei V., Dunn E. (2015). Is the call to abandon p-values the red herring of the replicability crisis? Frontiers in Psychology, 6, 245. doi: 10.3389/ fpsyg.2015.00245.
  • Schweinsberg, M. et al. (2016). The pipeline project: Pre-publication independent replications of a single laboratory’s research pipeline. Journal of Experimental Social Psychology, 66, 55-67. doi: https://doi. org/10.1016/j.jesp.2015.10.001.
  • Sharpe, D. (2013). Why the resistance to statistical innovations? Bridging the communication gap.  Psychological Methods,  18(4), 572. doi: 10.1037/a0034177.
  • Sijtsma, K. (2016). Playing with data–Or how to discourage questionable research practices and stimulate researchers to do things right. Psychometrika, 81(1), 1-15. doi:10.1007/s11336-015-9446-0.
  • Sijtsma, K., Veldkamp, C.L.S. & Wicherts, J.M. (2016). Improving the conduct and reporting of statistical analysis in Psychology. Psychometrika, 81(1), 33-38. doi:10.1007/s11336-015-9444-2.
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. doi:10.1177/0956797611417632.
  • Smaldino, P.E., McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3, 160384. doi:http://dx.doi.org/10.1098/ rsos.160384.
  • Thompson, B. (2008). Computing and interpreting effect sizes, confidence intervals, and confidence intervals for effect sizes (pp. 246-262). En Osborne, J.W. (2008)(Ed.). Best Practice for Quantitative Methods. London: Sage.
  • Trafimow, D. & Marks (2015). Editorial. Basic and Applied Social Psychology, 37 (1), 1–2.
  • Waldman, I. D., & Lilienfeld, S. O. (2015). Thinking about data, research methods, and statistical analyses: Commentary on Sijtsma’s (2014) “Playing with data”. Psychometrika, 81(1), 16-26. doi:10.1007/s11336015-9447-z.
  • Warschauer, M., Duncan, G. J., & Eccles, J. S. (2015). Inaugural Editorial: what we mean by “open”. AERA Open,1 (1),1-2. doi: 10.1177/2332858415574841.
  • Wasserstein, R. L., & Lazar, N. A. (2016). The ASA’s statement on p-values: context, process, and purpose. American Statistician, 70(2), 129-133. doi: 10.1080/00031305.2016.1154108.
  • Wigboldus, D. H. J., & Dotch, R. (2015). Encourage playing with data and discourage questionable reporting practices. Psychometrika, 81(1), 27-32. doi:10.1007/s11336-015-9445-1.
  • Wilkinson, L., & The Task Force on Statistical Inference. (1999). Statistical methods in psychology journals: Guidelines and explanations. American Psychologist, 54, 594–604. doi:10.1037/0003-066X.54.8.594.