Analysing students’ academic performance in Higher Education in Spain

  1. Carlos Rivero Rodríguez 1
  2. Cristina del Campo 1
  3. Elena Urquia-Grande 1
  4. Maria-del-Mar Camacho-Miñano 1
  5. David Pascual-Ezama 1
  1. 1 Universidad Complutense de Madrid
    info

    Universidad Complutense de Madrid

    Madrid, España

    ROR 02p0gd045

Revista:
Educade: revista de educación en contabilidad, finanzas y administración de empresas

ISSN: 2173-478X

Año de publicación: 2017

Número: 8

Páginas: 3-19

Tipo: Artículo

DOI: 10.12795/EDUCADE.2017.I08.02 DIALNET GOOGLE SCHOLAR lock_openDialnet editor

Otras publicaciones en: Educade: revista de educación en contabilidad, finanzas y administración de empresas

Resumen

The development of the European Higher Education Area has been a process of modernization in many universities. Teaching methodologies have undergone a process of continuous change to meet the demands for high quality leading to a need for enhancement in the learning assessment methodologies as well. The objective of this study is to analyse student´s academic performance measured through coursework vs. final exam and to ascertain the factors that could explain the difference. Regression and variance analysis are carried out over the grades and responses to a questionnaire on a sample of 298 students of different subjects in a Spanish university. The results show that there are differences between continuous assessment and the final examination marks.

Referencias bibliográficas

  • BAETEN,M., DOCHY, F. & STRUYVEN, K. (2008). Students approaches to learning and assessment preferences in a portfolio-based learning environment. Instructional Science, 36, 359–374.
  • BRIDGES, P., BOURDILLON, B., COLLYMORE, D., COOPER, A., FOX, W., HAINES, C. ET AL. (1999). Discipline related marking behaviour using percentages: a potential cause of inequity in assessment. Assessment & Evaluation in Higher Education, 24, 285-300.
  • BRIDGES, P., A. COOPER, P. EVANSON, C. HAINES, D. JENKINS, D. SCURRY ET AL. (2002). Course marks high, examination marks low: discuss. Assessment & Evaluation in Higher Education, 27, 35-48.
  • CANO MONTERO, E.I., CHAMIZO GONZÁLEZ, J. & MARTÍN VECINO, T (2016). Coherencia interna del Grado en Administración y Dirección de Empresas: Resultados de Aprendizaje como herramienta de gobernanza. Revista de Docencia Universitaria, 14(1), 321-345
  • CAMACHO-MIÑANO, M.M; URQUÍA-GRANDE, E.; RIVERO MENÉNDEZ, M.J. & PASCUALEZAMA, D. (2016). Recursos multimedia para el aprendizaje de Contabilidad financiera en los Grados Bilingües, Educación XX1, 19 (1), 63-89.
  • CAMACHO-MIÑANO, M.M. & DEL CAMPO, C. (2016). Useful interactive teaching tool for learning: clickers in Higher Education, Journal of Interactive Learning Environments, 24 (4), 706-723.
  • CHAMIZO-GONZALEZ, J.; CANO-MONTERO, E.I.; URQUIA-GRANDE, E. & MUÑOZCOLOMINA, C.I. (2015). Educational data mining for improving learning outcomes in teaching accounting within Higher Education, International Journal of Information and learning Technology, 32 (5), 272-285.
  • DAFOUZ, E., CAMACHO, M. & URQUIA, E. (2014). Surely they can't do as wellâ: a comparison of business students academic performance in English-medium and Spanish-as-first-language-medium programmes. Language and Education, 28(3), 223-236.
  • DALZIEL, J. (1998). Using marks to assess student performance: Some problems and alternatives. Assessment and Evaluation in Higher Education, 23, 351-366.
  • DEL CAMPO, C.; CANCER, A.; PASCUAL-EZAMA, D. & URQUÍA-GRANDE, E. (2015). EMI vs. Non-EMI: Preliminary Analysis of the Academic Output within the INTE-R-LICA Project. Procedia Social and Behavioral Sciences, 212, 74 – 79.
  • DEVITA, G. (2002). Cultural equivalence in the assessment of home and internal business management students: a UK exploratory study. Studies in Higher Education, 27, 221-231.
  • DOWNS, C. (2006). What should make a final mark for a course? An investigation into the academic performance of first year Bioscience students. Assessment and Evaluation in Higher Education, 31, 345-364.
  • DURÁN SANTOMIL, P.; MASIDE SANFÍZ, J. M.; CANTORNA AGRA, S. & RODEIRO PAZOS, D. (2013). ¿Es el nuevo sistema de evaluación del EEES realmente diferente del sistema tradicional?: Un análisis empírico del rendimiento académico en una asignatura de contabilidad. Educade: Revista de educación en contabilidad, finanzas y administración de empresas, 4, 77-96.
  • EUROPEAN COMMUNITIES. (2009). ECTS Users’ Guide, Luxembourg: Office for Official Publications of the European Communities. http://ec.europa.eu/education/tools/docs/ects-guide_en.pdf. Accessed 31 July 2014.
  • ELTON, L. & JOHNSTON, B. (2002). Assessment in Universities: a critical review of research. LTSN Generic Centre. http://eprints.soton.ac.uk/59244/1/59244.pdf Accessed January 15, 2013.
  • FLETCHER, R.B., MEYER, L.H., ANDERSON, H., JOHNSTON, P. & REES, M. (2012). Faculty and Students Conceptions of Assessment in Higher Education. Higher Education, 64, 119- 133.
  • FRICK, T.W., CHADHA, R., WATSON, C. & ZLATKOVSKA, E. (2010). Improving course evaluations to improve instruction and complex learning in higher education. Educational Technology Research & Development, 58, 115-136.
  • GAMMIE, E. & MATSON, M. (2007). Group Assessment at Final Degree Level: An Evaluation. Accounting Education: an international journal, 16 (2), 185–206.
  • GIBBS, G. & SIMPSON, C. (2004-05). Conditions under which assessment supports students learning. Learning and Teaching in Higher Education, 1, 3-30.
  • GRASHA, A. (1996). Teaching with Style. A practical guide to enhancing learning by understanding teaching and learning style. Pittsburgh, PA: Alliance Publishers.
  • HEYWOOD, J. (2000). Assessment in Higher Education. London: Jessica Kingsley.
  • JAMES, D. & FLEMING, S. (2005). Agreement in student performance in assessment. Learning and Teaching in Higher Education, 1, 32-50.
  • JACKSON, L., W. MEYER & PARKINSON, J. (2006). A study of the writing tasks and reading assigned to undergraduate science students at a South African University. English for Specific Purposes, 25, 260-281.
  • KNIGHT, P. T. (2002). Summative assessment in higher education: practices in disarray. Studies in Higher Education, 27, 275–286.
  • KNIVETON, B. H. (1996). Student perceptions of assessment methods. Assessment and Evaluation in Higher Education, 21(3), 229-238.
  • LONG, M. (1991). Focus on form: A design feature in language teaching methodology. In, K. de Bot, D. Coste, R. Ginsberg & C. Kramsch (Eds.) Foreign Language Research in Cross-Cultural Perspective (pp. 39-52). Amsterdam: John Benjamins.
  • LÓPEZ-PÉREZ, M.V., PÉREZ-LÓPEZ, M.C., RODRÍGUEZ-ARIZA, L. & ARGENTE-LINARES, E. (2013). The influence of the use of technology on student outcomes in a blended learning context. Educational Technology Research & Development, 61, 625-638.
  • MARTON, F. AND SÄLJÖ, R. (1976a). On qualitative differences in learning I, Outcome and process. British Journal of Educational Psychology, 46, 4-11.
  • MARTON, F. AND SÄLJÖ, R. (1976b). On qualitative differences in learning II, Outcome as a function of the learners conception of the task. British Journal of Educational Psychology, 46, 115-127.
  • MULDOON, R. (2012). Is it time to ditch the traditional university exam? Higher Education Research & Development, 31, 263-265.
  • MURDAN, S. (2005). Exploring relationships between coursework and examination marks: a study from one school of pharmacy. Pharmacy Education, 5, 97-104.
  • O'MALLEY, J. M. & CHAMOT, A. (1990). Learning strategies in second language acquisition. Cambridge: Cambridge University Press.
  • PASCUAL-EZAMA, D., CAMACHO-MIÑANO, M.M., URQUIA GRANDE, E. & MÜLLER, A. (2011). ¿Son los nuevos criterios de evaluación en el marco del EEES adecuados para valorar el rendimiento académico de los alumnos? Experiencia en Contabilidad Financiera. [Are the new assessment criteria appropriate to evaluate students‟ academic achievement? An experience from Financial Accounting]. Revista Educade, 2, 67-83.
  • PAYNE, E. & BROWN, G. (2011). Communication and practice with examination criteria. Does this influence performance in examinations? Assessment and Evaluation in Higher Education, 36, 619-626.
  • PEACOCK, M. (2000). Learning Styles and Teaching Style Preferences in EFL. Perspectives, 12/2000 spring
  • PICA, T. (2000). Tradition and transition in English language teaching methodology. System, 28, 1-18.
  • RIVERO-MENÉNDEZ, M.J.; URQUÍA-GRANDE, E.; LÓPEZ SÁNCHEZ, P. AND CAMACHOMIÑANO, M.M. (forthcoming 2017). Motivation and learning strategies in Accounting: Are there differences in EMI versus non-EMI students? Spanish Accounting Review.
  • RITTER, L. (2000). The quest for an effective form of assessment: the evolution and evaluation of a controlled assessment procedure (CAP). Assessment and Evaluation in Higher Education, 25, 307-320.
  • SASANGUIE, D., ELEN,J., CLAREBOUT, G., VAN DEN NOORTGATE, W., VANDENABEELE, J. & DE FRAINE, B. (2011). Disentangling instructional roles: the case of teaching and summative assessment. Studies in Higher Education, 36 (8), 897-910.
  • SEGERS, M. AND DOCHY, F. (2006). Enhancing student learning through assessment: Alignment between levels of assessment and different effects on learning. Studies in Educational Evaluation, 32(3), 171-179.
  • SIMONITE, V. (2003). The impact of coursework on degree classifications and the performance of individual students. Assessment and Evaluation in Higher Education, 28, 459-470.
  • SMITH, C. (2011). Examinations and the ESL student – more evidence of particular disadvantages. Assessment and Evaluation in Higher Education, 36, 13-25.
  • SPANISH ORGANIC ACT ON DATA PROTECTION. (1999). Boletín Oficial del Estado (BOE), 298, 43088-43099 (also available at
  • http://www.boe.es/boe/dias/1999/12/14/pdfs/A43088-43099.pdf, in Spanish)
  • STARR, J.W. (1970). Student opinion on methods of assessment. Educational Review, 22, 243-253.
  • STRUYVEN, K, DOCHI, F., JANSSENS, S., SCHELFHOUT, W. & GIELEN, S. (2006). The overall effects of end-of-course assessment on student performance: a comparison between multiple choice testing, peer assessment, case-based assessment and portfolio assessment. Studies in Educational Evaluation, 32, 202-222.
  • SWAIN, M. (1985). Communicative Competence: Some roles of comprehensible input and comprehensible output in its development. In S. Gass and C. Madden (eds.) Input in second language acquisition (pp. 235-253). Rowley, MA: Newbury House.
  • TIAN, X. (2007). Do assessment methods matter? A sensitivity test. Assessment and Evaluation in Higher Education, 32, 387-401.
  • TYNJAL, P. (1998). Traditional studying for examination versus constructivist learning tasks: do learning outcomes differ? Studies in Higher Education, 23, 173-190.
  • WOODFIELD, R., EARL-NOVELL, S. & SOLOMON, L. (2005). Gender and mode of assessment at university: should we assume female students are better suited to coursework and males to unseen examinations? Assessment and Evaluation in Higher Education, 30, 35-50.
  • YORKE, M. (2011). Summative assessment: dealing with the „measurement fallacy. Studies in Higher Education, 36, 251-273.
  • YORKE, M., BARNETT, G., BRIDGES, P., EVANSON, P., HAINES, C., JENKINS, D. ET AL. (2002).
  • Does Grading Method Influence Honours Degree Classification? Assessment and Evaluation in Higher Education, 27, 269-279.
  • YORKE, M., BRIDGES, P. & WOOLF, H. (2000). Mark distributions and marking practices in UK higher education. Some challenging issues. Learning in Higher Education, 1, 7-27.
  • YORKE, M., COOPER, A. & FOX, W. (1996). Module mark distributions in eight subject areas and some issues they raise. In N. Jackson (Ed.), Modular higher education in the UK in focus (pp. 105-107). London: Higher Education Quality Council.