Assessment of Autonomous Learning Skill Through Multi-criteria Analysis for Online ADE Students in Moodle

  1. del Pilar Laguna Sánchez, Ma
  2. de Castro, Mónica
  3. de la Fuente-Cabrero, Concepción
Libro:
Innovation, Technology, and Knowledge Management

ISSN: 2197-5698 2197-5701

Año de publicación: 2016

Páginas: 197-213

Tipo: Capítulo de Libro

DOI: 10.1007/978-3-319-47949-1_13 GOOGLE SCHOLAR lock_openAcceso abierto editor

Resumen

Quality and innovation are two key aspects in today’s higher educational systems. The rapid rise of online education in the European Higher Education Area context creates a challenge to maintain quality levels in learning processes and creates the need to develop new methodologies adapted to this type of education, in order to assess skills acquisition in a rigorous and participatory way. Quality guidelines from the European Model of Higher Education recommend the consideration of two key issues: Information management for decision-making processes and evaluation system capable of compiling student’s participation as well as autonomous learning processes. This research presents a methodology that evaluates the autonomous and online learning, using two statistical analyses: Analytical Hierarchy Process (AHP) and Goal Programming. To validate this evaluation methodology, we have used seven tools provided by Moodle. The research is conducted on a group of 71 students participating in an online higher education degree in two subjects related to quantitative studies. The “Tutorials” and “Solved Exercises” have been the best-valued tools for the acquisition of this competition, followed by the “Tasks”. The joint assessments have been validated according to the degree of agreement of the students, using an index of closeness. The results suggest that the multi-criteria analysis can be very useful for the evaluation of competences in the European Higher Education Area.

Referencias bibliográficas

  • Alfalla-Luque, R., Medina-López, C., & Arenas-Márquez, F. J. (2011). A step forward in opera-tions management training: Student visions and their response to different learning environ-ments. Cuadernos de Economía y Dirección de la Empresa, 14, 40–52.
  • ANECA. (2015). Libro Blanco del título de grado en economía y empresa de Retrieved November 10, 2015, from http://www.aneca.es/var/media/150292/libroblanco_economia_def.pdf.
  • Belton, V., & Stewart, E. (2000). Multiple criteria decision analysis. An integrated approach. Norwell, MA: Kluwer Academic Publishers.
  • Bergsmann, E., Schultes, M. T., Winter, P., Schober, B., & Spiel, C. (2015). Evaluation of competence- based teaching in higher education: From theory to practice. Evaluation and Program Planning, 52, 1–9.
  • Beshears, J., Choi, J. J., Laibson, D., & Madrian, B. C. (2008). How are preferences revealed? Journal of Public Economics, 92, 1787–1794.
  • Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet and Higher Education, 27, 1–13.
  • Chao, R., & Chen, Y. (2009). Evaluation of the criteria and effectiveness of distance e-learning with consistent fuzzy preference relations. Expert Systems with Applications, 36, 10657–10662
  • Collins, A., & Halverson, R. (2009). Rethinking education in the age of technology: The digital revolution and the schools. New York: Teachers College Press.
  • Cukusic, M., Garaca, Z., & Jadric, M. (2014). On line self-assessment and students’ success in higher education institutions. Computers & Education, 71, 100–109.
  • Deschacht, N., & Goeman, K. (2015). The effect of blended learning on course persistence and perfor-mance of adult learners: A difference-in-differences analysis. Compuetrs & Education, 87, 83–89.
  • Dias, J. M., & Diniz, J. A. (2013). FuzzyQoI model: A fuzzy logic-based modelling of users’ qual-ity of interaction with a learning management system under blended learning. Computers & Education, 69, 38–59.
  • Domagk, S., Schwartz, R. N., & Plass, J. L. (2010). Interactivity in multimedia learning: An inte-grated model. Computers in Human Behavior, 26, 1024–1033.
  • Douglas, J., McClelland, R., & Davies, J. (2008). The development of a conceptual model of student satisfaction with their experience in higher education. Quality Assurance in Education, 16, 19–35.
  • Escobar-Rodriguez, T., & Monge-Lozano, P. (2012). The acceptance of Moodle technology by business administration students. Computers & Education, 58, 1085–1093.
  • European Commission. (2015a). The Bologna process and the European higher education area. Retrieved November 10, 2015, from http://ec.europa.eu/education/policy/higher-education/bologna-process_en.htm.
  • European Commission. (2015b). Standards and guidelines for quality assurance in the European Higher Education Area (ESG).
  • Eyvindson, K., Hujala, T., Kangas, A., & Kurttila, M. (2012). Selecting a forest plan among alter-natives: Consistency of preferences within decision support frameworks. Forest Policy and Economics, 15, 114–122.
  • Fidalgo-Blanco, A., Sein-Echaluce, M. L., García-Peñalvo, F. J., & Conde, M. A. (2015). Using learn-ing analytics to improve teamwork assessment. Computers in Human Behavior, 47, 149–156.
  • Figueira, J., Greco, S., & Ehrgott, M. (2005). Multiple criteria decision analysis. State of the sur-vey. New York: Springer.
  • González, J., & Wagenaar, R. (2003). Tuning educational structures in Europe. Final report, phase 1. Bilbao: University of Deusto.
  • González-Pachón, J., & Romero, C. (2004). A method for dealing with inconsistencies in pairwise comparisons. European Journal of Operational Research, 158, 351–361.
  • Gregory, R., & Keeney, R. L. (2001). Creating policy alternatives using stakeholders values. Management Science, 40, 1035–1048.
  • Gress, C. L. Z., & Hadwin, A. F. (2010). Advancing educational research on collaboration through the use of gStudy computer-supported collaborative learning (CSCL) tools: Introduction to special issue. Computers in Human Behavior, 26, 785–786.
  • Guerrero, D., Palma, M., & La Rosa, G. (2014). Developing competences in engineering students. The case of project management course. Procedia-Social and Behavioral Sciences, 112, 832–841.
  • Herrera, M. A., & Casado, J. (2015). Interaction analysis of a blog/journal in teaching practice. Internet and Higher Education, 27, 22–43.
  • Hill, F. M. (1995). Managing service quality in higher education: The role of the student as primary consumer. Quality Assurance in Education, 3, 10–21.
  • Hung, M., & Chou, C. (2015). Students’ perceptions of instructors’ roles in blended and online learning environments: A comparative study. Computers & Education, 81, 315–325.
  • Iglesias-Pradas, S., Ruiz-de-Azcárate, C., & Agudo-Peregrina, A. F. (2015). Assessing the suit-ability of student interactions from Moodle data logs as predictors of cross-curricular competencies. Computers in Human Behavior, 47, 81–89.
  • Khlaisang, J., & Likhitdamrongkiat, M. (2015). E-learning system in blended learning environ-ment to enhance cognitive skills for learners in higher education. Procedia-Social and Behavioral Sciences, 174, 759–767.
  • Klein, J. (2002). The failure of a decision support system: Inconsistency in test grading by teach-ers. Teaching and Education, 18, 1023–1033.
  • Kwok, R. C. W., Ma, J., Vogel, D., & Zhou, D. (2001). Collaborative assessment in education: An application of a fuzzy GSS. Information & Management, 39, 243–253.
  • Li, H. L., & Ma, L. C. (2007). Detecting and adjusting ordinal and cardinal inconsistencies through a graphical and optimal approach in AHP models. Computers and Operations Research, 34, 780–798.
  • Lin, T. C., Ho, H. P., & Chang, C. T. (2014). Evaluation model for applying an e-learning system in a course: An analytic hierarchy process-Multi-Choice Goal programming approach. Journal of Educational Computing Research, 50(1), 135–157.
  • Martínez-Caro, E., Cegarra-Navarro, J. C., & Cepeda-Carrión, G. (2014). An application on the performance-evaluation model for e-learning quality in higher education. Total Quality Management and Business Excellence, 26, 1–16.
  • Marttunen, M., & Hamalainen, P. (2008). The decision analysis interview approach in the collab-orative management of a large regulated water course. Environmental Management, 42, 1026–1042.
  • Nazarenko, A. (2015). Blended learning vs traditional learning: What works? (a case study research). Procedia - Social and Behavior Sciences, 200, 77–82.
  • Nugaras, J., & Istrazivanja, E. (2015). The strategic assessment of networking of a higher educa-tion institution. Economic Research, 28, 31–44.
  • OECD. (2014). Skills beyond school. Synthesis Report. OCDE Reviews of Vocational Education and Training.Ozkan, S., & Koseler, R. (2009). Multi-dimensional students’ evaluation of e-learning systems in the higher education context: An empirical investigation. Computers & Education, 53, 1285–1296.
  • Pang, J., & Liang, J. (2012). Evaluation of the results of multi-attribute group decision-making with linguistic information. Omega, 40, 294–301.
  • Parkes, M., Stein, S., & Reading, C. (2015). Student preparedness for university e-learning envi-ronments. Internet and Higher Education, 25, 1–10.
  • Proctor, W., & Dreschler, M. (2003, February 11–15). Deliberative multi-criteria evaluation: A case study of recreation and tourism options in Victoria Australia. European Society for Ecological Economics, Frontiers 2 Conference, Tenerife.
  • Romero, C. (1991). Handbook of critical issues in Goal Programming. Oxford: Pergamon Press.Roy, B. (1996). Multicriteria methodology for decision aiding. Norwell, MA: Kluwer Academic Publishers.
  • Saaty, T. (2005). The analytic hierarchy and analytic network process for the measurement of intangible criteria and for decision-making. In J. Figueira, S. Greco, & M. Ehrgott (Eds.), Multiple criteria decision analysis. State of the survey. Berlin: Springer.
  • San Martín, S., Jiménez, N., & Jerónimo, E. (2015). The evaluation of the university students in the European Higher Education Area. Aula Abierta, 44(1), 7–14.
  • Selim, H. M. (2007). Critical success factors for e-learning acceptance: Confirmatory factor mod-els. Computers and Education, 49, 396–413.
  • Shee, D. Y., & Yang, Y. S. (2008). Multi-criteria evaluation of the web-based e-learning system: A methodology based on learner satisfaction and its applications. Computers & Education, 50, 894–905.
  • Szczypiriska, A., & Piotrowski, E. W. (2009). Inconsistency of the judgement matrix in AHP method and the decision maker’s knowledge. Physica A, 388, 907–915.
  • Tamiz, M., Jones, D., & Romero, C. (1998). Goal programming for decision making: An overview of the current state-of-the-art. European Journal of Operational Research, V-111(3), 569–581.
  • Thune, C. (2005). Standards and guidelines for quality assurance in the European Higher Education Area. Report, European Association for Quality Assurance in the European Higher Education.
  • United Nations. (2016). Competencies for the future. Retrieved February 21, 2016, from https://careers.un.org/lbw/attachments/competencies_booklet_en.pdf.
  • Vaidya, O. S., & Kumar, S. (2006). Analytic hierarchy process: An overview of applications. European Journal of Operational Research, 169, 1–29
  • Wu, J., Tennyson, R., & Hsia, T. (2010). A study of student satisfaction in a blended e-learning system environment. Computers & Education, 55, 155–164.
  • Zacharis, N. Z. (2015). A multivariate approach to predicting students outcomes in web-enabled blended learning courses. Internet and Higher Education, 27, 44–53