El tipo de pregunta importa
Capturando los enfoques de aprendizaje del alumnado en la evaluación educativa
Resumen
Para evaluar los conocimientos del alumnado se utilizan diferentes tipos de preguntas, y existen evidencias de que los enfoques de aprendizaje influyen significativamente en su rendimiento en estas preguntas. En este estudio, se investiga cómo los diferentes tipos de preguntas captan los enfoques de aprendizaje de las y los alumnos. La muestra estuvo compuesta por 140 alumnos y alumnas de secundaria. Se utilizaron análisis descriptivos, correlacionales y de senderos. Los resultados mostraron que el enfoque superficial se relacionó negativamente con el rendimiento académico en las preguntas abiertas, pero no en las preguntas de opción múltiple. Además, los análisis revelaron un efecto mediador de la autoeficacia académica entre los enfoques de aprendizaje y el rendimiento académico cuando se medía con preguntas de resolución de problemas. Sin embargo, no se encontraron efectos indirectos cuando se evaluaba el rendimiento académico con preguntas de opción múltiple o de respuesta corta. Las preguntas abiertas permitieron captar los enfoques de aprendizaje del alumnado de forma más eficaz que las preguntas cerradas. Dada la relación entre los tipos de preguntas y los enfoques de aprendizaje de las y los alumnos, se discuten las implicaciones pedagógicas para el diseño de evaluaciones eficaces.
Descargas
-
Resumen6
-
PDF 11
-
PDF11
Citas
Acar, S., Berthiaume, K., & Johnson, R. (2023). What kind of questions do creative people ask? Journal of Creativity, 33(3), 100062. https://doi.org/10.1016/j.yjoc.2023.100062
Albert Pérez, A. (2017). Evaluación del aprendizaje autorregulado: validación del Motivated Strategies Learning Questionnaire en educación secundaria [Tesis doctoral, Universitat de València]. Repositori d’Objectes Digitals per a l’Ensenyament, la Recerca i la Cultura. http://hdl.handle.net/10550/59163
Alyahyan, E., & Düştegör, D. (2020). Predicting academic success in higher education: literature review and best practices. International Journal of Educational Technology in Higher Education, 17(1), 3. https://doi.org/10.1186/s41239-020-0177-7
Ardura, D. & Galán, A. (2019). The interplay of learning approaches and self-efficacy in secondary school students’ academic achievement in science. International Journal of Science Education, 41(13)1723–1743. https://doi.org/10.1080/09500693.2019.1638981
Baburajan, V., e Silva, J. D. A., & Pereira, F. C. (2020). Open-ended versus closed-ended responses: A comparison study using topic modeling and factor analysis. IEEE transactions on intelligent transportation systems, 22(4), 2123-2132. https://doi.org/10.1109/TITS.2020.3040904
Baeten, M., Dochy, F., & Struyven, K. (2008). Students’ approaches to learning and assessment preferences in a portfolio-based learning environment. Instructional Science, 36, 359-374. https://doi.org/10.1007/s11251-008-9060-y
Beck, J. W., & Schmidt, A. M. (2018). Negative Relationships Between Self-Efficacy and Performance Can Be Adaptive: The Mediating Role of Resource Allocation. Journal of Management, 44(2), 555-588. https://doi.org/10.1177/0149206314567778
Becker, W. E., & Johnston, C. (1999). The Relationship between Multiple Choice and Essay Response Questions in Assessing Economics Understanding. Economic Record, 75(4), 348-357. https://doi.org/10.1111/j.1475-4932.1999.tb02571.x
Biggs, J. (1987). Student Approaches to Learning and Studying. Australian Council for Educational Research.
Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The revised two-factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71(1), 133-149. https://doi.org/10.1348/000709901158433
Blanco, A., Prieto, L., Torre, J.C., & García, M. (2009). Adaptación, validación y evaluación de la invarianza factorial del cuestionario revisado de procesos de estudio (R-SPQ-2F). En A. Boza (coord.), Actas del IX Congreso Nacional de Modelos de Investigación Educativa sobre ‘Educación, investigación y desarrollo social’ (pp. 1535–1543). Universidad de Huelva.
Bleske‐Rechek, A., Zeug, N., & Webb, R. M. (2007). Discrepant performance on multiple‐choice and short answer assessments and the relation of performance to general scholastic aptitude. Assessment & Evaluation in Higher Education, 32(2), 89-105. https://doi.org/10.1080/02602930600800763
Blunch, N. J. (2013). Introduction to Structural Equation Modeling Using IBM SPSS Statistics and AMOS (2nd ed.). Sage.
Breuer, S., Scherndl, T., & Ortner, T. M. (2023). Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses. Royal Society Open Science, 10(5), 220456. https://doi.org/10.1098/rsos.220456
Bridgeman, B. (1992). A Comparison of Quantitative Questions in Open-Ended and Multiple-Choice Formats. Journal of Educational Measurement, 29(3), 253-271. https://doi.org/10.1111/j.1745-3984.1992.tb00377.x
Bridgeman, B., & Lewis, C. (1994). The Relationship of Essay and Multiple-Choice Scores With Grades in College Courses. Journal of Educational Measurement, 31(1), 37-50. https://doi.org/10.1111/j.1745-3984.1994.tb00433.x
Bridgeman, B., & Morgan, R. (1996). Success in College for Students with Discrepancies between Performance on Multiple-Choice and Essay Tests. Journal of Educational Psychology, 88(2), 333-340. https://doi.org/10.1037/0022-0663.88.2.333
Bunce, D. M., Komperda, R., Schroeder, M. J., Dillner, D. K., Lin, S., Teichert, M. A., & Hartman, J. R. (2017). Differential Use of Study Approaches by Students of Different Achievement Levels. Journal of Chemical Education, 94(10), 1415-1424. https://doi.org/10.1021/acs.jchemed.7b00202
Bush, M. (2001). A Multiple Choice Test that Rewards Partial Knowledge. Journal of Further and Higher Education, 25(2), 157-163. https://doi.org/10.1080/03098770120050828
Cairns, D., & Areepattamannil, S. (2022). Teacher-Directed Learning Approaches and Science Achievement: Investigating the Importance of Instructional Explanations in Australian Schools. Research in Science Education, 52, 1171–1185 https://doi.org/10.1007/s11165-021-10002-0
Chin, C., & Brown, D. E. (2000). Learning in Science: A Comparison of Deep and Surface Approaches. Journal of Research in Science Teaching, 37(2), 109-138. https://doi.org/10.1002/(SICI)1098-2736(200002)37:2<109::AID-TEA3>3.0.CO;2-7
Couch, B. A., Hubbard, J. K., & Brassil, C. E. (2018). Multiple–true–false questions reveal the limits of the multiple–choice format for detecting students with incomplete understandings. BioScience, 68(6), 455-463. https://doi.org/10.1093/biosci/biy037
de la Fuente, J., Pichardo, M. C., Justicia, F., & Berbén, A. (2008). Enfoques de aprendizaje, autorregulación y rendimiento en tres universidades europeas / Learning approaches, self-regulation and achievement in three European universities. Psicothema, 20(4), 705-711. https://www.psicothema.com/pdf/3544.pdf
Delgado, A. R., & Prieto, G. (2003). The effect of item feedback on multiple-choice test responses. British Journal of Psychology, 94(1), 73-85. https://doi.org/10.1348/000712603762842110
DeVore, S., Stewart, J., & Stewart, G. (2016). Examining the effects of testwiseness in conceptual physics evaluations. Physical Review Physics Education Research, 12(2), 020138. https://doi.org/10.1103/PhysRevPhysEducRes.12.020138
Funk, S. C., & Dickson, K. L. (2011). Multiple-Choice and Short-Answer Exam Performance in a College Classroom. Teaching of Psychology, 38(4), 273-277. https://doi.org/10.1177/0098628311421329
Furnham, A., Batey, M., & Martin, N. (2011). How would you like to be evaluated? The correlates of students’ preferences for assessment methods. Personality and Individual Differences, 50(2), 259-263. https://doi.org/10.1016/j.paid.2010.09.040
García, T., Rodríguez, C., Betts, L., Areces, D., & González-Castro, P. (2016). How affective-motivational variables and approaches to learning predict mathematics achievement in upper elementary levels. Learning and Individual Differences, 49, 25-31. https://doi.org/10.1016/j.lindif.2016.05.021
Greving, S., & Richter, T. (2022). Practicing retrieval in university teaching: short-answer questions are beneficial, whereas multiple-choice questions are not. Journal of Cognitive Psychology, 34(5), 657-674. https://doi.org/10.1080/20445911.2022.2085281
Gutiérrez-de-Rozas, B., López-Martín, E., & Carpintero Molina, E. (2022). Condicionantes del rendimiento académico: revisión sistemática de 25 años de meta-análisis. Revista de Educación, 398, 39–85. https://doi.org/10.4438/1988-592X-RE-2022-398-552
Hayes, A. F. (2018). Introduction to Mediation, Moderation, and Conditional Process Analysis a Regression-Based Approach (2nd ed.). Guilford Press.
Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). How Question Types Reveal Student Thinking: An Experimental Comparison of Multiple-True-False and Free-Response Formats. CBE-Life Sciences Education, 16(2), ar26. https://doi.org/10.1187/cbe.16-12-0339
IBM Corp. (2020). IBM SPSS Statistics for Windows (27.0) [Computer software]. IBM Corp.
Laird, T. F. N., Seifert, T. A., Pascarella, E. T., Mayhew, M. J., & Blaich, C. F. (2014). Deeply Affecting First-Year Students’ Thinking: Deep Approaches to Learning and Three Dimensions of Cognitive Development. The Journal of Higher Education, 85(3), 402-432. https://doi.org/10.1080/00221546.2014.11777333
Laitinen S., Christopoulos, A., Laitinen, P., & Nieminen, V. (2024). Relationships between self-efficacy and learning approaches as perceived by computer science students. Frontiers in Education, 9, 1181616. https://doi.org/10.3389/feduc.2024.1181616
Ley Orgánica 8/2013, de 9 de diciembre, para la mejora de la calidad educativa.Boletín Oficial del Estado, 295, de 10 de diciembre de 2013. https://www.boe.es/eli/es/lo/2013/12/09/8/con
Liu, Q., Wald, N., Daskon, C., & Harland, T. (2024). Multiple-choice questions (MCQs) for higher-order cognition: Perspectives of university teachers. Innovations in Education and Teaching International, 61(4), 802-814. https://doi.org/10.1080/14703297.2023.2222715
Martínez-Abad, F., Hernández-Ramos, J. P., Sánchez-Prieto, J. C., Izquerdo-Álvarez, V., del Moral Marcos, M. T., Rivetta, M. S., & Ortíz-López, A. (2024). ¿Innovar en el examen tipo test? La prueba objetiva inversa para mejorar la evaluación sumativa en educación superior. REDU: Revista de Docencia Universitaria, 22(2), 233-250. https://doi.org/10.4995/redu.2024.21752
Pintrich, P. R., Smith, D. A. F., García, T., & McKeachie, W. J. (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). National Center for Research to Improve Postsecondary Teaching and Learning.
Radad, K., Taha, M., & Rausch, W. D. (2023). Multiple Choice Questions Versus Very Short Answered Questions in the Evaluation of Students of Veterinary Pathology. Revista Española de Educación Médica, 4(1), 27-35. http://doi.org/10.6018/edumed.548861
Rosander, P., & Bäckström, M. (2014). Personality traits measured at baseline can predict academic performance in upper secondary school three years late. Scandinavian Journal of Psychology, 55(6), 611-618. https://doi.org/10.1111/sjop.12165
Schladitz, S., Ophoff, J., & Wirtz, M. (2017). Effects of different response formats in measuring Educational Research Literacy. Journal for Educational Research Online, 9(2), 137-155. https://doi.org/10.25656/01:14900
Schneider, M., & Preckel, F. (2017). Variables associated with achievement in higher education: A systematic review of meta-analyses. Psychological Bulletin, 143(6), 565-600. https://doi.org/10.1037/bul0000098
Schumacker, R. & Lomax, R. G. (2016). A beginner’s guide to structural equation modelling (4th ed.). Routledge.
Schwarz, G. (2023). Multiple-Choice Questions for Teaching Quantitative Instrumental Element Analysis: A Follow-Up. Journal of Chemical Education, 100(10), 4099-4105. https://doi.org/10.1021/acs.jchemed.3c00061
Scouller, K. (1998). The Influence of Assessment Method on Students’ Learning Approaches: Multiple Choice Question Examination versus Assignment Essay. Higher Education, 35(4), 453-472.
Simkin, M. G., & Kuechler, W. L. (2005). Multiple-Choice Tests and Student Understanding: What Is the Connection? Decision Sciences Journal of Innovative Education, 3(1), 73-98. https://doi.org/10.1111/j.1540-4609.2005.00053.x
Skorbakk, I., & Gamlem, S. M. (2025). Exploring self-assessment practices and learning approaches in science among upper secondary students. Assessment in Education: Principles, Policy & Practice, 32(3), 299-319. https://doi.org/10.1111/j.1540-4609.2005.00053.x
Snyder, A. (2003). The New CPA Exam-Meeting Today’s Challenges: The Revised Exam Simplifies the Process Not Only for Those Taking the Test but Also for Their Employers. Journal of Accountancy, 196(6), 11.
Stanger-Hall, K. F. (2012). Multiple-choice exams: An obstacle for higher-level thinking in introductory science classes. CBE—Life Sciences Education, 11(3), 294–306. https://doi.org/10.1187/cbe.11-11-0100
Thacker, B., Chapagain, G., Pattillo, D., & West, K. (2013). The Effect of Problem Format on Students’ Responses. https://doi.org/10.48550/arXiv.1312.6004
van Wijk, E. V., Janse, R. J., Ruijter, B. N., Rohling, J. H., van der Kraan, J., Crobach, S., de Jonge, de Beaufort, A. J, Dekker, F.W., & Langers, A. M. (2023). Use of very short answer questions compared to multiple choice questions in undergraduate medical students: an external validation study. Plos one, 18(7), e0288558. https://doi.org/10.1371/journal.pone.0288558
Wooten, M. M., Cool, A. M., Prather, E. E., & Tanner, K. D. (2014). Comparison of performance on multiple-choice questions and open-ended questions in an introductory astronomy laboratory. Physical Review Special Topics - Physics Education Research, 10(2). https://doi.org/10.1103/PhysRevSTPER.10.02010
Xiromeriti, M., & Newton, P. M. (2024). Solving not answering. Validation of guidance for writing higher-order multiple-choice questions in medical science education. Medical Science Educator, 34(6), 1469-1477. https://doi.org/10.1007/s40670-024-02140-7
Yonker, J. E. (2011). The relationship of deep and surface study approaches on factual and applied test‐bank multiple‐choice question performance. Assessment & Evaluation in Higher Education, 36(6), 673-686. https://doi.org/10.1080/02602938.2010.481041
York, T. T., Gibson, C., & Rankin, S. (2019). Defining and measuring academic success. Practical Assessment, Research, and Evaluation, 20(1), 5. https://doi.org/10.7275/hz5x-tx03
Zhang, L. F., & Sternberg, R. J. (2000). Are learning approaches and thinking styles related? A study in two Chinese populations. The Journal of psychology, 134(5), 469-489. https://doi.org/10.1080/00223980009598230
Derechos de autor 2026 Revista de Investigación Educativa

Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial-SinDerivadas 4.0.
Las obras que se publican en esta revista están sujetas a los siguientes términos:
1. El Servicio de Publicaciones de la Universidad de Murcia (la editorial) conserva los derechos patrimoniales (copyright) de las obras publicadas, y favorece y permite la reutilización de las mismas bajo la licencia de uso indicada en el punto 2.
2. Las obras se publican en la edición electrónica de la revista bajo una licencia Creative Commons Reconocimiento-NoComercial-SinObraDerivada 3.0 España (texto legal). Se pueden copiar, usar, difundir, transmitir y exponer públicamente, siempre que: i) se cite la autoría y la fuente original de su publicación (revista, editorial y URL de la obra); ii) no se usen para fines comerciales; iii) se mencione la existencia y especificaciones de esta licencia de uso.
3. Condiciones de auto-archivo. Se permite a los/as autores/as a difundir electrónicamente las versiones pre-print (versión antes de ser evaluada) y/o post-print (versión evaluada y aceptada para su publicación) de sus obras antes de su publicación, ya que favorece su circulación y difusión más temprana y con ello un posible aumento en su citación y alcance entre la comunidad académica.







