El uso de las nuevas tecnologías en las evaluaciones educativasLa lectura en un mundo digital

  1. Javier Suárez-Álvarez 1
  2. Rubén Fernández-Alonso 2
  3. Francisco J. García-Crespo 3
  4. José Muñiz 4
  1. 1 Universidad de Massachusetts Amherst, Estados Unidos
  2. 2 Universidad de Oviedo, España
  3. 3 Universidad Complutense de Madrid, España
  4. 4 Universidad Nebrija, España
Revista:
Papeles del psicólogo

ISSN: 0214-7823 1886-1415

Año de publicación: 2022

Volumen: 43

Número: 1

Páginas: 36-47

Tipo: Artículo

Otras publicaciones en: Papeles del psicólogo

Resumen

Being a proficient reader in a digital world requires a strong reading foundation, but also the ability to think critically, which is a challenge for many students in Spain. Computerized adaptive tests and process data (information about students’ actions when responding to the test) are especially important when assessing skills such as reading. This work aims to analyze how the use of technology is changing the concept of reading and the ways to evaluate it. This has implications for Spanish students and any professional in charge of interpreting and designing educational evaluations. The researcher must ensure that the use of data and technology is adequate for the objectives of the evaluation and that it works in a reliable, valid, and fair way for the people involved, but also the user must know when, how, and for what purposes to use the data.

Referencias bibliográficas

  • Abad, F. J., Schames, R., Sorrel, M., Nájera, P., García-Garzón, E., Garrido, L. E., y Jiménez, M. (2022). Construyendo tests adaptativos de elección forzosa “on the fly” para la medición de la personalidad. Papeles del Psicólogo , 43(1), 29-35. https://doi.org/10.23923/pap.psicol.2982
  • AERA, APA, y NCME. (2014). Standards for educational and psychological testing (2nd ed.). Washington, DC: American Educational Research Association (AERA).
  • Andrés, J. C., Aguado, D., y de Miguel, J. (2022). ¿Qué hay detras de LinkedIn? Midiendo a través de rúbricas las LinkedIn Big Four Dimensions. Papeles del Ps icólogo , 43(1), 12-20. https://doi.org/10.23923/pap.psicol.2979
  • Breakstone, J., McGrew, S., Smith, M., Ortega, T., y Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Del ta Kappan , 99(6), 27–32. https://doi.org/10.1177/0031721718762419
  • Clinton, V. (2019). Reading from paper compared to screens: A systematic review and meta‐analysis. Journal of Research in Reading, 42(2), 288–325. https://doi.org/10.1111/1467-9817.12269
  • Delgado, P., Vargas, C., Ackerman, R., y Salmerón, L. (2018). Don’t throw away your printed books: A meta-analysis on the effects of reading media on reading comprehension. Educational Research Review , 25 , 23–38. ht tps://doi.org/10.1016/j.edurev.2018.09.003
  • Elosua, P. (2022). Impacto de la TIC en el entorno evaluativo. Innovaciones al servicio de la mejora continua. Papeles del Psicólogo, 43(1), 3-11. https://doi.org/10.23923/pap.psicol.2985
  • Ercikan, K., Guo, H., y He, Q. (2020). Use of response process data to inform group comparisons and fairness research. Educa t iona l As ses smen t , 25 (3 ) , 179–197. https://doi.org/10.1080/10627197.2020.1804353
  • Fonseca-Pedrero, E., Ródenas, Gabriel, Pérez-Albéniz, A., AlHalabí, S., Pérez, M., y Muñiz, J. (2022). La hora de la evaluación ambulatoria. Papeles del Psicólogo, 43(1), 21- 28 https://doi.org/10.23923/pap.psicol.2983
  • Fraillon, J., Ainley, J., Schulz, W., Duckworth, D., y Friedman, T. (2019). IEA International Computer and Information Literacy Study 2018 Assessment Framework. IEA International Computer and Information Literacy Study 2018 Assessment Framework. https://doi.org/10.1007/978-3-030-19389-8
  • Goldhammer, F., Martens, T., Christoph, G., y Lüdtke, O. (2016). Test-taking engagement in PIAAC. OECD Education Working Papers, No. 133, OECD Publishing, Paris, https://doi.org/10.1787/5jlzfl6fhxs2-en.
  • He, Q., Borgonovi, F., y Paccagnella, M. (2019). Using process data to understand adults’ problem-solving behaviour in the Program- me for the International Assessment of Adult Competencies (PIA- AC): Identifying generalised patterns across multiple tasks with sequence mining. OECD Publishing, Paris. https://www.oecd-ilibrary.org/education/using-process-data-to-understand-adultsproblem-solving-behaviour-in-the-programme-for-the-internation al-assessment-of-adult-competencies-piaac_650918f2-en
  • He, Q., Borgonovi, F., y Paccagnella, M. (2021). Leveraging process data to assess adults’ problem-solving skills: Using sequence mining to identify behavioral patterns across digital tasks. Computers & Education, 166, 104170. https://doi.org/10.1016/j.compedu.2021.104170
  • Hernández, A., Elosua, P., Fernández-Hermida, J. R., y Muñiz, J. (2022). Comisión de Test: Veinticinco años velando por la calidad de los test . Papeles del Psicólogo, 43(1), 55-62. https://doi.org/10.23923/pap.psicol.2978
  • INEE. (2019). PISA 2018. Items liberados. Secretaria General Técnica. Subdirección General de Atención al Ciudadano, Documentación y Publicaciones. https://sede.educacion.gob.es/ publiventa/descarga.action?f_codigo_agc=20232
  • INEE. (2021). Nota país: Lectores del siglo XXI. PISA 2018. Secretaria General Tecnica. Subdireccion General de Atencion al Ciudadano,Documentacion y Publicaciones.
  • Jiao, H., He, Q., y Veldkamp, B. P. (2021). Editorial: Process data in educational and psychological measurement. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.793399
  • Macedo-Rouet, M., Salmerón, L., Ros, C., Pérez, A., Stadtler, M., y Rouet, J.-F. (2020). Are frequent users of social network sites good information evaluators? An investigation of adolescents’ sourcing abilities. Journal for the Study of Education and Development, 43(1), 101–138. https://doi.org/10.1080/02103702.2019.1690849
  • McGrew, S., Breakstone, J., Ortega, T., Smith, M., y Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 46(2), 165–193. https://doi.org/10.1080/00933104.2017.1416320
  • Mullis, I. V. S., y Martin, M. O. (2019). Assessment Frameworks PIRLS 2021. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
  • Muñiz, J., Hernández, A., y Fernández-Hermida, J. R. (2020). Utilización de los test en España: el punto de vista de los psicólogos. Papeles del Psicólogo, 41(1), 1-15. https://doi.org/10.23923/pap.psicol2020.2921
  • OECD. (2011). PISA 2009 Results: Students On Line. PISA, OECD Publishing, Paris. https://doi.org/10.1787/9789264112995-en
  • OECD. (2018). PISA 2021 Mathematics Framework. PISA, OECD Publishing, Paris.
  • OECD. (2019a). Beyond proficiency: Using log files to understand respondent behaviour in the survey of adult skills. OECD Publishing, Paris. https://doi.org/10.1787/0b1414ed-en
  • OECD. (2019b). PISA 2018 Assessment and analytical framework. PISA, OECD Publishing, Paris. https://doi.org/10.1787/b25efab8-en
  • OECD. (2020). Curriculum Overload. PISA, OECD Publishing, Paris. https://doi.org/10.1787/3081ceca-en
  • OECD. (2021a). 21st-Century readers: Developing literacy skills in a digi tal world . PISA, OECD Publ ishing, Paris . https://doi.org/10.1787/a83d84cb-en
  • Olea, J., Abad, F. J., y Barrada, J. R. (2010). Tests informatizados y otros nuevos tipos de tests. Papeles del Psicólogo, 31(1), 94–107.
  • Padilla, J. L., y Benítez, I. (2014). Evidencia de validez basada en los procesos de respuesta. Psicothema, 26(1), 136–144. https://doi.org/10.7334/PSICOTHEMA2013.259
  • Paniagua, A., y Istance, D. (2018). Teachers as designers of lea rn ing env i ronmen t s . OECD Pub l i sh ing , Pa r i s . https://doi.org/10.1787/9789264085374-en
  • Pedrosa, I., Suárez-Álvarez, J., y García-Cueto, E. (2014). Evidencias sobre la validez de contenido: Avances teóricos y métodos para su estimación. Acción Psicológica, 10(2), 3. https://doi.org/10.5944/ap.10.2.11820
  • Pedrosa, I., Suárez-Álvarez, J., García-Cueto, E., y Muñiz, J. (2016). A computerized adaptive test for enterprising personality assessment in youth. Psicothema, 28(4). https://doi.org/10.7334/psicothema2016.68
  • Pérez, A., Potocki, A., Stadtler, M., Macedo-Rouet, M., Paul, J., Salmerón, L., y Rouet, J.-F. (2018). Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical source dimensions. Learning and Instruction, 58 , 53–64. h t tps ://doi .org/10.1016/j. learnins t ruc.2018.04.006
  • Pohl, S., Ulitzsch, E., y von Davier, M. (2021). Reframing rankings in educational assessments. Science, 372(6540), 338–340. https://doi.org/10.1126/science.abd3300
  • Postigo, Á., Cuesta, M., Pedrosa, I., Muñiz, J., y García-Cueto, E. (2020). Development of a computerized adaptive test to assess entrepreneurial personality. Psicologia: Reflexão e Crítica, 33(1), 6. https://doi.org/10.1186/s41155-020-00144-x
  • Santamaría, P. y Sánchez-Sánchez, F. (2022). Cuestiones abiertas en el uso de las nuevas tecnologías en la evaluación psicológica. Papeles del Psicólogo, 43(1), 48-54. https://doi.org/10.23923/pap.psicol.2984
  • Serra-Garcia, M., y Gneezy, U. (2021). Nonreplicable publications are cited more than replicable ones. Science Advances, 7(21). https://doi.org/10.1126/SCIADV.ABD1705/SUPPL_FILE/SCI ADV.ABD1705_SM.PDF
  • Suarez-Alvarez, J. (2021). "Are 15-year-olds prepared to deal with fake news and misinformation?". PISA in Focus, No. 113, OECD Publishing, Paris, https://doi.org/10.1787/6ad5395e-en
  • von Davier, A. A., Deonovic, B., Yudelson, M., Polyak, S. T., y Woo, A. (2019). Computational psychometrics approach to holistic learning and assessment systems. Frontiers in Education, 4. https://doi.org/10.3389/feduc.2019.00069
  • von Davier, M., Khorramdel, L., He, Q., Shin, H. J., y Chen, H. (2019). Developments in psychometric population models for technology-based large-scale assessments: An overview of challenges and opportunities. Journal of Educational and Behavioral Statistics, 44(6), 671–705. https://doi.org/10.3102/1076998619881789
  • Vörös, Z., Kehl, D., y Rouet, J.-F. (2021). Task characteristics as source of difficulty and moderators of the effect of time-on-task in digital problemsolving. Journal of Educational Computing Research, 58(8), 1494– 1514. https://doi.org/10.1177/0735633120945930
  • Vosoughi, S., Roy, D., y Aral, S. (2018). The spread of true and false news onl ine. Science , 359 (6380), 1146–1151. https://doi.org/10.1126/SCIENCE.AAP9559/SUPPL_FILE/ AAP9559_VOSOUGHI_SM.PDF
  • Wise, S. L., Im, S., y Lee, J. (2021). The Impact of Disengaged Test Taking on a State’s Accountability Test Results. Educational Assessment, 26(3), 163–174. https://doi.org/10.1080/10627197.2021.1956897
  • Zenisky, A., y Sireci, S. (2002). Technological Innovations in Large-Scale Assessment. Applied Measurement in Education, 15(4), 337–362.