Psychometric evaluation of the "Ser Bachiller 2020-Régimen Costa" applying the Differential Functioning of the Items

Authors

  • Jhon Paul Ajila Sanmartin Instituto Nacional de Evaluación Educativa
  • Juan Andrés Núñez- Wong Instituto Nacional de Evaluación Educativa

DOI:

https://doi.org/10.51440/unsch.revistaeducacion.2022.20.224

Keywords:

Differential Item Functioning, Standardized Tests, Mantel-Haenszel, Validity, bias.

Abstract

This study shows the results of the Differential Item Functioning (DIF) detection analysis of the items of the Ser Bachiller assessment of the Costa cycle, which was applied in 2020 to high school graduates and applicants to public universities in Ecuador. The Mantel-Haenszel method, a well-known method used by the Educational Testing Service, was used. The presence of DIF was explored based on four axes of analysis: sex of the subject, area of settlement, ethnic self-identification group and type of schooling. It is concluded that the items of the Ser bachiller Costa 2020 are not affected by differential functioning. This supports the hypothesis that the test items applied by Ineval do not favor or disadvantage certain specific population groups, so the different population groups evaluated are not affected in any way by the design of the Ser Bachiller assessment.

Downloads

Download data is not yet available.

References

AERA-APA-NCME. (2014). Standards for Educational and Psychological Testing, Washington, dc: American Educational Research Association- American Psychological Association-National Council on Measurement in Education.

Agencia de Calidad de la Educación (2015). Informe Técnico Simce 2013. http://archivos.agenciaeducacion.cl/documentos-web/InformeTecnicoSimce_2013.pdf

Andriola, W. (2002). Detección del funcionamiento diferencial del ítem (DIF) en tests de rendimiento: aportaciones teóricas y metodológicas. Madrid.

Blanco, E. (2007). Eficacia escolar en México: factores escolares asociados a los aprendizajes en la educación primaria, Ciudad de México: flacso-Sede México. http://fl acsoandes.edu.ec/dspace/handle/10469/1247.

Brown, P. J. (1996). Using differential analysis to determine differential item functioning of survey questions (Unpublished doctoral dissertation). University of Illinois, Urbana, Champaign.

Cozby Dzul-Garcia & Burcu Atar (2020) Investigation of possible item bias on PISA 2015 science items across Chile, Costa Rica and Mexico (Estudio de los posibles sesgos entre los ítems de ciencias de la prueba PISA 2015 de Chile, Costa Rica y México), Culture and Education, 32:3, 470-505, DOI: 10.1080/11356405.2020.1785158

Chávez, C., & Saade, A. (2010). Procedimientos básicos para el análisis de reactivos. México: Centro Nacional de Evaluación para la Educación Superior.

Cho, S.-J., Suh, Y., & Lee, W. (2016). After Differential Item Functioning Is Detected. Applied Psychological Measurement, 40(8), 573–591. doi:10.1177/0146621616664304

Dorans, N. J., & Holland, P. W. (1992). DIF detection and description: Mantel?Haenszel and standardization 1, 2. ETS Research Report Series, 1992(1), i-40.

Fidalgo, A. M., & Madeira, J. M. (2008). Generalized Mantel-Haenszel methods for differential item functioning detection. Educational and Psychological Measurement, 68(6), 940-958.

García-Medina, A. M., Martínez Rizo, F., & Cordero Arroyo, G. (2016). Análisis del funcionamiento diferencial de los ítems del Excale de Matemáticas para tercero de secundaria. Revista mexicana de investigación educativa, 21(71), 1191-1220.

Hidalgo, M., & Lopez, A. (2004). Differential item functioning detection and effect size: A comparison between logistic regression and Mantel-Haenszel procedures. Educational and Psychological Measurement, 64(6), 903-915.

Holland, P. W., & Thayer, D. T. (1988). Differential item performance and the Mantel-Haenszel procedure. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129–145). Hillsdale, NJ: Lawrence Erlbaum Associates

Jiménez, F. (2018). Universidad de Granada. http://www.ugr.es/~jmolinos/files/elaboraciondediagramasdebode.pdf

Jodoin, M. G., y Gierl, M.J. (2001). Evaluating Type I error and power rates using an effect size measure with logistic regression procedure for DIF detection. Applied Measurement in Education, 14, 329–349.

Maddox, B., Zumbo, B. D., Tay-Lim, B., & Qu, D. (2015). An Anthropologist Among the Psychometricians: Assessment Events, Ethnography, and Differential Item Functioning in the Mongolian Gobi. International Journal of Testing, 15(4), 291–309.doi:10.1080/15305058.2015.1017103

Mora, T. E. M. (2008). Funcionamiento diferencial del ítem en pruebas de matemática para educación media. Actualidades en psicología, 22(109), 91-113.

Ineval. (2018a). Informe de resultados nacional - Ser bachiller Año lectivo 2017-2018.

Ineval. (2018b). Funcionamiento Diferencial de los Ítems de la prueba Ser Bachiller 2017, según sexo

Ineval. (2019a). Informe de resultados nacional - Ser bachiller Año lectivo 2018-2019.

Ineval. (2019b). La educación en Ecuador: logros alcanzados y nuevos desafíos.

Ineval. (2020). Informe de resultados: Evaluación Costa 2019-2020.

Ineval., Ajila, J. & Levy, E. (2021). Estudio del funcionamiento diferencial de los ítems de la evaluación Ser Bachiller 2018, según las variables sexo, área, autoidentificación étnica y financiamiento. http://evaluaciones.evaluacion.gob.ec

Penfield, R., & Camilli, G. (2007). Test fairness and differential item functioning. Handbook of statistics, 26, 125-167.

Resino, D. A. (2018). Funcionamiento diferencial del ítem por sexo en alumnos de Educación Secundaria. Experiencias educativas en el aula de infantil, primaria y secundaria, 66.

Solano-Flores, G. (2011). "Assessing the cultural validity of assessment practices", en Cultural Validity in Assessment Nueva York: Routledge, pp. 3-21.

Taylor, C., & Lee, Y. (2011). Ethnic DIF in reading tests with mixed item formats. Educational Assessment, 16(1), 35-68.

UNESCO-OREALC. (2013). Situación educativa de América Latina y el Caribe: Hacia la educación de calidad para todos al 2015. Santiago de Chile: UNESCO.

Wedman, J. (2018). Reasons for Gender-related Item Functioning in a College Admissions Test, Scandinavian Journal of Educational Research, 62: 6, 959-970, DOI: 10.1080 / 00313831.2017.1402365

Woitschach, P., & Ortiz, L (2019). Funcionamiento diferencial del ítem en la evaluación educativa a nivel América Latina y el Caribe.

Zieky , M. (2003). A DIF primer. https://www.ets.org/Media/Tests/PRAXIS/pdf/DIF_primer.pdf.

Zwick, R., & Ercikan, K. (1989). Analysis of differential item functioning in the NAEP history assessment. Journal of Educational Measurement, 26, 55-66.

Zumbo, B. D., y Thomas, D. R. (1997) A measure of effect size for a model-basedapproach for studying DIF. Working Paper of the Edgeworth Laboratory for Quantitative Behavioral Science. University of Northern British Columbia: PrinceGeorge, B.C

Published

2022-07-01

How to Cite

Ajila Sanmartin, J. P., & Núñez- Wong, J. A. (2022). Psychometric evaluation of the "Ser Bachiller 2020-Régimen Costa" applying the Differential Functioning of the Items. Education Journal, 20(20), 26–38. https://doi.org/10.51440/unsch.revistaeducacion.2022.20.224