Differential item functioning analysis of a high stake test in terms of statistical regions of Turkey
Yeşim Özer Özkan 1 * , Meltem Acar Güvendir 2
More Detail
1 Gaziantep University, Faculty of Education, Turkey
2 Trakya University, Faculty of Education, Turkey
* Corresponding Author

Abstract

Large scale assessment is conducted at different class levels for various purposes such as identifying student success in education, observing the impacts of educational reforms on student achievement, assessment, selection, and placement. It is expected that these tests and their items are used in education do not display different traits with regard to the responses of individuals at the same ability level but from different groups. The purpose of this study was to identify whether the test includes Differential Item Functioning (DIF), and if it does, to identify whether the items are biased or not. The Rasch model was performed using the Winsteps package software to determine if the items contain DIF. DIF was identified in eight items during the analysis.  Expert opinion was sought to determine if the difference between the regions in item is due to DIF or item effect. Based on the feedback from the experts, no bias was observed in the items with regard to regions.

Keywords

References

  • Acar, T. (2011). Sample size in differential item functioning: an application of hierarchical linear modeling. Educational Sciences: Theory & Practice, 11(1), 284-288.
  • Akalın, Ş. (2014). The examination of the KPSS general ability test in terms of item bias [Unpublished Doctoral Dissertation]. Ankara University, Ankara, Turkey.
  • Ardıç, E. Ö., & Gelbal, S. (2017). Cross‐group equivalence of interest and motivation items in PISA 2012 Turkey sample. Eurasian Journal of Educational Research, 68, 221-238. https://doi.org/10.14689/ejer.2017.68.12
  • Bakan Kalaycıoğlu, D., & Kelecioğlu, H. (2011). Item bias analysis of the university entrance examination. Education and Science, 36(161), 3-13.
  • Baykul, Y. (2000). Eğitimde ve psikolojide ölçme: klasik test teorisi ve uygulanması [Measurement and Evaluation in Education and Psychology: Classical test theory and practice]. OSYM Publications.
  • Bekçi, B. (2007). Examining differential item functions of the elementary school student selection and placement examination according to gender and school type [Unpublished Master’s Thesis]. Hacettepe University, Ankara, Turkey.
  • Berberoğlu, G., & Kalender, İ. (2005). Investigation of student achievement across years, school types and regions: The SSE and PISA analyse. Educational Sciences and Practice, 4(7), 21-35.
  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. (2nd Ed.). Lawrence Erlbaum Associates Publishers.
  • Çelik, M., & Özer Özkan, Y. Ö. (2020). Analysis of Differential Item Functioning of PISA 2015 Mathematics Subtest Subject to Gender and Statistical Regions. Journal of Measurement and Evaluation in Education and Psychology, 11(3), 283-301. https://doi.org/10.21031/epod.715020
  • Embretson, S. E., & Reise, S. T. (2013). Item response theory for psychologists. Lawrence Erlbaum Associates Publishers.
  • Fincan, B. (2017). An investigation of math subtests item bias in 2014 free boarding and scholarship examination in terms of gender variable [Unpublished Master’s Thesis]. Gaziantep University, Gaziantep, Turkey.
  • Gao, F. (1997). DIMTEST enhancements and some parametric IRT asymptotics [Unpublished Doctoral Dissertation]. University of Illinois at Urbana- Champaign, United States.
  • Gümüş Özyıldırım, F. (2018). Examining TEOG mathematic sub-test exam in terms of differential item functioning based on geographical regions [Unpublished Master’s Thesis]. Hacettepe University, Ankara, Turkey.
  • Hambleton, R K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage Publication.
  • Kan, A., Sünbül, Ö., & Ömür, S. (2013). 6.-8. Sınıf seviye belirleme sınavları alt testlerinin çeşitli yöntemlere göre değişen madde fonksiyonlarının incelenmesi. [Examining differential item functioning of 6th-8th grade Level Determination Examinations (LDE) subtests to various methods] Mersin University Journal of the Faculty of Education, 9(2), 207-222.
  • Karakaya, İ., & Kutlu, 0.(2012). An Investigation of item bias in Turkish sub tests in level determination exam. Education and Science, 37(165), 348-362.
  • Karakaya, İ. (2012). An investigation of item bias in science & technology subtests and mathematic subtests in Level Determination Exam (LDE). Educational Sciences: Theory & Practice, 12(1), 215-229.
  • Karami, H. (2012). An introduction to differential item functioning. The International Journal of Educational and Psychological Assessment, 11(2), 59-76.
  • Linacre, J. M. (2002). What do infit and outfit, mean-square and standardized mean? Rasch Measurement Transactions, 16(2), 878.
  • Linacre, J. M. (2009). FACETS Rasch-model computer program (Version 3.66.0) [Computer software]. Winsteps.com.
  • Linacre, J. M. (2010). Winsteps®(Version 3.70.0) [Computer Software]. Winsteps.com.
  • Linacre, J. M. (2012). A User’s Guide to WINSTEPS Rasch-model Computer Programs. MESA Press.
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Erlbaum.
  • Ministry of Development (n.d.). İstatistiki Bölge Birimleri Sınıflandırması.[ Statistical Region Units Classification]. Retrieved from January 6, 2018, from http://www3.kalkinma.gov.tr/PortalDesign/PortalControls/WebIcerikGosterim.aspx?Enc=83D5A6FF03C7B4FCC26F032470459B0B
  • Ministry of National Education [MoNE]. (2014). Parasız yatılılık ve bursluluk sınavı (PYBS) başvuru ve uygulama e-Kılavuzu [Free Boarding and Scholarship Exam (FBSE) reference and application e-Guide]. Retrived from October 10, 2016 from http://www.meb.gov.tr/
  • Öğretmen, T., & Doğan, N. (2004). OKÖSYS matematik alt testine ait maddelerin yanlılık analizi [Differential item functioning analysis for the Mathematics subtest of the High Schools Entrance Examination (HSEE)]. İnönü University Journal of the Faculty of Education, 5(8), 61-76.
  • Özmen, D.T. (2014). A study on PISA 2009 reading test items in terms of bias PISA 2009. Educational Sciences and Practice, 13(26), 147-165
  • Reise, S. P., Widaman, K. F., & Pugh, R. H. (1993). Confirmatory factor analysis and item response theory: Two approaches for exploring measurement invariance, Psychological Bulletin, 114(3), 552-566. https://doi.org/10.1037/0033-2909.114.3.552
  • Robitzsch,A., Kiefer, T., & Wu, M. (2020). TAM: Test analysis modules. [R package]. https://CRAN.R-project.org/package=TAM
  • Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36. https://doi.org/10.18637/jss.v048.i02
  • Satıcı, K.. & Özer Özkan, Y. (2017). Investigation of item bias in 2014- November transition exam from primary to secondary education in terms of gender. Mersin University Journal of the Faculty of Education, 13(1), 254-274. https://doi.org/10.17860/mersinefd.305954
  • Schermelleh-Engel, K., & Moosbrugger, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8(2), 23-74.
  • Şenferah, S. (2015). An investigetion of differential item functioning and item bias for the mathematics subtest of level determination test in 2010 [Unpublished Doctoral Dissertation]. Gazi University, Ankara, Turkey.
  • Şengül, Ü., Eslemian, S., & Eren, M. (2013). Economic activities of regions of level 2 according to statistical regional units classification (NUTS) in Turkey determining by using DEA and Tobit model application. Journal of Administrative Sciences, 11(21), 75-99.
  • Türkan, A., & Çetin, B. (2017) Study of bias in 2012-placement test through Rasch Model in terms of gender variable. Journal of Education and Practice, 8(7), 196-204.
  • Uyar, Ş., & Doğan, N. (2011). An investigation of measurement invariance of learning strategies model across different groups in PISA Turkey sample. International Journal of Turkish Education Science, 3, 30-43.
  • Winsteps (n.d.). Winsteps. Rasch Analysis and Rasch Measurement Software. Retrieved April 13, 2018, from https://www.winsteps.com/winsteps.htm.
  • Wright, B. D., & Linacre, J. M. (1994). Reasonable mean-square fit values. Rasch Measurement Transactions, 8(3), 370.
  • Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACER ConQuest Version 2: Generalized item response modeling software [Computer software]. Australian Council for Educational Research.
  • Yen, W. M. (1984). Effects of local item dependence on the fit and equating performance of the three-parameter logistic model. Applied Psychological Measurement, 8, 125-145.
  • Yurdugül, H. & Aşkar, P. (2004). The investigation of the student selection and placement examination for secondary education with respect to student settlement region in terms of differential item functioning. Hacettepe University Journal of Education, 27, 268-275.
  • Zwick, R., Thayer, D.T., & Lewis, C. (1999) An empirical bayes approach to Mantel-Haenszel DIF analysis. Journal of Educational Measurement, 36(1), 1-28. https://doi.org/10.1111/j.1745-3984.1999.tb00543.x.

License

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.