학술논문

Testing for Differential Item Functioning under the 'D'-Scoring Method
Document Type
Journal Articles
Reports - Research
Author
Dimitrov, Dimiter M. (ORCID 0000-0003-1256-4842); Atanasov, Dimitar V. (ORCID 0000-0002-4079-8585)
Source
Educational and Psychological Measurement. Feb 2022 82(1):107-121.
Subject
Test Bias
Methods
Test Items
Scoring
Probability
Computation
Language
English
ISSN
0013-1644
Abstract
This study offers an approach to testing for differential item functioning (DIF) in a recently developed measurement framework, referred to as "D"-scoring method (DSM). Under the proposed approach, called "P-Z" method of testing for DIF, the item response functions of two groups (reference and focal) are compared by transforming their probabilities of correct item response, estimated under the DSM, into Z-scale normal deviates. Using the liner relationship between such Z-deviates, the testing for DIF is reduced to testing two basic statistical hypotheses about equal variances and equal means of the Z-deviates for the reference and focal groups. The results from a simulation study support the efficiency (low Type error and high power) of the proposed "P-Z" method. Furthermore, it is shown that the "P-Z" method is directly applicable in testing for differential test functioning. Recommendations for practical use and future research, including possible applications of the "P-Z" method in IRT context, are also provided.