Current Location:home > Detailed Browse

Article Detail

认知诊断评价中的被试拟合研究

Research on Person-fit in Cognitive Diagnostic Assessment

Submit Time: 2022-05-12
Author: 喻晓锋 1 ; 唐茜 1 ; 秦春影 2 ; 李喻骏 1 ;
Institute: 1.江西师范大学; 2.南昌师范学院;

Abstracts

通常情况下,认知诊断需要通过认知诊断模型对被试进行诊断评价。认知诊断模型所生成的诊断结果的有效性依赖于被试作答反应是否与所选用的模型拟合。因此,在对诊断结果进行评估的时候,需要通过被试拟合分析来对被试个体的作答反应与模型的拟合情况进行检验,以避免错误或无效的补救措施。本研究基于加权的得分残差,提出认知诊断评价中新的被试拟合指标R 。模拟研究结果表明,R 指标的一类错误率有较好的稳定性,对随机作答、疲劳、睡眠和创造性作答四种异常被试类型均有较高的统计检验力。并将R 指标应用于分数减法实证数据,展示R 指标在实际测验中的使用过程。

[英文摘要]Cognitive Diagnostic Assessment (CDA) has been widely used in educational assessment. It can provide guidance for further study and teaching by analyzing whether the test-takers have acquired knowledge points or skills.

In psychometrics, statistical methods for assessing the fit of an examinee’s item responses to a postulated psychometric model are often called person-fit statistic. The person-fit analysis can help to verify the individual diagnostic results, and is mainly used to distinguish the abnormal examinees from the normal ones. The abnormal response patterns include “sleeping” behavior, fatigue, cheating, creative responding, random guessing responses and cheating with randomness, and all of these abnormal response patterns can affect the deviation of examinee’s ability estimation. The person-fit analysis can help researchers identify the abnormal response patterns more accurately, so as to delete the abnormal responding examinees and improve the validity of the test. In the past, most of the person fit researches were mainly carried out under the Item Response Theory (IRT) framework, while only few papers have been published dealing with person-fit under the CDM framework. This study attempts to fill a gap in the literature by introducing new methods. In this study, a new person fit index (R) was proposed.

In order to verify the validity of the newly developed person fit index, this study explores the type I error and statistical test power of R index under different item length, item discrimination and different misfit types of respondent, and compares it with existing methods RCI and lz . Type I error rate was defined as the proportion of flagged abnormal response patterns by a person fit statistic out of 1,000 generated normal response patterns from the DINA model. The control variables of this study include: the number of subjects is controlled to 1000, the cognitive diagnosis model is chosen as DINA model, the attributes are 6, and the Q matrix is fixed. Finally, in order to reflect the value of person fit index in practical application, the R index is applied to the empirical data of fractional subtraction.

The results show that the type I error of R index is reasonable and stable at 0.05. In the aspect of statistical test power, with the improvement of item differentiation, the statistical test power of each index in different abnormal examinees is improved. With the increase in the number of items, most of the statistical power show an upward trend. For different types of abnormal subjects, R index perform best in the cases of random guessing responses and cheating with randomness. In the case of fatigue, sleep, and creative responding, the lz  index perform better. In the empirical data study, the detection rate of abnormal examinees is 4.29%.

With the increase of the discrimination of items and the increase of the number of items, the power of R index has improved, and the performance of R index is the most robust when the discrimination of item is low. The R index has a high power for the types of abnormal behavior such as creative responding behavior, random guessing responses and cheating with randomness.

"

Download Comment Hits:6822 Downloads:523
From: 喻晓锋
DOI:10.12074/202204.00026
Journal:心理科学
Recommended references: 喻晓锋,唐茜,秦春影,李喻骏.(2022).认知诊断评价中的被试拟合研究.心理科学.[ChinaXiv:202204.00026] (Click&Copy)
Version History
[V2] 2022-05-12 16:42:54 chinaXiv:202204.00026V2 Download
[V1] 2022-04-06 08:13:50 chinaXiv:202204.00026v1(View This Version) Download
Related Paper

Download

Current Browse

Change Subject Browse