A Modified Theory-Based Method For Answer-Correlated Weighting Ordering Theory And Its Application
Main Article Content
Abstract
Multiple-choice questions have been used as objective tests, and their results have served as optimal materials for analyzing test question response theories and conceptual structures. People’s answers to multiple-choice questions can only be used to analyze distracters and trick questions. At present, conceptual structure analyses only assess whether correct or incorrect answers have been given when providing visual descriptions, failing to account for information hidden among different incorrect answers. In practice, question designers can design incorrect answers that reveal some information about the participants’ knowledge. Accordingly, this study proposed a new answer-correlated weighted conceptual structure model that enhances the precision of analysis results to more accurately reflect participants’ learning statuses.
To verify the validity of the answer-correlated weighting ordering structure, this study used ordering theory and modified the model through answer weighting to develop answer-correlated weighting ordering theory. A simulation study was conducted to assess the estimation accuracy of the model. Participants’ responses were simulated using Ozaki’s structured deterministic inputs, noisy “and” gate model for multiple-choice items, and a four-part cognitive attribute structure was used to form ideal test question responses. The four-part structure, five sample sizes, four participant answering errors, and three different test question numbers were used to create 240 test scenarios. A total of 100 simulated binary sum answers were generated for each simulation scenario. The results demonstrated that answer-correlated weighting ordering theory exhibited the most effective estimation performance and generated favorable results in all participant “answering error” situations.