Usually, ILSAs report results of student achievement for subjects (e.g., in TIMSS for mathematics and science) and within each subject at a general content level (e.g., in mathematics for numbers, data display, and geometric shapes and measures). This publication, prepared by NCES and AIR staff, introduces the item response theory (IRT)–based domain scoring method for analyzing student achievement at the topic level (within content areas), using data from TIMSS 2007 and 2011 assessments.
Reporting topic-level scores in ILSAs is beneficial from a policy and pedagogical perspective, allowing, for example, the identification of students’ strength and weaknesses in relation to the intended and implemented curricula within countries or the comparison of instruction in mathematics related to specific topic areas (rather than the general content) between all participating countries.
It could be assumed that topic-level performance needs to be calculated within cross-sectional data from a single administration. However, with the IRT–based domain scoring method it is possible to estimate students’ performance on items that were not administered in their own test forms or not administered in the year they received the assessment. Further, the publication describes several technical conditions that need to be met for employing the domain scoring method and emphasizes that it can be replicated by researchers conducting secondary analyses in a relatively straightforward way.
To learn more about this publication, please follow the link below: