PIRLS 2016 Design
Methodology
- International large-scale sample-based assessment of student achievement and survey of the educational context
- Monitors trends in student achievement by reporting results from successive cycles on a common achievement scale
- Predominantly quantitative, with qualitative information presented in descriptive country chapters in the PIRLS 2016 encyclopedia
Method(s)
- Overall approach to data collection
- Proctored assessment of student achievement
- Self-administered surveys for students, parents, teachers, and school principals
- Specification
- Cross-sectional
- Some trend reporting possible with previous cycles
Target population
- All students enrolled in the grade that represented four years of schooling, counting from the first year of ISCED Level 1, provided that the mean age at the time of testing was at least 9.5 years.
- In countries where Grade 4 students were expected to find the PIRLS assessment too difficult, Grade 6 was defined as the target population.
Sample design
Stratified two-stage cluster sampling design
First stage: sampling schools
- Selection probability proportional to the size of the school (PPS)
- Optional: stratification of schools according to (demographic) variables of interest (e.g., school type or source of funding, level of urbanization, region of the country), either explicit or implicit
- Random-start fixed-interval systematic sampling
- Schools sampled at the same time for field trial and main data collection
- For each sampled school, two replacement schools were assigned where possible
Second stage (students): sampling classes
Within schools agreeing to participate:
- Systematic random sampling was used to select one or more classes from the sampled school in the target grade.
- Each class had an equal selection probability within the school.
- If a selected class was smaller than half of the average class size, a pseudo-class was created with other classes.
- All students within a selected class were asked to participate.
General notes
- Sampling of schools was conducted by Statistics Canada and the sampling team at IEA Hamburg.
- Sampling procedures within the center/school were carried out by the national study centers, using the Within-school Sampling Software for Windows (WinW3S) provided by IEA.
- All sampling activities were monitored and documented by Statistics Canada and IEA Hamburg staff.
Sample size
Intended per country
- Approx. 4,500 assessed students
- A minimum of 150 schools (in countries with fewer than 150 schools, all available schools were included)
- In each sampled school, selection of one class
- Required effective sample size: minimum of 400 students
Total achieved
Approximately 340,000 students from 61 education systems (50 countries and 11 benchmarking entities) participated.
Data collection techniques and instruments
Student Assessment
- Written format – PIRLS and PIRLS Literacy
- Computer-based - ePIRLS
- Reading passages/online reading tasks and accompanying questions:
- Multiple-choice (at least half the number of total points)
- Constructed response
- Achievement items
- 12 item blocks (passages) for PIRLS, 12 item blocks for PIRLS Literacy, and 5 ePIRLS tasks
- 16 test booklets for PIRLS, 16 test booklets for PIRLS Literacy, and 12 task combinations for ePIRLS
- Items per assessment
- PIRLS: 175 items
- PIRLS Literacy: 183 items
- ePIRLS: 91 items
- Items per assessment
- Linking mechanisms between booklets and between cycles
- PIRLS and PIRLS Literacy have 4 common item blocks/passages
- New PIRLS passages and items in 2016: 4 blocks
- 6 PIRLS blocks are trend passages and (unreleased) items from PIRLS 2001, PIRLS 2006, and 2011
- New PIRLS Literacy passages and items in 2016: 6 blocks
- 4 PIRLS Literacy blocks are trend passages and (unreleased) items from prePIRLS 2011
- Matrix sampling of passages/items (rotated test booklet design)
Background questionnaires
- Student questionnaire, in print format; modular design
- Learning to Read survey (home questionnaire), to be completed by students’ parents or primary caregivers, print and online format (online for some countries)
- Teacher questionnaire, to be completed by the reading teachers assessed classes, print or online
- School questionnaire, to be completed by the principal of each school sampled, in print or online format
- ePIRLS student questionnaire, in computer-based format
Curriculum questionnaire
- To be completed by NRCs
- Modular design
- Online format
Descriptive encyclopedia chapters
- One chapter for each participating entity
- Written by experts from ministries of education, research institutes, or institutions of higher education (in countries and benchmarking entities) based on an internationally agreed-upon outline.
Languages
- Assessment instruments administered in 40 languages for PIRLS, 10 languages for PIRLS Literacy, and 15 languages for ePIRLS.
- Instruments (assessments of achievement and student questionnaires) administered in two or more languages in 24 countries and four benchmarking entities.
- The most common languages: English (17 countries), Arabic (7 countries)
Translation procedures
- International version of all assessment instruments developed in English by TIMSS & PIRLS International Study Center.
- Instruments then translated and adapted by participating countries into their languages of instruction.
- Translations verified by independent linguistic and assessment experts in order to ensure their equivalence with the international version.
Quality control of operations
During data collection
- National Research Coordinator (NRC) in each participating country responsible for data collection
- Standardized survey operations procedures: step-by-step documentation of all operational activities provided with manuals
- Full-scale field test of all instruments and operational procedures (in each participating country and benchmarking entity)
- Provision of software tools for supporting activities (e.g., sampling and tracking classes and students, administering school and teacher questionnaires, documenting scoring reliability, creating and checking data files)
- Training of NRCs and their staff, school coordinators, test administrators, etc.
- School visits conducted by international quality control monitors (IQCMs) during test administration (15 schools per country)
- National quality control program
- Survey activities questionnaire (SAQ) to be completed by NRCs
During data processing and cleaning
- Testing of all data cleaning programs with simulated data sets
- Material receipt database
- National adaptation database
- Standardized cleaning process
- Repetition of data cleaning and comparison of new data sets with preceding versions
- Identification of irregularities in data patterns and correction of data errors
Sources - Technical documentation
Other sources