IELS 2025 Design

Methodology
  • International large-scale sample survey of children, their parents and teachers
  • Child assessment allowing monitoring of trends across successive cycles
  • Predominantly quantitative, with qualitative information presented in descriptive country chapters
Method(s)
  • Overall approach to data collection
    • Proctored assessment of children’s skills
    • Self-administered questionnaires for parents and teachers
  • Specification
    • Cross-sectional
    • Some trend reporting possible across cycles
Target population

All children in centers/schools at the age of five at the time of the assessment

Sample design
Stratified two-stage cluster sampling design - optimized for the child population

First stage: sampling centers/schools

  • Selection probability proportional to the size of the center/school (PPS), i.e., the expected number of eligible children
  • Optional: stratification of centers/schools according to (demographic) variables of interest (e.g., center/school type or source of funding, level of urbanization, region of the country), either explicit or implicit
  • Random-start fixed-interval systematic sampling
  • Centers/Schools sampled at the same time for field trial and main data collection
  • For each sampled center/school, two replacement centers/schools were assigned where possible

 

Second stage: sampling children

Within centers/schools agreeing to participate:

  • Systematic random sampling was used to select children.
  • Each child had an equal selection probability within the center/school.

 

General notes

  • Sampling of centers/schools was conducted by the sampling team at IEA Hamburg.
  • Sampling procedures within the center/school were carried out by the national study centers, using the Within-school Sampling Software for Windows (WinW3S) provided by IEA.
Sample size

Intended per country

  • Approx. 3,000 children
  • A minimum of 200 centers/schools
  • In each sampled center/school, selection of 15 children (or all children in centers/schools with fewer than 15 children)
     

Total achieved

Forthcoming

Data collection techniques and instruments

Child assessment

  • Children were directly assessed using tablets.
  • A study administrator conducted the assessment with each child individually
    • 15 minutes per domain
    • 4 domains conducted on two assessment days

 

Questionnaires

  • The parent questionnaire was completed either online or on paper and took approximately 20 minutes to complete.
  • The staff questionnaire was completed mainly online; in some countries it was administered on paper. Completion took approximately 10 minutes per child, plus an additional 5 minutes for questions about the staff member him-/herself.
     
Languages

In the main study, the assessment instrument as well as the parent and staff staff questionnaires were administered in the following languages:

  • Azerbaijan (AZE)
  • Chinese (HCN)
  • Dutch (NLD)
  • English (ENG, MLT, ARE)
  • Flemish (BFL)
  • Korean (KOR)
  • Maltese (MLT)
  • Portuguese (BRA)
  • Russian (AZE)

 

Translation procedures

National versions of all instruments used in the assessment had to be developed through a double-translation-and-reconciliation procedure:

  • Source materials for all instruments were developed in English.
  • Two independent translators translated the source material into the target language.
  • A third person merged these two versions into a single national version.
  • The translated/adapted versions of the direct assessments and the questionnaires were submitted to cApStAn for translation verification.
Quality control of operations

Measures during data collection

  • National project managers (NPMs) in each participating country responsible for data collection
  • Standardized survey operation procedures: step-by-step documentation of all operational activities provided with manuals
  • Full-scale field test of all instruments and operational procedures (in each participating country)
  • Appointment of a national data manager (NDM) to oversee and implement all data tasks
  • National Quality Assurance Monitoring Program
  • International Quality Assurance Monitoring Program
  • Training of NPMs, center/school coordinators, study administrators, national quality assurance monitors (NQAMs)
  • Provision of software tools for supporting activities
  • Survey activities questionnaire (SAQ) to be completed by NPMs
  • Double data entry for questionnaires
  • Visits to centers/schools by international quality assurance monitors (IQAMs)

Measures during data processing and cleaning

  • Thorough testing of all data cleaning programs with simulated data sets
  • Material receipt database
  • National adaptation database
  • Standardized, iterative four-step cleaning process
    • Documentation and structure check
    • Identification variable (ID) and linkage cleaning
    • Background cleaning (resolving inconsistencies in questionnaire data)
    • Valid range checks
  • Repetition of data cleaning until all data shown to be consistent and comparable
  • Identification of irregularities in data patterns and correction of data errors