ICILS 2023 Design

Methodology
  • International large-scale sample survey of students, teachers, and school principals
  • Student achievement test allowing monitoring of trends across successive cycles
  • Predominantly quantitative, with qualitative information presented in descriptive country chapters
Method(s)
  • Overall approach to data collection
    • Proctored assessment of student achievement
    • Self-administered surveys for students, teachers, and school principals
  • Specification
    • Cross-sectional
    • Some trend reporting possible with previous cycles
Target population

Students

  • The student target population consisted of all students enrolled in the grade that represented eight years of schooling, counting from the first year of ISCED Level 1, provided that the mean age at the time of testing was at least 13.5 years.
  • Students older than 17 years were not included in the target population.
  • In countries where the average age of students in Grade 8 was less than 13.5 years, Grade 9 was defined as the target population.

 

Teachers

All teachers teaching regular school subjects to students of the target grade (regardless of the subject or the number of hours taught) during the ICILS testing period who had been employed at the school since the beginning of the school year.

 

Schools

  • The population for the ICILS school survey comprised schools at which target grade students were enrolled.
  • Principals of sampled schools were asked to complete the school questionnaire.
Sample design
Stratified two-stage cluster sampling design - optimized for the student population

Schools selected at the first stage for the student population were also considered sampled for the teacher and school populations.

 

First stage: sampling schools

  • Selection probability proportional to the size of the school (PPS)
  • Optional: stratification of schools according to (demographic) variables of interest (e.g., school type or source of funding, level of urbanization, region of the country), either explicit or implicit
  • Random-start fixed-interval systematic sampling
  • Schools sampled at the same time for field trial and main data collection
  • For each sampled school, two replacement schools were assigned where possible

 

Second stage (students): class sampling 

Within schools agreeing to participate:

  • Systematic random sampling was used to select one or more classes from the sampled school in the target grade.
  • Each class had an equal selection probability within the school.
  • If a selected class was smaller than half of the average class size, a pseudo-class was created with other classes.
  • In case the school also participated in TIMSS 2023, either different classes were randomly selected for both studies, or students had a two-week break between the assessments.
  • All students within a selected class were asked to participate.

 

Second stage: sampling teachers

Within schools agreeing to participate, 15–20 teachers were selected from all eligible teachers using systematic random sampling.

 

General notes

  • Sampling of schools was conducted by the sampling team at IEA Hamburg.
  • Sampling procedures within the center/school were carried out by the national study centers, using the Within-school Sampling Software for Windows (WinW3S) provided by IEA.
Sample size

Intended per country

  • Approx. 3,000 assessed students
  • Approx. 2,600 participating teachers
  • Minimum of 150 schools (in countries with fewer than 150 schools, all available schools were included.)
  • In each sampled school, selection of one class and 15 target grade teachers (or all if the number of target grade teachers was less than or equal to 20)
  • Required effective sample size: minimum of 400 students

 

Total achieved

Approximately 132,600 students (CIL), 85,200 students (CT), 61,000 teachers, 5,300 schools (i.e., school principals and ICT coordinators) from 35 education systems (34 countries and one benchmarking entity) participated.

Data collection techniques and instruments

Student CIL and CT assessments  

  • The tests were embedded within modules. In total, there were five 30- minute CIL modules and two 25-minute CT modules.
  • Each student completed 2 out of 7 CIL modules. The order and selection of the CIL modules were assigned randomly.
  • In countries participating in the CT option, each student completed 2 out of 4 modules following completion of the CIL test and student questionnaire. The order and selection of CT modules were assigned randomly.

 

Student questionnaire

  • Following the CIL assessment, each student completed a 30-minute student questionnaire.
  • The student questionnaire was used to investigate student engagement with ICT and included questions related to experience and use of ICT, their attitudes towards the use of computers and ICT and background characteristics.

 

Questionnaires for teachers, ICT coordinators, and principals

  • ICILS 2023 also included a teacher questionnaire, which was completed by 15 randomly selected Grade 8 teachers that teach in the target grade in each sampled school.
  • The questions were related to familiarity of teachers with ICT, their use of ICT in educational activities and approaches to teaching, their perceptions of ICT in schools, learning to use ICT in teaching, and their background characteristics.
  • The ICT coordinators also had to complete a 15-minute questionnaire, which included questions related to the resources of ICT in school, ICT use in school, ICT technical support, and provisions for professional development in ICT.
  • The principal of the respective schools completed a 15-minute questionnaire, answering questions related to school characteristics, policies, procedures, school leadership for ICT and priorities for ICT. An optional questionnaire with questions about generative artificial intelligence was included in ICILS 2023 for principals as well.

 

National coordinator questionnaires

  • National research coordinators (NRCs) collected data from experts in a national context survey (NCS).
  • The survey was used for gathering information about the structure of the education system and systematic descriptions of policy and practice in the use of ICT in school education.

 

Languages
  • Administration of national study instruments in 50 languages. Of the 35 participating entities, 10 administered the instruments in more than one language.
  • The most common languages
    • English (5 countries)
    • Spanish (3 countries)
    • Russian (3 countries)
    • German (3 countries)
Translation procedures
  • Development of an international version of all assessment instruments in English by the ICILS International Study Center
  • Translation into applicable languages of instruction by national research coordinators (NRCs)
  • Translation verification by linguistic and assessment experts in order to ensure equivalence with the international version
Quality control of operations

Measures during data collection

  • Participants were responsible for data collection within their own respective territories.
  • Standardized survey operation procedures: step-by-step documentation of all operational activities provided with the operation manuals
  • Full-scale field test of all instruments and operational procedures (in each participating country and entity)
  • Provision of software tools for supporting activities (e.g., sampling and tracking classes and students; administering school and teacher questionnaires; documenting scoring reliability; creating, and checking data files)
  • Training, e.g., for national research coordinators (NRCs) and their staff, for school coordinators, and test administrators
  • School visits conducted by international quality observers (IQOs) during test administration (at 15 schools per grade and country)
  • Training, e.g., for IQOs for procedures related to arranging visits to schools, interviewing school coordinator and test administrator, and reporting of these activities
  • Documentation of data collection activities by IQOs to determine whether the ICILS assessment was administered in compliance with the standardized procedures
  • National quality control program by national study centers that involved national quality officers (NQOs) to visit 10 percent of the sampled schools (minimum 15 schools) during testing
  • Survey activities questionnaire (SAQ) completed by NRCs

 

Measures during data processing and cleaning

  • Testing of all data cleaning programs with simulated data sets
  • Registering all incoming data and documents in a specific database recording the date of arrival
  • All systematic data recoding were documented.
  • Iterative data cleaning process on each national dataset until all data were consistent and comparable
  • Standardized cleaning process
  • Repetition of data cleaning and comparison of the new data sets with the preceding version
  • Finally, identification of irregularities in data patterns and correction.