ICILS 2013 Design
- International large-scale sample survey of students, teachers, and school principals
- Student achievement test allowing monitoring of trends across successive cycles
- Predominantly quantitative, with qualitative information presented in descriptive country chapters
- Overall approach to data collection
- Proctored assessment of student achievement
- Self-administered surveys for students, teachers, and school principals
- Specification
- Cross-sectional
Students
- The student target population consisted of all students enrolled in the grade that represented eight years of schooling, counting from the first year of ISCED Level 1, provided that the mean age at the time of testing was at least 13.5 years.
- Students older than 17 years were not included in the target population.
- In countries where the average age of students in Grade 8 was less than 13.5 years, Grade 9 was defined as the target population.
Teachers
All teachers teaching regular school subjects to students of the target grade (regardless of the subject or the number of hours taught) during the ICILS testing period who had been employed at the school since the beginning of the school year.
Schools
- The population for the ICILS school survey comprised schools at which target grade students were enrolled.
- Principals of sampled schools were asked to complete the school questionnaire.
Stratified two-stage cluster sampling design - optimized for the student population
Schools selected at the first stage for the student population were also considered sampled for the teacher and school populations.
First stage: sampling schools
- Selection probability proportional to the size of the school (PPS)
- Optional: stratification of schools according to (demographic) variables of interest (e.g., school type or source of funding, level of urbanization, region of the country), either explicit or implicit
- Random-start fixed-interval systematic sampling
- Schools sampled at the same time for field trial and main data collection
- For each sampled school, two replacement schools were assigned where possible
Second stage: sampling students
Within schools agreeing to participate:
- Systematic random sampling was used to select 20 students from the sampled school in the target grade. All students were invited to participate in schools with fewer than 20 students in the target grade.
- Each student had an equal selection probability within the school.
Second stage: sampling teachers
Within schools agreeing to participate, 15–20 teachers were selected from all eligible teachers using systematic random sampling.
General notes
- Sampling of schools was conducted by the sampling team at IEA Hamburg.
- Sampling procedures within the center/school were carried out by the national study centers, using the Within-school Sampling Software for Windows (WinW3S) provided by IEA.
Intended per country
- Approx. 3,000 assessed students
- Approx. 2,600 participating teachers
- Minimum of 150 schools (in countries with fewer than 150 schools, all available schools were included.)
- In each sampled school, selection of 20 students (or all if the number of target grade students was less than or equal to 25) and 15 target grade teachers (or all if the number of target grade teachers was less than or equal to 20)
- Required effective sample size: minimum of 400 students
Total achieved
Approximately 60,000 students, 35,000 teachers, and 3,300 schools (i.e., school principals and ICT coordinators) from 21 education systems (18 countries and three benchmarking entities) participated.
Student CIL assessment
- Students completed a computer-based test of CIL.
- The test consisted of questions and tasks presented in four 30-minute modules.
- Each student completed two modules randomly allocated from the set of four so that the total assessment time for each student was one hour.
Student questionnaire
- After completing the two test modules, students answered (again, on computer) a 30-minute international student questionnaire.
- It included questions relating to:
- Students’ background characteristics;
- Their experience and use of computers and ICT to complete a range of different tasks in school and out of school;
- Their attitudes toward using computers and ICT.
Questionnaires for principals, teachers, and ICT coordinators
The following three instruments could be completed on computer (over the Internet) or on paper:
- A 30-minute teacher questionnaire that asked teachers
- Several basic background questions;
- Questions relating to their
- Use of ICT in teaching,
- Attitudes about the use of ICT in teaching;
- Participation in professional learning activities relating to the pedagogical use of ICT.
- A 10-minute ICT coordinator questionnaire that
- Asked ICT coordinators about the resources available in the school to support the use of ICT in teaching and learning.
- Addressed both technological (e.g., infrastructure, hardware, and software) as well as pedagogical support (such as through professional learning).
- A 10-minute principal questionnaire that asked school principals to provide information about:
- School characteristics;
- School approaches to providing CIL-related teaching and incorporating ICT in teaching and learning.
National contexts survey
- Completion of the national contexts survey happened online.
- It was the responsibility of staff in the national centers.
- The survey contained general information about key antecedents and processes in relation to CIL education at the country level.
Languages used
- Administration of assessment instruments and questionnaires in 22 different languages
- The most common languages
- English (3 countries)
- Spanish (2 countries)
- German (2 countries)
- French (2 countries)
- Development of an international version of all assessment instruments in English by the ICILS International Study Center
- Translation into applicable languages of instruction by participating entities
- Translation verification by linguistic and assessment experts in order to ensure equivalence with the international version
Measures during data collection
- Participants were responsible for data collection within their own respective territories.
- Standardized survey operation procedures: step-by-step documentation of all operational activities provided with the operation manuals.
- Full-scale field test of all instruments and operational procedures (in each participating country and entity)
- Provision of software tools for supporting activities (e.g., sampling and tracking classes and students, administering school and teacher questionnaires, documenting scoring reliability, creating and checking data files)
- Training, e.g., for national research coordinators (NRCs) and their staff, for school coordinators, and test administrators
- School visits conducted by international quality control monitors (IQCMs) during test administration (at 15 schools per grade and country)
- National quality control program
- Survey activities questionnaire (SAQ) completed by NRCs
Measures during data processing and cleaning
- Testing of all data cleaning programs with simulated data sets
- Material Receipt Database
- National Adaptation Database
- Standardized cleaning process
- Repetition of data cleaning and comparison of the new data sets with the preceding version
- Finally, identification of irregularities in data patterns and correction