PASEC 2014 Design

Methodology
Quantitative Study
Method(s)
  • Survey-based study
  • Overall approach to data collection
    • Proctored assessment and self-administrated surveys with the support of the PASEC Coordination Center
  • Specification
    • Trend study
Target population

All students formally enrolled in Grades 2 and the last grade (Grade 5 or 6) of primary school (formal education), regardless of the type of school (public, private, community, etc.)

Sample design
Stratified three-stage cluster sample design

First stage: school sampling

  • Carried out 2 months before main data collection by the PASEC Coordination Center
  • Stratification of schools by important variables (region, type of school, level, urbanization)
  • Probability proportional to size (PPS): Systematic selection proportional to the combined size of the two grades.
  • A standard random sample of 90 schools for Grade 2 and (at least) 180 schools for Grade 6 was selected from each country’s school database.
  • For each sampled school, sampling of one or two replacement schools
  • Sampling of different schools for field trial (20 schools in each country)

 

Second stage: sampling classes within schools

  • One class from the target grades of each school selected by random sampling
  • Carried out during school data collection operations by test administrators

 

Third stage: sampling students within sampled classes

  • For Grade 6, selection of 20 students in each class (or less if the class size was under 20) by random sampling
  • For Grade 2, selection of 10 students in each class (or less if the class size was under 10) by random sampling
  • Carried out during school data collection operations by test administrators
Sample size

Intended

  • Per country, Grade 2
    • Grade sample: approx. 90 classes (a country may have more)
    • Student sample: approx. 900 students
  • Per country, Grade 6
    • Grade sample: approx. 180 classes (a country may have more)
    • Student sample: approx. 3,600 students

 

Achieved

In practice, these numbers were never reached, due to reasons of absenteeism, student withdrawal, closure of schools, etc.

Data collection techniques and instruments

Technique 1: achievement test and background questionnaire for Grade 2

  • One-to-one administration
  • Test specifics
    • One common booklet for all students
    • About 30 minutes per discipline (language and math)
    • About 40 items per discipline
    • Pupils answered questions orally with very short answers
    • Tests were administered over the course of two mornings, with administrators responsible for data collection
    • Test administrators were supervised and monitored by national teams.
  • Background questionnaire for student
    • One booklet
    • Test administrators supported the completion of the questionnaire, reading questions and answers aloud and translating them into the native language of students, if needed.

 

Technique 2: achievement test and background questionnaire for Grade 6

  • Pencil and paper
  • Written questionnaire
  • Collective administration
  • Tests
    • All items multiple-choice
    • Number of booklets
      • Four booklets for reading comprehension
      • Four booklets for mathematics
    • Four blocks of 23 items per discipline
    • Approx. 92 items per discipline
    • Each student administered 2 blocks per discipline
    • Common examples before starting
    • One hour maximum per block
    • Break of 10 minutes minimum between blocks
    • Each student administered 1 booklet with two blocks in reading and two blocks in math; the 2 blocks for each discipline were administered on the same day.
    • Test administration with matrix sampling of items, i.e., rotated test booklet design within selected class
  • Test administrators responsible for data collection.
  • Test administrators supervised and monitored by national teams.
  • Background questionnaire for student
    • One booklet
    • Test administrators supported the completion of the questionnaire, reading questions and answers aloud and translating them into the native language of students, if needed.
  • Background questionnaire for teacher completed by the teacher of class being assessed
  • Background questionnaire for school completed by the principal of school being sampled

 

Technique 3: Document analysis

  • One chapter for international report to describe the context
  • Written by the coordination center, based on national documents and data
  • One chapter for each national report to describe the context
  • Written by the countries based on their national documents and data
Techniques
  • achievement test
  • documents
  • questionnaire
Languages
  • Test developed in French
  • Assessment instruments administered in 3 languages:
    • French, 10 countries
    • English, 1 subsystem
    • Kirundi, 1 country for Grade 2
Translation procedures
  • Test developed in French.
  • Translation by two groups of linguistic and assessment experts
  • External conciliator to ensure equivalence with the international version
  • National verification and final validation by PASEC Coordination Center
Quality control of operations

Measures during data collection

  • Countries were responsible for data collection
  • Standardized survey operation procedures: step-by-step documentation of all operational activities provided with manuals
  • International training for national PASEC team at each step
  • International support and training for national PASEC team and test administrator, coding, data entry, and cleaning
  • National quality control program
  • International quality control program

 

Measures during data processing and cleaning

After data collection

  • Standardized storage of instruments, with a manual of procedures made available for the national teams; verification of procedures on a country-by-country basis by PASEC
  • Training of the team on data entry, with a manual of procedures made available
  • Data entry controls incorporated (program, forms)
  • Double data entry required for a percentage of the instruments (10% to 15%, depending on the type of instrument); data matching in order to estimate data entry errors; data entry repeated if necessary

Data cleaning

  • First phase of data cleaning at the PASEC center by means of various verification programs to check for inconsistencies, missing values, outlet values, etc.
  • Second phase of cleansing by national teams, as necessary, in the respective countries

 

Measures during data analysis and report writing

  • PASEC (with expert support on some aspects) established estimates of student scores and contextual indices, followed by statistical analysis of the data; these steps were checked and validated by the PASEC Scientific Committee.
  • The writing of the international report as well as the national reports was also checked and validated by the Scientific Committee.