TESS MILLER: Are PISA results valid?

It is better to make policy decisions based on accurate results, even if those results are disappointing

Published on January 11, 2017

The Programme for International Student Assessment (PISA) is a standardized assessment used to measure achievement of 15-year-olds (Grade 10 students) because they are about to complete their compulsory education. The test focuses on solving real-life problems rather than regurgitating knowledge from textbooks.

The outcome provides an indicator of how well we educate our children, which is an important driver of economic growth. In the past, P.E.I. did not score very well on these assessments but in the last administration, P.E.I. moved from last to 5th place. Given the significance of this international assessment, it is important that the data accurately reflect our education system.

The process begins by randomly selecting schools and then 15-year-olds from each school are also randomly selected to write the assessment. According to the PISA guidelines, a student can be excluded from writing the assessment for one of three reasons: (1) functional or physical disability that prevents the student from completing the assessment; (2) intellectual disability such as a mental or emotional disability that has resulted in a student being cognitively delayed making it impossible to complete the assessment; and (3) limited language skills that prevents the student from speaking or reading in the language the assessment is administered.

A common belief that new immigrants with weak language skills were holding P.E.I. back on these assessments is false. Only 0.9 per cent of students in P.E.I. were excluded because of limited language in comparison to 12 per cent excluded for limited intellectual ability and 1.8 per cent for physical disability.

According to Statistics Canada’s most recent survey of disabilities, a total of 4.4 per cent of Canadians aged 15 to 24 have a disability and only 2 per cent of these disabilities are classified as a learning (i.e., intellectual) disability.

Given these national statistics, it comes as a great surprise to learn that 12 per cent of 15 year-olds on P.E.I. have a learning disability. This statistic was reported on the recent PISA report based on a sample of just 15 year-olds. Not only was P.E.I.’s percentage of learning disabled students the highest when compared to all other provinces, it exceeded the percent of P.E.I.’s learning disabled students on the 2012 assessment by almost 5 per cent.

Does P.E.I. have significantly more learning disabled students or did the sampling methods create bias?

When comparing the range of mathematics scores found by subtracting the highest score from the lowest score, the range for P.E.I. students was the smallest (198) in comparison to Saskatchewan and Newfoundland (tied) who were next on the list with a range of 210 and the widest range of 227 was recorded from the top scoring province, Quebec. On the previous PISA in 2012, the spread of math scores was significantly greater for P.E.I. at 216 revealing a wide range of mathematics abilities.

The smaller range or spread of scores in 2015 suggests a more homogeneous group of students with similar skills in mathematics. Missing are achievers at the extreme ends of the scales (high-end, low-end or both). Based on P.E.I.’s history of performance on the PISA, we know that P.E.I. students struggled with mathematics. On the 2012 assessment, P.E.I. had only 3.9 per cent of students scoring in the highest levels of 5 or 6 and a whopping 19.1 per cent of students scoring at level 1 or below. 

Although these statistical measures are not yet available for the 2015 assessment, there is evidence to suggest that the smaller range, as well as the change in P.E.I.’s ranking from last place in 2012 to 5th place on the 2015 assessment may be due to the absence of weaker students and/or possibly students who were labeled as learning disabled on the 2015 assessment.

On the 2012 PISA, P.E.I. assessed all 15-year-olds because the number of 15-year-olds was much smaller in P.E.I. compared to larger provinces. Involving all students in the assessment provides a more representative indicator of students’ achievement. On the 2015 assessment, the province opted for a sample of P.E.I. students where only 543 out of X students were identified as eligible to write the PISA. From this sample, 80 students (14.7 per cent) were excluded from the sample because of disabilities or limited language skills. Another 15 students were scratched off the list due to parental refusal, students were no longer in the school, or the age criteria was not matched, leaving a total 448 students eligible to write the PISA. On the day of the assessment, an additional 56 students (12.5 per cent) did not write the assessment for unknown reasons, which resulted in a total of 392 students who wrote the PISA.

Of the 543 students originally selected to write the PISA for the province, a total of 151 or 27.8 per cent did not write the assessment for various reasons. P.E.I. had the smallest sample of students with the next highest being Newfoundland and Labrador with a sample of 1,197 students. Although this sample meets PISA’s strict sampling guidelines, from a statistical perspective, smaller samples are less reliable than larger samples.

The reliability of students’ scores is also dependent on a statistic called the standard error of measurement. It is a measure of uncertainty or the degree to which the scores from the sample are representative of the entire population had all 15 year-olds in P.E.I. wrote the test. P.E.I.’s standard error of measure in 2015 was 7.0, the largest of all provinces and up 3.3 points from the previous assessment in 2012. This provides reason to question whether the small select sample of students was an accurate reflection of the P.E.I. population of 15 year-olds.

We all want P.E.I. students to do well in school and excel in whatever program they pursue after completing Grade 12. We also want P.E.I. to grow economically and provide jobs for our graduates. The outcome of the PISA test is an important indicator of our education system and economic development – inaccurate representation of students’ achievement jeopardizes the validity of this international assessment as well as people’s perceptions of the entire province.

It is better to make policy decisions based on accurate results, even if those results are disappointing. 

 

 

- Tess Miller in an associate professor in UPEI’s faculty of education