Canada's participation in TIMSS.
How did Canada do on the grade-12 or last year of secondary school - testing, in which 24 countries participated? According to David Robitaille, coordinator for TIMSS and professor of mathematics at the University of British Columbia, "Canada did quite well - better in fact than any of the other G-8 countries that participated [Japan did not participate]. Within Canada, B.C., Alberta, and Ontario had similar results." Robitaille said that these provinces scored significantly above the international average in mathematics and science literacy and in advanced math. They scored at the international average in physics. In the case of advanced math and physics, Robitaille compared the performance of the top 5% of students in order to control for the effects of fairly large differences among countries in the proportion of the students included in those two subpopulations. "If we look at Canada's performance across all three populations [grade 4, grade 8, and grade 12], Alberta looks strongest, with British Columbia next," he said.
In the grade-8 tests, Canadian students did as well as or better than students from 31 countries, posting a mean of 59% correct. Scores ranged from a high of 70% in Singapore to a low of 27% in South Africa, with an international mean of 56%. Alberta, with a score of 65% correct, ranked third in science and in the top one-third of participating countries in mathematics. One important outcome was that the results showed no significant difference in overall math and science performance between males and females in Alberta or in Canada. This was not the case in the grade-12 results, which showed males doing considerably better than females in Canada.
It is not fashionable for Canadians to get too excited about their education system, which probably accounts for the lack of media coverage about Canada's participation in international tests. It certainly is not that Canada scored low. Indeed, Canada scored consistently above the international mean for all three age groups tested. But some of the media as well as some education reformers and critics have taken the results and used them to promote their own biases and ideologies. More on this point later.
Yet we Canadians can take pride in the performance of Canadian students on these tests, despite the very obvious lack of media interest. David Flower, editor of the ATA News (a publication of the Alberta Teachers' Association), shrewdly pointed out in the 10 March 1998 issue just how indifferent the media, especially the Alberta media, were to the performance of Canadian students by observing, "Why is coming 10th in downhill skiing or 42nd in cross-country skiing considered a significant Canadian achievement in the Olympic Games, yet coming third in international tests is considered only so-so in Alberta? . . . Probably it is symptomatic of the Canadian reluctance to brag." Overcoming my own reluctance to brag, I offer a salute to the Canadian students who did so well on these international tests.
Even though Alberta scored so well, Joe Freedman of Red Deer, Alberta, an education reformer and strong advocate of international competitions and charter schools, who was no doubt surprised by Alberta's strong performance in science, was not thrilled with the results. "The good science results were mitigated by the poor math results," he said. Again, Freedman is not willing to concede that Alberta students rank among the top students tested in 45 countries. And John Snobelen, minister of education for Ontario, commenting on Ontario's results in the grade-8 tests, said, "Mediocre results are not what we are after. No one should be happy."
"Making Our Schools Measure Up," an editorial in the Globe and Mail (17 January 1998), attended to the poor grade-8 results in Ontario by berating the inept education system. The editorial proudly points out "the beginnings of a movement across the country to restore to the school system the power and the ability to heal itself." After many sanctimonious statements about the lack of yardsticks with which to measure success and failure and about the importance of making measurable goals the centerpiece of reform, the editorial writer reveals his true stripes when money enters the picture.
More than the progress of individual students can be measured. There are also measures of the performance of schools and whole systems. How Canadians fare in international competitions is not a meaningless factoid, but a precious insight. How many teachers and dollars we lavish on each student relative to other countries is valuable information, especially when it tells us, as it does, that there is little relationship between the money and pupil/teacher ratios in a country's schools, and the measurable learning that takes place there.
It is quite obvious that the bottom line for the Globe and Mail writer is not why students did or did not compare favorably with other countries but how the newspaper can slant the TIMSS results to support the present government's determination to cut millions from the education budget and reduce the teaching staff.
I join many other educators in rejoicing in Canada's performance on TIMSS. I do so because I believe that Canada's science and math teachers are doing a first-rate job, despite certain handicaps. However, I do not believe that the purpose of TIMSS was - or even should be - strictly competitive. The purpose of such tests should be to aid the improvement of education by the various participating countries. There are a number of important questions that we should be asking. How well did students do on the questions derived from the curriculum that they have been taught? Was the test representative of the country's curriculum? How significant is that part of the curriculum not tested? In the past few years there have been major changes in the curriculum of most provinces. How have those changes affected students' performance?
David Ireland, recently retired manager of research and evaluation with the Carleton, Ontario, board of education, points out the absurdity of including items on the test that are not covered by the curriculum. He gives what he calls a bizarre example in the report on mathematics. The report says: "Canadian students did poorly on solving the linear equation, 'Find x if 10x - 15 = 5x + 20.' Only 27% found the correct solution, compared to 46% internationally." But grade-8 students in Canada don't usually do linear equations. Why then were they tested on them? And why were the results reported? If a quarter of Canadian eighth-graders can already do linear equations, perhaps they should be included in the curriculum.
Much more time and money, as TIMSS coordinator Robitaille suggests, should be devoted to analyzing the results of these international tests if they are to be meaningful for the improvement of math and science teaching and learning. I agree with Ireland that the focus of the reports is particularly questionable in the present political climate. As he says, unadorned standings "simply give ammunition to the right-wing politicians, the editorial writers, and the neo-con education bashers." There is a wealth of interesting and useful information in the many volumes of TIMSS reports published. It will take years to analyze all the data and arrive at meaningful judgments on how we can use what we learn from these tests to improve science and math teaching and learning.
TOM McCONAGHY is an editorial consultant and education writer in Edmonton, Alta.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||Third International Mathematics and Science Study|
|Publication:||Phi Delta Kappan|
|Date:||Jun 1, 1998|
|Previous Article:||Boring or bunkum?|
|Next Article:||The changing user.|