Printer Friendly

Doubts raised about SAT's reading test.

Doubts raised about SAT's reading test

College hopefuls taking the Scholastic Aptitude Test (SAT) might consider answering the multiple-choice questions following reading-comprehension passages without reading the passages themselves; they may do almost as well as if they had actually read the passages, while saving valuable time for completing other parts of the SAT's verbal section.

That, at least, is the conclusion of a study in the March PSYCHOLOGICAL SCIENCE. College students scored well above chance (random guessing) on a sample SAT reading-comprehension task with the reading passages deleted, report psychologist Stuart Katz and his colleagues at the University of Georgia in Athens. Almost two-thirds of the questions following reading passages do not tap into the test-taker's comprehension of the passages, the scientists maintain.

"Here's a task that's been carefully developed using the most arcane, advanced techniques of a leading educational organization, but it doesn't measure what it's supposed to," Katz asserts.

So what does the reading-comprehension task measure? According to Katz, it largely gauges "testing skills," which include the ability to derive correct answers by analyzing the structure and phrasing of questions and by using knowledge from one's background to help weed out improbable answers.

The new report appears in the midst of an extensive review of SAT items by the Educational Testing Service (ETS) in Princeton, N.J., which formulates the college-entrance examination.

"We think the Katz study is important, but it has some technical problems," says ETS psychometrist Cathy Wendler. "I'm sure his research will be taken seriously in our ongoing SAT review."

Katz and his co-workers administered reading-comprehension questions from 1983 versions of the SAT to 197 college students, some of them honors students. None of the students had taken the SAT in 1983. Exam-takers must answer a group of multiple-choice questions based on what is stated or implied in a short passage preceding the questions. The researchers presented students with a total of 100 questions derived from 24 passages. Each questions was followed by five possible answers, only one of which was correct.

Test scores for the 75 students who read the passages averaged 57 and rose to 70 among honors students. The rest answered questions without passages and averaged 38 correct responses, with honors students scoring about 46, far exceeding a chance score of 20.

The student's precollege scores on the entire SAT verbal test -- which includes three sections in addition to reading comprehension -- were strong statistical predictors of their scores on the experimental reading-comprehension task regardless of whether or not passages were included, the researchers found. This suggests that testing skills, rather than an understanding of the passages, contribute more to sorting out better performers from poorer ones, Katz argues.

Students who took the no-passage test answered 61 of the items correctly at least 30 percent of the time, indicating that those items have little to do with reading comprehension, Katz adds.

Reading comprehension is the most time-consuming part of the verbal section, so the implications of the no-passage strategy are important, Katz says. He notes that at least one SAT coaching school instructs its pupils to answer reading-comprehension questions using information unrelated to the passages, although SAT handbooks published by ETS instruct students to read the passages and ignore background knowledge or intuition.

"You can't create reading-comprehension questions in isolation from a person's background knowledge," Wendler responds. But she argues that the test -- indeed, the entire SAT verbal section -- measures "general verbal reasoning skills" rather than "reading comprehension" or any other specific skill.

Wendler also says the college students tested by the Georgia researchers were brighter and more knowledgeable than the average high school student taking the SAT, and that this may have inflated scores on the no-passage tests.

"There's no one right way to take the SAT," she maintains. Katz replies: "Our evidence should be enough to alert those who take the SAT, or make use of its results, that it may tell a story different from the one now generally accepted."
COPYRIGHT 1990 Science Service, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1990, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Scholastic Aptitude Test
Author:Bower, B.
Publication:Science News
Date:Mar 31, 1990
Previous Article:Japanese satellite begins orbiting moon.
Next Article:Germanium speeds transistor.

Related Articles
Doubt cast on biology of giftedness.
Employment testing of persons with specific learning disabilities.
THE BIG TEST: The Secret History of the American Meritocracy.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters