Includes updates and/or revisions.
As state leaders and education advocates weigh evaluating U.S. students using international benchmarks, a argues that one prominent test, PISA, is flawed and may not be appropriate for judging American schools on global standards.
The author, Tom Loveless, a senior fellow at the Washington-based Brookings Institution, also contends that questions asked on the Program for International Student Assessment surveys of students鈥 beliefs and attitudes about science reflect an ideological bias, which undermines the test鈥檚 credibility.
He cites an example from one PISA questionnaire, which seeks to gauge 鈥渁 sense of students鈥 responsibility for sustainable development,鈥 and asks test-takers if they agree with certain statements, such as 鈥渉aving laws that protect the habitats of endangered species.鈥
A response requires a 鈥減olitical judgment,鈥 Mr. Loveless writes. Also, the questions are vague, making it difficult for the scientifically literate to know how to answer, he argues.
鈥淚t is difficult to see how declaring support or opposition to a policy without knowing the details鈥 is related to responsible citizenship, Mr. Loveless adds.
Andreas Schleicher, the head of education indicators for the Organization for Economic Cooperation and Development, the Paris-based group that oversees the test, called the report 鈥渄isingenuous鈥 and misleading on some points.
鈥楢 First Reading鈥
He noted that the student questionnaire is not in any way connected to the main, publicly reported PISA scores for science and math, which are most commonly cited in the news media and by policymakers. It is clear, he said, that the test scores and the questionnaire give policymakers two different sets of information. Results from the questionnaire are put in separate indices in PISA reports, he noted.
鈥淭hese questions explore significant science-related contemporary issues,鈥 Mr. Schleicher said in an e-mail, and give policymakers 鈥渁 first reading鈥 of students鈥 attitudes about science, even if the phrases are not perfect.
Mr. Loveless also casts doubt on whether PISA鈥檚 practice of measuring skills that students pick up both in and out of school makes it useful for state policymakers who want to improve their K-12 systems. Another international test, the Trends in International Mathematics and Science Study, or TIMSS, and the U.S.-based National Assessment of 91制片厂视频al Progress, or NAEP, focus primarily on in-school skills.
In addition, he said the OECD takes policy positions that it should not be doing if it collects and interprets score data, because it creates potential for conflict.
Mr. Schleicher said PISA emphasizes students鈥 ability to apply knowledge in an out-of-school context, but that doesn鈥檛 mean students necessarily learned those skills outside the classroom.
One central PISA goal is to assess students鈥 鈥渃apacities to extrapolate from what they know and transfer and apply their knowledge and skills to novel settings,鈥 Mr. Schleicher said, which, he added, is a prized skill in science.
Last September, the National Governors Association, Achieve, and the Council of Chief State School Officers to create an advisory group to produce a 鈥渞oad map鈥 to benchmark U.S. school performance with that of top-performing nations.
Mr. Loveless writes that the NGA would 鈥渓ike states to use PISA鈥 in that process. But Dane Linn, the director of the NGA鈥檚 education division, disputed that, saying the organizations are not committed to any particular approach, but considering a range of rigorous international exams.
鈥淚t behooves us to not exclude PISA in examining how other countries measure performance,鈥 Mr. Linn said. Different elements of PISA, TIMSS, and other international tests are likely to appeal to state policymakers. Debates about which kind of test material, emphasizing in-school 鈥渃ontent,鈥 as opposed to the 鈥渁pplication of knowledge,鈥 miss the point, Mr. Linn added. 鈥淚t鈥檚 both.鈥