91制片厂视频

91制片厂视频

91制片厂视频 Scholars Finding New 鈥榁alue鈥 In Student Test Data

By Lynn Olson 鈥 November 20, 2002 9 min read
  • Save to favorites
  • Print
Email Copy URL
Data Driven: An Occasional Series

Every year, school administrators at E.A. Cox Middle School in Maury County, Tenn., look for patterns in the state test scores of students assigned to individual teachers. Then they coax and cajole teachers who have been especially effective at raising achievement to teach subjects or grades the school is having trouble with.

鈥淚t鈥檚 kind of been a joke here at Cox that you don鈥檛 know what you鈥檒l be teaching until school starts,鈥 says Principal Debbie Steen. 鈥淭he teachers are just very flexible, and they don鈥檛 get upset about that. The teachers are very dedicated to finding the best fit for the child.鈥

That approach seems to be paying off. In 2002, the 700-student school outperformed all other middle schools statewide with similar percentages of poor and minority students in the gains students made in mathematics averaged over three years. E.A. Cox鈥檚 reading gains topped all but one middle school with comparable student populations.

Soon, principals and teachers in Colorado, Ohio, and Pennsylvania will be armed with similar data, as part of pilot projects that rely on 鈥渧alue added鈥 methods. Rather than simply rank schools on raw test scores, such analyses focus on the progress by individual students over time.

Some contend that such information provides a fairer way to judge schools, based on how much schools 鈥渁dd value鈥 to a student鈥檚 knowledge and skills. The data also can help pinpoint a school鈥檚 strengths and weaknesses, right down to the improvements in individual teachers鈥 classrooms.

鈥淩ight now, a lot of states and districts rely on changes in cohorts,鈥 explained Laura S. Hamilton, a behavioral scientist with the RAND Corp., a research institute in Santa Monica, Calif. 鈥淭hey鈥檒l compare the performance of this year鈥檚 4th graders with last year鈥檚 4th graders. That confounds differences in the particular abilities and other characteristics of the students in each group with real changes in test performance.鈥

Value-added methods, she said, have 鈥渢he potential to provide more accurate estimates of changes in test scores than we currently get with a lot of the systems being used. It also, I think, does a better job of communicating what we really want to know, which is the extent to which individual students are gaining or losing in terms of test scores.鈥

Pilot Projects

Probably the best-known proponent of value-added analyses is William L. Sanders, who directs the value-added assessment and research center for SAS inSchool, part of the Cary, N.C.-based SAS Institute, a software company. Since the early 1990s, Tennessee has used Mr. Sanders鈥 value-added techniques to show the public the gains by every school and district, averaged over three years. (鈥淪anders 101,鈥 May 5, 1999.)

Working with the SAS Institute, the statistician is now exporting his methodology to other willing states and districts. To date, districts in 21 states have signed on, including the three, large-scale pilots in Colorado, Ohio, and Pennsylvania. Specifically:

  • In Pennsylvania, about 30 districts are participating in a $500,000 project this school year with SAS inSchool. To participate, districts had to test students every year in grades 3-8 and link test results with individual students. In September, the state board of education approved a plan that calls for every district to include a value- added component in its assessment system by 2005-06. 鈥淲e鈥檝e always understood that the fairest measure for our schools, for our school districts, is looking at where did your kids start, and where did they end up?鈥 said Charles Zogby, the commissioner of education in Pennsylvania.
  • In Ohio, 42 districts are participating in a project coordinated by Battelle for Kids, a nonprofit group based in Columbus, that will generate value-added data for each school, again working with SAS inSchool.
  • In Colorado, about 40 districts have worked with Mr. Sanders over the past four years to produce value-added analyses for their schools, as part of a voluntary project started by the state department of education. Last year, legislators approved a separate 鈥渁cademic growth pilot program鈥 that will track the progress of individual students in reading, writing, and math in participating schools and districts, beginning this school year. Starting with the 2005-06 school year, the law calls for every district in the state to take part in the 鈥渁cademic growth program.鈥

State Rep. Keith C. King, a Republican who sponsored the legislation, said he wanted to 鈥済ive teachers tools whereby they could actually improve each student鈥檚 academic achievement.鈥

Under the state鈥檚 1999 accreditation rules, schools and districts must demonstrate that all student subgroups, such as Hispanic students or students with disabilities, have achieved at least one year鈥檚 academic growth in a year鈥檚 time.

Gaining Momentum

Other states are using value-added methods for accountability.

Florida assigns letter grades to schools based partly on the learning gains of individual students. Jim Horne, the secretary of education, argued, 鈥淏y being able to measure annual learning gains, that is a powerful place to be.鈥

One of the most powerful and controversial aspects of Mr. Sanders鈥 system is that it can reach beyond the school level to produce a measure of an individual teacher鈥檚 effectiveness, based on how the students in his or her classroom progress each year. In Tennessee, such information is shown only to school officials, who can use it as part of job evaluations, and to the teachers themselves.

Kip Reel, the superintendent of the 11,350- student Maury County, Tenn., schools, said his system uses the information to set performance goals for principals and to help evaluate individual teachers.

鈥淚 don鈥檛 deny that you can get a feel for how effective an individual educator is by visiting the classroom, or by observations or things like that,鈥 he said. 鈥淏ut it鈥檚 a real big benefit to have some quantification that allows comparisons within the school or within the system.鈥

Starting this fall, parents in Tennessee also can go on the department of education鈥檚 Web site and find confidential projections for whether their children will pass the state tests required for high school graduation, as indicated by their performance to date. The site also projects the youngsters鈥 chances of earning a high enough score on the ACT college-entrance exam to gain admission to one of the state鈥檚 colleges or universities, earn an A or B average as a college freshman, or major in a technical field, such as computers or engineering.

鈥楽eize This Opportunity鈥

Until now, many states have faced two big challenges in doing such value-added analysis.

The first is that they did not test every student annually and so could not chart individuals鈥 progress from year to year. That situation will change under the federal 鈥淣o Child Left Behind鈥 Act of 2001, which requires states to test every student in grades 3-8 annually in reading and mathematics, no later than the 2005-06 school year.

鈥淢y message to the states is you need to seize this opportunity,鈥 said Mr. Sanders of SAS inSchool, 鈥渁nd begin to follow the academic progress of individual kids in much more robust ways.鈥

The second impediment is that states must have some way to link the records of individual students over time, usually by assigning each student a unique identification number. To date, only 16 states have that capacity, although the number of such states is growing.

In September, Gov. Gray Davis of California signed legislation to set up a data system that will track the individual progress of students. SB 1453, sponsored by Democratic state Sen. Dee Dee Alpert, authorizes the use of up to $6 million in federal aid to develop the longitudinal data system.

鈥淚f we hadn鈥檛 been able to access federal money, I don鈥檛 know if I would have gotten a signature from the governor on this,鈥 Sen. Alpert said. 鈥淭his is not sexy, in the sense that people want to say they made classes smaller or gave a book to every kid, so it winds up being way down at the bottom of the list.

鈥淵et without it,鈥 she continued, 鈥測ou don鈥檛 actually know if the things that you鈥檙e doing programmatically are doing any good, because you don鈥檛 have any good information.鈥

The National Center for 91制片厂视频al Accountability, based at the University of Texas at Austin, is urging states to devise such longitudinal, student-level data systems. The center currently is using data from seven states to investigate what high- performing schools do that average- and low-performing schools do not.

Challenging Work

Creating data systems that can track individual students is not easy, though.

One problem is technical. Arizona, for instance, is trying to set up the Student Accountability Information System鈥攁 new, state database that will trace students鈥 progress. As of September, nearly 20 percent of the state鈥檚 836,000 K-12 students weren鈥檛 enrolled in the database, in part because many districts needed to buy or upgrade software. The state is now threatening to withhold per-pupil funding from districts and charter schools that do not report the required information.

Chuck Essigs, an adviser to the state superintendent of public instruction, said Arizona districts range in size from three students to more than 70,000, and the state has hundreds of charter schools. 鈥淭here鈥檚 a lot of work that has to be done on the district level to prepare their data to be properly submitted to the state, so the state can process it,鈥 he said.

On the political side, collecting information on individual students raises privacy concerns. And some educators oppose linking student test scores to the evaluation of individual teachers.

鈥淚n some ways, those political hurdles may be more insurmountable than the technical ones,鈥 said Ms. Hamilton of RAND.

Finally, researchers use various methods to conduct value-added analyses, and it鈥檚 not clear which approaches are best or how to improve them. Many techniques are complex and aren鈥檛 readily understood by teachers or parents.

In the summer 2002 issue of 91制片厂视频 Next, Dale Ballou, an associate professor of economics at the University of Massachusetts at Amherst, argued, 鈥淚t is much harder to measure achievement gains than is commonly supposed.鈥 He cautioned that while value-added techniques provide a useful diagnostic tool, too many uncertainties surround such methods to use them for high-stakes personnel decisions.

鈥楾ougher Than We鈥檇 Like鈥

Among other problems, he noted, current methods of testing don鈥檛 measure gains very accurately. When statistical methods are used to minimize error or 鈥渘oise,鈥 the systems quickly become incomprehensible to educators, losing the 鈥渢ransparency鈥 that many argue is a hallmark of effective accountability systems.

But Harold Doran, the director of research and evaluation for the education-performance network at New American Schools, a nonprofit group based in Alexandria, Va., said, 鈥淎dopting a less rigorous model to gain increased transparency is inexcusable.鈥

鈥淚 wouldn鈥檛 expect a physician to dumb down their surgical procedures simply because it鈥檚 easier for a patient to understand,鈥 he said, 鈥渂ut that model should be transparent to other physicians.鈥

New American Schools is currently conducting a study of the different value-added models now in existence. Among other tasks, the project is analyzing data from an unnamed state to see whether the different methods produce similar or different results.

RAND also is conducting a study, underwritten by the Carnegie Corporation of New York, of the various value-added models; the challenges involved in applying them to education and, particularly, to the performance of individual teachers; and how those challenges might be overcome.

鈥淥ur stance is that this is a very promising thing to look at. We just need a lot more work to know how to do it best,鈥 said Daniel M. Koretz, a senior social scientist at RAND and a professor of education at Harvard University. 鈥淚t鈥檚 just a lot tougher than we鈥檇 like.鈥

A version of this article appeared in the November 20, 2002 edition of 91制片厂视频 Week as 91制片厂视频 Scholars Finding New 鈥榁alue鈥 In Student Test Data

Events

Recruitment & Retention Webinar Keep Talented Teachers and Improve Student Outcomes
Keep talented teachers and unlock student success with strategic planning based on insights from Apple 91制片厂视频 and educational leaders.鈥
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 91制片厂视频 Week's editorial staff.
Sponsor
Families & the Community Webinar
Family Engagement: The Foundation for a Strong School Year
Learn how family engagement promotes student success with insights from National PTA, AASA鈥痑nd leading districts and schools.鈥
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 91制片厂视频 Week's editorial staff.
Sponsor
Special 91制片厂视频 Webinar
How Early Adopters of Remote Therapy are Improving IEPs
Learn how schools are using remote therapy to improve IEP compliance & scalability while delivering outcomes comparable to onsite providers.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

91制片厂视频 Briefly Stated: August 28, 2024
Here's a look at some recent 91制片厂视频 Week articles you may have missed.
9 min read
91制片厂视频 Briefly Stated: August 21, 2024
Here's a look at some recent 91制片厂视频 Week articles you may have missed.
9 min read
91制片厂视频 Briefly Stated: August 14, 2024
Here's a look at some recent 91制片厂视频 Week articles you may have missed.
9 min read
91制片厂视频 Briefly Stated: July 17, 2024
Here's a look at some recent 91制片厂视频 Week articles you may have missed.
8 min read