91制片厂视频

Opinion
Assessment Opinion

When 19 Heads Are Better Than One

By Kathryn Parker Boudett, Elizabeth A. City & Richard J. Murnane 鈥 December 06, 2005 7 min read
  • Save to favorites
  • Print
Email Copy URL
Using data effectively does not mean getting good at crunching numbers. It means getting good at working together.

How can educators use the piles of student-assessment results that land on their desks to improve learning in their schools? Two years ago, a group of school leaders from the Boston public schools and faculty members and doctoral students from Harvard University鈥檚 graduate school of education agreed to meet monthly to wrestle with this topic. (鈥淗arvard, Boston Educators Team Up on Test-Data Book,鈥 April 27, 2005.) We began with a somewhat vague notion of producing a book that would help teachers and administrators take the high road in responding to student-assessment results鈥攊mprove instruction, rather than engage in 鈥渄rill and kill.鈥 We guessed that 19 heads would be better than one in producing such a book.

In our first year, we floundered. We were a collection of polite professionals who did our best to take time from our busy schedules to talk about using data in schools. We read each other鈥檚 musings on the topic of the day and then spent many meetings trying to outline a book that would compile our respective contributions. Each of us learned something, but by the end of the year it was not clear 鈥渨e鈥 had made much progress toward our collective goal.

In our second year, everything changed. We identified three editors, who proposed reorienting the book. We scrapped the original idea of a collection of chapters about various relevant topics. The new focus would be addressing what our target audience鈥攚e identified it as school leaders, broadly defined鈥攏eeded to know and be able to do in order to use data effectively. We found a publisher, who then convinced us to adopt an extremely aggressive deadline. Perhaps most important, we created a very deliberate process for involving everyone in getting the work done.

Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning is the fruit of our labors. The central message of the book is that using data effectively does not mean getting good at crunching numbers. It means getting good at working together to gain insights from student-assessment results and to use the insights to improve instruction. We put our 19 heads together to come up with a set of tools and strategies for making collaborative analysis of assessment data a catalyst for improving student learning. Reflecting on our process, we realize that many of the principles we advocate in the book are those that proved critical to making our own collaborative endeavor successful.

For many years, people have argued for increased cooperation between school practitioners and researchers. But as we learned in year one of our book project, just putting a bunch of teachers, administrators, and professors in the same room won鈥檛 guarantee progress in solving knotty educational problems.

鈥 Nip Rogers

BRIC ARCHIVE

We see the lessons we learned as relevant to other collaborative efforts, including the Strategic 91制片厂视频 Research Partnership, or SERP, a new endeavor resulting from a 2003 National Academy of Sciences report. SERP aims to usher in a new paradigm for improving education by providing a structure for researchers and practitioners to collaborate in conducting rigorous studies in authentic school settings. We boil our lessons learned down to three: focus on a concrete goal, balance democracy and dictatorship, and revel in disagreement.

Focus on a concrete goal.Once we decided to create a step-by-step guide for school leaders, our collective task became clearer. We realized that we needed to distill from our collective experience a compelling process for improvement. When we looked at the areas of expertise of each group member, we began to see who might contribute to each chapter on each phase of the process.

We wanted to offer our readers a clear process for structuring the work of improvement, but we were also determined to avoid writing a preachy 鈥渉ow to.鈥 After all, the vast majority of our authors had spent years of their lives working as teachers, counselors, or administrators in schools, so we knew we had the real potential to write a compelling text for practitioners. To do this, we set about the task of creating two case-study schools to use throughout the book to breathe life into the improvement process. Invariably, first drafts of our chapters featured vignettes of leaders from these schools devising ingenious solutions to daunting problems. To keep true to our goal of writing a book that would really resonate with educators, we eventually revised these vignettes to illustrate the challenges, tensions, failures, and general 鈥渕essiness鈥 involved in using data in schools. We then worked to help readers see how to learn from that messiness.

Balance democracy and dictatorship. By the end of our first year, members seemed to tire of the constant appeal to the group for consensus. A few even approached those of us who had convened the meetings and implored, 鈥淛ust tell us what to do next!鈥

So, over the summer, the three of us who were to become the editors made some major decisions about where to take the work. When we reconvened in the fall, we announced the focus for the book and a formal process for getting the work done. We assigned every member of the group to a team responsible for writing a specific chapter. We instituted a rigorous review process for each chapter: The full group would review and comment on teams鈥 initial chapter outlines, and then offer feedback on a chapter draft later in the year. We distributed a calendar showing due dates for the different phases of each chapter. Finally, we asked that authors entrust their revised work to us editors, who would sew the story together and ensure that the book read with one voice.

Did our colleagues mutiny at being whipped into shape? Quite the contrary. Attendance at meetings soared. Deadlines were met. We became a true community of practice like the ones we were advocating in our book, with members accountable to each other.

We discovered that points of tension were our sources of greatest learning.

Revel in disagreement. How can a book with so many authors provide a coherent story line? To make this happen, we had to devote time to really hashing out the sources of our many disagreements. Even though individuals to this day disagree on some issues, all of our authors endorse what鈥檚 in our collective book. Initially, there were times when we wondered whether we could find common ground. For example, our assessment experts argued that focusing a school discussion on individual test items was a dangerous practice. Some of our school principals explained that discussing specific items was one of their most powerful strategies for engaging faculty members in looking at assessment results. After much discussion, we settled on recommending that school leaders use individual items as catalysts for discussion, not as a basis for decisions.

We also had many intense discussions about language. For example, we initially assumed that we would make use of the popular notion that a key step in data analysis is to identify the root cause of student learning difficulties. But many in our group took issue with the term 鈥渞oot cause.鈥 Is a school leader鈥檚 job to uncover the deepest causes of achievement problems, or to treat the causes over which she has most control? In the end we decided to abandon the term entirely, in favor of more precise language that focused on the work school faculties could do.

We also learned that agreement on general principles did not always map to agreement on detailed practices. For example, everyone agreed that devoting a great deal of scarce instructional time to 鈥渢eaching to the test鈥 was a bad idea. However, some of our group maintained that, given the high stakes for students, familiarizing students with the format of mandatory state tests and providing opportunities to practice test-taking skills were appropriate. On this question we also discovered that 鈥渞esearchers鈥 and 鈥減ractitioners鈥 aren鈥檛 monolithic categories鈥攓uite often members of these somewhat arbitrary groups held differing views.

Keeping the focus on sources of disagreements kept the sessions lively, fruitful, and intense. We learned that adopting protocols to structure conversations helped assure that the intense discussions generated light as well as heat. We discovered that points of tension were our sources of greatest learning, and we had to change our book outline many times over the second year to accommodate what we were learning together. We realized that tensions don鈥檛 go away until you resolve them, and we were fortunate that our community of practice was strong enough to support frank conversations that led to resolutions we all could live with.

Most problems in education today are too complicated for individuals to solve alone. But when bringing together researchers and practitioners to work on tough issues, our experience is that the collaborative venture itself needs care and attention if it is to produce results. Only then are 19 heads better than one.

Related Tags:

Events

Recruitment & Retention Webinar Keep Talented Teachers and Improve Student Outcomes
Keep talented teachers and unlock student success with strategic planning based on insights from Apple 91制片厂视频 and educational leaders.鈥
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 91制片厂视频 Week's editorial staff.
Sponsor
Families & the Community Webinar
Family Engagement: The Foundation for a Strong School Year
Learn how family engagement promotes student success with insights from National PTA, AASA鈥痑nd leading districts and schools.鈥
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 91制片厂视频 Week's editorial staff.
Sponsor
Special 91制片厂视频 Webinar
How Early Adopters of Remote Therapy are Improving IEPs
Learn how schools are using remote therapy to improve IEP compliance & scalability while delivering outcomes comparable to onsite providers.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment From Our Research Center It's Hard to Shift to Competency-Based Learning. These Strategies Can Help
Educators are interested in the model and supportive of some of its key components, even if largely unfamiliar with the practice.
6 min read
A collage of a faceless student sitting and writing in notebook with stacks of books, math equations, letter grades and numbers all around him.
Nadia Radic for 91制片厂视频 Week
Assessment Explainer What Is Standards-Based Grading, and How Does It Work?
Schools can retool to make instruction more personalized and student-centered. But grading is a common sticking point.
11 min read
A collage of two faceless students sitting on an open book with a notebook and laptop. All around them are numbers, math symbols and pieces of an actual student transcript.
Nadia Radic for 91制片厂视频 Week
Assessment Letter to the Editor Are Advanced Placement Exams Becoming Easier?
A letter to the editor reflects on changes to the College Board's Advanced Placement exams over the years.
1 min read
91制片厂视频 Week opinion letters submissions
Gwen Keraval for 91制片厂视频 Week
Assessment Opinion 鈥楩ail Fast, Fail Often鈥: What a Tech-Bro Mantra Can Teach Us About Grading
I was tied to traditional grading practices鈥攗ntil I realized they didn鈥檛 reflect what I wanted students to learn: the power of failure.
Liz MacLauchlan
4 min read
Glowing light bulb among the crumpled papers of failed attempts
iStock/Getty + 91制片厂视频 Week