Profession-Led Innovations Most Grounded in Evidence
It’s axiomatic that experts in a field are better equipped than outsiders to design interventions that will work. Yet in education, we face a constant barrage of external reform efforts that fail to incorporate professional knowledge and expertise—and they just don’t work.
This point is reinforced in recent research out of the National Education Policy Center. In this study, Marilyn Cochran-Smith and her colleagues at Boston College (MA) examine the evidentiary base underlying four national initiatives for teacher preparation program accountability and improvement. They find that only one of the initiatives—the beginning-teacher performance assessment edTPA, designed and managed by the profession—is founded on claims supported by research. With a measure that is valid, scoring that is reliable, and therefore results that are accurate, we have a serious tool for program improvement.
The other three accountability mechanisms included in the study don’t fare so well. The authors conclude that the state and institutional reporting requirements under the Higher Education Act, the Council for the Accreditation of Educator Preparation standards and review system, and the National Council on Teacher Quality Teacher Prep Review are supported by only “thin evidence” and hold little promise for positive program impact.
“The advocates of these [three] initiatives assume a direct relationship between the implementation of public summative evaluations and the improvement of teacher preparation program quality,” the authors write. “However, summative evaluations intended to influence policy decisions generally do not provide information useful for program improvement. The irony here is that while these policies call for teacher education programs and institutions to make decisions based on evidence, the policies themselves are not evidence-based.”
In short, successful program improvement is driven not by high-stakes accountability, but by professionals who really know what’s up. Look at the terrific results realized at Massachusetts’ Brockton High School, where the experts on the ground came together to design an intervention grounded in coordinated support. Or look at David Kirp’s book Improbable Scholars about the gains made in Union City, New Jersey, where educators collaborate to use data for improvement rather than punishment and embrace a systemwide approach to policy making for the full span of education services.
In the case of edTPA, we have a product that teacher educators spearheaded to solve a specific problem, at their own expense, because they needed it for their work. Its success also builds on prior advances in the field, such as technologies and models developed by the National Board for Professional Teaching Standards. And while the research base supporting edTPA cited in this study is an important start, recent administrative reports provide further evidence, and studies of predictive validity are currently under way. None of this would have been possible, though, without the immense and engaged efforts of the profession to develop, test, and implement the best assessment they could build. (See this recent article in Kappan magazine for examples of how teacher educators are implementing edTPA locally.)
When we act with integrity to articulate the professional consensus, and we support the profession to act on that consensus, we get a good result—with a better return on investment than reformist-driven philanthropy can realize. For the most viable innovations, we need the wisdom of the profession.
Tags: accreditation, assessment, data, federal issues, program evaluation, research, teacher quality