Deeper Dive Session Explores Data Systems
The “Data Systems” Deeper Dive session, held during the AACTE 2019 Annual Meeting examined the possibilities and challenges for using information and evidence-based research to improve teacher education programs. The discussion was led by moderator Robert Floden, dean of the College of Education at Michigan State University, and included four panelists: Kevin Bastian, senior research associate, University of North Carolina (UNC) and director, Teacher Quality Research Initiative Education Policy Initiative at Carolina (EPIC); Charles Peck, a professor of teacher education and special education at the University of Washington; Suzanne Wilson, Neag School of Educatin Endowed Professor of Teacher Education at the University of Connecticut (UConn); and Gladis Kersaint, dean, Neag School of Education at UConn.
The robust discussion opened with Bastian sharing the details of a two-pronged study conducted by EPIC that pairs student teaching data with workforce outcomes. He stated, “The problem we’re interested in addressing is how can programs take a mountain of performance assessment data and identify what we might call actionable evidence within it.”
Bastian explained the two different approaches that can both be used by teacher preparation programs to help inform interventions with candidates and to help prioritize and target their improvement efforts. The first approach, latent class analysis, grouped candidates and created profiles of instructional practices based on similar performance assessment scores. The second approach, predictive validity analyses, predicts the relationships between candidates’ performance-assessment scores and their performance as a teacher. Bastian also stressed the need to make strategic placements more intentional, stating that “the research continually shows that high quality learning environments, highly effective clinical teachers, are related to how good that candidate is going to be as an early career teacher.”
Peck’s presentation was also based on two studies; these focused on data use. The first conceptual framework was defined by three domains: people involved in the work, nature of the tools used to undertake program improvement efforts, and organizational policies and practices that both support and interfere with people coming together and taking action together. In talking about the first study, he emphasized the third domain—organizational policies and practices—and the need for academic leaders to invest in time to analyze the data. “You have to make time for people to collaborate, to think together, to look at data together and to make decisions based on the data,” said Peck. He shared some of the innovative ways that the 10 programs who participated in the study were able to develop strategies to collect, report, and act on data to support and sustain their work. Peck invited session attendees to visit the series of short briefs, Using Data for Program Improvement: A Study of Promising Practices in Teacher Education available on AACTE’s website.
Peck described the second study, which is currently underway as part of the U.S. Teacher Preparation Education Improvement Network. He highlighted one finding that showed there was a real discrepancy across programs about how people interpreted the meaning of data use work, and the need to develop a clear and more shared understanding of what they wanted out of their data use work. “That shared understanding, that shared commitment, must be grounded in local program values and priorities,” said Peck.
Wilson focused her talk on the way public policy has had unintended consequences for schools of education. She argued that accountability “regimes” are structures that put pressure on educator preparation programs to act in a way that makes it possible to be successful within that context. She pointed out that another unintended consequence of the accountability structure on educator preparation programs is the repositioning of students as products of their work rather than the people they care about.
In addressing the argument that the public has lost trust in educator preparation, Wilson offered her “bottom line.” She stated, “Trust is not going to be rebuilt with tests but instead with joint work, open communication, and a focus on improvement over compliance and regulation.”
In her presentation, Kersaint talked about Neag’s challenge to collect and use data for program improvement due to the limited data available from the state in regard to teacher employment, performance and student evaluations. As a result, faculty have been self-reporting data, including student learning objectives they identify based on Connecticut’s System for Educator Evaluation and Development (SEED). “The problem is that the objectives are not standardized,” said Kersaint.
Given the context of imperfect data collection, she shared that they are using anecdotal data that shows they are doing well based on rankings. Kersaint shared that her goal is to balance how to honor faculty who, based on the data they do have, are doing well and still push them toward improvement. She shared a number of recommendations for collecting data that Neag is currently working toward.
A video recording on this Deeper Dive session, “Data Systems,” is available to Annual Meeting attendees at aacte.org. Additional video recordings of the General Sessions and all Deeper Dives from the 71st Annual Meeting may be accessed in the AACTE Resource Library.
Tags: Annual Meeting, data, teacher quality, workforce development