Effective Features of Video-Based Professional Development for Math Teachers
Have you seen the JTE Insider blog managed by the Journal of Teacher Education (JTE) editorial team? Check out the following interview with the authors of a recent article. This blog is available to the public, and AACTE members have free access to the articles themselves in the full JTE archives online–just log in with your AACTE profile here.
The January/February 2018 issue of JTE contains an article by Mary Beisiegel of Oregon State University, Rebecca Mitchell of Pine Manor College (MA), and Heather C. Hill of Harvard University (MA) titled “The Design of Video-Based Professional Development: An Exploratory Experiment Intended to Identify Effective Features.” The article is summarized in the following abstract:
Although video cases and video clubs have become popular forms of teacher professional development, there have been few systematic investigations of designs for such programs. Programs may vary according to (a) whether teachers watch videos of their own/their peers’ instruction, or whether teachers watch stock video of unknown teachers; and (b) whether discussions are led by trained facilitators or by participants themselves. Using a factorial design, we defined four treatment conditions based on these possibilities, then assigned three groups of teachers to each condition. Teachers watched, scored, and discussed mathematics instruction according to each treatment condition’s protocol. Evidence from groups’ conversations and teachers’ video analyses and lesson reflections suggest that the teacher-led, own-video condition is slightly superior to the other conditions.
The authors reflect on their article and research in this recent interview for the JTE Insider blog:
Q: What motivated you to pursue this particular research topic?
A: As part of the National Center for Teacher Effectiveness, we collected and scored more than 900 elementary mathematics lessons with the Mathematical Quality of Instruction (MQI) instrument. To do this work, we hired and trained raters from across the country to help with the scoring. Some of the raters were mathematics teachers who scored video in their “spare time”; others were graduate students and early-career faculty. Each week, we conducted calibration checks – essentially seeing whether raters’ scores matched our scores – and then held webinars to explain our ratings and discuss discrepancies. During the webinars, the raters who were classroom teachers would often say that scoring videos was making them think about their teaching in new ways. We heard that comment so often that we thought we needed to explore how the MQI would work in professional development settings.
Also, even though she wasn’t teaching during this time, Mary felt that she was learning about her teaching because she kept “back scoring” lessons that she had taught in the past year. She would think about how she might restructure her lessons so that she would offer students multiple solution methods and talk about efficiencies and why one method would work better than another. She reflected back on how she had not emphasized mathematical language in her lessons and thought about ways she could elevate language in her future classroom. Heather had a similar experience; as she was teaching basic statistics to her students, she would think “Oh, I should have compared those two representations to make sure students saw it both ways” or “I should have offered a more thorough explanation.”
Rebecca had used the MQI with preservice mathematics teachers and found that learning how to score videos with the MQI and scoring their own lessons elevated how the preservice teachers thought about mathematics teaching.
With all that in mind, we wondered – how would teachers’ thinking about mathematics teaching change if they were trained on the MQI and asked to score video?
Q: Were there any specific external events (political, social, economic) that influenced your decision to engage in this research study?
A: We can’t think of political, social, or economic events in 2011 that would have influenced our decision to conduct the research. Instead, it was our own dilemmas in designing an MQI-based professional development program for teachers that led us to this work. In particular, in looking at the existing video-based professional development literature, a question emerged about teachers watching videos of themselves and their peers or teachers watching videos of unknown teachers. We found pros and cons of each model described in the literature, but found that there weren’t many comparisons of this difference within the same study. We needed to know whether to use teachers’ own videos or those of unknown exemplars so that we could move forward with our final program design.
Another aspect of the literature that influenced our design was facilitation. External facilitators can be quite expensive for districts to hire/employ, and there’s a movement toward teachers participating in more collaborative learning structures within their schools. However, some of Miriam Sherin and Beth van Es’ work also suggests that teacher-led groups may lack focus. We decided to experiment, and to see whether we needed minimal vs. very structured facilitation for the MQI professional development. We wondered, if we could show that teachers could have similar quality conversations and that the teacher-facilitated conversations were as stimulating as those with an external facilitator, we could argue that districts wouldn’t necessarily need to hire long-term facilitators. We started to ask the question: If we trained teachers to score mathematics lessons with the MQI, would their video scoring conversations be just as strong without an external facilitator? Similar to the type of video, when we explored the research literature on types of facilitation, we found only a few studies that made comparisons of this condition.
Q: What were some difficulties you encountered with the research?
A: One of the main difficulties was with random assignment. In our proposed research design, we had intended to randomly assign teachers to one of the four treatment conditions. However, when we recruited teachers in districts, we heard them say that they would like to participate – if they could be in groups with their peers (e.g., teachers they were familiar with, teachers in their own schools). It was clear that this was very important to the teachers and we wondered if there would be less investment in the project if we didn’t change the design of the study. After some thought, we did change the design and formed groups in different ways depending on the districts. In one large urban district, timing and proximity to the meeting location helped to determine groups; in a suburban district, grade levels determined the groups (elementary school vs. middle school). Once they were determined based on several factors, the groups were randomly assigned to one of the treatment conditions.
Another difficulty was getting teachers to volunteer to video-record and share their classrooms and share clips with their groups. The amount of sharing clips varied greatly across districts. In one large urban district, teachers who signed up for the study didn’t always know each other and it seemed that had an impact on sharing of video clips. As an example, only two teachers in a group of eight shared a clip from their own lessons. In this case, we supplemented the professional development sessions with videos from unknown teachers. In another district, teachers knew each other well and had experienced a lot of mathematics-specific professional development that focused on mathematics teaching. In this district, the own-video group teachers shared multiple clips with the group. They expressed that they were somewhat hesitant, but after the group had viewed and scored the clip, they shared that the process had been very useful.
Q: Writing, by necessity, requires leaving certain things on the cutting room floor. What didn’t make it into the article that you want to talk about?
A: We left out an analysis of teachers’ MQI scoring accuracy. We had originally thought that accuracy might be an outcome of the professional development, but as we worked with teachers, we decided that accuracy in scoring really wasn’t the point. Instead, we were interested in whether teachers’ thinking about their instruction changed. The scoring accuracy analysis also didn’t fit into our research questions as we constructed them in the paper.
Q: What current areas of research are you pursuing?
A: We are continuing the analysis of the data; specifically, more fine-grained, qualitative analysis of the comparisons between the facilitation conditions and the video conditions.
We have also used the lessons from this professional development study to design a coaching program based on the MQI. For instance, in that program we care less about teacher accuracy in scoring and more about their ability to reflect. We also use both stock and teachers’ own video, because we found that the stock video helped norm teachers around instruction, and the personal video helped teachers revise their practice.
Q: What new challenges do you see for the field of teacher education?
A: Heather has noted the challenges to causal inference in the field of teacher education. The field of teacher preparation is rapidly innovating, with new foci on core practices, novel forms of teacher education (e.g., rehearsals, community-based teacher preparation), community-based training, and research that investigates novice teachers’ practice. However, there have been few corresponding innovations in field-specific research designs. Programs of research in teacher are notoriously difficult to devise, partly because of a lack of opportunities for rigorous designs, partly because of the small number of preservice teachers serviced by most courses and programs, and partly because of the lack of inexpensive but widely available outcome measures. Perhaps as a result, much research in this field is descriptive in nature, focusing either on how teachers experience a particular instructional approach or case studies depicting how teachers may develop knowledge, beliefs, habits of mind, or values as a result of those experiences. We believe that to sustain and enhance innovations in teacher education, we need to create new, more rigorous forms of research in teacher education.
Q: What advice would you give to new scholars in teacher education?
A: Look critically at how your ideas play out in practice. Know that failed efforts are equally as good as successful efforts. Engage in design-based research as you think through new experiences for teachers. Adopt more rigorous study designs with the result.