• Home
  • program evaluation

Posts Tagged ‘program evaluation’

Aspiring to the Characteristics of Technology Infused Preparation Programs

How Does Your Preparation Program Compare?

AACTE has long emphasized the need for a more robust integration of technology in teacher preparation programs. This vision involves a shift beyond mere coursework on educational technology to a comprehensive, program-wide infusion of technology, helping candidates graduate from programs as technologically proficient teachers. How does your preparation program compare to the literature on technology infusion? 

The Vision of Technology Infusion

QR Code to Survey

Technology Infusion represents a holistic strategy to empower preservice teachers with high-level proficiency with the digital tools necessary for PK-12 students to engage in modern-day learning experiences. This approach ensures that candidates benefit from continuous, developmentally appropriate exposure to technology’s potential to be seamlessly addressed throughout their training – including methods courses and clinical experiences. By sharing the responsibility of teaching the technology integration curriculum, faculty members and PK-12 mentors can support the necessary teaching, modeling, and support for candidates’ growth in teaching with technology across faculty members and PK12 mentor teachers, theoretical foundations for technology integration, effective teaching practices, and policy that is essential for candidates to develop self-efficacy for teaching with technology as they graduate.

AACTE, Westat Piloting Surveys of EPP Graduates, Supervisors

AACTE and Westat are partnering with state chapters and education agencies this spring to pilot new surveys of beginning teachers and their supervisors. By developing common instruments to be used across states that can also be customized with state-specific questions, the partners aim to fill the need for both national benchmarks for preparation programs (as called for in accreditation standards) and state-determined priorities.

AACTE staff conducted exploratory work last year, collecting and studying state-level instruments currently used for surveying program completers in 13 states that were willing to share both their instruments and their most recent survey results. We found that all of the instruments align with the InTASC model standards for beginning teachers, although their length and emphasis areas vary. Meanwhile, we began talking with state education agencies (SEAs) and AACTE state chapters and member institutions to gauge their interest in consolidating these state and institution data collection efforts in a national-level instrument.

Study Tests Using Teacher Observation Data for Evaluation of EPPs

A new study finds that using observational ratings of beginning teachers may be a viable alternative—or a useful complement—to relying solely on controversial “value-added” modeling (VAM) in evaluation of educator preparation providers (EPPs).

An article about the study by Matthew Ronfeldt and Shanyce Campbell of the University of Michigan School of Education, published in the journal Educational Evaluation and Policy Analysis, is now available online.

In what the authors describe as the first study to investigate the use of teachers’ observational ratings to evaluate their preparation programs and institutions, the results are compelling.

“The demands for teacher preparation accountability continue to grow, from the proposed federal regulations to new accreditation standards,” said Ronfeldt, who was also the 2016 recipient of AACTE’s Outstanding Journal of Teacher Education Article Award. “We sorely need better ways to assess program quality. Although VAM makes an important contribution to our understanding of program outcomes, we likely need multiple measures to capture something as complex as preparation quality. We are excited to find that teacher observational ratings could be a viable supplement.”

Profession-Led Innovations Most Grounded in Evidence

It’s axiomatic that experts in a field are better equipped than outsiders to design interventions that will work. Yet in education, we face a constant barrage of external reform efforts that fail to incorporate professional knowledge and expertise—and they just don’t work.

This point is reinforced in recent research out of the National Education Policy Center. In this study, Marilyn Cochran-Smith and her colleagues at Boston College (MA) examine the evidentiary base underlying four national initiatives for teacher preparation program accountability and improvement. They find that only one of the initiatives—the beginning-teacher performance assessment edTPA, designed and managed by the profession—is founded on claims supported by research. With a measure that is valid, scoring that is reliable, and therefore results that are accurate, we have a serious tool for program improvement.

Study: Evidence ‘Thin’ for Key Accountability Efforts—Except for edTPA

A new policy brief out of the National Education Policy Center (NEPC) reviews the evidentiary base underlying four national initiatives for teacher preparation program accountability and finds that only one of them—the beginning-teacher performance assessment edTPA—is founded on claims supported by research. The other three mechanisms included in the study are the state and institutional reporting requirements under the Higher Education Act (HEA), the Council for the Accreditation of Educator Preparation (CAEP) standards and system, and the National Council on Teacher Quality (NCTQ) Teacher Prep Review.

Holding Teacher Preparation Accountable: A Review of Claims and Evidence, conducted by Marilyn Cochran-Smith and colleagues at Boston College (MA), investigated two primary questions: What claims does each initiative make about how it contributes to the preparation of high-quality teachers? And is there evidence that supports these claims? In addition, researchers looked at the initiatives’ potential to meet their shared goal of reducing educational inequity.

Project Management Training, Tools Critical for Managing Accreditation Work

Accreditation work involves considerable project management to track logistics and the activities of stakeholders. Resource management is a usual business practice of academic units, but the tools are not typically suitable for tracking projects with due dates and multiple actors. Tune in to AACTE’s upcoming Online Professional Seminars (OPSs) to learn about specialized software and methods for managing assessment cycles, quality assurance systems, and accreditation submissions.

In a session starting January 25, OPS #6: Leveraging Accreditation for Quality Improvement will cover topics such as ethical considerations, tools, checklists, site visits, mock visits, and walk-throughs. Or join us starting February 8 for OPS #5: Preparing for Accreditation, where we’ll cover teamwork, readiness, calendar planning, document control, best practices, and more.

Tennessee EPPs Optimistic About Changes to State Report Card

On December 2, 2015, the members of the Tennessee Association of Colleges for Teacher Education (TACTE) held their collective breath as the Tennessee State Board of Education released the 2015 Report Card on the Effectiveness of Teacher Training Programs. After 5 years of publicity nightmares as programs’ ratings and rankings received widespread media attention, would this year’s report be any better?

Back in 2007, the Tennessee General Assembly passed legislation requiring the publication of a report on the effectiveness of educator preparation programs (EPPs) throughout the state. The report was to provide the following information on program graduates: placement and retention rates, Praxis II scores, and teacher effect data based on the Tennessee Value-Added Assessment System (TVAAS). Meghan Curran, director of Tennessee’s First to the Top programs, noted, “It is our intent that the report cards will help institutions identify both what they do well and where there is room for growth based on the outputs of their graduates.”

Take What You Need With AACTE Online Seminars

Did you know that AACTE’s six Online Professional Seminars (OPSs) can be taken in any order? In fact, the seminars have no prerequisites, meaning you can skip what you already know and jump right in to the professional learning you need most.

Or are you looking for a well-rounded understanding of assessment and accreditation issues for educator development, program improvement, and quality assurance systems? Then start from the beginning and run through the complete sequence of courses.

Offered through AACTE’s Quality Support Initiative, the seminars are scheduled to be not only flexible but also convenient. Each course is completed asynchronously over a 3- to 4-week period, and multiple session options let you work around your schedule. We’ll be starting several course sections this month, including some that run over the holidays, if that suits your needs—see the current schedule of available dates.

Build Confidence, Competence, and Capacity Through Online Seminars

The immediate value of taking part in AACTE’s Online Professional Seminars is obvious: You get to enhance your peer network while gaining knowledge on crucial issues in the field, from assessment and data use to quality assurance systems and the nuts-and-bolts of preparing for national or regional accreditation. But there are other, long-term advantages to participating in the seminars offered through AACTE’s Quality Support Initiative.

The OPSs provide a framework that allows you and your institution to focus on your faculty. The professional development offered through the seminars strengthens your performance in your current position and prepares you for future ones. By developing participants’ skills regarding assessment and accreditation, the OPS series builds individuals’ confidence and enhances their competence.

Learn to Use Data for Improvement in a Free Online Seminar

Data are ubiquitous in this day and age, and making sense of all the numbers and trends can be overwhelming. Yet using data wisely is critical to be able to learn from experience and determine strategic directions for improving what we do. So where do we start—how do we identify what information we need and appropriate sources to use? How do we recognize patterns in the data and their lessons for our work? And how do we put it all together to improve our programs and demonstrate our accountability?

Teacher Preparation Is Smart (Response to ‘Teachers Aren’t Dumb’)

Sometimes the story is as good as the headlines, and sometimes it’s even better. The New York Times op-ed “Teachers Aren’t Dumb” (Sept. 8) by Psychologist Daniel T. Willingham is a case in point. As Willingham notes, contrary to popular belief, new teachers are solid academic performers. And as his article asserts, they can benefit from the research on effective teaching that is being conducted in the schools of education that prepare them. Willingham also points out—with rhetorical hyberbole—that not all preparation programs are using the latest research. While program quality varies, the excellent preparation provided by the universities whose researchers he cites illustrates that teacher education has strong exemplars. Unfortunately, Willingham does not acknowledge the widespread change within the education preparation community.

The direction of today’s preparation programs is truly good news. Willingham accurately identifies two guiding principles for improving teacher preparation and program accountability: evaluate programs based on graduates’ performance on a rigorous, credible culminating assessment, and base that assessment (and programs’ content) on evidence of what works best for student learning.

AACTE to Offer Free Online Seminars, New Support Initiative

I am delighted to announce AACTE’s new Quality Support Initiative, which is designed to provide resources and support to educators interested in assessment and accreditation. Starting next month, we will offer Online Professional Seminars (OPSs) for faculty at AACTE member and nonmember institutions, undergraduate and graduate students, PK-12 teachers—or anyone involved in educator preparation.

As part of our mission to advocate and build capacity for high-quality educator preparation, AACTE has established this initiative to support the profession’s work in continuous improvement and accreditation. The OPSs provide professional development for individuals and promote organizational development for institutions in a convenient, flexible format.

Member Voices: Bring It On: Teacher Education Ready for Sensible Evaluation

This post originally appeared in Dean Feuer’s blog, “Feuer Consideration,” and is reposted with permission. The views expressed in this post do not necessarily reflect the views of AACTE.

The dean of the Curry School of Education at the University of Virginia recently wrote an op-ed for The Washington Post that was well meaning but misleading. It was surprising and disappointing to see a distinguished educator miss an opportunity to dispel conventional myths and clarify for the general public what is really going on in the world of teacher preparation and its evaluation.

For those who may have missed Robert Pianta’s short article, here is a summary and rebuttal.

Letter to Editor: Teacher Preparation Programs Are Effective and Accountable

The following letter to the editor was published in the Washington Post February 23, in response to the February 20 commentary by the University of Virginia’s Robert C. Pianta, “Teacher Prep Programs Need to Be Accountable, Too.”

Robert C. Pianta vastly oversimplified the narrative about accountability among those who prepare educators.

Educator preparation programs should indeed be accountable, and the profession has been busy creating data tools and processes for accountability. States such as Louisiana, California, and Georgia are working to determine the best ways to use data collected through existing assessments and surveys to document program impact. These systems rely on access to K-12 student achievement data as one indicator.