Respond by 6/29 to Survey on Evidence for Tech Integration
The authors are members and leaders of the AACTE topical action group called “All Things Accreditation.” The views expressed in this post do not necessarily reflect the views of AACTE.
Members of the All Things Accreditation Topical Action Group (TAG) spent the spring term this year designing and validating a survey that aims to help us better understand how the field is measuring technology integration in teacher education programs and what evidence colleges are using for technology expectations such as those of the Council for the Accreditation of Educator Preparation (CAEP). We invite you to participate in the survey at this link by June 29.
As teacher education programs seek to meet evolving expectations around candidate technology preparation, many programs have modified their curriculum to meet the new ISTE standards as well as to satisfy CAEP expectations.
Past accreditation systems have included technology expectations at some levels, but they were not required in all reports or documentation. It was not until CAEP introduced its standards that technology was formally articulated as a preparation program expectation.
CAEP’s definition of technology as a program-wide, cross-cutting theme established a presumption that all providers (and all specialized professional associations) must address technology in a meaningful way. This change in expectations for programs’ focus and documentation challenges those applying for accreditation to present appropriate evidence of how candidates are being prepared to use technology.
Through conversations in our own networks, we have found that people are interpreting these requirements and documenting them in diverse ways. To supplement this anecdotal evidence, we developed the survey to give us a more comprehensive data-driven understanding of how institutions preparing for accreditation were interpreting and meeting these expectations.
Initial results from a pilot administration of the survey show that the standards that programs use vary greatly–with some using ISTE for Educators, others using INTASC, and still others relying on state technology standards (which may or may not be adaptations of national standards). Forty-six percent of respondents used both a stand-alone course and cross-curriculum integration to meet the needs, and 15% used only a stand-alone course. All programs required students to create a technology-integrated lesson plan during their internship/student teaching. Most programs used a portfolio system to record student artifacts, and most of those (78%) said they are using a portfolio solution that is paid for by students (either a bookstore purchase or a special fee).
Even these preliminary data indicate multiple areas for development. For example, the majority of programs are not using distance technologies to communicate with their students in the field. Additionally, programs’ organizational structures may be creating disconnects among faculty and leadership making technology decisions. Many programs reported relying on leadership to make technology decisions, while instructors were responsible for reviewing course content in the majority of cases. These kind of disconnects could undermine successful technology preparation, and programs could be improved by exploring these inconsistencies.
Again, the link to the survey is http://bit.ly/caepsurvey with a deadline of June 29. Please consider completing the survey or passing it along to the person from your program who would be most knowledgeable on the topic. Your participation will help to provide a deeper roadmap to understand how programs are meeting accreditation requirements related to technology and provide ideas for different approaches.
Leave a comment
University of Central Arkansas
University of Oklahoma