logo

Main Content: Evaluation

The State Professional Development Grantees must describe how the SEA will assess, on a regular basis, the extent to which the strategies implemented have been effective in meeting the prescribed performance goals. Below are journal articles, resources and tools developed by SPDG evaluators to assist projects in evaluating initiatives.

Assessments / Instruments/ Checklists
Behavior Interventions and School Climate
Comparison Group Evaluation Design
Data Collection, Display, Analysis, and Use
Goal Attainment Scale
Fidelity of Implementation
Implementation
Logic Models
Measuring Collaboration
Examples of SPDG Evaluation Systems
Reports
Response to Intervention and Multi-Tiered Models of Intervention
SIG Program Evaluation (former SPDG Program)
Social Networking Analysis
Working with Third Party Evaluators


Assessments / Instruments / Checklists

CIPP Three part series on Conducting High-Quality Customer Surveys: 

First Webinar Slides: This webinar outlines the varied purposes of customer surveys, walks participants through the process of planning a high-quality customer survey, identifies some of the common problems related to planning customer surveys, and highlight the benefits of a well-designed customer survey.  This webinar is accompanied by Resources on Planning, Designing, and Conducting Customer Surveys.

Second Webinar Slides: This webinar walks participants through key aspects of survey design, identifies some of the common problems associated with designing customer surveys, and discusses how to carry out pilot testing to improve the quality of the survey. This webinar is accompanied by a Sample Item Formats for Customer Surveys.

Third Webinar Slides: This webinar outlines the different modes of survey data collection, walk participants through the process of collecting high-quality survey data and ensuring data security, identify some of the common problems associated with conducting customer surveys, and offer suggestions for improving response rates and managing the data that have been collected.  This webinar is accompanied by a Checklist for Planning, Designing, and Conducting Customer Surveys.

 

Noonan, P., Langham, A., & Gaumer Erickson, A. (2013).Observation checklist for high-quality professional development in education. Center for Research on Learning, University of Kansas, Lawrence, Kansas.

Pre-Post Test Guidance Checklist For Multiple-Choice Tests. (Noonan, P. and Gaumer Erickson, A. (2012). The following checklist can be used to assure best practices in developing items for multiple-choice tests used for evaluating changes in teacher knowledge as a result of your professional development session.

Arkansas SPDG Research-Based Professional Development Evaluation Form (PDF, February 2011). 

Arkansas SPDG Evaluation Form (PDF, Jan. 2010). For research purposes 

Arkansas Department of Education SPDG Consultation Evaluation Form. (PDF, Feb. 2011) . School personnel to evaluate on-line consultation services.

Coaches Self-Assessment (DOC, Lewis-Palmer, Barrett, & Lewis 4/04) (revised Newcomer 10/09)

District Capacity Assessment (DRAFT DOC, 7/2012) This assessement is intended to help district leadership teams and state professional development providers assess their readiness with elements related to tiered interventions for academics and behavior and implementation science.  

Overview of Running Start and evaluation activities (PDF, Utah, 3/2012). Utahs' Running Start is a special education new teacher induction program designed to prepare licensed and unlicensed special education teachers to teach on the first day of school. This PDF provides an overview of Running Start with an emphasis on the evaluation activities and outcomes associated with Running Start and the generalization of vital teaching behaviors to the classroom. Additional emphasis will be placed on the role of micro-teaching, coaching, and on-going performance feedback as critical attributes of sticky professional development. Running Start Coaches Checklist (PDF, Utah, 3/2012)   

Pennsylvania Paraprofessionals Credential of Competency Checklist (3/2012)

Culturally Responsive PBIS:  Modifications of Tools to Include Culturally Responsiveness. These tools - EMS Rubric, Culturally Responsiveness Assessment (CRA), and 5 X 5 Walkthrough - were developed by PBISIS Indiana  for use by state, district and school staff. The most important tool is the EMS Rubric The rubric shows how they've modified standard PBIS fidelity measures to include issues of culture. The CRA and the 5 X 5 Walkthrough are interesting in that they are different methods for getting at some of the same dimensions of cultural responsiveness in a school.  The CRA is self-report and the 5 X 5 is based on observation by staff.

PBSIS Indiana EMS Rubric (DOC, 7/2012)
Culturally Responsiveness Assessment  (DOC, 7/2012)
 5 X 5 Walkthrough (DOC, 7/'2012)

Ohio's Teacher Based Team Checklist (PDF, 3/2012). 

Ohio's Building Leadership Team (BLT) Effectiveness Survey (PDF, 3/2012)

Ohio's District Leadership Team (DLT) Effectiveness Survey (PDF, 3/2012)

inQsit Online Assessment System  (Fortiede, D. and Draper, V., Ball State University) inQsit is a comprehensive online assessment instrument.. You can create, distribute, and administer tests, quizzes and surveys via the intranet or internet, then retrieve, record, optionally grade and assess responses; all online, without the need for any HTML or programming knowledge.
Keywords: online assessment instrument, online surveys, online quizzes

The Effect of an "Unsure" Option on Early Childhood Professionals; Pre- and Post-Training Knowledge Assessments (American Journal of Evaluation, published (online) 9/3/2010, Wakabayashi, T. and Guskin, K.)
Keywords: knowledge assessment, measurement, methods, early childhood professionals, true/false test

PaTTAN Leadership Mentoring to Mastery (M2M): Self Assessment Tool (DOC, 1/2009) Developed by Pennsylvania DOE for those based on Council for Exceptional Children, Council for Administrators of Special Education, Advanced Knowledge and Skill Set for Special Education Administrators and aligned with Pennsylvania Inspired Leadership Standards.
Keywords: knowledge assessment, leadership, administrators, mentoring

CYFERnet (Children, Youth, Families Education, Research Network) CYFERnet's evaluation section includes practical tools that can be uses to evaluate community-based programs, information on how community programs can be sustained and assessments of organizational support for work in the areas of children, youth and families. This site is hosted by North Carolina State University.
Keywords: community-based programs, children and families, organizational assessment

Rhode Island Modified Universal Design for Learning Educator Checklist (DOCX, 9/24/2009) Adapted from CAST UDL Guidelines
Keywords: UDL, Universal Design for Learning, Educator Checklist

Advancing Parent-Professional Leadership in Education (A.P.P.L.E.) Example Post-Course Survey (PDF, Spring 2009) Includes course evaluation items, post-test items on course content and general participant feedback items.
Keywords: parent survey, parent leadership
 

Behavior Interventions and School Climate

Vermont PBiS Website With evaluation resources.

Vermont PBiS Team Implementation Checklists The Team Implementation Checklists (monthly and quarterly) are tools designed for school teams to monitor progress through PBIS implementation. For the first few months following PBIS implementation, it is recommended that school teams use this to guide  monthly PBIS Leadership Team meetings.

Vermont PBiS School-Based Universal Coordinator Self-Assessment (5/2011) This self-assessment is designed to assist coordinators in identifying current strengths and professional development goals.
Keywords: checklist, self-assessment, behavior, PBIS, team implementation,

Developing Training Capacity for Statewide Implementation of Michigan's Integrated Behavior and Learning Support Initiative (MiBLSi): A Response to Intervention Model for Reading; Behavior Support (DOC, 2010) The purpose of this document is to develop statewide training capacity for the implementation and sustainability of schoolwide models of reading and behavior support (RtI) through the application of the implementation drivers to create an effective Trainer Capacity Development Model. In the appendix are evaluation resources including: Participant Training Evaluation, Rubric Feedback Forms, Trainer Feedback Forms, Pre-Post Survey of Readiness for Training, Trainer Work Day Evaluation and TAP Trainer Support Evaluation.
Keywords: Rubric, Trainer, Survey, Coach

Building Effective Schools Together through Positive Behavior Supports Implementation Rubric (XLS, 7/2010, CalSTAT) This interactive template is used by school teams to record their implementation level for 10 key elements. After entering their data teams can generate reports.
Keywords: rubric, practice profile, implementation rubric
 

 

Data Collection, Display, Analysis and Use

DATA COLLECTION

Model of T3 Behavioral Support: Scaling the Pyramid (PPT, 10/2012, C. Davis) Behavior PLC: Cohort 2 Model Demonstration Project (C2MD). Carol presented the components of their tertiary model, the outcomes of their project,  implementation lessons learned, and tools developed. At the end she touched upon a slick user friendly tablet-based application recently developed for data collection purposes.

Wisconsin SPDG Data Collection Tools (PPT) and WEBINAR RECORDING (Oct 2011, Evaluators PLC Session), Jim Fraiser, Ph.D, Evaluator, Wisconsin provided a demonstration of two University of Wisconsin’s online data collection tools that can be accessed free by SPDG Evaluators.

New Mexico's iPad Training and Coaching Data Collection Tool. (July 2011) Measuring implementation; fidelity of implementation; challenges; solutions for data collection; data management using web-based tools including a new iPad app. To view the app go to the Apple App Store under Training and Coaching Log. Anyone can download the free app but you have to have an account on the system to sync data. Otherwise, the data just stays on your iPad. For more information contact Carlos Romero - romero@apexeducation.org

Keywords: iPad app, mobile data collection, online training and coaching data collection

Coaching and School Data Collection App: The Classroom Mosaic App, the 20-Minute Target Survey and Fidelity Checklist Tools. Data Collection and Fidelity of Implementation in the SC Gateways Project (PPT, 8/2012)Target Audience: SPDG project staff, administrators and instructional coaches; Presenter: Susan Beck, South Carolina SPDG Director.

NCSIP II Fidelity Observation Forms and Professional Development Data Collections Forms and Procedures North Carolina's SPDG - NC SIP II - Evaluation Tools.  Website includes: student progress data evaluation collection instructions and forms, online surveys, professional development data collection forms and procedures, and reading and math model fidelity observations forms. Keywords: data collection, fidelity, reading, math, student progress data 

New Mexico State Personnel Development Grant State Performance Plan Improvement Project Data Management System (PDF, 2009) This resource highlights New Mexico SPDG's State Performance Plan Improvement Project (SPPIP) Online Consultant Log for collecting training and coaching data. The New Mexico SPDG project uses a web-site to disseminate information about project activities, register participants for centralized events, manage school improvement plans (Educational Plans for Student Success), administer surveys for feedback and evaluation, and collect information on training, coaching and other technical assistance provided to schools by the cadre of consultants.
Keywords: online training log, online coaching, online database, State Performance Plan, data collection

DATA DISPLAY

Creating Customized Evaluation Reports for Stakeholders (Recording, 6/2009). Cheryl Walter and Alan Wood, California SPDG Evaluators. CalSTAT's process for summarizing; displaying evaluation data for multiple; shared tools they developed and the software used to create their reports.
Keywords: stakeholder reports, data display, customized reports

DATA ANALYSIS

RRCP Data Priority Report: Resources on data management, data analysis, and identification of improvement areas. (7/2013, Blythe, T.)  This report includes a compiled list of resources (e.g., tools, guidance, best practices, etc.) relating to data management, data analysis, and identification of improvement areas. This information will be used to develop training for RRCP staff and provide TA to states as they work to meet the requirements of RDA.

 

DATA USE

Washington State: Using Data to Drive Statewide Improvement Efforts (PPT, 11/2011) Leslie Pyper, Washington SPDG Director and Greg Roberts, Center on Instruction, Special Education Strand

OSPI Efforts Inventory Instructions and Reference Sheet (DOC, 10/2010) The purpose of this inventory is to identify those efforts within OSPI that are supportive of RtI components. This document was developed in conjunction with the Northwest Regional Comprehensive Center.
Keywords: Data Analysis, Data Use, Statewide Improvement, Systems Improvement
 

Goal Attainment Scale

Goal Attainment Scales: A Way to Measure Progress (PPT, 12/3/2012, M. Ballay and A. Gaumer Erickson)  Amy Gaumer Erickson, Kansas and Missouri SPDG Evaluator, University of Kansas, Center for Research on Learning and Monica Ballay, Lousiana Evaluation and Site Liaison, Louisiana State University. Link to SESSION RECORDING.

Goal Attainment Scale Templates and Data Displays
Goal Attainment Scale (GAS) Template (DOC, 2010) Monica Ballay
Modified Goal Attainment Scale Template (DOC, 2010) Monica Ballay
Goal Attainment Scale Form (DOC, 2010) Amy Gaumer Erickson
GAS Sample Data Displays (PDF, 2010) Amy Gaumer Erickson
 

Fidelity of Implementation

Quality Indicators for Assessing Individualized Services for Preschool with Significant Support Needs (PDF, 7/2012) and Action Plan (EXCEL, 7/2012) Developed by Colorado Department of Education. The Quality Indicators offer guidance to educators and administrators when developing, implementing and evaluating quality programs and services for students with the most significant needs. It's a tool designed to assist those who are educating students with significant support needs or evaluating these programs. 

Colorado RtI Fidelity of Implementation Rubrics (PPT, 3/2012, Pereles, D., and Jorgensen, D.) The presentation will provide an overview of tools utilized by Colorado's SPDG Leadership Team, Technical Assistance Coordinators, and participating school sites to generate dialogue concerning RtI fidelity of implementation at the classroom, school, and district levels.

Resources: 
CO RTII Implementation Rubrics Guidebook (PDF, 11/2010)
CO RTI Implementation Rubrics Training (PDF, 1/2011)
CO RTI Implementation Rubric School-Level (PDF)

Data Tracking and Usability (NH RESPONDS), PPT (7/2012, NH SPDG) What happens with all the data that you collect? Ever feel like it’s not used much because there’s too much or it’s not easily accessible? The presentation describes the methods and data tracking systems employed by New Hampshire’s SPDG Leadership Team and evaluators to capture the extent to which the intervention is being implemented with fidelity. The goal of the NH RESPONDS initiative is to develop and implement a multi-tiered literacy and behavior support program that will result in improved student outcomes in five NH demonstration school districts. As the project ended its 4th year, it became apparent that various types of data were collected (e.g., PD satisfaction and implementation data; student progress monitoring data utilizing a number of literacy and behavior assessments; student outcome data measured by state, norm-referenced assessments). The project Evaluation Workgroup developed a data-tracking system to ensure that pertinent data were collected and summarized in usable formats. The system has been valuable in: 1) identifying gaps in PD and TA delivery; 2) pinpointing missing data by some demo sites; 3) giving schools/districts actionable student level performance data to inform appropriate programming across the 3 tiers; and 4) providing the RESPONDS leadership with a bird’s eye view of the entire program to inform the decision-making process.

RESOURCES: 

NH RESPONDS Assessment Instruments/Tools, DOC
NH RESPONDS School Summary of RtI Implementation (blank form), DOC
NH RESPONDS School Summary of RtI Implementation (completed sample), DOC

Developing, Measuring & Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation (PPT, 2/2012, SPDG National Meeting)  Allison Metz, Associate Director, National Implementation Research Network. 

Examples of fidelity instruments
Teaching Pyramid Observation Tool for Preschool Classrooms (TPOT), Research Edition, Mary Louise Hemmeter and Lise Fox
The PBIS fidelity measure (the SET) described at http://www.pbis.org/pbis_resource_detail_page.aspx?Type=4&PBIS_ResourceID=222

Articles
Sanetti, L. & Kratochwill, T. (2009). Toward Developing a Science of Treatment Integrity: Introduction to the Special Series. School Psychology Review, Volume 38, No. 4, pp. 445–459.
Mowbray, C.T., Holter, M.C., Teague, G.B., Bybee, D. (2003). Fidelity Criteria: Development, Measurement and Validation. American Journal of Evaluation, 24 (3), 315-340.
Hall, G.E., & Hord, S.M. (2011). Implementing Change: Patterns, principles and potholes (3rd ed.)Boston: Allyn and Bacon.

The Evaluation Forum (Blog) - 'Fiddling with Fidelity?" The author provides a unique description of fidelity, using a bread pudding metaphor. It goes on to describe the key questions evaluators should ask for process evaluation purposes.

National Center on Response to Intervention Resources on Fidelity to RTI

NCSIP II Fidelity Observation Forms and Professional Development Data Collections Forms and Procedures North Carolina's SPDG - NC SIP II - Evaluation Tools. The site includes: student progress data evaluation collection instructions and forms, online surveys, professional development data collection forms and procedures, and reading and math model fidelity observations forms.

Advancing Parent-Professional Leadership in Education (A.P.P.L.E.) Example Post-Course Survey (PDF, Spring 2009) Includes: course evaluation items, post-test items on course content and general participant feedback items.

Creating Customized Evaluation Reports for Stakeholders (Recording , 6/2009) Cheryl Walter and Alan Wood, California SPDG Evaluators. CalSTAT's process for summarizing and displaying evaluation data for multiple shared tools they developed and the software used to create their reports.

PRESENTATIONS

SPDG Evaluator's Webinar - Problem Solving and Response to Intervention SAPSI Training (PPT Recording, 1/2011) Jose Castillo and Clark Dorman, University of South Florida, Tampa)  In-depth description of how they are using the Self-Assessment of Problem Solving Implementation (SAPSI) to formatively evaluate their project activities. In addition, they share other evaluation instruments that would be of interest to the community. Resourses shared during the session:

Florida Statewide Problem Solving/Response to Intervention Evaluation Tool Technical Assistance Manual (PDF)
Example - Self-Assessment of Problem Solving Implementation (SAPSI) Administration Summary, 2010-2011 School Year
(DOC)
Florida's PS/RtI Project Program Evaluation Website
Florida's PS/RtI Project Year One Evaluation Report
Florida's PS/RtI Project Year Two Evaluation Report

Keywords: problem solving, fidelity of implementation, self-assessment
 

Implementation

Overview of Stage-Based Measures of Implementation Components (PDF, 11/2011). SISEP offers this table that outlines the stage-based assessments of implementation activities in practice, to be used (modeled) by Regional Implementation Team members during district or project work with District Implementation Teams or Project Teams in the state DOE. Subsequently useful to District Leadership and Implementation Team members as they work with groups of schools or building leadership and implementation teams.

ImpleMapping. Exploration Stage Assessment of Implementation Capacity (PDF, 11/2010). This is part of SISEP's series of Stage-Based Measures of Implementation Components.

Installation Stage Assessment (PDF, 2/2011). This is part of SISEP's series of Stage-Based Measures of Implementation Components.

Installation Stage Action Planning Guide for Implementation
(PDF, 2/2011). This is part of SISEP's series of Stage-Based Measures of Implementation Components.

Full Implementation Stage Assessment
(PDF, 2/2011). This is part of SISEP's series of Stage-Based Measures of Implementation Components.

Stages of Implementation Analysis (PDF, Adapted 1/2012). This tool provides the team with the opportunity to plan for and/or assess the use of stage-based activities to improve the success of implementation efforts for EBPs or evidence-informed innovations. The tool can be used to assess current stage activities (e.g. “We are in the midst of Exploration”) or past efforts related to a stage (e.g. “We just completed most of Installation? How did we do? What did we miss?).

Missouri Integrated Model Implementation (MIM) Implementation Matrix (PDF, 3/2012). This matrix, based on essential features of school improvement, is used to support teams in interpreting multiple sources of data and identifying areas of focus for action planning.

Measuring School-wide Implementation of Academic and Behavior Supports School Implementation Scale (PDF): This evidence-based online survey measures the implementation of key components within multi-tiered models of support school-wide (across all school staff). The measure shows strong validity and reliability and has proven to be beneficial in data-based decision-making for schools, districts, and professional development providers.  Results of the online survey have been used by evaluators and project management to identify professional development/coaching needs, as well as by school leadership teams to target areas for action planning. This cost-effective measure has the potential to provide accurate and reliable data across multiple RTI, PBS and integrated models.

Michigan's Integrated Behavior and Learning Support (MiBLSi) Initiative - Evaluation Resources  Find resources on MiBLSi's model imlementation evaluation, measurement manual, measurements, measurement schedule and their support training sequence.
Keywords: implementation, measurement, training, multi-tiered model of intervention

Ohio Improvement Process (OIP) Implementation Criteria & Rubric (PDF, 3/2012) Ohio Improvement Process (OIP) Graphic (PDF)

ERIA: Effective Reading Interventions Academy, A Pathway Towards RTI2 (PDF, 12/10/2010) This booklet explicitly addresses both implementation and intervention practices to guide the design of a site-based program.
Keywords: practice profile, RtI, Response to Intervention

ERIA Implementation Rubric (XLS, 7/2010) CalSTAT's 10 key elements instrument provides a framework for trainers, coaches, site team members and teachers to evaluate and discuss implementation, fidelity and next steps.
Keywords: implementation rubric, RtI, Response to Intervention

Building Effective Schools Together through Positive Behavior Supports Implementation Rubric (XLS, 7/2010) CalSTAT's interactive Excel template is used by school teams to record their implementation level for 10 key elements. After entering their data teams can generate reports.
Keywords: implementation rubric, behavior interventions

Missouri/KU Regional Coach Level of Implementation and Collaboration Interview Questions (DOC, 9/2010)

Missouri School Staff Survey (PDF, March 2011) The Missouri Integrated Model (MIM) School Staff Survey is designed to gain input from all school staff (teachers, administrators and instructional staff) regarding the level of implementation of the essential features of the MIM. This survey was developed with input from the MIM Evaluation Work Group and asks participants to rate their own behavior in implementing the 11 essential features ( Shared Vision and Commitment, Leadership, Collaborative Environment, Ongoing Professional Development, Mentoring and Coaching, Culturally Responsive Practices, Resource Mapping, Family and Community Involvement, Evidence-based Practices, Data-based Decision Making and Monitoring of Student Progress). Click on hyperlink to view Amy Gaumer Erickson's  presentation.
Keywords: school staff survey

PRESENTATIONS

Evaluating the Implementation Drivers (PPT, 5/2010) SPDG Evaluator's Webinar,  Amy Gaumer Erickson, SPDG Evaluator, Kansas and Missouri; Julie Morrison, SPDG Evaluator, Ohio; Pat Mueller, SPDG Evaluator, New Hampshire and Mississippi)

2011 SPDG Directors' Professional Development Webinar #1: Models of and Evalution of Professional Development (Recording)

Professional Development Series #1 - Evaluation. (PPT, 1/2011) Julie Morrison, Ohio SPDG Evaluator, explored how NIRN's implementation drivers framework has prioritized staff competence for effective programs; practices and examined how Guskey's five critical levels for evaluating professional development can be used as a framework in designing effective training.

2011 Professional Development Webinar: Models of and Evaluation of Professional DevelopmentTools as Implementation Drivers: An Example from California SPDG's ERIA - Effective Reading Interventions Academy (PPT, 1/2011) Cheryl Walter, and Alan Wood, California SPDG Evaluators, CalSTAT
 

Logic Models

Logic Models (Recording, 2011) The Center for Evaluation and Education Policy (CEEP) created this Voice-Over PowerPoint for OSEP on: (1) How to create and use logic models and (2) How to create high quality objectives and performance measures.

This website provides information on logic models and lists multiple online resources.

The Education Logic Model (ELM) Application developed by IES and the Pacific REL can be found at:  http://relpacific.mcrel.org/resources/elm-app
 

Measuring Collaboration

Measuring Collaboration Among Grant Partners (American Journal of Evaluation, 2006; No. 27; pg. 383)
Keywords: collaboration; grant partners; cooperation; visual displays.

Transition Collaboration Scale and Rubric Process (PDF, 1/2011) Patricia Noonan, University of Kansas, Center for Research on Learning

Transition Collaboration Survey (PDF, 1/2011) Patricia Noonan, University of Kansas, Center for Research on Learning. The Transition Collaboration Survey is designed to gain input from individuals regarding their level of collaboration in providing transition education and services.

Examples of SPDG Evaluation System

MiBLSi's Three Level of Project Evaluation (PPT, 12/2012, A. Harms) For this session  Dr. Anna Harms will describe the three levels of Michigan’s evaluation system. The Formal or Organizational Level involves evaluation for and feedback loops with stakeholders (OSEP, Michigan Department of Education) and partners (intermediate school districts, districts, school personnel) to ensure critical outcomes are being met through project activities. The Process Level focuses on evaluation of inter- and intra-unit productivity, collaboration and communication to continuously improve how resources are transformed into supports for partners. The Individual Level emphasizes using evaluation procedures to support project staff as they contribute to the project's mission and goals. She will also discuss how each unit of the project contributes to evaluation and continuous improvement.  SESSION RECORDING

Reports

Creating Customized Evaluation Reports for Stakeholders (PPT, Recording,  6/2009) Cheryl Walter and Alan Wood, California SPDG Evaluators. CalSTAT's process for summarizing and displaying evaluation data for multiple and shared tools they developed and the software used to create their reports.

Annotated Materials List (PDF) Includes links to SIG 2/CalSTAT Activities and Outcomes Evaluation Summary Report, Professional Development TA and Collaborative Sites Survey Summary Report, ERIA Summary Report 2007-08 and ERIA Team Implementation Checklist (TIC) Charting Program. To see additional reports developed by CalSTAT go to: http://www.calstat.org/evaluation.html.


Access to Wisconsin SPDG's current efforts related to Fidelity of Implementation and Sustainability Evaluation Architecture and also provides examples of evaluation reports for fidelity of implementation and sustainability can be found: http://wispdg.org/wispdg_evaluation.html.

 

Response to Intervention and Multi-Tiered Models of Intervention

National Center on Response to Intervention Website (NCRTI) RtI Screening, Progress Monitoring and Other Tools. Use their advanced search function for best results.
Keywords: RtI, response to intervention, progress monitoring, screening

Response to Intervention Progress Monitoring Resources for Grades K-12 (PDF, 3/2009) Southeast Comprehensive Center
Keywords: progress monitoring

Washington's Online The Integrity Rubric for RTI implementation. Anonymous users can click the "Explore RTI Data" link to browse the data that’s been entered so far. Washington launched the system in August 2012 so there may not be a lot of users. If you want to see the actual rubric web form you must create an account. After creating username and email you’ll be able to flip through and see how it actually works. Users click on stars to “rate” – then a comment box opens so they can insert “evidence” for selecting that rating. The optimal browser is Google Chrome. They are working to resolve some bugs when displayed in Internet Explorer. Based on feedback from their July training sessions, they are doing some tweaking to the integrity rubric training modules. The revised modules will be posted once they are made available (sometime after September 10, 2012).

The RTI Essential Components Integrity Rubric (PDF, 8/2011) and the RTI Essential Components Integrity Worksheet (PDF, 8/2011)were developed by the National Center for Response to Intervention  are for use by individuals responsible for monitoring the school-level fidelity of Response to Intervention (RTI) implementation or for self-appraisal. The rubric and the worksheet are designed to be used together and are aligned with the essential components of RTI.

NOTE: The Integrity Rubric for evaluation of implementation has been used by Washington for two years (since 2010) with their pilot sites. Washington has begun entering the data summer/fall 2012.

RTI Essential Components Integrity Worksheet: Examples of Possible Evidence (DOC, 8/2012). This accompanying resource was developed by Washington's Office of Superintendent of Public Instruction (OSPI).

I-ASPIRE Evaluation Tools Find the following tools: Data Protocol Form, Self-Assessment of Problem Solving Implementation, Technical Assistance Log Forms, Fidelity Checklist, Parent Survey and Institutes of Higher Education Checklist. The evaulation tools were developed by Loyola University, Center for School Evaluation, Intervention and Training for the Illinois SPDG, I-ASPIRE.

I-ASPIRE Project Evaluation Tools Overview  (PPT, 10/5,6/2009),  Kathy Cox, I-ASPIRE Director) An overview of two evaluation tools and data collected on levels of implementation for 2007-2008. The primary focus of the I-ASPIRE project is reading at the K-3 level.

I-ASPIRE Project Problem Solving Fidelity Checklist  (PDF, 2009) Allows external evaluation by I-ASPIRE regional staff of implementation in school sites. The primary focus of the I-ASPIRE project is reading at the K-3 level.

Illnois RTI Network - Self-Assessment of Problem Solving Implementation-School Level (SAPSI-S), 2012-2103, PDF). The Self-Assessment of Problem Solving Implementation-School Level (SAPSI-S) monitors ongoing efforts to establish permanent problem solving procedures, tools, and products and thereby implement a multi-tiered system of supports (MTSS).

Illinois RTI Network - Self-Assessment of Problem Solving Implementation at the District Level (SAPSI-D), 2012-2013, PDF. Purpose and Target Participants: The Self-Assessment of Problem Solving Implementation at the District Level (SAPSI-D) monitors ongoing efforts to establish permanent problem solving procedures, structures, tools and products in the implementation of a multi-tiered system of supports (MTSS). The district leadership team should complete the SAPSI-D once each academic year in the spring. The SAPSI-D can, however, be completed more frequently (e.g. once per semester) for the purposes of further district level planning.

Methods & Instrumentation Implementing Multi-Tiered Literacy & Behavior Models (PPT, Pat Mueller, Evergreen Evaluation Consulting) This presentation highlights the methods & instrumentation employed by New Hampshire's SPDG Leadership Team and evaluators to capture the extent to which the intervention is being implemented with a high degree of fidelity. The primary outcome of the NH RESPONDS initiative is to develop and implement a multi-tiered, braided literacy and behavior support program which result in improved student outcomes in five demonstration school districts across New Hampshire. The intent is for the demonstration schools to have a highly developed integrated 3-tier system of literacy and behavior support at the end of the grant period that is aligned to school improvement efforts and supported by district leadership activities.

 

New Hampshire Literacy Universal Team Checklist (DOC, 2002) Adapted from: Sugai, Horner, Lewis-Palmer with content from NH Literacy Action Plan. The Universal Team completes this checklist at two checkpoints during the school year to monitor activities for implementation of RtI for Literacy Instruction: Early Fall and Early Spring.
Keywords: checklist, fidelity of implementation, TIC

Montana's RtI Implementation Survey [PDF, 2/2011] Montana's RtI Implementation Survey has been used in identifying where schools and districts are in the process of scaling up RtI in their local areas. Target Audience: Participants who have identified RtI or PBIS as one of their SPDG Goals.
Keywords: RTI, response to intervention, survey, implementation

Montana District Data Audit Tool  (PDF, 9/2010) As Montana starts the braiding of RtI and PBIS, this Data Audit Tool has been designed to assist local sites in reviewing and discussing data in the areas of academic achievement, discipline & special education placement over a three year period.
Keywords: audit, district, multi-tiered model, integrated model
 

SIG Program Evaluation (former SPDG program)

Evaluation of the State Program Improvement Grant (SIG) Program: Final Report
(PDF, 12/2007), Fiore, T.A.; Munk, T.E.; Langham, A., Westat and Magliocca, L.A., Ohio State University) Includes a systems change model and an elegant way of designating outcomes as robust, solid, probable and possible. Outcome evidence is provided to support the categorization of the outcomes. There is also a very interesting section on administrative leadership. Please note that this is a SIG rather than SPDG report. As such, some of the outcomes are a bit dated (e.g., participation in assessment) given the passage of NCLB.
 

Social Networking Analysis

Using Social Network Analysis to Understand and Improve Collaboration Among Centers, Projects and Initiatives. (PPT) 12/2011, David Merves, Evaluoators PLC Session.  Webinar Recording

Working with Third Party Evaluators

CENTER TO IMPROVE PROJECT PERFORMANCE (CIPP): Guidelines for Working with Third-Party Evaluators

The Guidelines for Working with Third-Party Evaluators is written to assist grantees and their OSEP Project Officers in planning for, finding and hiring, and working with third-party evaluators to design, implement, and complete a project evaluation. The document presents a discussion of the benefits, drawbacks, and limitations of using a third-party evaluator and practical guidelines for creating a third-party evaluation scope of work, developing a Request for Proposals, soliciting bids for and contracting with a third-party evaluator, and monitoring and managing the work of the third-party evaluator.

Center to Improve Project Performance
Guidelines for Working with Third-Party Evaluators – Evaluation Project Management Tools