What You Can Do to Improve Your Program Measure Submission
Shauna Harps and Brad Keller describe best practices in writing and describing your measures
Exemplar Continuation Reports
Exemplar annual report (APR) from Arkansas
Please use this report to see how Program and Project measures
are included in the Status Charts
Exemplar Final Report
Alabama Exemplar Final Report
Please note how Alabama shares their child outcomes.
Program Measures Guidance
Rubric B (PDF)
Please note that Rubric B is used by the GPRA External Reviewers.
Designed to sustain the use of SPDG-supported practices (for Program Measure 3)
SPDG Program Measure Methodology
This document describes how to respond to the four Program Measures.
Program Measure Example Continuation Report
This is an example of what a grantee may provide relative to the Program Measures in the qualitative and quantitative sections of the APR.
Related Resources - By Measure
Program Measure 1: Evidence-based Professional Development
SIGNetwork Resource Library materials on evidence-based professional development
Dr. Carol Trivette shares learning strategies to better engage and motivate adults with Reflections on How to Enhance: Learning: HPL II
Drivers Best Practices Assessment (DBPA)
The purpose of the Drivers Best Practices Assessment (DBPA) is to assist organizations in assessing their current supports and resources for quality use of selected programs or practices.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Download all or part of the monograph for free at: http://www.fpg.unc.edu/~nirn/resources/detail.cfm?resourceID=31
MiBLSi Training Scope and Sequence
• Developing Training Capacity for Statewide Implementation of Michigan's Integrated Behavior & Learning Support Initiative (MiBLSi): A Response to Intervention Model for Reading & Behavior Support [DOC]
Trivette, C. M., Dunst, C. J., Hamby, D.W., & O’Herin, C. E. (2009). Characteristics and consequences of adult learning methods and strategies (Winterberry Research Synthesis, Vol. 2, No. 2). Asheville, NC: Winterberry Press.
Dunst, C. J., & Trivette, C. M. (2009). Let’s Be PALS: An Evidence-Based Approach to Professional Development. Infants & Young Children 22. No. 3, 164–176. [Purchase Here: http://journals.lww.com/iycjournal/toc/2009/07000]
Guskey, T. R. (2000). Evaluating professional development (pp. 79-81). Thousand Oaks, CA: Corwin Press.
Noonan, P., Langham, A., & Gaumer Erickson, A. (2013). Observation checklist for high-quality professional development in education. Center for Research on Learning, University of Kansas, Lawrence, Kansas.
Pre-Post Test Guidance Checklist For Multiple-Choice Tests. (Noonan, P. and Gaumer Erickson, A. (2012). The following checklist can be used to assure best practices in developing items for multiple-choice tests used for evaluating changes in teacher knowledge as a result of your professional development session.
Program Measure 2: Improving Implementation
Exploring Connections Between Implementation Capacity and Fidelity (SISEP) https://nirn.fpg.unc.edu/practicing-implementation/exploring-connections-between-implementation-capacity-and-fidelity
Usable Innovations streaming lesson (SISEP)
Practice Profiles (SISEP) streaming lesson
National Center on Intensive Intervention (NCII, https://intensiveintervention.org/) Intervention Fidelity Infographic https://intensiveintervention.org/sites/default/files/5_Elements_Fidelity_508.pdf
Usable Innovation Team Handout (SISEP)
The Purpose of the Usable Innovation Team (UIT) is to make effective innovations teachable, learnable, doable and observable in practice
Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W.B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256. Link to view article
Dane, A.V., & Schneider, B.H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18, 23– 45. Link to purchase article (not available for free).
O’Donnell,C. L. (2008). Defining,conceptualizing,and measuring fidelity of implementation and its relationship to outcomes in K–12 curriculum intervention research. Review of Educational Research,78,33–84. Download Article Link
Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation, 24, 315–340. Link to Journal website