logo

National Meeting: 2015 National Meeting. Market Place Presentations

                                       SESSION ONE:  1:30 – 1:55

Web-Based Design for Sharing and Coaching

In Missouri, data is shared across all levels of systems and each exchange of data poses an opportunity for coaching. This presentation will show two examples and highlight the features that support coaching.  One is the online system that supports the observation of the delivery professional development, data collection, and follow-up coaching.  The other is the online system that supports the building-level sharing of common formative assessment data with regional professional development providers and the follow-up coaching prior to the data being shared to the state.

Presenters:        Ronda Jenson, PhD; Director of Research, University of Missouri-

                                        Kansas City, MO SPDG

                          Carla Williams, EdD, Research Associate, University of Missouri,                                         Kansas City, MO SPDG

Questions for participants:        

  • In what ways are coaching mindsets and data-sharing mindsets complimentary as well as contradictory?
  • Have you experienced hurdles to incorporating new technology such as web-based systems and how have you overcome?

System Level:   State, regional, building

http://moedu-sail.org/

http://moedu-sail.org/implementation-checklists/

http://moedu-sail-cfa.org/home/cfa_index

 

Report Summaries for Various Stakeholders

Reporting is a necessity within every organization and project. Your project stakeholders will assist you in defining some requirements that are best implemented as operational functionality.  This session explores critical report implementation issues.  How should you architect your database to support reporting? Reports can be rendered in a variety of manners – printed, displayed on a screen, or an electronic file.

Presenter:          David Merves, Senior Evaluator Evergreen Evaluation & Consulting,

                              External Evaluator, NH and DE

Questions for participants:        

  • What is a report?
  • Who receives reports within your project?
  • Who selects the data to report?
  • How is it the data reported?

System Level: State, regional, district    

VT Early MTSS Program Inventory and Data Template

The Early MTSS Program Inventory is used by an Early Childhood program’s Leadership Team to assess the program’s ability to adopt the key components of Early MTSS and develop an implementation plan so that Early MTSS initiatives are implemented with fidelity and are sustainable over time. 

The Inventory Rating Scale is based on the stages of implementation, using a scale of 0 - 3. Once completed the data can be summarized and displayed in chart form for ease of decision-making and trend tracking. The Inventory will be shared with participants, as well as a sample data template

Presenter:      Pat Mueller, VT SPDG Co-Director

Questions for the participants:

  • What types of inventories or data tracking systems do they use?
  • In what ways do they analyze the data?
  • How are the results reported and to whom?

System Level: State, regional, district

Vermont Early Childhood MTSS (PPT)

Vermont State Professional Development Strategies to Support Inclusive Practices for Young Children with Disabilities

EC MTSS Goal Setting Tool

EC Program Inventory

VT Early MTSS Cohort 1 Fact Sheet

 

 

 

Transforming Implementation Data into Actionable Information: Protocols, Real-Time Data, and Customized Reporting

The Scientific Research-Based Interventions (SRBI) Self-Assessment is a tool used by CT’s SPDG to measure the level of implementation of SRBI (CT’s multi-tiered system of support) at the school level.  This session will describe how data collection protocols, real-time data summaries via SurveyGizmo, and customized reports have been integrated together to meet the information needs of technical assistance providers, schools, and grant leaders.  Electronic and hard-copy examples will be provided.

Presenters:     Rebecca Walker, Executive Director, Glen Martin Associates

Kristina Lee, Research Associate, Glen Martin Associates Evaluators, CT SPDG

Questions for the participants:

  • What are some of the pros and cons you’ve encountered in trying to balance quick, cost-effective reports with more in-depth analysis? 
  • What are some examples of low-cost options you’ve used to strike this balance?

System Level: State, regional, district, building

SCIENTIFIC RESEARCH-BASED INTERVENTIONS (SRBI) SELF-ASSESSMENT: School instructions

SRBI SELF-ASSESSMENT TECHNICAL ASSISTANCE (TA) PROVIDER PROTOCOL

SRBI SELF-ASSESSMENT: Sample Summary Report

SRBI SELF-ASSESSMENT: Sample Cohort Summary Report

SESSION TWO:  2:00 – 2:25

LEA Report Cards as Feedback Loops

The Utah Multi-Tiered System of Supports project has created infographic report cards as a tool to share with LEA Implementation Teams their current level of performance and growth over time on key implementation indicators. These have also fostered additional conversations within the teams and with their state-assigned coaches around the stages of implementation and how to make greater progress in implementation of MTSS.

Presenters:     Leanne Hawken, PhD, University of Utah, UT SPDG Evaluator

                        Ann-Michelle Neal, EdS, Program Specialist, UT State Office of

                              Education

Questions for the participants:

  • How have other SPDG projects created feedback loops back to LEAs on performance so that their projects are not simply collected data without LEAs being made aware of performance?
  • How have these feedback loops facilitated conversations with LEAs on goals for improvement?
  • What other technology and tools exist that can help in the establishment of efficient and effective feedback loops?

System Level: State, regional, district

LEA Report Cards:Presentation Slides

UMTSS Report Card: Granite School DIstrict

UMTSS Report Card: Ogden School DIstrict

Master Practice Profile/Self-Assessment

In Missouri, we have developed a structure for professional development that includes the use of practice profiles. The practice profiles are rubrics designed to support implementation proficiency and have become a valuable tool in both training and coaching settings.  The master practice profile pulls together the series of practice profiles across professional development topics into a self-assessment tool.  This tool contains questionnaires for each topic, auto-display on corresponding practice profiles, and auto-creation of a dashboard compiling results for all topics.    

Presenters:     Ronda Jenson, PhD; Director of Research, University of Missouri-Kansas City, MO SPDG

Carla Williams, EdD, Research Associate, University of Missouri-Kansas City, MO SPDG

Questions for the participants:

  • In what ways have you used the practice profile or rubric style self-assessment in your state?  
  • This Missouri tool currently is limited to self-reflection.  Have any of you been able to incorporate observable look-fors and if so in what way?     

System Level: Building

Self-assessment practice profile workbook

 

Aligning SPDG with State Student and Educator Data Systems

SPDGs have developed sophisticated data systems that include well-defined methodologies for collecting, analyzing, and reporting process and outcome data.  In some instances, these SPDG data systems are aligned with the state’s comprehensive student and educator data systems, and sometimes these systems are totally separate.  In this session, the presenter will discuss how the Georgia SPDG has worked to align SPDG data collections to the maximum extent possible with the state’s data systems for students and educators.   Participants will have the opportunity to discuss challenges and successes that they have experienced in aligning data systems.

Presenter:      Kim Hartsell, External SPDG Evaluator, Georgia

Questions for the participants:

  • What are the benefits of aligning data systems
  • How are you leveraging your state’s data system in collecting and reporting SPDG data?
  • What challenges have you experienced when aligning data systems?  What advice would you offer your colleagues?

System Level: State

Marketing the Results of SPDG Initiatives

InfoGraphics and other marketing tools are useful tools to help celebrate successes internally and to market successes to internal and external partners, as well as potential funders. This session will provide examples of how to use results-based accountability findings to approach and acquire future partners and sponsors.

Presenter:      Brent Garrett, Ph.D., Evaluator, DE, MS, NH, VT

Questions for the participants:

  • What tools do you use to disseminate and market the results/outcomes of your SPDG initiatives?
  • What audiences have you approached in your marketing and dissemination efforts?
  • What outcomes have you achieved as a results of your marketing efforts?

System Level: State, regional, district, building