logo

2013 National Meeting Market Place Sessions

Market Place Session Descriptions (PDF). This document provides descriptions of each market place presentation found below.

 

Round 1 - 10 and 10 Tools (March 5, 1:15-1:35 pm)

Student Oral Reading Fluency (ORF) Data Tool & Site Implementation Rubric

Two Excel-based tools for use by School Sites will be shared. The Student Oral Reading Fluency Data Tool enables recording of three annual assessments, auto color codes students by level, and charts progress over the year. The Site Team Implementation Rubric operationalizes key program components, enables both action planning and progress assessment, and automatically charts the data and summarizes Next Steps.

Resources: 
ERIA: Oral Reading Fluency (ORF) Data Summary Model (Excel 2/2013) 
ERIA: Oral Reading Fluency (ORF) Data Summary Model HANDOUT (PDF, 2/2013) 
ERIA Implementation Rubric (Excel 2/2013) 
ERIA Implementation Rubric HANDOUT (PDF, 2/2013)

Presenter:  Li Walter, Ph.D, SPDG Evaluator, California

Target Audience:  Used by Site Teams for recording and sharing student data on oral reading fluency progress and site progress toward implementing key program components.

System Level:  Used as an action tool at the Site level, and electronically shared with district, regional and state levels.

Literacy Efficacy Measure

Improving adolescent literacy is identified by educators and policymakers as an area of critical need as we seek to equip and prepare our students for the 21st century.  The ALAB was developed as a tool that would serve teachers and schools as they provide instruction and implement strategies targeting the improvement of adolescent literacy.  The ALAB can gauge student perceptions and confidence in performing literacy tasks. It can be used as a tool to measure change in student literacy self-efficacy as a result of literacy skill interventions. The ALAB is a reliable measure of adolescent literacy and academic behavior self-efficacy.

Resources:
Adolescent Literacy and Academic Behaviors Self-Efficacy Survey (ALAB) (PDF, 2012, DeFur and Runnells)

Adolescent Literacy and Academic Behavior Self-Efficacy Survey (ALAB) (PPT, 3/2013, Sharon deFur, Virginia SPDG Evaluation)

Presenter:  Sharon deFur, EdD, Evaluator, Virginia SPDG, and Professor of Special Education, College of William and Mary

Target Audience:  School and division intervention implementers, evaluators

System Level:  Classroom, Building, Division, State

Scientific Research-Based Interventions (SRBI) Self-Assessment

Fidelity tool used to measure the implementation of SRBI at participating schools. The assessment consists of 45 items designed to measure current levels of implementation across four general areas: System of Instruction; System of Assessment; System of Decision-Making; and Leadership.  Protocol for use, how data used to customize on-site technical assistance, and considerations for improved reliability and validity will be shared.

Resources: 
CT SPDG Scientific Research-Based Interventions (SRBI) Self-Assessment (PDF, 2012-15)
CT SPDG Cohort I SRBI Self-Assessment Report 2011-12 Cohort (PDF) 
CT SPDG Cohort 1 SRBI Self-Assessment Report 2012-13 Cohort (PDF)

Presenter:  Michelle LeBrun-Griffin, Consultant, SPDG Coordinator, Connecticut

Target Audience:  District and building administrators, school leadership teams, and professional development service providers

System Level:  State, regional, district, and building level

Family-School Partnering Tools (12-13)  
The Family-School Partnering Survey is used to collect meaningful feedback from families. The feedback directly informs data-based decision making at the school, district, and state levels.  It was derived from the Multi-Tiered Family, School, and Community Partnering Practices Checklist created for the Colorado Family, School, Community Partnership Toolkit.

Resources: 
Colorado - Family-School Partnering Survey (PDF, 2013)

Presenters:  Daphne Pereles, Executive Director, Colorado Department of Education; and Cyndi Boezio, Supervisor, SPDG Project Director, Colorado Department of Education

Target Audience: SEAs, LEAs, school administrators, evaluators, family liaisons

System Level:  State, regional, district, building

Implementation Capacity Assessment for Intermediate Units
The Intermediate Capacity Assessment is a tool to evaluate an intermediate unit’s capacity to support local districts in the implementation of an evidence-based practice such as multi-tiered system of supports. This tool was modified from the District Capacity Assessment developed Duda, Ingram-West, Tedesco, Putnum, Buenrostro, Chapiro & Horner (2012) from State Implementation & Scaling-up of Evidence-based Practices (SISEP).

Resources:  
MiiBLSi Intermediate District Capacity Assessment (DOC, 2012)

Presenter:  Steve Goodman, Ph.D. Director, Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi)

Target Audience: State level project staff, intermediate unit project staff

System Level:  State, regional, intermediate district units

iPad App for Efficient/Cost-Effective Data Collection
Our new SPDG (ASSETS) demands technology that can evolve and adapt throughout the duration of the grant in a cost-effective manner. We have created a framework for the iPad that allows us to efficiently create customized yet adaptable apps to ensure clean data collection and fidelity of project implementation. Thus far, we have created an app that tracks project consultants’ (coaches) activities in the field as they interact with schools and districts, and we will demonstrate the app’s capabilities, which includes immediate feedback and custom reporting.

Resources:
iPad Application for Efficient/Cost-Effective Data Collection: New Mexico SPDG – Achieving Student Success with Effective Tiered Supports (ASSETS). (DOC, 3/2013)

Presenters:  Michelle Bloodworth, PhD, Lead Evaluator, New Mexico; and Catherine Bornhorst, Evaluator, New Mexico

Target Audience: The tool can be used by anyone involved in the SPDG. Currently, we are using it for our project consultants/coaches as they document their work with schools. This includes observations of PLCs and classrooms and discussions with teachers/school administrators.  However, the framework can easily support the creation of additional iPad apps as our data collection needs continue to be discussed and determined.

System Level:  The iPad app is used at the district/school level but informs overall project decisions 

 

Round 2 - 10 and 10 Tools (March 5, 1:15-1:35 pm)

Prioritizing Protocol Tool
This prioritizing tool allows anyone to use actual data to rank preferences/priorities of small and large groups of people. Participants systematically select their individual priorities and then the data are aggregated to determine the overall priorities of the group.

Resources: 
Prioritizing Protocol Tool (PDF, 2013)

Presenter:  Cynde Snider, Ph.D., Program Specialist, Georgia

Target Audience: Applicable for everyone from classroom teachers to state school officers and business leaders

System Level:  Applicable for all levels: building, district, region, state

Social Network Analysis as a Utilization-Focused Measure for Building a Collaborative Professional Development Network
Through a structured professional development system, Kansas’ Technical Assistance System Network (TASN) provides technical assistance to support school districts’ implementation of evidence-based practices in a variety of content areas. The overall system is supported by and dependent on intensive collaboration among technical assistance providers to meet the diverse needs of districts in a cost-effective and timely manner. To identify areas of need for increased collaboration, a social network analysis was implemented to support the TA providers to examine their level of collaboration across the network and brainstorm strategies for increasing collaboration where appropriate. This discussion will outline the process for implementing a social network analysis and strategies for facilitating decision-making based on the results.

Resources: Social Network Analysis as a Utilization-Focused Measure for Building a Collaborative PD Network (PDF, 3/2013)

Presenters:  Amy Gaumer Erickson, Ph.D., Evaluator, Kansas, Missouri, Oregon; and Kerry Haag, M.S.Ed. Assistant Director of Special Education Services/SPDG Project Director, Kansas

Target Audience: Evaluators, state personnel

System Level:  State


Family Engagement Surveys & Quality Indicators
The Family Engagement Surveys & Quality Indicators are tools to measure the level of family engagement.  The surveys are used to receive feedback from three different stakeholders (School Staff, Families & Students) on the level of family engagement from their perspective.  The Family Engagement Quality Indicators Tool is a used by school and/or district improvement team to conduct a self-assessment of the school/district on family engagement.
Presenter:  Monica Ballay, M. Ed, Facilitator DBDM & Evaluator, Louisiana

Resources:
Indicators of Family Engagement Survey: School/Family/Student (PDF, 2013) 
Family Engagement School Level Quality Indicators (PDF, 2013)

Target Audience: SPDG staff members who work with districts and schools on measuring the level of family engagement

System Level:  District and building

Professional Practice Rubric for SPDG Positive Behavior Support System (PBSS) Facilitators
The AR SPDG PBSS team has developed an online self-evaluation tool – Learning Outcomes Rubric – to assess PBSS Facilitators’ understanding of core PBSS components and the degree to which facilitators have implemented the PBSS process. Data collected help SPDG staff tailor the professional development, technical assistance, and consultation support they provide to Facilitators over time, as well as to assess the development of the facilitators’ skills over time.

Resources: 
Professional Practice Rubric for SPDG Positive Behavior Support System (PBSS) Facilitators Guidelines and Learning Outcomes Rubric (PDF, 3/2011)

Presenter:  Jennifer Huisken LaPointe, MSW, Senior Consultant, Evaluator, Arkansas

Target Audience: SPDG directors (and their staff), SPDG evaluators

System Level:  All

Oklahoma Tiered Intervention Systems of Support (OTISS) Site Action Plan (SAP) and Goal Attainment Form (GAF)
The OTISS-SAP is intended to provide structure in creating detailed actions necessary to achieve identified goals at OTISS sites.  The SAP is completed by identifying each goal to be addressed, the actions necessary in addressing the goal, and the responsible party. In addition, the SAP provides a structure to identify the timeframes, barriers, and the resources necessary in attaining the goal.  Evidence of accomplishment is also a required component in the action plan.  Finally, as sites are moving toward completion, the GAF is designed to track progress toward completion of the identified goals.

Resources:  
Oklahoma Tiered Intervention Systems of Support (OTISS) Site Action Plan (SAP) and Goal Attainment Form (GAF) (PDF, 2013)

Presenters:  Dr. Gary Duhon, Associate Professor, Oklahoma State University, Oklahoma; and Christa Knight, M.Ed., SPDG Project Coordinator, Oklahoma

Target Audience: Internal or external coaches attempting to guide and improve implementation fidelity

System Level:  District and building

Self-Assessment Tool
The PaTTAN Special Education Leadership Initiative Self-Assessment Tool is based on the Council for Exceptional Children, Council for Administrators of Special Education (CEC-CASE) Advanced Knowledge and Skill Set for Special Education Administrators.  It is also aligned to the Pennsylvania Inspired Leadership (PIL) Standards.  This tool is used to identify current areas of strength as well as areas of potential need for Fellows in the PA Fellowship Program (PFP).

Resources:
PaTTAN Special Education Leadership Initiative Self Assessment Tool (DOC, 2013)

Presenters:  Shatarupa Podder, M.Ed., SPDG Project Director, Pennsylvania, and Dr. Janet Sloand, Ph.D., SPDG Project Coordinators, Pennsylvania, PaTTAN Special Education Leadership Initiative

Target Audience: SPDG administrators, LEA supervisors

System Level:  Regional, district and building levels

 

Mini Presentations (March 5, 2:15-3:00 pm)

The Preschool through Kindergarten NE/LRE Team Decision Making Module was developed by the Maryland State Department of Education in partnership with Johns Hopkins University Center for Technology in Education. The module highlights best practice for effective decision making by:

  • Extended IFSP and IEP teams in Maryland in selecting natural environments (NE) and least restrictive environment (LRE) for children with disabilities, ages three through kindergarten.
  • Early childhood implementation teams in providing early intervention, preschool special education and related services to young children with disabilities in regular early childhood settings with children without disabilities.

The Stages of Professional Development: A Resource for All Teachers Responsible for the Achievement of Students with Disabilities - Revised (StagesR) is a recent revision of a Stages tool that was developed in 2006.  The tool is used to self-assess and monitor a teacher’s growth in implementing instructional practices that are effective for students with disabilities.  While Stages has been presented in the past, during this session the coordinator from a Maryland Local School System will talk about how Stages is used by their pre-service and in-service teachers and their mentors to develop personalized professional development plans.

Resources: PPT to be posted by 3/5

Presenters:  Karla Marty, Section Chief, Specialized Instruction,  Programmatic Support and Technical Assistance Branch, Division of Special Education / Early Intervention Services, Maryland State Department of Education; Nancy Vorobey, Section Chief, early Education,  Programmatic Support and Technical Assistance Branch, Division of Special Education / Early Intervention Services, Maryland State Department of Education; Colleen Wilson, Maryland Approved Alternative Preparation Program Coordinator, Anne Arundel County Public Schools, Maryland

Target Audience: Team Decision Making Module:  Early childhood service providers, teachers of children ages 3 to 5, service coordinators, administrators. The Stages of Professional Development (StagesR):  New or veteran teachers, mentors and mentees, building district administrators, directors of special education

System Level:  State, district, building, content department

Online Courses for District Leadership Teams
Online training courses have been developed to establish common vocabulary and understanding among participants involved in the Utah Multi-Tiered System of Supports (UMTSS ) project.  Courses focus on numerous aspects of MTSS, including an overview, implementation, data and assessment.  Courses will be shared with Market Place participants.

Resources:
UMTSS Online Course Site

Presenters:  Devin Healey, Ed.S., Program Specialist, SPDG Project Coordinator, Utah; Jeri Rigby, M.Ed., Program Specialist, Utah Professional development providers

Target Audience: State, district, and school level personnel providing or receiving training associated with Multi-Tiered System of Supports basics

System Level  State, regional, district, building


Self-Assessment of Problem Solving and Implementation – School Version (SAPSI-S)
The SASPI-S was designed as a self-assessment of on-going implementation efforts of a Multi-Tiered System of Supports at the building level.  The instrument is administered/completed once per year online by school teams, and the data are analyzed and circulated back to the respective schools to inform and improve future implementation efforts.

Resources: 
Self-Assessment of Problem Solving and Implementation – School Version (SAPSI-S) (PDF, 2012)

Presentation: Self-Assessment OF Problem-Solving IMPLEMENTATION – School Version (SAPSI-S) (PPT, 3/2013, G. Cates)

Presenters:  Gary L. Cates, Ph.D., Lead Project Evaluator, Illinois, and Kathryn Cox, MBA, Project Director, Illinois

Target Audience: School personnel at the building level

System Level:  Although the tool provides valuable data at the state, regional, district, and building level, the primary target is the building.


Using Comprehensive Evaluation to Drive Statewide PD 
The Maine SPDG project will present its design for formative feedback and evaluation.  Specifically, it will present how a real-time database system works in concert with pre- and post-measures of participant knowledge and satisfaction to shape the design of future professional development sessions and document the project's overall progress toward goals.  The presentation will include visual displays, and case use scenarios depicting this cycle of continuous improvement. 

Resources: No resources available (real-time demonstration)

Presenters:  Brian Doore, Ph.D External Evaluator, Maine SPDG, University of Maine; Debrajean Scheibel,  E.d.D. Maine SPDG Coordinator, Maine Department of Education; Gail Donahue, Ph.D., CCC-SLP, Part C Consultant and SPDG Goal Coordinator

Target Audience: Professional development planners, evaluators, trainers, and coach/mentors, participants

System Level:  Targeted level is state and regional

State Scale-up of an MTSS Model

Over the next five years, Michigan's Integrated Behavior and Learning Support Initiative (MiBLSi) is working to scale up multi-tiered system of support (MTSS) with 1600 schools and 320 school districts. To achieve this goal, the project will need to partner with Intermediate School Districts (ISDs) across the state that the support local districts and schools. This presentation will highlight the scaling up process and the tools used at the ISD level to create leadership and implementation support team structures with the focus on fidelity and durability of MTSS implementation.

Resources: MiBLSi Scaling-Up Within a Statewide Multi-Tiered System of Supports (MTSS) (PPT, 3/2013)

Presenter:  Kim St. Martin, Ph.D. Assistant Director, Michigan Integrated Behavior and Learning Support Initiative (MiBLSi)

Target Audience: State project personnel

System Level:  State level

Embedding Feedback Data Into Your Professional Development
Missouri is developing a consistent statewide approach to implementing high quality professional development (HQPD).  This presentation will share a description of how Missouri is approaching HQPD, developing new training content within a HQPD framework, tools for supporting implementation with fidelity, and lessons learned.   Critical to the implementation of HQPD is not only implementation of evidence-based characteristics of professional development but also implementation of a mechanism for getting feedback on the learning experience and learner outcomes.

Resources: 
Observation Checklist for High-Quality Professional Development Training

Missouri SPDG Professional Development Learning Package Outline

Embedding Feedback Data into Your Professional Development (PPT, 3/2013)

Presenters:  Ronda Jenson, PhD, UMKC Institute for Human Development; Pattie Noonan, PhD, Evaluator, University of Kansas; Pam Williams, EdS, Project Director, Missouri Dept. of Elementary & Secondary Education; Ginger Henry, Missouri Dept. of Elementary & Secondary Education

Target Audience: Professional development providers

System Level:  Professional development is provided at all levels.  Through the SPDG we have started the pilot at the state and regional level.  The tools are intended for eventual use at the district and building level as well.