D ATA B ASED P ROBLEM S OLVING Don Kincaid University of South Florida
1. MTSS Defined MTSS is a term used to describe an evidence- based framework of educating students that includes providing high quality, effective core instruction, intervention supports matched to student needs and uses data based problem solving to integrate all academic and behavior instruction and interventions.
2. Data Utilization – In General • Educators should use key questions to guide data use (Feldman & Tung, 2001; Lachat & Smith, 2005; Protheroe, 2001) • Structured data use approaches: use data rather than be used by data (Wayman & Stringfield, 2006)
2. Data Utilization – In General • Recognize & plan for common barriers to data use (Coburn & Talbert, 2006; Honig & Venkateswaren, 2012; Kerr et al., 2006; Lachat & Smith, 2005; Little, 2012; Young, 2006) • Expand definition of a “data system” beyond just technology – include data practices/culture!! (e.g., Armstrong & Anthes, 2006; Honig & Venkateswaren, 2012; Ingram et al., 2004).
3. Foundation of DBPS: Using Data for Evaluation • Data Evaluation: • Process is in place for making informed decisions • Process includes problem identification, problem analysis, intervention implementation and evaluation 4-step problem-solving process (RtI) • • Informs your Action Plan steps • continue interventions/practices • adapt, revise, modify interventions/practices • discontinue interventions/practices
Evaluating Your Data • Data Evaluation: • Process is in place for making informed decisions • Process includes problem identification, problem analysis, intervention implementation and evaluation 4-step problem-solving process (RtI) • • Informs your Action Plan steps continue interventions/practices • • adapt, revise, modify interventions/practices • discontinue interventions/practices
Evaluating Your Data • Evaluation Process Includes: • System to efficiently and effectively collect, record and graph data • Resources and expertise to review and analyze data • Monthly review and analysis of discipline and outcome data • SWPBS Action Plan updates based on data review and analysis • Discussion: • Are these steps included in your school’s data evaluation process? • If not, in what areas would you like additional support?
Discipline Data Sources • Five major data sources: • Average referrals/day/month • Referrals by: problem behavior, location, time of day, and individual student • Additional data sources • Referrals by motivation or function (get/obtain, escape/avoid) • Office-managed vs. classroom-managed referrals • ISS/OSS data • Discussion: • Does your PBS team review and analyze your school’s discipline data at each meeting? • Does your team use the data to evaluate the PBS development and implementation process and develop next steps?
Other Data Sources • Staff, student and/or parent surveys • Staff and student attendance • Teacher requests for assistance or school-wide behavioral screening • ESE referrals • Grades and/or standardized test scores (FCAT) • Fidelity measures • Benchmarks of Quality, PBS Implementation Checklist, Walkthrough Evaluations • SWPBS Action Plan • Direct observations • Discussion: • What are other sources of outcome data? • Does your PBS team review other data sources at each meeting and use the data to evaluate progress?
Data Evaluation • Questions to address at each monthly PBS meeting: • Are problem behaviors improving? • Are problem behaviors ‘holding steady’? • Are problem behaviors ‘getting worse’? • The following slides provide ‘next steps’ to help address each of these questions.
Problem Behaviors Improving • Discipline data shows a decrease in problem behavior • At least 80% of students receive 0-1 ODRs • Significant decrease in ODRs from previous month/quarter • Decrease in OSS/ISS days • Review other data sources to confirm progress • At least 80% of students contact reward events • PBS Implementation Checklist/Benchmarks of Quality • Consistency exists across teachers, grade-levels/hallways, etc. • School-climate/faculty surveys more positive or supportive • ODRs are decreasing equally - disaggregate the data • ESE, ethnicity/race, free/reduced lunch, male/female • Classroom, grade-level, individual teachers
Problem Behaviors ‘Holding Steady’ • Look for areas of improvement • Benchmarks of Quality, PIC, Action Plan implementation • Increasing the level of support at Tier 1 may increase intervention effectiveness • Are your interventions targeted appropriately? • Review referrals by location, time of day, teacher, grad-level, etc. • Review expectations and rules • Are the expectations well-defined and have they been taught? • Review discipline procedures and definitions • Are problem behaviors well-defined? Are office-managed vs. teacher-managed behaviors well-defined? • • Do your interventions target the appropriate function/motivation of the problem behaviors?
Problem Behaviors ‘Getting Worse’ • Use the 4-step problem solving process: 1. Identify the Problem Be specific, problem behavior(s) should be well-defined 2. Analyze the Problem – Hypothesis development Teaching – Are the expectations being taught as planned? Fidelity – Are the interventions being implemented as designed? Admin decisions & function of behavior: Is problem behavior being reinforced? • 3. Design Interventions Do the interventions target the problem behavior(s)? Have the strategies been taught to all staff? • 4. Evaluation (RtI) – Is it working? Are the problem behaviors decreasing?
Problem Behaviors ‘Getting Worse’ Problem-Solving Process Step 1: Problem Identification What’s the problem? Step 4: Response Step 2: Problem Analysis to Intervention Why is it occurring? Is it working? Step 3: Intervention Design What are we going to do about it?
The PBIS Triangle: Another View Problem Identification What is the Problem? I II Response III to Intervention Problem Analysis Is it working? Why is it occurring? What are we going to do about it? Intervention Design
Florida’s Guiding Questions Step p 1 – Problem ID • What do we expect out students to know, understand, and do as a result of instruction? • Do our students meet or exceed these expected levels? (How sufficient is the core?) • Are there groups for whom core is not sufficient? Step p 2 – Problem Analysi ysis • If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels? Step p 3 – Plan Developm opment ent and Imp mpleme ment ntati tion • What strategies or interventions will be used? • What resources are needed to support implementation of the plan? • How will sufficiency and effectiveness of core be monitored overtime? • How will fidelity be monitored over time? • How will “good”, “questionable,” and “poor” responses to intervention be defined ? Step p 4 – Plan Eva valua uati tion n of Effecti ctivenes eness • Have planned improvements to core been effective?
Step 1: Problem Identification – Tier 1 • What do we expect our students to know, understand, and do as a result of instruction? • Do our students meet or exceed these expected levels? (How sufficient is the core?) • Are there groups for whom core is not sufficient?
Expectations for Behavior • 80% have 1 or fewer ODRs • Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average? • Are the # of ODRs, ISS and OSS per 100 students decreasing? • Is attendance steady?
Step 1: Problem Identification – Tier 1 • What do we expect our students to know, understand, and do as a result of instruction? • Do our students meet or exceed these expected levels? (How sufficient is the core?) • Are there groups for whom core is not sufficient?
Do 80% of students exhibit appropriate behavior?
Do 80% of students exhibit appropriate behavior?
During the current year, does the school have students with 2 or more ODRs by October 1?
Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average? • National Average for MS is .05 per 100 students
Are the # of ODRs, ISS and OSS per 100 students decreasing?
Are the # of ODRs, ISS and OSS per 100 students decreasing?
Is attendance steady?
Step 1: Problem Identification – Tier 1 • What do we expect our students to know, understand, and do as a result of instruction? • Do our students meet or exceed these expected levels? (How sufficient is the core?) • Are there groups for whom core is not sufficient?
Are there groups of students for whom the Tier 1 Core is not sufficient?
Are there groups of students for whom the Tier 1 Core is not sufficient?
Are there groups of students for whom the Tier 1 Core is not sufficient?
Problem Identification - Example
Recommend
More recommend