By Jason Harlacher
October 11, 2023

Have you ever sat in a team meeting discussing a student, and before you know it, the bell is ringing? Somehow, the team spent the entire meeting on one student, even though they had planned to discuss four students. Or have you ever spent hours discussing a student, going over every detail of their education, only to still feel unsure how to help the student? Perhaps you’ve implemented an intervention and monitored the student’s progress, only to be disappointed at the student’s lack of growth. Your colleagues and you put your heads together, but you’re at a loss on what to do next to help the student. You’re not alone; supporting students can be a long and challenging process.

But alas! There’s good news. The use of the four-step problem solving model (PSM) provides a structured process for identifying and supporting students (Deno, 2003, 2016), which aligns well data-based individualization from the National Center on Intensive Intervention. Teams can use the model to organize their discussions and identify supports for students. However, it may be easy to lose track of the nuances of each step of the PSM. We offer this flipbook as a tool for teams.

How to Use the Flipbook

The flipbook is designed to highlight key questions and information for each step of the PSM. Teams can print out this flipbook and have the respective step of the PSM displayed so that key reminders and questions are visible to team members. In this manner, teams can keep track of what step they’re in and can stay focused on the pertinent information to discuss. Say goodbye to wasted time during team meetings and hello to student success! A summary of the PSM is provided here, followed by directions on how to use the flipbook.

Step 1: Problem Identification

Have you ever been driving your car and the “check engine” light turns on? Or have you taken your temperature and found it’s several degrees higher than it should be? If you have, you’re likely called a mechanic or made a doctor’s appointment to get things checked out. That’s because you saw a red flag that indicated a problem. This is the idea for step 1 of the PSM (see page 3 of the flipbook). During this stage, educators examine red flags or general indicators of performance and ask “Is there a problem?” Those red flags can include (a) screening assessments, such as curriculum-based measurement; (b) pre-existing data, such as attendance records and office discipline referrals; (c) progress monitoring data; or (d) nomination from a parent, teacher, or family member.

After a problem is identified, educators conduct a gap analysis by comparing the observed performance with an expected criterion. In doing so, they can decide whether the gap between the observed and expected performance is large enough to be a problem worth solving. For example, if a student scores 50 on an assessment and they are expected to score 80, then this gap is large enough to warrant solving. However, if a student scored 70, then this score would not be considered a problem. Educators can conduct a risk verification process to ensure the problem actually exists. This is done by having at least two other sources of information that confirm that the initial red flag is a legitimate concern. For example, a student may score at risk on a math screener, and two other math assessments indicate deficits in math for that student. After a problem is identified, educators can proceed to step 2. If a problem isn’t identified (perhaps the gap isn’t large enough or the risk verification indicates the problem is a false alarm), then no further problem solving is needed.

Step 2: Problem Analysis

At step 2, educators dig in and answer, “Why is the problem occurring?” (page 4 of the flipbook). Here, they use the RIOT and ICEL matrix to analyze why the problem exists (take a look on page 5 in the flipbook). RIOT is an acronym that refers to assessment methods: review, observe, interview, and test. ICEL is an acronym that refers to assessment domains:

  • Instruction: How we teach students
  • Curriculum: What we teach students
  • Environment: Where we teach students
  • Learner: Whom we teach

Educators can explore hypotheses related to ICEL by gathering information using RIOT. Start with initial ideas as to why the student is not performing to grade-level expectations or as to why the problem exists. Gather information and determine whether the hypothesis is true or not true, or if more information is needed. Problem analysis is arguably the longest step of the PSM. Teams should explore and be sure they have a solid hypothesis that is likely true before moving on to step 3 of the PSM. Keep in mind that learning is the result of the interaction among the instruction, curriculum, and environment, so educators should ask questions related to where the break down occurred among those three domains (see page 6 on flipbook). Explore alterable variables related to those domains, and understand how learner characteristics interact with the instruction, curriculum, and environment. Consider where the student’s skill deficits exist within the instructional hierarchy (Daly et al., 1996; Haring et al., 1978; Parker & Burns, 2014; Szadokierski et al., 2019). The instructional hierarchy provides a context for instruction that can be adjusted to match a student’s skill, based on their mastery of that skill (we provide a summary on page 7 of the flipbook). After the team reaches an understanding of why the problem exists, it can move to the next step and design a plan.

Step 3: Plan Identification and Implementation

During step 3, the team answers the question, “What can we do about it?” Here, they develop a solution (i.e., a plan to solve the problem) based on the hypothesis formed in step 2 (page 8 of the flipbook). Regardless of whether the student’s need is related to academics, behavior, or social-emotional learning, the solution can be framed around good instructional practices. The team answers specific questions as they design the solution (as illustrated in the table below), keeping in mind they are adjusting factors within instruction, curriculum, and environment to solve the problem.

Solution Frame

Component Description Question
Teaching Strategies Explicit teaching methods used to teach or reteach the missing skills to the student/learner (Visit NCII, the PROGRESS Center, the IRIS Center or Anita Archer's website for more information on explicit instruction)  What skills need to be defined and taught to the student?
Prevention Strategies Methods and adjustments to the context that prompted the desired skill and offset misuse of the skill. 

How can we prompt the skills that is taught?

How can we avoid misuse of the skill?

Response Strategies Methods and adjustments to the context of reinforce the taught skill and to avoid reinforcement for the error or misuse of the skill

What systematic reward can we use for the taught skill?

How can we efficiently correct any misuse of the skill? 

Data Collection Procedures The data gathered and procedures used to evaluate the fidelity of the solution and outcome(s) of the solution. How can we collect and use data related to fidelity and outcomes for the solution?

*see page 9 of the flipbook

During this step, teams can identify a goal so that it’s clear when the problem is resolved. Goals can be written using four components: who, what, how much, and by when/where:

  • Who refers to the individual.
  • What is the skill or behavior of interest.
  • How much is the desired criterion to reach (which can also be the expected criterion from step 1).
  • By when/where are the conditions under which the skill is performed.

After the solution is developed and a goal is set, including how fidelity and outcome data will be measured, the team implements the plan. The progress of the plan is revisited in step 4.

Step 4: Plan Evaluation

During step 4, the team examines fidelity and progress monitoring or outcome data to determine the effectiveness of the solution. We have provided a flowchart of questions to guide the team through step 4 on page 11 of the flipbook. The team meets and asks first whether the plan was implemented as intended by examining fidelity data. Generally, fidelity data that indicate the plan was implemented to 90–95% adherence is desirable (cf. Hawken et al., 2021; Vaughn et al., 2003; 2012). Examining fidelity data is critical. Without knowing whether the plan was implemented as intended, it’s unclear why the plan worked or didn’t work.

If fidelity is low or weak, then the team can determine why that is the case and work to improve fidelity. After providing the solution with improved fidelity, the team can continue evaluating the effectiveness of the plan. If fidelity is strong, then the team can determine the extent to which the plan was effective by examining the outcome data they’ve gathered. When examining progress monitoring data and trend lines, the team can determine whether the student’s progress is positive, questionable, or poor. Based on the student’s progress, the team can either maintain the supports, fade them out, or adjust them to improve student progress. We provide a summary of instructional factors that can be intensified to enhance the solution. These same factors can be faded out if a student has met their goal.

Summary

The PSM is a four-step, structured process that teams can use to identify students who are not achieving grade-level standards. Using each step, teams can analyze the learning environment, develop a solution based on a hypothesis, and then implement and monitor that solution. The flipbook can serve as a visual guide and reminder of the questions and considerations during each step. By referring to the flipbook pages during team meetings, teams can ensure they use the PSM as intended.

References

Daly, E. J., III, Lentz, F. E., Jr., & Boyer, J. (1996). The instructional hierarchy: A conceptual model for understanding the effective components of reading interventions. School Psychology Quarterly, 11(4), 369–386.

Deno, S. (2003). Developments in curriculum-based measurement. The Journal of Special Education, 37(3), 184-192.

Deno, S. (2016). Data-based decision making. In S. Jimerson, M. Burns, &; A. VanDerHeyden (Eds.) Handbook of Response to Intervention (pp. 9-28). Springer.

Haring, N.G., Lovitt, T.C., Eaton, M.D., & Hansen, C.L. (1978). The fourth R: Research in the classroom. Charles E. Merrill Publishing Co.

Hawken, L., Crone, D., & Bundock, K., & Horner, R. H. (2021). Responding to problem behavior in schools (third edition). Guilford.

Parker, D. C., & Burns, M. K. (2014). Using the instructional level as a criterion to target reading interventions. Reading and Writing Quarterly, 30(1), 79-94.

Szadokierski, I., Burns, M. K., McComas, J. J., & Eckert, T. (2019). Predicting intervention effectiveness from reading accuracy and rate measures through the instructional hierarchy: Evidence for a skill-by-treatment interaction. School Psychology Review, 2, 190-200.

Vaughn, S., Linan-Thompson, S., & Hickman, P. (2003). Response to instruction as a means of identifying students with reading/learning disabilities. Exceptional Children, 69, 391–409.

Vaughn, S., Wanzek, J. S., & Murray, G. (2012). Intensive interventions for students struggling in reading and mathematics: A practice guide. Portsmouth, NH: RMC Research Corporation, Center on Instruction.