1 of 44

Universal SEB Screening:

Data-Based Decision Making

Developed by Niki Kendall on Behalf of the DE-PBS Project

2 of 44

Credits and Background

The DE-PBS Project serves as a technical assistance center for the Delaware DOE to actualize the vision to create safe and caring learning environments that promote the social-emotional and academic development of all children.

The statewide initiative is designed to build the knowledge and skills of Delaware educators in the concepts and evidence-based practices of Positive Behavior Support (PBS) as a Multi-tiered System of Support (MTSS).

https://www.delawarepbs.org/universal-screening/

3 of 44

Please take note:

  • These webinars were developed in response to the frequent requests we have received at the Delaware PBS Project for guidance on selecting a universal social-emotional and behavior (SEB) screener.   �
  • As a reminder, our State MTSS policy (14 DE Admin. Code § 508.6.1.1 - 508.6.1.1.4) requires the use of a universal screening process (not a screener) to identify students who are not meeting academic and non-academic benchmarks. �
  • Regulation 508 and the DE Department of Education does not endorse use of a specific tool.  Instead, LEAs and charters are required to implement a comprehensive universal screening process.  A comprehensive screening process typically includes a combination of multiple data sources such as:
    1. Existing schoolwide data (e.g., attendance, office discipline data, course grades)
    2. Requests for assistance from teachers, families and students
    3. Norm referenced, criterion-referenced, or curriculum-based screening tools�
  • This series is designed to help leadership teams select and install a screening tool after they identify and prioritize non-academic skills, competencies or mind-sets that are not identified in their current screening processes.  �
  • Although these webinars focus on selecting an SEB screener, a similar process can be used to select and install academic screening tools.  

4 of 44

Objectives

After viewing this webinar, you will be able to:�

  • Develop strategies for sharing screener results with a variety of stakeholders�
  • Understand how to respond to screener results across the continuum of support�
  • Use tools to monitor your team’s response to screener results�
  • Evaluate your screener and screener implementation

5 of 44

Installing a Universal SEB Screener Series

Key Webinars in this Series:

  • Webinar 1: introduction
  • Webinar 2: universal SEB screener selection
  • Webinar 3: universal screening readiness, resource mapping, gap analysis
  • Webinar 4: parental consent, action planning and gaining stakeholder buy in
  • Webinar 5: data-based decision making

Bonus Content for Enhanced Learning:

  • Webinar 1 bonus content: Top Ten Questions about Universal Screening
  • Webinar 2 bonus content: a review of select SEB screening tools

6 of 44

Universal Screener Installation Action Steps

  1. Learn about comprehensive screening in the context of MTSS
  2. Establish a district implementation team  
  3. Complete tool selection process
  4. Determine readiness criteria and select schools to participate
  5. Organize school team(s) to support screening work
  6. Complete resource map
  7. Confirm that each school has adequate SEB supports
  8. Introduce screening initiative to school community
  9. Consult with legal team about consent/finalize consent forms
  10. Train/coach teachers to administer
  11. Conduct screening (2 or 3 times)
  12. Create data reports
  13. Prioritize needs and act on data
  14. Evaluate your screener implementation

7 of 44

System supports for a data-driven culture

  1. Provide a data coach or facilitator to guide MTSS teams through screener implementation and data analysis (e.g., screening coordinator)
  2. Create opportunities for professional development (based on roles)
  3. Provide standardized team meeting agendas
  4. Provide easy access to relevant data
  5. Set aside TIME for collaborative data analysis and planning of instructional strategies

Data-driven culture: “a learning environment within a school or district that includes attitudes, values, goals, norms of behavior and practices, accompanied by an explicit vision for data-use by leadership that emphasizes the importance and power that data can bring to the decision-making process” (Mandinach, 2012)

8 of 44

Set aside TIME for collaborative data analysis and planning of instructional strategies

Schedule a block of time to review the data!

9 of 44

12.  Create data reports

10 of 44

Sample team communication plan

Topic

Responsible Team Member

Frequency

Recipient

Method

Schoolwide (Tier 1) screening

Screening coordinator/administrator

Within a month

All staff

Staff meeting

Grade-level screening results

Screening coordinator/administrator

Within a month

Grade/Classroom Staff

Grade-level PLC meetings

Schoolwide (Tier 1) screening results

Tier 1 team leader/student advisor

Within a month

All students

Student advisory

Schoolwide (Tier 1) screening results

Screening coordinator/administrator

Within a month

All families

Newsletter

Concerns about screening results and plans for follow-up

School based behavioral health staff

Within 1-2 weeks

Individual students

In-person

Concerns about screening results and plans for follow-up

School based behavioral health staff

Within 1-2 weeks

Individual families

In-person or by phone

11 of 44

Key considerations for reporting the results from your screener

  1. Ensure that the reporting of results protects the privacy of students and the copyright of the screener
  2. Use simple language to describe what the assessment covers, what scores represent, the precision/reliability of the scores, and how to interpret and use scores.
  3. Report amount of error expected for a score (e.g., standard error or confidence intervals) to indicate scores are estimates that can vary from one occasion to the next.
  4. As appropriate, identify supplemental information (e.g., results from other assessments, academic/behavioral data) that would support appropriate interpretation and use.
  5. If reporting subgroup results, individuals familiar with those subgroups should be involved in interpretation and use.
  6. When sharing screener results across time, note any significant changes in the SEL curriculum/instruction, the population of students screened and/or modes of administration.

Buros Center for Testing–Spencer Foundation Project Scholars (2020). SocialEmotional Learning Assessment Technical Guidebook.

12 of 44

#7 Make the data understandable

Before creating data reports for your users, consider the following five questions:

    • What data is important to show?
    • What do I want to emphasize in the data?
    • What options do I have for displaying the data?
    • Which option is most effective in communicating the data?
    • Who is my audience? How can I make the information understandable for them?

13 of 44

What do you want to show?

Distribution

Table

Comparison

14 of 44

How do the values compare to each other (exact values)?

Teacher Last Name

Grade

# of students screened

# of students at-risk

Percent At- Risk

Shaffer

5

25

14

56%

Triggs

4

26

13

50%

Ells

2

26

7

27%

Memphis

1

28

7

25%

Barrett

2

25

5

20%

Cassidy

4

21

4

19%

Ulrich

4

28

5

18%

von der Embse, et al. (2022)

15 of 44

How is the data distributed?

16 of 44

How do the values compare to each other among categories?

17 of 44

How do the values compare to each other other over time?

18 of 44

Visual best practices

most important data

Emphasize

graphs for legibility

Orient

overloading graphs

Avoid

# of colors and shapes

Limit

through important text

Inform

Unger, 2017

19 of 44

13.  Prioritize needs and act on data

20 of 44

“Effective data use requires going beyond the numbers and their statistical properties to make meaning of them. Teachers who engage in data-based decision making must translate the data into actions that inform instruction”

(Mandinach, 2012, p. 73).

21 of 44

A quick review of Regulation 508 Multi-Tiered System of Support (MTSS)

Purpose of 14 DE Admin. Code § 508.6.1.1 - 508.6.1.1.4:  ​

    • To meet the academic and non-academic needs of the whole child
    • To determine when a student requires scientific evidence- based interventions
    • To identify needed supportive services for all students

�Essential Components:​

    • Team-based leadership, tiered system of support, assessment, data-based decision making, academic and non-academic resources support and intervention�

Procedures:  ​

    • Tier 1 core instruction delivered with fidelity to all students​
    • Multiple gating procedure to determine when a student needs support​
      • First stage is universal screening to identify students who may need additional supports​
      • Second stage (within two weeks) is data analysis to confirm there are specific areas of need for Tier 2 supports​
      • Based on results identified students matched to supports​
      • If 20% of students in a classroom are not meeting a benchmark consider the need for additional classroom, instructional and systems level supports and strategies​

22 of 44

Tier 1: 80% Decision Rule

School/Grade:

→If 80% of students are not responding to universal schoolwide SEB practices, evaluate and look for ways to improve.

Subgroups:

→If 80% of students in a subgroup (race, gender, disability status) are not responding to universal SEB practices, evaluate and look for ways to improve�

Classrooms:

→If 80% of students in a class or grade are not responding to universal classroom SEB practices, evaluate and look for ways to improve

Tip: Use the interpretation guidance from the test publisher. For example, the SAEBRS recommends only using the total score to determine risk and the subscales for problem solving. So the 80% rule would only apply to the total score!

23 of 44

80% Decision Rule: Schoolwide

→If 80% of students are not responding to universal schoolwide SEB practices, evaluate and look for ways to improve.

Next steps:

    • Explore other data sources to identify root cause of the problem (e.g., ODR patterns, student focus groups)
    • Support teachers to implement existing Tier 1 SEB practices (e.g., professional learning, coaching, peer modeling)
    • Modify existing SEB practices (e.g., adjust school-wide matrix and lessons; increase rate of feedback for specific behaviors)
    • Implement new SEB practices

von der Embse, et al. (2022)

24 of 44

80% Problem Solving: School/Grade

80% rule:

Fall screening data indicates that 68% of our 6th graders self reported difficulties with social behavior.

Identify the problem:

After meeting with a group of students and reviewing our discipline data (50% of ODRs involved peer to peer conflict), the primary need is relationship skills (solving conflicts with peers).

Set Measurable Goal:

# lessons taught�Reduction in ODRs for peer-to-peer conflict

Decrease %age of students who self-report social behavior problems to 50% by winter screening

Proposed Solution and Action Plan:

Solving conflict skills will be added to the school’s SEB expectations and advisory teachers will be supported to deliver lessons on conflict resolution

Fidelity Monitoring Plan:

A weekly google form will be completed by advisory teachers that reports which SEB lessons have been taught.

Monitor Outcome vs Goal:

What outcome data do you see as a result?

Did you achieve the goal, or do you need to revise a component of your problem-solving process?

25 of 44

80% Decision Rule: Classroom

Teacher Last Name

Teacher First Name

Grade

Percent of students meeting benchmark

Shaffer

Sarah

5

44%

Triggs

Taylor

4

50%

Ells

Erica

2

73%

Memphis

Marsha

1

75%

Barrett

Bob

2

80%

Cassidy

Cara

4

81%

Ulrich

Uma

4

82%

von der Embse, et al. (2022)

If 80% of students in a class or grade are not responding to universal classroom SEB practices, evaluate and look for ways to improve.

Next steps:�

  • Explore other data sources and meet with the teacher to explore the root cause of student needs

  • Support teachers to implement/differentiate Tier 1 practices (e.g., coaching)

  • Respond to the large number of SEB needs in the classroom with additional support (e.g., clinician delivers classroom SEB lessons with teacher)

26 of 44

Example classroom quick start guide

Wichita Public School’s 80% Decision Rule for SAEBRS Screener: “When 20% or more of the students in a class come up At Risk on the Total Behavior Score, utilize the quick reference guide as a springboard to get started with classroom-based support ideas”

Wichita Public Schools (2020)

Melanie Barbas, Colonial School District

27 of 44

80% Decision Rule: Subgroups

At or Above Benchmark:

  • Students with IEPs = 55%
  • All other students, W/O IEPs = 85%

55% of students with IEPs are at/above benchmark as compared to 85% of all other students without IEPs.

Sample Data: Pennsylvania Training and Technical Assistance Network (2021)

Risk Ratio Calculator (Wisconsin PBIS Network)

28 of 44

Digging Deeper into the Data: Subgroups

At Risk Risk Index:

  • Students with IEPs = 15%
  • All other students, W/O IEPs = 7%�

AT Risk Risk Ratio:

Students with IEPs = 0.15/0.07 or 2.14

Students with IEPs are 2 times more likely to be at risk when compared with students without IEPs.

Sample Data: Pennsylvania Training and Technical Assistance Network (2021)

Risk Ratio Calculator (Wisconsin PBIS Network)

29 of 44

80% Rule Problem Solving: Subgroup

Trend in screener results:

Fall screening data indicates that students with IEPs are 2 times more likely to be at high risk for SEB challenges than students without disabilities in the school.

Identify the problem:

Explore schoolwide data with big 5 questions (what time of day, what is the need, where is it happening, when is it happening) for your subgroup, in this case, students with IEPs.

How do individuals impacted by the data trend and/or root cause (e.g., families, students, educators) interpret the results?

Set Measurable Goal:

What do you hope to impact?

Create Proposed Solution and Action Plan:

Consider first: What Tier 1 strategies might be available to support students with IEPs to access T1 SEB supports? How might we strengthen these supports or access to them?

Monitor Outcome vs Goal:

What outcome data do you see as a result?

Did you achieve the goal, or do you need to revise a component of your problem-solving process?

30 of 44

Subgroup Decision Rules

  1. Determine if the group size is large enough (i.e., 10 or more)�
  2. Use the 80% rule to determine if a majority of the subgroup are supported by universal SEB supports�
  3. Calculate a risk ratio, those that exceed 2.0 should be evaluated further

31 of 44

Student Problem Solving

1. Cast a wide net to identify students who may need more

2. Identify which students to explore (e.g., highest 20% of students on a screener)

3. Confirm which students need support (e.g., debrief with student/teacher, record review)

4. (as needed) Dig deeper into student needs (e.g., observations, additional assessments or interviews)

5. Identify a goal for change

6. Determine next steps (i.e., match a student to a support/intervention)

Some students may need Tier 2 support�

Some students may need more support/monitoring at Tier 1

Some students will need more intensive help

32 of 44

Student Problem Solving Conversations

Understand student needs:

In what areas is the student not meeting SEB expectations?

Do academic needs need to be addressed because they underlie the student’s SEB challenges?

Confirm student needs support:

Does the student have support for the need? Do they want support? What do their teacher or parent recommend?

Identify a goal for change:

What is the specific SEB need, with who, when, where, and why does it occur?

For identified needs, what is our goal for the student?

�Determine next steps:

What interventions/supports do we have (or need) to help the student reach the goal?

Some students will need more intensive help

Some students may need Tier 2 support�

Some students may just need more support/monitoring at Tier 1

33 of 44

Student Debrief Tips

  • Be timely and provide follow-up as quickly as you can�
  • Show you value their privacy
    1. ensure the setting for your conversation is private
    2. discuss any limits to confidentiality�
  • Prior to your debrief, review individual screener results and make note of strengths and needs to discuss with the student�
  • Restate the purpose of the screener they completed��
  • Discuss strengths noted in their screener results and in daily life. For example, you could say: “I notice you do (insert an SEB strength) well. What do you think you do well?”�
  • Discuss noted areas of concern in the screener results. For example, you could say: “I noticed you said that you were feeling (insert a need). Can you tell me more about that?”�
  • Prompt to find out when, where, and with whom the challenges most frequently occur.�

Help Me Grow (2019)

Fairfax County Public Schools (n.d)

34 of 44

Student Debrief Tips Continued

  • Find out how the student feels about any current supports they are receiving or strategies they use. For example you could say, “when you are feeling this way, what are some things that make you feel better?”�
  • Provide the student with information about how to access help if they need it.�
  • Discuss next steps. For example you could say:�
    • (if you feel monitoring is appropriate next step): “I am glad we have a chance to talk about how you are doing. Is it okay if I check in with you again from time to time. If you want to talk with me before then…”�
    • (if you feel additional support is appropriate next step): “I am glad we have a chance to talk about how you are doing. How would you feel about having some more routine check ins and support? (if yes) I will look into this and check back in with you in a few weeks. If you want to talk with me before then…”

Help Me Grow (2019)

Fairfax County Public Schools (n.d)

35 of 44

SEB Screening Tracking Tool

Tips:

  • Complete screening (benchmarking) at least 3 times per year (as per Regulation 508)
  • You may or may not include your screener in winter benchmarking
  • Create a request for assistance process to identify student needs between benchmarking

Adapted from Wichita Public Schools

36 of 44

Systems Problem Solving

Use your data to inform group interventions:�

  1. Set a cut point for targeted interventions (e.g., highest 20% of students on a screener in a grade level) based on the number of students you are prepared to support
  2. Review the data to identify salient SEB needs (e.g., domains on the screener)

  • Review the data to identify whether particular groups are impacted (e.g., grade, gender, race, disability status)�
  • Review your inventory of Tier 2 interventions and set up groups OR make a plan to adopt a new intervention

37 of 44

Considerations: screening for risk of suicide, self-harm, or harm to others

If there are individual items on your selected universal SEB screener that are indicative of risk for suicide, self harm, or harm to others:�

  • Remember that Individual items on universal SEB screeners are not intended to be used as the sole determinant of risk for suicide or self harm�
  • Ensure an appropriate number of staff are prepared to follow up the same day with critical students using district approved risk/threat assessment procedures�
  • Administer the screener in the morning (e.g., before 10am) to ensure you have enough time to review critical items and follow up with students before they go home

  • Ensure you have resources on hand to share with students and families

Greenberg, J. (2022)

38 of 44

Considerations: screening for risk of suicide, self-harm, or harm to others

  • Remind families (e.g., the day before) their children will be taking the screener, and include information about what to do if they have concerns about their child�
  • After administration, consider debriefing with the class and providing information about what to do if they or someone they know is having thoughts about self-harm or harm to others

39 of 44

14.  Evaluate your screener implementation

40 of 44

3 questions to ask while evaluating your screener:

  1. Is it meeting your need?
      • Results lead to intervention planning and service delivery
      • Stakeholders understand the results
      • Improves identification of prioritized needs�
  2. Is it usable?
      • Acceptability to stakeholders
      • Costs to implement
      • Data system is efficient�
  3. Is it defensible?
      • Leads to accurate identification of needs

Adapted from Glover & Albers, 2007

41 of 44

Get student feedback!

Source: William Henry Middle School

42 of 44

Screening Coordinator Checklist (implementation steps)

Adapted from Michigan’s Integrated Behavior and Learning Support, 2020

43 of 44

Questions?

Niki Kendall, DE-PBS Project

Robertsn@udel.edu

44 of 44

References

Buros Center for Testing–Spencer Foundation Project Scholars (2020). Social-emotional learning assessment technical guidebook. https://buros.org/pdfs/SEL_Guidebook.pdf

Fairfax County Public Schools. (n.d.). SEL Screener Report and Guide. Retrieved from� https://www.fcps.edu/student-tests-and-assessments/student-assessment-details/social-emotional-screener/score-report

Glover, T. A., & Albers, C. A. (2007). Considerations for evaluating universal screening assessments. Journal of School Psychology, 45(2), 117-135. https://doi.org/10.1016/j.jsp.2006.05.005

Greenberg, J. (2022). 988 Suicide & Crisis Lifeline. Columbia Protocol app. Retrieved from https://cssrs.columbia.edu/wp-content/uploads/Columbia_Protocol.pdf

Help Me Grow. (2019). Sharing screening results with families. https://helpmegrowsmc.org/wp-content/uploads/2019/11/Screening-Conversation-Guide-10.15.19.pdf

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist 47(2), 71-85. https://doi.org/10.1080/00461520.2012.667064

Pennsylvania Training and Technical Assistance Network (2021). Interpreting your universal screening: Highlights of building a multi-tiered system of supports for mathematics. https://www.pattan.net/CMSPages/GetAmazonFile.aspx?path=~%5Cpattan%5Cmedia%5Cpublications%5C2019%20accessible%20pdfs%5Cmtss_math_webinar_series.pdf&hash=7af8ca7074a81d5d5564ceeada8f19824c529ed38f791b0d45364cec366af512

Simonsen, B., Putnam, R., Yaneck, K., Evanovich, L., Shaw, S. Shuttleton, C. Morris, K., & Mitchell, B. (February, 2020). Supporting students with disabilities within a PBIS framework. Center on PBIS, University of Oregon. www.pbis.org.

Squires, J., Twombly, E., Bricker, D., & Potter, L. W. (2009). Sharing screening results with families. ASQ-3 User's Guide.

Unger, K. (2017). Visualizing your data effectively. https://www.nefrsef.org/forms/2018-forms/2018-Visualizing-your-data-effectively.pptx

von der Embrose, N., Eklund, K., Kligus, S., & Rutherford, K. (2022). Screen to intervene: Integrating mental health across tiers. University of South Florida, School Mental Health Collaborative, University of Wisconsin-Madison. https://drive.google.com/file/d/1RYTSflALPgUBhwMXKiAJvLKuKHejpqAV/view

Wichita Public Schools. (2020). Social, academic, and emotional behavior risk screener (SAEBRS) [PowerPoint slides]. https://www.usd259.org/cms/lib/KS01906405/Centricity/Domain/1568/District%20SAEBRS%20Training%20Sept%202020.pptx

Wisconsin RTI Center. (2019, February 13). Risk ratio calculator [Excel spreadsheet]. https://www.wisconsinrticenter.org/resources/risk-ratio-calculator/