1 of 46

Universal Screening for SEB and the Critical Role of Administrators for Success

Niki Kendall�Delaware PBS Project

Dr. Felicia Kaas�Lewes Elementary School

20th Annual Policy and Practice Institute

June, 2022

2 of 46

Objectives

After participating in this session, you will be able to:

  1. Describe a comprehensive universal screening process
  2. Determine next steps to establish a comprehensive SEB screening process in your school

3 of 46

DE-PBS universal screening resources to explore…

  1. Top Ten Questions Webinar�
  2. Select an SEB Screener Webinar Series

www.delawarepbs.org/univeral-screening/

4 of 46

Universal Screening Defined

“Universal SEB screening is a process that relies on sound procedures for implementing evidence based screening approaches to ensure school teams access good data to inform decisions within a system aiming to improve mental wellness, prevent SEB problems, and ensure all students access a continuum of SEB supports (Best Practices in Universal SEB Screening Implementation Guide Version 2.0, p. 7)”

Universal screening data should make it easier to identify needs for problem solving conversations

Improved access to an effective continuum of SEB supports

�Improved outcomes for students

Leads to

Results in

5 of 46

Examples of effective SEB screening practices…

  • Supported and informed by youth and family
  • Is a process (not a tool) that uses multiple screening approaches to increase accuracy of decisions
  • Aligned with Tier 1 programming
  • Informs continuous problem solving for improved SEB outcomes across the tiers
  • Identifies students who may benefit from early SEB support
  • Identifies grades or classrooms that may benefit from additional resources

Adapted from Best Practices in Universal SEB Screening Implementation Guide Version 2.0

6 of 46

Examples of ineffective SEB screening practices…

  • Screens for symptoms of specific diagnoses
  • Collects data on some students but not others
  • Purpose is not well defined or communicated to youth, families, staff, and other stakeholders
  • Limited or no follow-up data collection
  • Uses teacher, parent, or student nomination data in isolation

Adapted from Best Practices in Universal SEB Screening Implementation Guide Version 2.0

7 of 46

Regulation 508 screening procedures (14 DE Admin. Code § 508.6.1.1 - 508.6.1.1.4):

  • Tier 1 core instruction delivered with fidelity to all students​
  • Multiple gating procedure to determine when a student needs support​
    • First stage is universal screening to identify students who may need additional supports​
    • Second stage (within two weeks) is data analysis to confirm there are specific areas of need for Tier 2 supports​
    • Based on results identified students matched to supports​
    • If 20% of students in a classroom are not meeting a benchmark consider the need for additional classroom, instructional and systems level supports and strategies​

8 of 46

The multi-stage screening procedure unpacked

Nominations

Expected levels of performance on schoolwide data

Universal screening tool data

Data analysis to explore reasons for why a problem is happening

Strategies are developed/implemented to improve student outcomes

Universal screening (stage/gate 1)

Strategies are evaluated for fidelity and effectiveness

Data analysis �(stage/gate 2)

This is a problem solving process

9 of 46

Sources of data to identify students (stage 1)�

Social, emotional and behavioral

Universal screening data

Academic universal screening data

School wide data with decision rules

School wide data with screening rules

Family and educator referrals 

Family and educator referrals

Brief indirect ratings of universally supported behaviors, mind-sets or competencies

Brief direct assessments of universally supported academic skills 

10 of 46

Review of school-wide data

  • On-going (i.e., monthly) review to identify student needs
  • Requires teams to set expected levels of performance

Advantages:

  • Acceptable/easy to access
  • Aligns with service delivery

Disadvantages:

  • Reactive (student must exhibit problems before they are identified)
  • ODRs not sensitive to internalizing behaviors
  • ODRs are subjective/biased

Sample Schoolwide Data Decision Rules

Expected

Some Risk

At Risk

ODR

0-1/quarter

2-3/quarter

4+/quarter

Absences

0-3/quarter

4/quarter

5+/quarter

Nurses Visits

No Concern

Moderate Concern

High Concern

GPA

3.0+

2-2.9

Less than 2.0

Course Failures

0

1

2+

11 of 46

Individual ratings (SEB screening tools)

  • Brief
  • Typically norm-referenced with cut-off scores to identify level(s) of risk
  • May yield one score as a measure of overall risk or multiple measures of separate constructs
  • Able to assess a variety of behaviors, mind-sets or competencies

Advantages:

  • Assess all students
  • Norm-referenced
  • Versatile (multiple constructs)�

Disadvantages:

  • Require systems to support response
  • Cost
  • Some students under-report symptoms

12 of 46

Request for Assistance/Nomination Form...

  • Always available
  • One form whether concern is academic, behavioral or both
  • You may have one already for teachers but do you have one for families and students?

�Advantages:

  • Should be brief
  • Acceptable/easy to access

Disadvantages:

  • Reactive (often students must exhibit problems before they are identified)
  • Depending on forms, may be inefficient
  • If systems and Tier 1 supports are ineffective, may overwhelm Tier 2 team
  • May over or under identify particular behaviors/needs

13 of 46

Sample request for assistance forms

14 of 46

What universal screening approaches are you using to identify students in need of support?

  • Requests for assistance?
  • Review of schoolwide data (e.g., ODRs)
  • Universal screening tools?
  • Other?

15 of 46

Key points about universal screening data:

  • May come from screening measures/tools, grades, attendance, and behavioral referrals, among other sources.
  • Should include screening measures that are brief, technically adequate, designed for repeated use, and have research-based cut scores for determining which students are at risk.
  • Do not require school teams to use a commercially published screening tool
  • Should include all students:
    • within the first 4 weeks of the school year or within 4 weeks of the student’s entry into school.
    • at least three times per year.
  • Does not require all students to be screened with every tool.

16 of 46

General recommendations:

  • Use multiple strategies
  • Develop decision rules
  • Use multiple informants (e.g., teacher, family, student)
  • Review at multiple points in time
    • Screening at multiple times
    • On-going data review and access to nomination process

16

17 of 46

Universal Screener School-Level Action Steps:

  1. Establish or modify school’s meeting structures to support screening work
  2. Identify and schedule SEB universal screening approaches
  3. Confirm adequate SEB supports
  4. Introduce SEB universal screening approaches to school community
  5. Consult with district leadership about consent/finalize consent forms
  6. Train teachers
  7. Conduct screening (2 or 3 times)
  8. Review disaggregated data by group for tier 1 supports
  9. Review student data and make decisions about how to provide Tier 2 or 3 supports
  10. Evaluate your screener implementation

18 of 46

Implementation Teams

“A group of stakeholders that oversees, attends to, and is accountable for key functions of innovation selection, implementation, and improvement related to an evidence-based practice or program.”�

3 functions:

    • Ensure implementation
    • Engage the community
    • Create effective environments (e.g., scheduling, resources, curriculum choices, professional development, resource allocation)

National Implementation Research Network (NIRN)

19 of 46

An administrator is essential to the team:

  • Assures school staff that implementation will be supported with resources (e.g., time, incentives, training)�
  • Orients staff to new ways of doing business and provides clear expectations to staff�
  • Promotes frequent feedback from staff regarding the progress of implementation (and needed supports)�
  • Addresses competing practices that may decrease resources

McIntosh, Predy, et al., (2013)

One of the strongest predictors of MTSS-SEB sustainability

20 of 46

Selecting a new SEB screening approach

Construct(s)

Informant(s)

Procedure(s)

21 of 46

Use existing data to start the conversation…

Questions adapted from: DuBois, Antonelli & Hill, 2022

Data Source Reviewed

Common social, emotional, or behavioral challenges experienced by your students

Are there any

groups that are disproportionately impacted?

What supports are available in our schools to address the need?*

What questions could you ask students (or ask teachers about students) to more effectively address these needs?

(example) peer relationships are a concern

especially our high school girls

Elementary:

  • Tier 1 (second step, schoolwide PBIS)
  • Tier 2 (Social skills groups, lunch bunches)�

Secondary:

  • Tier 1 (advisory, schoolwide PBIS)
  • Tier 2: (overcoming obstacles)

Students: do you have friends at school? can you solve conflicts when they arise?

22 of 46

Use your data trends to prioritize the behaviors, mind-sets and/or competencies you hope to impact

Externalizing behaviors (e.g., arguing, disruption) or internalizing behaviors (e.g., sadness, worry)?

Mind-sets or attitudes (e.g., motivation to learn, feelings about school)

Academic enablers (e.g., academic engagement)?

Social and emotional skills (e.g., responsible decision making, cooperation with peers)?

Mental health concerns (e.g., substance misuse, suicidal ideation)?

23 of 46

Informants

Teacher

  • May be more sensitive to externalizing than internalizing challenges
  • Primary use in Pre-K-6th grade; secondary use with 7-12

Parent

  • Primary use with PK and K
  • Response rates may be low

Student

  • Often best informant for internalizing challenges
  • Primary use with secondary students due to their increased awareness of their own psychological experiences

24 of 46

Procedures

What systems are needed to collect, store, analyze and interpret SEB screening data?

  • Free tools are available but require you to analyze the data�

Will data inform intervention within existing systems?

  • Aligns with interventions you have available�

Roles and responsibilities?

  • Data analysis
  • Implementing intervention connected to data

25 of 46

Measure

Purpose

Expected

At Risk

High Risk

Schedule

Course Grades

To monitor student progress with core academic content and response to instructional practices

Passing all courses

Failing 1 course

Failing 2 courses

Reviewed each quarter

Attendance

To monitor student access to core instructional and SEB practices

Less than 5 days/quarter

Less than 19 days per year

5-8 days/quarter or

19+ days (full year)

9 days/quarter or

36 days (full year)

Reviewed each quarter

Office Referrals

To monitor student response to core SEB practices and prevent externalizing behavioral difficulties and/or school dropout

Less than 3 per year

1 per quarter or

3-5 per year

2 per quarter or

6+ per year

Reviewed each quarter

Screening tool A (self report)

To monitor student progress with core SEB content and response to prevention practices to prevent internalizing and externalizing behavioral difficulties and/or mental health disorder

0-15

16 or higher

20 or higher

Fall, spring

26 of 46

Confirm adequate SEB supports

Directions:

To complete this form, we recommend starting with an area (e.g., Self-Efficacy) and then naming all of the resources, strategies, and supports you have available in your building to address that area.

When you identify a support, place a check all of the areas that it addresses. One support may address multiple areas.

DuBois, Antonelli and Hill, 2022

Panorama SEL and Wellness Survey

27 of 46

Universal Screening & Consent Forms

Letter of Notification (broad or specific)

versus

Letter of Notification with an opt-out option

versus

Informed Consent (opt-in)

Options:

Letter of Notification (opt-out)

Letter to include:

  1. Purpose of Screening
  2. Areas of focus in the tool
  3. Plan for follow-up if concerns are identified
  4. Date to return opt-out forms

Suggested practice:

28 of 46

MTSS Implementation Priority Areas

DE-MTSS School Quick Reference Guide, 2021

29 of 46

SEL Screening in Practice:

Lewes Elementary School

K-5 School

Enrollment: 530 students

  • 260 males/ 270 females
  • 395 White/ 63 Hispanic/ 43 Black/ 10 Asian/ 2 American Indian/ 17 two or more races

30 of 46

Infrastructure for MTSS

-Core Problem Solving Team identified

-Admin, School Counselor, Social Worker, School Psychologist, Reading Specialist, Math Specialist, and ELL Teacher

- Meeting schedule set

- Quarterly Data Day meetings to review trends in classroom/ universal data and discuss intervention group needs

- Weekly Problem Solving Team meetings

- System in place to track data

31 of 46

Example Data Sheet

32 of 46

33 of 46

Screening Approaches

  • Discipline Data
    • Tier 2: 3-5 ODRs
    • Tier 3: 6+ ODRs
  • Attendance Data
    • Concern noted when student misses about 10% of days
  • Teacher/ staff requests for assistance
  • SEL screener data

34 of 46

Selecting a Screener

What was important to us?

  • Ease of use
  • Cost
  • Preference to stay general
  • Specific cut-off scores to help us group students by need

35 of 46

Screener Procedures

  • Stakeholder buy in
    • Introduce to staff
    • Admin support
    • Minimal time commitment
    • Understanding purpose
  • Consent procedures
    • Recommended to allow parents to opt out of SEL screeners
  • Data collection/analysis
    • Data was collected in our data days meeting three times a year (took 10-15 minutes to complete for the entire class)
  • Data sharing
    • Data shared with the problem solving team and the teacher

36 of 46

Student Risk Screening Scale- Internalizing and Externalizing (SRSS-IE)

Purpose: The SRSS-IE is used to identify students who may be at risk for challenging antisocial behaviors and to better inform instruction. Teachers rate students on items related to both internalizing and externalizing behaviors.

  • Grades K-12, teacher report only​
  • 12 items for elementary school (7 externalizing, 5 internalizing)​
  • 13 items for middle & high school (7 externalizing and 6 internalizing)​
  • Categorizes students into 3 bands based on cut scores for both subscales separately​
  • Low risk, medium risk, or high risk​

37 of 46

SRSS-IE

Cost

  • Available online for free access​

Administration & Scoring

  • Guidance documents and presentations for districts and site-level coordinators available online​
  • Teachers directly rate students in the excel file or google sheet​
  • A teacher can complete screening for class in about 15-20 minutes​�

Data management system: ​

  • No specific data management system is provided, but guidance is available for​
    • How districts may want to organize files for staff​
    • How teachers use the excel sheet to enter their scores​
  • Customizable score reporting PowerPoint template is available​�

Repeatability: ​

  • Is intended to be used 3 times per year (4-6 weeks after the school year starts, prior to winter break, and 6 weeks prior to the end of the school year)​

38 of 46

Data-Decision Making

39 of 46

Data-Decision Making

40 of 46

Universal SEL supports

  • Universal SEL curriculums in place schoolwide
    • Leader in Me and Kimochi’s (previously used Second Step)
    • Mentoring programs (community mentors, school staff mentors, and student mentors)
    • Leadership roles
    • Classwide SEL instruction delivered by counselor or social worker

41 of 46

Targeted and Intensive Supports

Tier 2 Supports

  • More targeted mentoring focus (Check in/Check out or goal setting/ PALS)
  • Counseling groups focusing on friendship skills, conflict resolution, impulse control, and emotions (set to 6 weeks in length)

Tier 3 Supports

  • Point cards with targeted replacement behaviors
  • Individualized counseling
  • Increased counseling time/ intensity
  • FBA/BIP with individualized interventions and supports

42 of 46

Tier 2 Intervention/ Data Example

43 of 46

Tier 2 Intervention/ Data Example

44 of 46

Next steps and Reflections

  • Continue use for the 2022-2023 school year
  • Administer prior to our Data Days meeting to help inform our discussion
  • Used data to shape teacher professional development in areas of student needs
  • Data is used to make class lists for the following year
  • Positive feedback from teachers
    • Helped them reflect on whether concerns were severe enough to warrant extra support
    • Increased teachers awareness of students struggling with internalizing behaviors
  • Be alert for big changes in student scores
  • Important to remember this scale is teacher opinion

45 of 46

SRSS-IE Resources

Lane, K. L., Oakes, W. P., Cantwell, E. D., Common, E. A., Royer, D. J., Leko, M. M., Schatschneider, C., Menzies, H. M., Buckman, M. M., & Allen, G. E. (2019). Predictive Validity of Student Risk Screening Scale—Internalizing and Externalizing (SRSS-IE) Scores in Elementary Schools. Journal of Emotional and Behavioral Disorders, 27(4), 221–234. https://doi.org/10.1177/1063426618795443

Lane, K. L., Oakes, W. P., Cantwell, E. D., Royer,D. J., Leko, M. M., Schatschneider, C., & Menzies, H. M. (2019). Predictive Validity of Student Risk Screening Scale for Internalizing and Externalizing Scores in Secondary Schools. Journal of Emotional and Behavioral Disorders, 27(2), 86–100. https://doi.org/10.1177/1063426617744746

Lane, K. L., Oakes, W. P., Ennis, R. P., Cox, M. L., Schatschneider, C., & Lambert, W. (2013). Additional evidence for the reliability and validity of the Student Risk Screening Scale at the high school level: A replication and extension. Journal of Emotional and Behavioral Disorders, 21(2), 97-115. https://doi.org/10.1177/1063426611407339

Lane, K. L., Oakes, W. P., Menzies, H. M., Buckman, M. M., & Royer, D. J. (2020). Systematic Screening for Behavior: Considerations and Commitment to Continued Inquiry [Research Brief]. Ci3T Strategic Leadership Team. http://www.ci3t.org/screening

Michigan’s Integrated Behavior and Learning Support Initiative, (2020, February). Student Risk Screening Scale – Internalizing and Externalizing, Coordinator Training [38]. Michigan Department of Education. https://www.ci3t.org/wpcontent/uploads/2020/09/00_SRSS-IE_Coordinator_Presentation.pdf

46 of 46

Thank you!