1 of 21

Examining Methods to Measure Competency in Simulation

Jamie Robinson, PhD, RN, CNL; Julie Sutherland, DNP, RN; Cindy Atkinson, DNP, RN, CNL; Tim Whelden, MSN, RN; Betsy Herron, PhD, RN, CNE

July 29, 2025

2 of 21

Disclosures and Acknowledgements

  • There are no disclosures or acknowledgements required.

3 of 21

Introduction

Competency is the ability to know, show, and do (AACN, 2021)

Competency based education (CBE) is a well-established approach in nursing education

4 of 21

Background

Self-regulated learning involves a cycle of planning activities in advance, monitoring performance while working, and assessing the outcomes upon completion (Versteeg et al., 2021)

Faculty assessing students in a variety of settings, using multiple methods helps mitigate bias (Duitsman et al., 2019)

5 of 21

Purpose of this study

This study aimed to assess novice nursing students' self-efficacy and perceptions of competency attainment in a clinically based first semester nursing course

6 of 21

Study Objectives

This study aims to: 

  • Assess the efficacy of the Standardized Clinical Performance Grading Rubric by faculty to determine student competency (Kopp, 2018)
  • Assess the efficacy of the Standardized Clinical Performance Grading Rubric by students to determine self-efficacy of competency (Kopp, 2018) 
  • Assess student perception of competency with the Learning Self-efficacy scale (Kang et al., 2019) 
  • Assess student report of competency in core nursing skills using the Core Competence in Fundamental Nursing Practicum Scale (Chang et al., 2022)
  • Assess student perception of the simulation effectiveness with the Educational Practices Questionnaire-C (National League for Nursing, 2021)

7 of 21

Simulation Design

Scenario Development and Competency Alignment

Developed to align with course objectives and clinical competencies

Included realistic clinical contexts and decision-making opportunities

Embedded key learning outcomes to guide competency assessment

Scaffolded with clinical experiences to support milestone progression and performance growth

8 of 21

Pilot

Purpose:

  • Tested faculty use of Standardized Clinical Performance Grading Rubric (SCPGR)
  • Evaluated competency for 113 first-semester nursing students (Fall 2024)

📚 6 criteria

  • Rated: Exemplary, Accomplished, Beginning, Developing

📊 Key Results:

  • Strong reliability: α = 0.917
    • Individual areas: 0.72–0.80
  • Significant improvement from week 1 to week 8 (Wilcoxon tests)

🎯 Impact:

  • Findings informed research design
  • Guided refinement of evaluation tools and focus areas

9 of 21

Methods

🎯 Purpose:

✅ Evaluate SCPGR for competency assessment

🧠 Measure self-efficacy (LSES)

📚 Assess educational practices (EPQ-C)

🩺 Check self-reported core skills (CCFNPS)

🗂️ Design:

🔬 Prospective study

‍⚕️ 113 first-semester nursing students (Spring 2025)

🏥 Required clinical & simulation activities

📊 Data Collection:

🧑‍💻 2 simulations & 2 surveys (weeks 6 & 15)

🗨️ Student Surveys: LSES, EPQ-C, CCFNPS + open-ended questions

🗨️ Faculty Scoring Rubric

10 of 21

Simulation 1 and Scoring Tool

Designed for new student nurses to safely build foundational clinical skills.

Summary:

  • 69M, admitted to pulmonary rehab post-pneumonia
  • COPD, right-sided weakness post-stroke
  • Foley catheter, back discomfort, high fall risk
  • Pork-free diet for religious reasons
  • Simulation Focus:� Basic safety (ID, allergies, fall risk)� Focused assessment (VS, pain, intake/output)� Identify priorities (pain, thirst, culture)� Repositioning, catheter care� Inclusive, therapeutic communication

11 of 21

Simulation 2 and Scoring Tool

Designed for increased clinical judgment and skill demonstration.

Summary:

  • 67F LTC resident, 24h nausea, new weakness, fever
  • Lower abdominal pain, unable to void
  • Type II DM, toe ulcer, mild confusion, WBC 12.1
  • Simulation Focus:� Assess status changes, mild confusion� Prioritize (pain, fever, NPO, unable to void)� Medication safety (ondansetron, acetaminophen)� Infection prevention, wound care, I&O monitoring� Inclusive communication (gender identity)� SBAR handoff practice

12 of 21

Results:

Educational Practices Questionnaire- Student (EPQ-S)

    • Active learning
    • Collaboration
    • Diverse learning methods
    • Clear expectations

High satisfaction with:

    • Simulation 1: α = 0.93
    • Simulation 2: α = 0.94

Reliability:

13 of 21

Results: Simulation Evaluation by Faculty

14 of 21

Results: Core Competence in Fundamental Nursing Practicum Scale (CCFNPS)

Communication (p = 0.586 overall)

Notable improvement in reading nonverbal cues

Nursing Process (p < 0.001 overall) 

Better nursing diagnoses, care planning, and physical assessments

Biomedical Knowledge (p < 0.001 overall)

Stronger understanding of medications

Nursing Skills (p = 0.23 overall)

Small improvements in organizing data and reports

Professional Attitude (p = 0.931 overall)

Consistently high, no significant change

Reliability: α = 0.92 (CI: 0.89–0.95)

15 of 21

Results: Learning Self-Efficacy Scale

  • 3 Domains
  • 12 items
  • Ratings of 0 (Disagree) to 5 (Agree)
  • Based on Bloom's Taxonomy, expert consensus, and content validity index

(Kang et al., 2019)

Trends: 

Increases in all domains: 

  • Cognitive
  • Affective
  • Psychomotor

(Cronbach's α = 0.84; 95% CI)

16 of 21

Results: Learning Self-Efficacy Scale

🧠 Cognitive

❤️ Affective

🖐️ Psychomotor

Increased awareness and understanding

Improved emotional readiness and confidence

Enhanced hands-on skill performance

  • 📊 No significant change in scores between Simulation 1 and Simulation 2

  • Strong reliability across both time points
    • Cronbach’s α = 0.84; 95% CI [0.77, 0.90]

17 of 21

Overall Simulation Evaluation (SON QI)

Simulation 1:

  • I really appreciated the feedback given during debrief I learned a lot within that experience.
  • I understood about what i missed and how I can improve
  • It really helped me gauged my understanding of my own skills and what needs more practice.

Simulation 2:

  • I thought that it was so cool to see how much improvement we all had from the first sim to now.
  • I felt as though we were more organized than the previous time.
  • I think today was better than last time. I felt more comfortable talking to my patient.

18 of 21

Limitations

  • Data collection times and methods
  • The use of simulation was critical, but its use also limited the complexities we were able to add
  • The simulation was limited by having to group students of 3-4 together in a single simulation
  • Pass/fail course affects student motivation for learning/performing

19 of 21

Implications

Objective and consistent assessment tools enhance evaluation reliability

To enhance reliability and consistency, faculty need training and support

Apply data gathered from assessment tools into clinical student education

Determine a way to evaluate students individually

Defining competency for assignments and activities

20 of 21

Conclusions

  • Future research and evaluation needs to be done to support our tools and methods
  • Implement in educational practice
  • Use of simulation is critical to measuring results accurately/consistently

21 of 21

References