1 of 23

Speaker(s)

Evaluation & Assessment

Speaker(s)

2 of 23

Assessment vs. Evaluation

What does it mean?

  • Assessment:
    • Ongoing
    • Collection of data
  • Evaluation:
    • “To judge the value”
    • Analysis of data

Speaker(s)

3 of 23

Assessment vs. Evaluation

Individual Learners

  • Assessment of individual learning
    • Pre-class surveys
    • Knowledge checks
    • K-W-L Charts
  • Evaluation of effectiveness of training
    • Final Exams
    • Performance indicators

Instructional Design

  • Assessment of audience and objectives
    • Entry and exit surveys
    • Final Exam scores
    • Performance indicators
  • Evaluation of effectiveness of training
    • Results – Behavior – Learning – Reaction

Speaker(s)

4 of 23

WHO/WHAT ARE WE EVALUATING?

Speaker(s)

5 of 23

Group Discussion

Who and What do we evaluate?

Speaker(s)

6 of 23

Assessment

Data Gathering

Speaker(s)

7 of 23

Types of Classroom Assessment

Formative

  • Pre-Assessment
  • Knowledge Checks
  • Often ungraded

Summative

  • End of Program (typically)
  • Usually Graded
  • “Final Exams”

Diagnostic: Polling, Self-assessments, discussions, observations

Speaker(s)

8 of 23

What are “Authentic” Assessments?

“a(n) … assessment task that involves the student deeply, both in terms of cognitive complexity and intrinsic interest, and are meant to develop or evaluate skills and abilities that have value beyond the assessment itself.”

Frey, Bruce B., Schmitt, Vicki L., & Justin P. Allen (2012). Defining Authentic Classroom Assessment. Practical

Assessment, Research & Evaluation, 17(2). Available online: http://pareonline.net/getvn.asp?v=17&n=2

Speaker(s)

9 of 23

Traditional vs. Authentic Summative Assessments

Traditional:

  • Requires “Correctness”
  • Unknown prior to assessment date
  • Disconnected
  • Knowledge is measured in an isolated way
  • “One Shot”
  • Scored in, an often, numerical way

Authentic:

  • Require a “high quality” product
  • Should be known prior to assessment
  • Tied to “real-world” contexts
  • Knowledge must be used in coordination
  • Contain recurring or ongoing tasks
  • Scored in a diagnostic way

Speaker(s)

10 of 23

Group Discussion

What are some examples of Authentic Assessment you have utilized? What are the pros and cons of Authentic Assessments, in your experience/opinion?

Speaker(s)

11 of 23

Evaluation

Data Utilization

Speaker(s)

12 of 23

KIRKPATRICK’S MODEL

Speaker(s)

13 of 23

The Basic Model:

Level 1: Reaction

  • The degree to which participants find the training favorable, engaging and relevant to their jobs

Level 2: Learning

  • The degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training

Level 3: Behavior

  • The degree to which participants apply what they learned during training when they are back on the job

Level 4: Results

  • The degree to which targeted outcomes occur as a result of the training and the support and accountability package

Speaker(s)

14 of 23

HOW CAN WE EVALUATE THAT??

This Photo by Unknown Author is licensed under CC BY-SA

Speaker(s)

15 of 23

Speaker(s)

16 of 23

A few more ideas from the field…

Level Three:

  • Observation: 360 Self/Peer Evaluations
  • Alumni Groups
  • Post-training networking
  • Involve past learners in training evaluation

Level Four:

  • Observation: 360 Supervisor Evaluations
  • Include measurable learning objectives
  • Data gathering: Audit records
  • Big Picture post-training surveys

Speaker(s)

17 of 23

Group Activity

Case Study Activity

Speaker(s)

18 of 23

Instructions:

Develop an initial program/course evaluation plan given the data in your case study. Answer the following questions with your plan:

  1. What is the problem or perceived problem with the training program?
  2. What assessments have been utilized?
  3. How will you evaluate the program based on the data you’ve been given?
  4. What questions will you be attempting to answer with your evaluation?
  5. What changes would you see, ideally, if your evaluation and resulting revisions are successful?

Speaker(s)

19 of 23

Low performance on assessment; �high experience level

Case Study #1

Speaker(s)

20 of 23

High performance on assessment; reporting “useless” training

Case Study #2

Speaker(s)

21 of 23

Few issues in assessment; needed change not being seen

Case Study #3

Speaker(s)

22 of 23

Training viewed as “not relevant” to target audience- �low motivation

Case Study #4

Speaker(s)

23 of 23

Training giving short term positive effect with little lasting effect on behavior

Case Study #5

Speaker(s)