1 of 15

SSU QLT Working Group: QLT Instrument Validation based on Student Response Data

Kyle Falbo, M.S., Educational Technology Applications Expert, CTET

Dr. Justin Lipp, Director, Center for Teaching & Educational Technology (CTET)

2 of 15

Background/Methods

  • To our knowledge, no empirical validation of the QLT instrument has been conducted utilizing student raters. �
  • Our analysis represents the first phase of a larger research agenda, hoping to obtain representative data from the SSU student population on factors that impact student success in online education.�
    • Phase 1: Item validation using a streamlined set of measures derived from the 50+ QLT indicators (streamlined to 31 items, based on project team’s assessment of what students could reasonably be expected to be able to rate);
      • Student raters were asked to review 8 online courses provided by SSU faculty using the streamlined instrument. We then conducted our item analysis.�
    • Phase 2: (Delayed to Fall 2024): Utilizing refined, validated QLT based questionnaire, expand scope to include a representative sample of campus students. We intend to include experimental manipulations (A/B testing) of various “well designed” vs. “poorly designed” course elements (drawn from QLT), and explore whether our refined instrument is able to detect discernable patterns/differences in student response data based on the controlled manipulation

3 of 15

Questionnaire Part 1: Course Syllabus, Welcome/Getting Started Module, and Technical Support Module

  1. Are there working links to the instructor’s email, scheduling, calendaring apps, social media, and/or other communication platforms?
  2. Are course and campus policies about cheating, plagiarism, and copyright, including possible consequences and course grievance procedures, identified?
  3. Does the instructor state the technology required for the course; how to access that technology; and instructions for how to use it?
  4. Does the instructor indicate office hours for the course?
  • Does the instructor indicate how to contact them with questions?
  • Does the course syllabus include either a statement about diversity, equity, and/or inclusion?
  • Does the syllabus tell students what they should expect to learn from the course?
  • Does the instructor explain the grading policy and grade breakdown for course assignments?
  • Does the syllabus indicate which course materials are required versus recommended?

4 of 15

Questionnaire Part 1: Course Syllabus, Welcome/Getting Started Module, and Technical Support Module

  • Does the instructor provide a survey to allow students to self-assess their readiness for online learning?
  • Does the course indicate what successful participation in the course looks like?
  • Does the instructor state how they will interact with students, including the frequency of the instructor’s participation?
  • Does the instructor state how soon they will provide feedback on student work?

  • Does the instructor tell students how to get technical support for the class?
  • Does the instructor tell students how to access campus support services, such as the tutoring center, library, or mental health services?
  • Does the instructor tell students how to access Disability Services for Students?

5 of 15

Questionnaire Part 2: Course Structure & Navigation

  1. Does the instructor or course website tell students where to start with course materials?
  2. Is the course consistently organized?
  3. Are the course elements and materials clearly labeled?
  4. Are the course materials organized chronologically?
  5. Does the course indicate due dates for assignments and activities?
  6. Does the instructor ask students to reflect on why this course is important or useful to them?

  1. Does the instructor provide a rubric or explanation for how student submissions will be graded?
  2. Is that grading rubric or explanation easy to understand? (If no rubric or explanation is present, indicate No)
  3. Does the course have frequent graded assignments or other opportunities for the instructor to provide feedback on student work?
  4. Does the course have a mid-course survey?
  5. Does the course have an instructor-generated end of course survey, distinct from the official university course survey?

6 of 15

Questionnaire Part 2: Course Structure & Navigation

  1. Are course materials delivered in more than one media type (e.g. lecture, textbooks, online courseware, Powerpoint slide shows, or embedded videos)?
  2. Are there any activities in the first week of the course designed for students to introduce themselves to each other, such as icebreakers or self-introduction discussion posts?
  3. Does the course have discussion forums or other opportunities for students to interact with each other throughout the term?
  4. Do discussion forums or opportunities for students to interact with each other state expectations for participation, such as length, frequency, or timeliness?

How important is this to you as a learner?

(Not at all important/Moderately important/Very important/Prefer not to answer)

7 of 15

Are course and campus policies about cheating, plagiarism, and copyright, including possible consequences and course grievance procedures, identified?

Does the instructor tell students how to get technical support for the class?

Does the instructor tell students how to access campus support services, such as the tutoring center, library, or mental health services?

Does the course have frequent graded assignments or other opportunities for the instructor to provide feedback on student work?

Are course materials delivered in more than one media type (e.g. lecture, textbooks, online courseware, Powerpoint slide shows, or embedded videos)?

Are there any activities in the first week of the course designed for students to introduce themselves to each other, such as icebreakers or self-introduction discussion posts?

Does the course have discussion forums or other opportunities for students to interact with each other throughout the term?

Items That Are Present But Less Important to Students

8 of 15

Does the instructor state how soon they will provide feedback on student work?

Is that grading rubric or explanation easy to understand? (If 0 rubric or explanation is present, indicate 0)

Do discussion forums or opportunities for students to interact with each other state expectations for participation, such as length, frequency, or timeliness?

Items That Are More Important to Students But Less Present

9 of 15

Does the instructor provide a survey to allow students to self-assess their readiness for online learning?

Does the instructor state how soon they will provide feedback on student work?

Does the instructor ask students to reflect on why this course is important or useful to them?

Does the course have a mid-course survey?

Does the course have an instructor-generated end of course survey, distinct from the official university course survey?

Are there any activities in the first week of the course designed for students to introduce themselves to each other, such as icebreakers or self-introduction discussion posts?

Do discussion forums or opportunities for students to interact with each other state expectations for participation, such as length, frequency, or timeliness?

Items That Are Less Present

10 of 15

Preliminary Results/Analysis: Inter-Rater Reliability

Goal: to assess whether students’ ratings of the online courses were consistent in terms of understanding of the items (construct validity)

Results:

  • Fleiss’ Kappa (Range: 0.447 to 0.668, across 31 items), indicating moderate agreement among raters (desirable as we need some discriminant variability to assess which items provide statistical explanatory power
  • Kendall’s W (strength of relationship among multiple raters): =0.506, (Chi-Sq = 486.21, df = 30, p < .001), again, modest agreement

11 of 15

Preliminary Results/Analysis: Inter-Rater Reliability (Cont’d)

  • Friedman’s T (non-parametric comparison of means to test for differences among ordinal data): Chi-Sq = 482.21, df = 30, p < .001 �
  • Item Mean Rankings (Top 5):

21.84: Working Links to Instructor Means of Communication

21.83: Explanation of Grading Policy

21.77: Course Elements & Materials Clearly Labeled

21.77: Due Dates for Assignments & Activities

21.33: How to Contact Instructor w/ Questions

  • Item Mean Rankings (Bottom 5)�3.86: Instructor generated end-of-course survey separate from SETEs�4.19: Mid-course survey

7.61: Instructor asking students to reflect on why course is meaningful/useful�8.34: Providing self-assessment for readiness for online learning�9.94: IM presented in multiple formats

12 of 15

Preliminary Results/Analysis: Factor Analysis

  • Principal Components Analysis & Exploratory Factor Analysis
    • Examining statistically latent structure of the data to look for similar variability in ratings, and apply a theoretical framework
  • PCA:
    • Suggested preliminary 7-factor overall latent conceptual framework (Variance Explained 86.71%)
    • Problems with multicollinearity in data, required dropping variables with high % agreement, to look for more discriminating factors that still explain sufficiently high degree of variance
    • Also necessary to reduce total size of instrument for Phase 2

13 of 15

Preliminary Results/Analysis: Factor Analysis (Cont’d)

EFA: After many iterations, 2-factor solution determined (62.54 % of variance explained), 13 total items (

Factor 1: Course indicates what successful participation looks like, Syllabus tells students what to expect/learn, Technology required for course, Instructor stating how they will interact w/ students, Course materials organized chronologically, Course organizational consistency, How to contact instructor w/ questions, Syllabus indicating req vs. optional materials, Grading rubrics/explanations easy to understand

Factor 2: Statement on campus policies about cheating/plagiarism, How soon students can expect feedback from instructors, IM presented in multiple formats

14 of 15

Preliminary Inferences & Next Steps

  • Preliminary thematic alignment with Kahneman’s Peripheral (Factor 1) vs. Systematic (Factor 2)Information Processing Framework
    • Theoretical explanation could allow us to develop/test hypothesis in line well-validated information processing theory to further explore student responses to QLT ratings
    • May be an artifact of how the survey was constructed (will need future survey to rotate item ordering, to eliminate testing effects)
  • Next Steps: Continuing item analysis, refining short-form survey, and developing experimental manipulations (e.g., 2x2 factorial design, presence/absence of QLT indicators like a mock Syllabus (not) containing learning outcomes.

15 of 15

Thank You!

Questions?