1 of 25

Research Partnerships @ Code.org

Baker Franke

SIGCSE 2021

2 of 25

Goal: Invest in supporting high-quality computer science education research that leverages Code.org’s data and tools.

Supporting Advancement in CS Ed Research

Supporting Regional Partners

Supporting our Mission

3 of 25

Education Programs @ Code.org

CS Fundamentals (K-5)

CS Discoveries (6-9)

CS Principles �(9-12)

Curriculum

six 20-hour courses by grade level��~1M students/yr

Full year curriculum w/semester option��~100K students/yr

Full Year AP curriculum

��~60K students/yr

Professional Learning

Teachers and facilitators

1-day workshops (7 hrs)�

~10K teachers�500 facilitators

5-day summer + 4 follow up workshops (57.5 hrs)

2.5K teachers�250 facilitators

1.5K teachers�150 facilitators

4 of 25

5 of 25

6 of 25

7 of 25

Teacher Sections

section_id / course_id

teacher_id / student_id

School Data (NCES)

%Wh, %Bl, %Hi, etc.

%FRL, Urban/Rural

City / State / Zip

Users

gender / race / age / etc.

Teacher / Student

User Levels

Course / Unit / Lesson

Timestamps (create/mod)

Student Code

Block config / Free Code

Teacher Workshops

Date / Location / Course

Facilitator

Regional Partner Affiliation

Assessments

MC / Free Response

Facilitator Trainings

Date / Location / Course

Facilitator Surveys

MC / Free responses

Teacher Surveys

MC / Free responses

Student Surveys

MC / Free responses

Course Activity Data

User Demographic Data

PD Activity Data

Survey Data

8 of 25

2(ish) models of partnership

  • Data sharing
    • “Easy” - we have data - want some?
    • “Hard” - you want all student code?
      • Lack Capacity
      • Privacy concerns
    • High demand for K5 puzzle data

9 of 25

2(ish) models of partnership

2. Implementation -- we implement a researcher’s intervention on our platform and give them data back. (Collaborative)

    • “Easy” - e.g. add a few survey questions.
    • “Hard” - implement a 2x2 RCT requiring building of multiple levels and/or an entire alternate version of a unit with a new tool palette

10 of 25

Why you want to partner with us - Infrastructure and Scale

  • Linking out
    • E.g. passing user_id to an app
  • Research/Cohort group “tracking”
    • Data for all students under PD’d teachers with Regional Partner X
  • Randomized experimental design (A/B testing)
    • Student A sees one quiz question, Student B sees the other...at the “teacher level”
  • Arbitrary experiments data
    • E.g. number of times hit block/text toggle

11 of 25

Partners and Results

12 of 25

Data Sharing -- Chris Piech, Stanford University

We are able to provide autonomous feedback for the first students working on an introductory programming assignment with accuracy that substantially outperforms data-hungry algorithms and approaches human level fidelity. Rubric sampling requires minimal teacher effort, can associate feedback with specific parts of a student’s solution and can articulate a student’s misconceptions in the language of the instructor.

13 of 25

14 of 25

15 of 25

Swipe Right for CS: Measuring Teacher Bias about Recruitment into Computer Science

Joshua Littenberg-Tobias, Kevin Robinson, Gabrielle Ballard

Teaching Systems Lab, MIT

April 2018

16 of 25

What is Swipe Right for CS?

  • Users are presented with a profile and a series of arguments to persuade students to take computer science
  • Users “swipe right” if they think the argument would be effective in convincing the student and “swipe left” if they do not find it convincing.

17 of 25

You can see these differences in how likely users were to swipe right for specific groups of students compared to white students.

18 of 25

19 of 25

Implementation - David Weintrop Univ. Maryland

Do students answer these differently?

Does the programming environment matter?

20 of 25

Implementation - David Weintrop Univ. Maryland

. Our analysis shows students performing better on questions presented in the block-based form compared to text-based questions. Further analysis shows that this difference is consistent across conceptual categories.

21 of 25

22 of 25

Subgoal Labels Study -- What are subgoals?

Research Partner: Briana Morrison - U. Nebraska - Omaha

Subgoals:

  • Program comments that describe steps in a problem solving process
  • Students asked to break problem down, and compose solution with “subgoals” before writing code
  • Helps thinking about writing code - gets novice “over the hump” - bridges the gap between expert and novice (i.e. the gap between a teacher’s explanation and student’s understanding)
  • Research has shown that subgoal labels a) improve student performance on programming tasks b) especially for women and URMs

23 of 25

24 of 25

Subgoal Labels Study -- How it worked

Participants opt-in (randomly selected: intervention + control)

  • Participating teachers assign a research version of Unit 3 to students through the course pull down, and teach as normal.
  • The URL will look something like: studio.code.org/s/csp3-research-XQEQYV

We will share data with the researcher (deidentified) for students of teachers in the study. There will be no way for researcher to know the identity or any PII of any teachers or students.

25 of 25