1 of 41

Welcome to the LAK Hackathon 2019

2 of 41

Program of

the LAK Hackathon

08:30 – 09:00

Registration

09:00 – 09:15

Arrivals

09:15 – 09:30

Introduction – Welcome to LAK Hackathon

09:30-10:30

Pitches for hackathon challenges (10 minutes max.)

10:30-11:00

Coffee break

11:00-12:30

Form teams; work on challenges

12:30-13:30

Lunch

13:30-15:00

Work on challenges

15:00-15:30

Coffee Break

15:30-16:30

Work on challenges

16:30-17:00

Wrap up Day 1 – summary of progress and next steps

Evening

Optional meet up for drinks.

09:00 – 09:30

Recap Day 1 – Goals for Day 2

09:30-10:30

Work on challenges

10:30-11:00

Coffee break

11:00-12:30

Work on challenges

12:30-13:30

Lunch

13:30-15:00

Work on challenges

15:00-15:30

Break

15:30-16:00

Wrap up challenges

16:00-17:00

Present progress back from each challenge. Collect evidence and any themes we wish to carry forward.

3 of 41

Previous Hackathons

  1. LAK’15 – Poughkeepsie
    1. Apereo Open Dashboard
  2. LAK’16 – Edinburgh
    • JISC xAPI recipes
    • tested learning record stores
    • LA standards landscape
  3. LAK’17 – Vancouver
    • actionable analytics
    • embedding LA in pedagogic practice
    • Jisc’s student app
  4. LAK’18 – Sydney
    • Multimodal sensor data
    • Custom dashboards
    • GDPR and Privacy in LA
    • E-portfolios, e. Algorithmic transparency
    • Data literacy

4 of 41

5 of 41

Supporting infrastructure

GitHub

Wordpress

Slack

6 of 41

Challenges 2019

  1. Multimodal Learning Analytics
  2. Data Interoperability
  3. Goal setting and analytics
  4. Curriculum analytics
  5. LA for assessment in games
  6. LA Open Knowledge Infrastructure
  7. LA Packages Challenge

7 of 41

  1. MMLA Feedback Runtime Engine

by Daniele Di Mitri & Jan Schneider

sensors

feedback

actuators

Learning

Hub

8 of 41

  1. MMLA design Exercises

Improving the LearningHub with a real-time feedback system.

Questions:

1 - Feedback rules: how to design good feedback rules?

2 - Pushing feedback: shall we inform the teachers? or everybody? how to address specific type of feedback to specific learners?

GitHub https://github.com/janschneiderou/LearningHub/

9 of 41

2. Data Interoperability (for a lifetime of learning)

by Kirsty Kitto

  • What’s the point? Hasn’t this horse bolted already?
  • We already have two data standards being touted for LA
  • One will surely win? (Well no…)
  • And do either of them actually solve the core problem?

10 of 41

11 of 41

The LA-API

by Kirsty Kitto

(infrastructure emerging at UTS)

<insert your slides here>

12 of 41

Code base is in dev but has been released

Four main steps to the ETL pipeline…

  1. https://github.com/uts-cic/canvas-extract & https://github.com/uts-cic/canvas-quizzes-extract
  2. https://github.com/uts-cic/lrs-mongo
  3. https://github.com/uts-cic/la-graphql-live
  4. https://github.com/uts-cic/la-api-dashboards

And some other related repos

13 of 41

What do we need?

  • A whole stack of new xAPI profiles �(well the xAPI community needs them anyway)
  • New integration pipelines
  • A Caliper integration to the GraphQL
  • Security models for the graph… or elsewhere?
  • New widgets for the dashboards…
  • Analytics

  • New ideas - what do you think we should be doing?

14 of 41

3. Goal setting and analytics

by Gábor Kismihók and Stefan Mol

15 of 41

Role of Learning Analytics in Individual, Goal Driven Person – Job Matching

Dr. Gábor Kismihók

Head of Learning and Skills Analytics

@kismihok

Gabor.Kismihok@tib.eu

Page 15

16 of 41

General Objective

Establishing a personalised curriculum development method, on the basis of goal setting and labor market information, to improve student proactivity

Page 16

17 of 41

Concept

Page 17

18 of 41

Scientific Objectives

Contribution to the understanding of proactive learner behavior through goal setting

  • Effects of goal setting in education
  • Show effects on learners self-efficacy and motivation

Contribution to the literature of self directed (regulated) learning

  • Developing novel ways to design learner driven curricula
  • Methods to mirror individual to traditional educational pathways (using e.g. performance, motivation)
  • Capturing attitudinal changes of learners

Contribution to methods, which introduce external (non educational) data sources in curriculum design and learning evaluation

  • Labour market data on skills and jobs
  • Skills taxonomy improvement

Page 18

19 of 41

Practical Objectives

Providing individual advice to learners about their progress

Visualisation of learning progress

Dynamic reconfiguration of learning content delivery

Establishing means to use labour market data in education

Page 19

20 of 41

How?

Developing the Dynamic, Individual Curriculum Recommender Dashboard

Series of Hackathons:

LSAC2018, LAK19, LSAC2019

PhD project, starting May2019

Page 20

21 of 41

Concept

Page 21

22 of 41

Concept

Page 22

23 of 41

Challenges

  • Generating a minimum viable product

  • ESCO / Vacancy matching for dynamic skill (re)configuration

  • Learning Unit to skills matching

Available Labour Market data:

  • ESCO
  • Vacancy announcements

Page 23

24 of 41

4. Curriculum analytics

By Niall Sclater & Michael Webb

25 of 41

4. Curriculum analytics: Objects

26 of 41

4. Curriculum analytics: Objects

A curriculum object describes an aspect of the curriculum, the data and the analytics that can be used to enhance it

27 of 41

4. Curriculum analytics: User Stories

28 of 41

4. Curriculum analytics: Multiple uses

29 of 41

4. Curriculum analytics: Aspirations for the hackathon

  1. Brainstorm curriculum objects
  2. Map user stories to curriculum objects
  3. Design basic markup language for curriculum objects
  4. Create some fake data
  5. Build a basic analytics tool

30 of 41

5. LA for assessment in games

by José A. Ruipérez Valiente

  • Digital games are widely used and are being adopted as part of education
  • Alternative ways to unobtrusively measure students’ learning
  • Without interrupting the learning and game experience (stealth assessment)
  • Specially interesting for skills hard to measure but related to lifelong learning

31 of 41

5. LA for assessment in games

  • We use Shadowspect:
    • Developed by MIT Playful Journey Lab in collaboration with MIT Education Arcade
    • Topic area of geometric measurement and dimension, including the relationship between 2D and 3D shapes
  • Playable game here
  • We will use data from 3 case studies, with around 40 players at TSL Dine and Play and with high school students
  • We are going to start to collect data in high schools
  • Data includes clickstream data and configuration files

32 of 41

5. LA for assessment in games

  • Challenges and GitHub repo!
  • Game analytics: Level difficulty, learning pathways, tutorial analytics, participation funnel…
  • Cognitive skills and behavioral assessment: Persistence, wheel spinning, gaming the system, strategies, off-task behaviors, creativity…
  • Content assessment: Initially out of the challenge, some examples are include ratios, understanding rotations and reflections, 2D/3D relationships…
  • Challenge outcomes:
    • Possibility of a joint publication based on what we implement in the LAKHackaton
    • Possibility to extend collaboration afterwards, specially once we gather more data

33 of 41

6 LA in Open Knowledge Infrastructures

by Atezaz Ahmad

Are there some specific analytics/indicators that are relevant to the OER?

34 of 41

Open Learning Analytics (1)

  • How can we make LA OPEN?

35 of 41

Open Learning Analytics (2)

Suppose a course with analytics within the OER

A course with analytics

36 of 41

Open Learning Analytics (3)

37 of 41

Open Learning Analytics (4)

38 of 41

Open Learning Analytics (5)

39 of 41

Open Learning Analytics (6)

Questions

  • What type of data we can share publicly under which conditions?
  • In which format it is needed to be shared and how?
  • Should we share only the data or the results or both?

40 of 41

6. Packages Challenge

by Alan Berg

Let's make it easy by creating a workflow using R packages.

  1. What is a Learning Analytic workflow in the context of programmatic Data Science?
  2. Which functions are necessary to decrease the number of lines of code to achieve the full workflow?
  3. Which freely available LA related data sources already exist that libraries can pull into the developer's environment via helper functions?
  4. What is the relationship between an LA targeted library and commonly used libraries in the Tidyverse and Caret?

We wish to pull in Data Scientists to Learning Analytics

41 of 41

6. Exercises

https://github.com/AlanBerg/Package-Hackathon-LAK19

Makes an initial package ready for further refinement.

SRC/Start.R is example code for the start of package writing. The aim is to motivate thinking about generating a set of R packages to support data scientists interested with kicking off their Learning Analytics efforts

1. Discuss the packages you think are necessary

2. Generate package(s) using SRC/start.R

3. Discuss the functions in the package necessary

4. Add dummy functions and update documentation using Roxygen2 and the document() command

5. Add tests for the dummy functions

6. Write you first functioning methods.