ALASI_prg_01.xls
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

View only
 
ABCDEFGHIJKLMNOPQRSTUVWXYZ
1
>>Descriptions of all sessions underneath program<<
2
27 November (SUNDAY)Foyer Jeffrey Smart BuildingRoom A

JS3-13A (63)
Room B

JS5-13 (36)
Room C

JS4-11 (63)
Room D

JS5-12 (42)
3
8amRegistration and coffee
4
8.3Welcome
5
8.45Keynote: Ryan Baker
'Massive Replication of Scientific Findings in Massive Online Open Courses'
6
9.45Flash talks - super short intros to the workshops, panels and presentations
7
10Morning Tea
8
10.3WORKSHOP 1A: Mixed Reality in Higher Education: Pedagogy Before Technology - James Birt (Bond) and Michael Cowling (CQU)PANEL 2B: What exactly do we mean by ‘learning’ in learning analytics? - Jason Lodge (UMelb) Sakinah Alhadad (Griffith) Simon Knight (UTS) Melinda Lewis (USyd) Tim Rogers (UniSA)WORKSHOP 1E: Driving student engagement: SRES, two versions one purpose
Marion Blumenstein (University of Auckland), Jenny McDonald (University of Auckland), Adon Moskal (Otago Polytechnic), Danny Liu (USyd), and Steve Leichtweis (University of Auckland)
PANEL: 2A: Running a Learning Analytics Centre: Stories & Conversations -
Simon Buckingham Shum (UTS) Cassandra Colvin (UniSA) Linda Corrin (UMelb) David Gibson (Curtin) Marcel Lavrencic (UQ) Abelardo Pardo (USyd) Lorenzo Vigentini (UNSW)
9
11:30PRESENTATION1I: Putting the pieces together: multimodal learning analytics for learning design - Kate Thompson (Griffith), Sarah Howard (UoW), Abelardo Pardo (USyd)PRESENTATION 3a: “We're not here to be babied” University students’ attitudes towards learning analytics - Lynne Roberts (Curtin), Joel Howell (Curtin) Kristen Seaman (Curtin) David Gibson (Curtin)
10
12.3Lunch and Posters
11
1.3WORKSHOP 1H: e-Exams analytics: How authentic can we get? - Mathew Hillier (Monash) Andrew Fluck (UTas)
PRESENTATION 1D: How can we turn around negative perceptions of Learning Analytics? - Hazel Jones (USQ)WORKSHOP 1J: Literacies in Learning Analytics:
What methodologies can/should learners know for actioning LA data? – an experimental tour.
- Ronald Monson (ECU)
PANEL 2c: Experimentation to enterprise: Is there a place for open source learning analytics platforms at our institutions?
Danny Liu (USyd) Cassandra Colvin (UniSA) Linda Corrin (UMelb) Jenny McDonald (Uni Auckland) James Hamilton (Navitas) Abelardo Pardo (USyd) Steve Leichtweis (Auckland) and Russ Little (Hobsons)
12
2:30PRESENTATION1B: Collect, Analyze, Act, Reflect: A framework for learning analytics professional development - Cathy Gunn, Claire Donald, Jenny McDonald - University of AucklandPRESENTATION: 1C: Visualising Learning & Teaching data: comparing Institutional practice in an open discussion about workflows, issues, and practical uses. Vigentini, L., Clayphan, A., Chitsaz, M., Zhang,L. (UNSW); Lui, D. (Usyd); Matinez-Maldonado, R. (UTS).
13
3.3Afternoon Tea
14
4Keynote: Abelardo Pardo
15
5Close
16
5.3ASCILITE opening night drinks
17
18
Session details
19
Ryan Baker KeynoteMassive Online Open Courses (MOOCs) have been viewed as a potentially powerful
source for research on learning and student success, but research on
MOOCs has largely been fragmentary, with most research conducted on
single MOOCs, and limited work towards formally studying which findings replicate across courses. In this keynote I present an architecture to facilitate replication of
research on learning and behavior within MOOCs. This architecture can
ingest data from an edX Massive Open Online Course (MOOC) and test
whether a range of findings apply, in their original form or slightly
modified using an automated search process. We identify 21 findings
from previously published studies on completion in MOOCs, render them
into production rules within our architecture, and test them in the
case of a single MOOC, using a post-hoc method to control for multiple
comparisons. We find that nine of these previously published results
replicate successfully in the current data set and that contradictory
results are found in two cases. We are currently working to extend
this replication to over 60 MOOCs. This work represents a step towards
automated replication of correlational research findings at large
scale.
20
21
WORKSHOP 1A: Mixed Reality in Higher Education: Pedagogy Before TechnologyThis workshop builds upon the past five years of experience and research of The Mixed Reality Research Lab (MRRL). Demonstrating several of the applications produced by the MRRL as case studies, the workshop will provide participants with knowledge of how mixed reality is used, how learning analytics can be recorded through mixed reality simulations and how pedagogy and technology can be weaved together to implement a mixed reality and mobile device solution into a classroom for a specific discipline.
22
23
PRESENTATION1B: Collect, Analyze, Act, Reflect: A framework for learning analytics professional development In this presentation we will introduce a Learning Design-Learning Analytics Framework designed to encourage individuals or teams of teachers to use data to inform and evaluate learning designs. We will examine three case studies of tertiary teachers’ use of learning analytics data for this purpose in three different disciplines. In these cases, teachers used learning analytics data to check a) their assumptions about students’ prior knowledge and b) students’ responses to the structure, resources and teaching strategies used in their online courses. Our Learning Design-Learning Analytics Framework aligns different kinds of learning analytics data with the rhythms of teaching, i.e. the preparation stage before a course or lesson starts, during teaching, and the review period after the course or lesson ends.
24
25
PRESENTATION: 1C: Visualising Learning & Teaching data: comparing Institutional practice in an open discussion about workflows, issues, and practical usesUsing case studies from three universities (USyd, UNSW, UTS) this presentation will cover the following key messages and talking points from the perspective of esearchers, academic developers, teachers and students::
● How can various methods of accessing, analysing, and interpreting data be used to provide actionable insights?
● What we can learn from the rich data streams generated from learning platforms?
● Where is flexibility needed in a system that can process data from different systems, data and formats?
● How can accountability and transparency be managed?
● How can these systems be used to connect with students to enhance educational experiences?
26
27
PRESENTATION 1D: How can we turn around negative perceptions of Learning Analytics?A recent newspaper article “University students you are being watched” (http://www.theage.com.au/national/university-students-you-are-being-watched-20160811-gqqet7.html) drew many comments from the general public, suggesting that there is much work required to promote the benefits of learning analytics and turn the tide of public opinion. Through role play and small group discussion this interactive presentation will explore the themes developed from the negative comments and opinions and develop some possible responses from universities and the field and action plans that universities could consider implementing to further promote positive engagement with learning analytics from staff students and their local communities.
28
29
WORKSHOP 1E: Driving student engagement: SRES, two versions one purposeIn this two-hour workshop we are introducing the Student Relationship Engagement System (SRES) which allows teachers to efficiently target students based on their individual performance, engagement with course materials, and/or participation in learning activities (e.g. quizzes, discussions) using highly personalised email or text messages. The SRES exists as two concurrent open source developments, version 1 and 2. The SRES aim is to provide teachers with user-friendly tools to enable adoption of learning analytics without needing them to be expert data analysts, thus closing the loop by affording actionable intelligence from these data. This will be a hands-on session where participants are able to explore real world data drawn from large undergraduate courses that have used the SRES. The aim is to showcase how the system affords timely and personalised engagement with students, and promote discussion of salient points as part of a Community of Interest.
30
31
WORKSHOP 1H: e-Exams analytics: How authentic can we get?This workshop will take participants on a journey to explore the possibilities, limits and issues related to the evolution of traditional pen-on-paper examinations into the digital era. We will discuss and explore how the drive to harness the power of analytics may be at odds with a desire to enable highly authentic forms of assessment such as complex constructed responses using contemporary software 'tools of the trade' in the exam room. Is it possible, with the right mix of e-tools, that we could design an infrastructure to 'have our cake and eat it too'? We will look at the options for using ICT for exams to enable the delivery of assessments that are more authentic through an evolution from paper-equivalent to post-paper computerised exams. We also discuss future options for capturing a range of insightful data about student performance, including problem solving techniques, behaviour in exams, resources utilised, marks gained, question quality, as well as possible ways to authenticate student identity through the use of biometric and analytic techniques.
32
33
PRESENTATION1I: Putting the pieces together: multimodal learning analytics for learning designThe ability of learning analytics techniques to be actionable relies on core interaction between researchers and instructors in order to develop tools that meet the needs of learners, and can be integrated into practice. In this presentation we will show how integrating the results of multiple learning analytic measures can be used to make decisions about teaching and learning. We will be using real data, previously analysed and published, that describes Masters students carrying out a collaborative design assessment task. The data includes video and audio recordings, as well as artefacts created during the group work. The students were given a design task to be completed in a 5-week period. The group collaborated on this task in online and a face-to-face environment. The group consisted of four students, and met online four times and face-to-face three times. The physical environment contained writeable walls, onto which computers could be projected. The furniture could be moved to suit the needs of the learners in the room.
34
35
36
37
WORKSHOP 1J: Literacies in Learning Analytics:
What methodologies can/should learners know for actioning LA data? – an experimental tour
Is there such a thing as a Learning Analytical Literacy? What implications do recent, startling advances in neural networks have on artificial intelligence and learning? What does Learning Analytics (LA) have to say about the Scientific Method and vice-versa? Finally, What knowledge or intuition, if any, should learners acquire about analytical methods increasingly used to direct their learning experiences?
Plainly these are deep questions beyond thorough examination in a 2 hr workshop, but the idea behind this workshop is that they are nevertheless important and can be tackled more informedly with some insight into the scientific methodologies routinely applied throughout LA. This is the goal of this workshop; to have participants gain an understanding of the underlying principles and latest developments in four methodologies predominantly used in Learning Analytics - Predictive Analytics, Hypothesis Testing/Bayesian Inference, Linguistic Analysis and Visualizations.
Participants will get to select and apply each of these techniques to supplied datasets, dynamically varying parameters such as machine learning methods, training sets, p-values/significance levels, sentiment selections and visualization options. No coding experience is required as users instead modify expressive templates in the cloud.
38
39
PANEL: 2A: Running a Learning Analytics Centre: Stories & ConversationsAn indicator of the maturing field of learning analytics is the creation of new organizational entities dedicated to using learning analytics services to improve the student experience through institutional research. Going beyond traditional Business Intelligence (BI), these groups operate firmly at the intersection of learning and analytics — they can speak the language of pedagogy and assessment with educators, invent/deploy novel analytics tools, while engaging IT and BI colleagues around mainstreaming services. The end-users targeted by these learning analytics centres are educators and learners. Attendees should leave with a greater awareness of the diversity of ways in which universities are creating centres of expertise, and fresh ideas on the options they have to advance learning analytics in their own contexts.
40
41
PANEL 2B: What exactly do we mean by ‘learning’ in learning analytics?While analytics techniques continue to develop and evolve, fundamental questions about what is being analysed remain for the learning analytics community. As a multidisciplinary endeavour, there are conflicting views as to what learning is and how best to collect data in order to determine how it is occurring and how best to enhance it. Clarity around valid and reliable means of collecting, analysing and interpreting data about learning are thus vital for learning analytics if the discipline is to reach its undoubted potential. The aim of this panel is to bring researchers and practitioners from a range of disciplines together to delve further into this pressing issue. Each practitioner will present a particular conception of learning and the implications of ths perspective for the practice of learning analytics as research, for pedagogy, and in implementation
42
43
PANEL 2c: Experimentation to enterprise: Is there a place for open source learning analytics platforms at our institutions?There has been a the tendency at Australasian higher education institutions for a top-down, business intelligence (BI) approach to learning analytics. This has grown from a genuine interest in gleaning insights from big data within existing leadership and data paradigms, and early LA implementations focussed on efficiency and monetary benefits such as improving retention. This approach has been typically characterised by one-size-fits-all vendor solutions, which are frequently reported to not meet the felt needs of learners and educators. Yet, we recognise that LA, like learning, is situated, and demands more contextualised approaches. This is manifested in a growing number of bottom-up open source LA initiatives. Supporting rapid innovation cycles within these developments may help LA to be “sensitive to [educators’] work environments, meeting and extending their pedagogical requirements, and ensuring flexibility and rewards” . Indeed, early examples of such developments highlight the disconnect between the top-down and bottom-up approaches to LA. This panel will therefore explore the role of these open source LA tools in higher education institutions, and where they lie on the spectrum of research-focussed experimentation to enterprise-level adoption. Discussion will be situated in the context of recent findings of LA needs and approaches in Australasia, including compatibility with institutional culture and systems, consideration of the roles and needs of learners and educators, and affordances of the LA tools to provide actionable intelligence. This panel is supported by the ASCILITE Learning Analytics SIG’s new Open Analytics Project, and will give attendees (1) insiders’ perspectives to open source LA development, implementation, and impact at a range of institutions, and (2) opportunities to challenge and apply these learnings to their own contexts.
44
45
PRESENTATION 3a: “We're not here to be babied” University students’ attitudes towards learning analyticsThe technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students’ knowledge, attitudes and concerns about big data and learning analytics through four focus groups with 41 higher education students. Thematic analysis of the focus group transcripts identified six key themes. The first theme, ‘Uninformed and Uncertain’, represents students’ lack of knowledge about learning analytics prior to the focus groups. Following the provision of information, viewing of videos and discussion of learning analytics scenarios further themes; ‘Help or Hindrance to Learning’, ‘More than a Number’, and ‘Impeding Independence’; represented students’ perceptions of the likely impact of learning analytics on their learning. ‘Driving Inequality’ and ‘Where Will it Stop?” represent ethical concerns raised by the students about the potential for inequity, bias and invasion of privacy and the need for informed consent. A key tension to emerge was how ‘personal’ versus ‘collective’ purposes or principles can intersect with ‘uniform’ versus ‘autonomous’ activity. The findings highlight the need the need to engage students in the decision making process about learning analytics.
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...