CI 5371: Learning Analytics: Theory and Practice

DRAFT - Aug 30, 2018

Fall 2018  -  Online  -  3 Credits

Instructor Information

Bodong Chen, Assistant Professor

Course Description

Overview

Learning analytics as a nascent field is broadly defined as the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”[1] This course aims to provide a general survey of learning analytics emphasizing its application in various educational contexts, rather than its underlying algorithmic details. In particular, we will discuss foundations of learning analytics, survey pertinent education theories, discuss emerging forms of assessment, explore popular data analytic techniques, review learning analytical tools and case studies, and design analytics for our own contexts. Given the breadth of this field, personalized support is provided for deeper dives in special interest areas. Overall, this course provides a comprehensive, theory-driven overview of learning analytics to orient students to this nascent field and prepare them for advanced research/practice in learning analytics.

Audience

The course is designed for a broad audience. All graduate students interested in learning analytics and its application in specific educational areas (e.g., STEM learning, literacies, online learning, workplace learning, learning in informal settings) are welcomed.

Prerequisites: None. Prior knowledge in learning theories, assessment, and data science is helpful but not required.

Objectives

By the end of the course, students will:

  1. Understand the learning analytics data cycle
  2. Identify and describe key epistemological, pedagogical, ethical, and technical factors informing the design and implementation of learning analytics
  3. Be familiar with the basics of collecting, cleaning, transforming, managing, and analyzing educational data
  4. Be familiar with some of the popular data analytic techniques, including predictive models, text analysis, relationship mining, and social networks
  5. Develop awareness and skills necessary for applying learning analytics

Course Design

Guiding Philosophy

This is a Knowledge Building course, meaning all participants (including the instructor) are collectively producing ideas and knowledge of value to the community, in order to solve authentic learning analytics problems. Our top-level goal in this course will be to work as a knowledge building team, living and exploring the capacity of learning analytics in supporting learning in various domains. This overarching goal will be interwoven throughout this course. We will advance this goal through social annotation of readings, design activities, and group projects.

The course design also follows facets of Open Pedagogy. Instructional content (produced by the instructor) will be made open on the web as Open Educational Resources (OERs). All members of the community are encouraged to create and openly share artifacts -- e.g., web annotations, blog posts, tweets, essays, computer codes -- at any stage of the course.

Course Timeline

This is an online class. The course website will be the hub, and most course readings will be listed on the site.

Each week:

  • The class will meet synchronously online on Mondays, 5-6:30pm, via Zoom. These synched meetings will be spent on the elaboration of key concepts, collaborative activities, software demos, Q&A, student presentations, etc. The Zoom room will be open 15 minutes before class for community members to settle in and socialize.
  • The class will interact asynchronously during the rest workdays, via Slack <https://la-mn.slack.com/> and Hypothes.is. They’re where social reading, discussion, and knowledge building take place.

For the semester:

  1. The first seven weeks are designed to provide an introduction to the field of learning analytics, including its foundations, research themes, data analytic techniques, and case studies. These weeks feature both theoretical discussions highlighting typical assumptions underlying learning analytics applications, and hands-on with data analytic techniques. During the process, students will form working groups (WGs; with max. 3 members) around emergent design problems in specific contexts. Students will also self-organize into several special interest groups (SIGs), each of which will lead the community in exploring a theme of learning analytics in the 2nd part the course.
  2. The second part of the course features five(ish) themes, each led by a corresponding SIG. With support from the instructor, each SIG is expected to take a lead on designing classroom activities, presenting key ideas, and facilitating discussion. Each SIG will meet with the instructor one week in advance to finalize their course plan. In the meantime, each WG will keep advancing their group projects.
  3. The class will use the final two weeks to further advance our WG projects and rise-above ideas in the community. Each WG will present their work to the class. We will collectively reflect on our designs and explore possible ways to further improve them.

Supporting Tools

Digital tools and practices are important for this online course. Supporting tools include but are not limited to the following:

  • Zoom: A video-conferencing tool for our synchronous online meetings
  • Hypothes.is: A web annotation tool for discussing course readings
  • Slack: A team communication tool for announcements and asynchronous discussions within the community
  • Social media: Use #LAUMN and #LearningAnalytics when posting on social media (e.g., Twitter). If you blog, send the instructor your RSS feed.

You will need a functional web camera and a microphone to participate in Zoom meetings. Please consult with the Office of Information Technology (OIT) in advance if you need additional support.  If you’re new to virtual meetings, familiarize yourself with basic virtual meeting etiquettes.

Workload Expectation

This is a 3-credit course, with an expected weekly workload of 9 hours.

Assessment and Grades

Attendance & Deadlines

Attendance requirements and penalties for missing class: Attendance are required. Missed classes will lead to lower grades (see section Grading).

All graded work in class must be completed by the due dates listed below in the Course Schedule section of this syllabus.

If you find a specific deadline not working for you or you need more time for an assignment, you can establish a new deadline if you contact the instructor in advance prior to the deadline, provided that the new deadline would not disrupt your peers' work. However, if the instructor receives no prior communication from you and you submit an assignment late, the assignment will be penalized at the rate of 10% per day.

Parameters

  • Group- and Individual-Assessment: Students will be assessed both individually and as a group
  • Teacher- and Peer-Assessment: Students will be assessed both by the instructor and peers, based on their individual learning and contributions to the community

Grading

Components

Points*

Class participation (online and F2F)

20

SIG presentation (group & peer)

25

WG project artifact (group)

25

WG presentation (group)

15

Reflection essay or e-portfolio

15

* No extra credit is allowed in the course.

Class participation involves active and constructive participation in sync and async discussions. Evaluation will be based on both numeric metrics reflecting participation efforts and qualitative assessment of one's discussion contribution.

SIG presentation. Each SIG will design a session to engage the whole class in exploring a theme in learning analytics. When one group presents/leads, other groups will participate and assess the session following a given rubric. Students in a same group will get a same score as other peers. Each SIG member will also be assessed by group members.

Each WG will tackle a real-world problem of their choice, and will be expected to produce a project artifact and present it to the whole class.

  • A WG project artifact could be a design document, a research plan, or a functioning prototype depending on the problem chosen by the WG. (Each WG should come up with a tentative project proposal to discuss with the instructor by the end of Week 11.)
  • WG presentations will be peer-assessed following a given rubric. Students in a same group get a same score. Each member of a WG are also assessed by other group members.

Reflection essay or e-portfolio. Students would have the choice between a reflective essay (not exceeding 2,000 words excluding references) or preparing an e-portfolio reflecting on one's journey in the course.

 


Class Schedule with Weekly Readings and Activities

Wk

Date

Topics

Readings

Activities

1

9/10

Introduction

  • Self-intro on Slack
  • Annotate on Hypothesis

2

9/17

Learning Analytics: A Brief Overview

Explore WG group ideas

3

9/24

Ethics, Algorithmic Accountability, and System Integrity

SIG topics

4

10/1

Theory and Learning Analytics

WG project ideas share-out

5

10/8

Hidden Assumptions: Epistemology, Pedagogy, and Assessment

SIG and WG signup

6

10/15

Educational Data Mining: An Overview

7

10/22

Cases and Examples of Learning Analytics

8

10/29

"Fun with Data" Hands-on

9

11/5

Social Networks (theme 1)

To be designed by SIG 1

10

11/12

Predictive Models (theme 2)

  • Buniyamin, Arsad, & Kassim, 2013
  • Pardos et al., 2013
  • Thai-Nghe et al., 2011
  • Elbadrawy et al., 2016

To be designed by SIG 2

11

11/19

Text and Discourse Analytics (theme 3)

To be designed by SIG 3

12

11/26

AI and Text Mining (theme 4)

  • TBD

To be designed by SIG 4

13

12/3

Visual Learning Analytics (theme 5)

  • TBD

To be designed by SIG 5

14

12/10

WG Presentations and Reflection

None

WGs present their group projects

15

12/17

Assignments due

Bibliography

Advisors, E. G. (2013). Learning to adapt: Understanding the adaptive learning supplier landscape. Tyton Partners.

Anderson, C. (2008, June). The end of theory: The data deluge makes the scientific method obsolete. Wired. Retrieved from http://www.wired.com/2008/06/pb-theory/

Arnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1), n1.

Baker, R., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.

Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An issue brief. U.S. Department of Education Office of Educational Technology.

Buckingham Shum, S. (2012). UNESCO Policy Brief: Learning Analytics. UNESCO Institute for Information Technologies in Education. Retrieved from http://www.iite.unesco.org/publications/3214711/

Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology and Society, 15(3), 3–26.

Bull, S., & Kay, J. (2010). Open Learner Models. In R. Nkambou, J. Bourdeau, & R. Mizoguchi (Eds.), Advances in Intelligent Tutoring Systems (pp. 301–322). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-14363-2_15

Buniyamin, N., Arsad, P. M., & Kassim, R. (2013). An overview of using academic analytics to predict and improve students’ achievement: A proposed proactive intelligent intervention. In Engineering Education (ICEED), 2013 IEEE 5th Conference on (pp. 126–130). IEEE.

Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge - LAK ’14 (pp. 49–53). https://doi.org/10.1145/2567574.2567603

Desmarais, M. C., & Baker, R. S. J. d. (2012). A review of recent advances in learner and skill modeling in intelligent learning environments. User Modeling and User-Adapted Interaction, 22(1-2), 9–38. https://doi.org/10.1007/s11257-011-9106-8

Elbadrawy, A., Polyzou, A., Ren, Z., Sweeney, M., Karypis, G., & Rangwala, H. (2016). Predicting Student Performance Using Personalized Analytics. Computer, 49(4), 61–69. https://doi.org/10.1109/MC.2016.119

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C., & Knight, S. (2017). Reflective writing analytics for actionable feedback. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 153–162). ACM. https://doi.org/10.1145/3027385.3027436

Grunspan, D. Z., Wiggins, B. L., & Goodreau, S. M. (2014). Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research. Cell Biology Education, 13(2), 167–178. https://doi.org/10.1187/cbe.13-08-0162

Haythornthwaite, C. (1996). Social network analysis: An approach and technique for the study of information exchange. Library & Information Science Research, 18(4), 323–342. https://doi.org/10.1016/S0740-8188(96)90003-1

Kandel, S., Heer, J., Plaisant, C., Kennedy, J., van Ham, F., Riche, N. H., … Buono, P. (2011). Research Directions in Data Wrangling: Visualizations and Transformations for Usable and Credible Data. Information Visualization Journal, 10(4), 271–288.

Knight, S., Buckingham Shum, S., & Littleton, K. (2014). Epistemology, assessment, pedagogy: where learning meets analytics in the middle space. Journal of Learning Analytics, 1(2), 23–47.

Oshima, J., Oshima, R., & Matsuzawa, Y. (2012). Knowledge Building Discourse Explorer: A social network analysis application for knowledge building discourse. Educational Technology Research and Development: ETR & D, 60(5), 903–921. https://doi.org/10.1007/s11423-012-9265-2

Pardos, Z. A., Baker, R. S. J. D., San Pedro, M. O. C. Z., Gowda, S. M., & Gowda, S. M. (2013). Affective states and state tests: Investigating how affect throughout the school year predicts end of year learning outcomes. In Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13 (p. 117). https://doi.org/10.1145/2460296.2460320

Rohrer, R. M., Ebert, D. S., & Sibert, J. L. (1998). The shape of Shakespeare: visualizing text using implicit surfaces. In Proceedings of IEEE Symposium on Information Visualization (pp. 121–129). IEEE Comput. Soc. https://doi.org/10.1109/INFVIS.1998.729568

Rosé, C. P., Wang, Y.-C., Cui, Y., Arguello, J., Stegmann, K., Weinberger, A., & Fischer, F. (2008). Analyzing collaborative learning processes automatically: Exploiting the advances of computational linguistics in computer-supported collaborative learning. International Journal of Computer-Supported Collaborative Learning, 3(3), 237–271. doi:10.1007/s11412-007-9034-0

Scardamalia, M., & Bereiter, C. (2003). Knowledge building. In J. W. Guthrie (Ed.), Encyclopedia of education (2nd ed., Vol. 17, pp. 1370–1373). New York, NY: Macmillan Reference.

Scardamalia, M., Bransford, J. D., Kozma, B., & Quellmalz, E. (2012). New assessments and environments for knowledge building. In Assessment and teaching of 21st century skills (pp. 231–300). Springer. https://doi.org/10.1007/978-94-007-2324-5_5

Scheuer, O., & McLaren, B. M. (2012). Educational data mining. In Encyclopedia of the Sciences of Learning (pp. 1075–1079). Springer.

Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Achér, A., Fortus, D., … Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632–654. https://doi.org/10.1002/tea.20311

Shermis, M. D. (2014). State-of-the-art automated essay scoring: Competition, results, and future directions from a United States demonstration. Assessing Writing, 20, 53–76. https://doi.org/10.1016/j.asw.2013.04.001

Siemens, G. (2013). Learning analytics: The emergence of a discipline. The American Behavioral Scientist, 57(10), 1380–1400.

Slade, S. (2016). Applications of Student Data in Higher Education: Issues and Ethical Considerations. Ithaka S+R. https://doi.org/10.18665/sr.283891

Stahl, G. (2013). Learning across Levels. International Journal of Computer-Supported Collaborative Learning, 8(1), 1–12.

Thai-Nghe, N., Drumond, L., Horváth, T., Krohn-Grimberghe, A., Nanopoulos, A., & Schmidt-Thieme, L. (2011). Factorization techniques for predicting student performance. Educational Recommender Systems and Technologies: Practices and Challenges.

van de Sande, B. (2013). Properties of the Bayesian Knowledge Tracing Model. JEDM - Journal of Educational Data Mining, 5(2), 1–10. Retrieved from http://www.educationaldatamining.org/JEDM/index.php/JEDM/article/view/35

Wise, A. F., & Schaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5–13. https://doi.org/10.18608/jla.2015.22.2

Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. In C. Lang, G. Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 151–160). Society for Learning Analytics Research.

Wolff, A., & Zdrahal, Z. (2012). Improving retention by identifying and supporting“ at-risk” students. EDUCAUSE Review Online.


[1] See https://tekri.athabascau.ca/analytics/.