1 of 42

MIU - My Interactive University

The goal is to make the interaction between university professors and students and their university life easier, considering the administration part like grading, course and learning management as well as the in class-situation. Apart from that the idea is also to provide useful information of the happenings and orientation on campus.

Target groups: university professors and their students

Human – Computer Interaction Project

Universidad Politécnica de Madrid

Summer Semester 2018

CONTEXT OF USE ANALYSIS

Users, Tasks, Environment

Methods: Interviews, Observations

Deliverables: User Profile, Value Proposition Canvas, User Journey Map, Task Organization Model

2

1

3

4

5

LOW FIDELITY PROTOTYPES

On paper

Two different designs: mobile and desktop

USER TESTS, ITERATION I: Two Designs and Both Target Groups

Pre-Test Questionnaire, Usability Test, Post-Test-Interview

Methods: AB Testing, Think Aloud, Performance Observation (Actions, Errors), System Usability Scale (SUS)

Deliverables: Usability Problems, Decision on preferred design

HIGH FIDELITY PROTOTYE

Methods: Navigation Map, Wire Frames

Deliverables: Web Application (HTML, JavaScript, CSS)

USER TESTS, ITERATION II: Desktop Design, Student target group

Usability Test, Post-Test-Interview

Methods: Performance Observation (Clicks, Errors, Time) Eyetracking, System Usability Scale (SUS), User Experience Questionnaire (UEQ)

Deliverables: Usability Problems, improvement suggestions for implementation

2 of 42

Context of Use

Analysis

3 of 42

MIU – My Interactive University

GOAL

Interaction

Motivation

Knowledge Transfer

Social

CONTEXT OF USE

In Class

On Campus

Office / study rooms

On the way

FUNCTIONS

Learning Management

Campus Life

Administrative Work

PROJECT TOPIC

4 of 42

User Journey Map

for students

5 of 42

User Journey Map

for professors

6 of 42

Value Proposition Canvas for students

7 of 42

Value Proposition Canvas for professors

8 of 42

Task Organization Model

for students

Students

Academic

Interaction

Participate and ask questions in class

Communicate about class-related content

Working with other students

Technical tasks

Submit assignments

Access class content

Checking grades

Studying

Taking Notes

Daily assignments

Prepare for exams

Administrative

Enrol for classes

Evaluate Professors

Requesting documents

Campus Life

Socialize with other students

Get bus schedule

Find rooms

Find food and menu

Find out news and events on campus

9 of 42

Task Organization Model

for professors

Professors

Preparation

preparing class material

knowing the students’ abilities

Prepare exams or quizzes

Interaction

Communicating with students

Holding a lecture

Discussing with colleagues

Administrative

Evaluation

Grading students

Evaluating assignments

Evaluate students’ participation in class

controlling students’ attendance

Administrating virtual classrooms

checking the forums

uploading new content

informing students

Report students’ performance and progress

10 of 42

Paper Prototypes

11 of 42

Ubiquity of smartphones

Daily frequent usage

“Second screen”-phenomenon

VIEWPOINT

Smartphone

Input: Touch & Physical Buttons

Output: visual & haptic feedback

Style: Direct Manipulation

INTERACTION STYLES / DEVICES

TERMINOLOGY

FIRST PROTOTYPE: MOBILE

Common

Chat, Activities, Assignments, Slides

Students

Academics, Campus Life, Paper Work

Professors

Quick Evaluate

12 of 42

Paper Prototype - Mobile

13 of 42

Big screen for important tasks

Better overview

Laptop is standard

VIEWPOINT

Laptop

Input: Keyboard, Mouse / Output: Display

Styles: Direct Manipulation / Menu selection / Form Filling

INTERACTION STYLES / DEVICES

TERMINOLOGY

SECOND PROTOTYPE: DESKTOP

Common

Forum, Assignments, Slides

Students

Ask the professor, Chat with colleagues

Professors

Drafts, Collaborate, Reward, Timer

14 of 42

Paper Prototype - Desktop

15 of 42

Iteration I

Low Fidelity Prototype

Usability Test

16 of 42

CLASS SETUP

ASK

VOTE

TASKS & PROTOTYPE DEMO

FOOD

Open the class & take notes on the slides.

The professor through the system.

Participate in the voting through the system.

Check the food menu of the cafeteria.

1

2

3

4

STUDENTS

CLASS SETUP

ANSWER

REWARD

CREATE

Open the class, check students’ attendance and start slides.

A question, received through the system.

Give a bonus to a student through the system.

Create a voting in the system.

1

2

3

4

PROFESSORS

17 of 42

Characteristic

Students

Professors

Average Age

23,5

39

Nationalities

50% Spanish

Indian, Dutch, Bulgarian

100% Spanish

Gender

50/50 M-F

100% M

Academic Experience

4,3 years

13,75 years

Satisfaction with current university systems

6

4

Users

METHODOLOGY

7

7,91

Daily usage in h:

Other

device

Preferred for university.

9,5

5,12

Daily usage in h:

Other

device

Preferred for university.

2 2 1 1

1 1 2

Students

Professors

18 of 42

TEST DATES & LENGTH

26th of March until 6th of April 2018

1h per session

ROLES

Facilitator: Andra

Computer: Andrzej

Observer: Miriam

Technical Support: all

TEST ENVIRONMENT

UPM Campus Montegancedo / Researcher’s flat

Quiet

METHODOLOGY

19 of 42

TEST CONDUCTION

Collect demographics

AB Testing (change order!)

Think Aloud

Task Instruction embedded in scenario to create empathy

FILLING SUS QUESTIONNAIRE

+

SIX

GENERAL IMPRESSIONS QUESTIONS

Problems // Overall experience // User Interface

Navigation // Functions // Improvements

PERFORMANCE MEASUREMENT BY OBSERVER

METHODOLOGY

USABILITY

CRITERIA

METRICS

Effectiveness

Task Completion

Number of errors

Efficiency

Number of actions

Satisfaction

System Usability Scale

(SUS)

Observations

Notes

20 of 42

FEEDBACK INTERVIEW

  1. Which of the prototypes do you prefer and why?

  • What have you liked the most in each prototype?

  • If you had one wish free, what or which function would you like to have for such a system?

DATA ANALYSIS

Quantitative

Qualitative

FINDINGS & CONCLUSION

List of usability problems

Preferred prototype

METHODOLOGY

21 of 42

Main usability �problems ranked

FINDINGS

STUDENTS

Ranking

Issue

Details

1

Navigation

Too deep, back/home button, transition between sections

2

Design

Small screen packed with info

3

Efficiency

High amount of needed actions

4

Accessibility

Hard to access certain functions

5

Practicability

Taking notes is cumbersome

6

Terminology

Paper work vs. academics / chat vs. message

7

User Control

Guidance through system, slides control

8

Previous Knowledge

Effect on usage by academic experience, familiarity, age

9

Error recovery

No possibility to reset from errors

10

Anonymity

Private or public chat

Ranking

Issue

Details

1

Terminology

Chat with colleagues vs. ask the professor, paper work vs. academics

2

Functionalities

Multitude of not necessary options

3

Design

Cluttered, overwhelming

4

User Control

Guidance through system, slides control

5

Previous Knowledge

Effect on usage by academic experience, familiarity, age

Mobile

Desktop

22 of 42

Main usability �problems ranked

FINDINGS

PROFESSORS

Ranking

Issue

Details

1

Navigation

Too deep, confusion by options (top, bottom bar, side drawer)

2

Functionalities

Order, filter, search option missing (e.g. for students)

3

User Control

Slides control, notifications

4

Learnability

New, innovative functions (e.g. rewarding)

5

Terminology

Quick evaluate vs. activities

6

Practicability

Voting creation cumbersome: split in more screens

7

Match with real world

Identify students: picture

Ranking

Issue

Details

1

Integration in context of use

Natural barrier towards the students due to screen size

2

Terminology

Voting vs. quiz

3

Design

Crowed, overwhelming: side bar and slides

4

Learnability

New, innovative functions (e.g. rewarding)

5

Navigation

Location of functions: wide spread across screen

6

Functionalities

Order, filter, search option missing (e.g. for students)

7

User Control

Slides control, notifications

8

Emotional influence

Negative order of rewarding options

9

Match with real world

Identify students: picture

Mobile

Desktop

23 of 42

Metric

Mobile

Desktop

Effectiveness

Average mistake < 1

Task 2 (ask) : success rate 83%, all other 100%

No mistakes in task 3 (Voting)

Very few mistakes (only first two tasks)

Task 1: 2 mistakes by two students

Success rate: 100% in all tasks

No mistakes in task 3 (Voting) and 4 (Food)

Efficiency

performed actions slightly above expected

Task 2: range from 4 to 9 actions

P5 needed most actions

performed actions slightly above expected

High STDEV in Task 3: range from 2 to 5

Influence of Prototype order in task 4: P1,3,5 needed 1 more action

SUS

82,08 (STDEV: 8,28)

Ranging from 70 to 92,5

88,33 (STDEV: 6,65)

Ranging from 80 to 95

Main Problems

Transition between different sections

Confusing terminology, (chat), cluttered design

Overall Experience

Easy to use, user friendly

Clear what to do, all options available

User Interface Design

Simple, visual, confusing wording

Straight forward, visual with descriptions, one screen

Navigation

Missing home button, very deep, too many different options (top, bottom, side)

Clear, useful, single page app vs. overwhelming functions

Improvements

Page navigation, direct ask button

Wording, minimal layout, slide navigation

Observations

Navigation (Back, Hub)

Slides control

Confusion in Chat

Confident with guidance

familiar with slides (from ppt, YouTube)

Comfortable with chat (sidebar from Facebook)

RESULTS

STUDENTS

Frequency

one user would not use it

Complexity

One user finds it complex

Easy to use

Strong agreement

Technical Support

Not needed at all

Learnability

3 agree – 3 strongly agree

Cumbersomeness

One neutral

One agrees

Confidence

More confident (4 strongly agree)

Previous knowledge needed

One agrees (low academic experience, young)

24 of 42

Metric

Mobile

Desktop

Effectiveness

Only one mistake occurred (in Task 3)

Success rate: 100% in all tasks

Only two errors: in task 3 and task 4

All errors occurred in the first tested design

Success rate: 100% in all tasks

Efficiency

Task 1 and 2 according to expectation (STDEV 0)

More complex tasks 3 and 4: one more step needed than expected, P1 needs most clicks

performed actions slightly above expected

High STDEV in Task 1: range from 2 to 6

Average in Task 3 (3,75 actions) almost double as expected (2)

P4 is above average in all tasks

SUS

85, 63 (STDEV: 3,75)

Ranging from 82,5 to 90

88,13 (STDEV: 5,54)

Ranging from 80 to 92,5

Main Problems

Usage during class, info overload (too many options)

Voting section hard to find

Confusing wording, doubts on some functions

Laptop as a communication barrier

Overall Experience

Good, easy to perform, intuitive (one: trial&error)

Good, intuitive, not surprising

User Interface Design

Simple, standard, known elements, partly crowded

Traditional, standard vs modern, clear

Navigation

Too many options (hamburger, bottom)

Voting not intuitive, but other tasks fast&easy

Well designed, helpful, familiar, not deep

Improvements

Reward in chat, voting split in screens with guidance

Additional material, wall

Voting details, filter and order for students’ list, more detailed student profile, group all functions

Observations

Doubts on possible distraction, Slides control

Chat: doubts if recognized

Rewarding in different ways

Attendance: order of students, ON/OFF, Slides control

Chat: disturbed by notifications, Doubts on possible distraction

Rewarding: shortcut in chat

Frequency

Two strongly agree

Two are neutral

Complexity

Two strongly – two disagree

Three disagree

Easy to use

Three agree – one strongly

Technical Support

Four strongly disagree

Learnability

Slight preference (three strongly agree)

Cumbersomeness

Four strongly disagree

Confidence

Three agree – one strongly

Previous knowledge needed

One agrees (no test experience)

Four strongly disagree

RESULTS

PROFESSORS

25 of 42

PREFERRED

DESIGN

Mobile is better

Desktop is better

Effectiveness

X

Efficiency

X

User Satisfaction

X

Usability Problems

X

User preferences

X

FINAL RESULT

X

Mobile is better

Desktop is better

Tie

Effectiveness

X

Efficiency

X

User Satisfaction

X

Usability Problems

X

User preferences

X

FINAL RESULT

X

  • Better integration and flexibility in lecture
  • Only technical support during class (main task: teaching)
  • For more complex tasks: desktop
  • Bigger screen & better overview, all options available
  • Depends on task: mobile for simple, fun tasks like chatting
  • Doubts to use mobile for academic purposes

To go on for the 2nd interation?

26 of 42

Functions

  • Recording lectures
  • Automatically graded assignments
  • People: paid tutoring offers
  • Floor plan and orientation
  • Calendar function
  • Less number of functions
  • Automatic feedback on assignments
  • “History” of class happenings
  • Interactive sharing/discussion wall
  • More rewarding options (“too late”, “on phone”)
  • Easy grading during students’ presentation

Design

  • Use meaningful, clear icons
  • Improve wording
  • Minimalistic design

POSSIBLE SOLUTIONS

AND IMPROVEMENTS

Navigation

  • Home Button (mobile)
  • Tree vs. Hub
  • Page Navigation
  • Slide control

  • Placement of functions

27 of 42

TASKS

Complexity and frequency

FUNCTIONALITIES

Scope, purpose

To be especially considered

CONTEXT OF USE

Classroom, campus, office

DEVICE CHARACTERISTICS

Size, weight, integration, battery, connection

CONCLUSION

28 of 42

High Fidelity Prototype

HTML Website

29 of 42

Navigation Map

30 of 42

Wire Frames

31 of 42

32 of 42

Iteration II

High Fidelity Prototype

Usability Test

33 of 42

Motivate the users to interact in class

Social aspect in university environment

“More than just moodle”

GOALS

Class Management with material, collaboration & individual performance

Campus Life

(Paper work)

FUNCTIONS

SYSTEM

HIGH-FIDELITY PROTOTYPE

CHARACTERISTICS

Web-based

Device size: laptop

Real-time notifications based on locations / schedule

Interactive features such as chat, quizzes

MIU – My Interactive University

34 of 42

CLASS SETUP

ASK

VOTE

USER TASKS

FOOD

Open the class & take notes on the slides.

The professor through the system.

Participate in the voting through the system.

Check the food menu of the cafeteria.

1

2

3

4

STUDENTS

35 of 42

13

University students

between 20 – 29 years

USERS

Ø4,65

years of academic experience

9

6,7

Daily usage in h:

Other

device

36 of 42

TEST DATES & LENGTH

20th to 23rd of May 2018

30 minutes per session

ROLES

Facilitator: Andra

Observer – Impressions : Andrzej

Observer – Quantitative Measures : Miriam

TEST ENVIRONMENT

IMDEA Software Building / Researcher’s flat

TEST SESSIONS

37 of 42

TEST CONDUCTION

PROCESS

USABILITY

CRITERIA

QUANTITATVE METRICS

Effectiveness

Success of Task Completion

Number of errors

Efficiency

Number of actions

Time for Task Completion

Satisfaction

System Usability Scale (SUS)

User Experience Questionnaire (UEQ)

USABILITY

CRITERIA

QUALITATVE METRICS

Observations

Notes

General Impressions

Post - Interview

DATA ANALYSIS

FINDINGS

List of usability problems

Suggested Improvements

Conclusion

38 of 42

Metric

Results

Effectiveness

Average mistake < 1.5

Task 3 & 4 : success rate 100%,

Task 1 : 92,31%

Task 2: 23,08%

Efficiency

performed actions slightly above expected

Task 3(voting): needed the most additional actions

Time to complete tasks ~ 2x expected value

SUS

74,04 (STDEV: 15,46)

Ranging from 40 to 92,5

UEQ

Attractiveness and Efficiency were ranked the highest

with a value of 1,705 and 1,731 respectively.

All of the other categories scored higher than 1.2, with mean

above average or even good

MAIN QUANTITATIVE RESULTS

39 of 42

Metric

Results

Overall Experience

Good, easy to navigate and well organized

User Interface Design

Simple, friendly, clean, professional and with good icons

Navigation

Easy, fast and with a quite flat hierarchy.

(4 out of 13) need more time to feel confident with the system

Observations

2 / 13 participants used the pop-up as system short-cut

for access the class or slides

3 /13 participants found the chat symbol, many were confused and

posted on the dashboard status wall

using analogies to popular services by applying similar layout we decrease the

mental load allowing users to be more confident using the system

MAIN

QUALITATIVE

RESULTS

40 of 42

LEARNABILITY

Hard to familiarize for first-time users.

USABILITY�PROBLEMS

USER INTERFACE

Chat function and pop-ups difficult to notice

Side menu and top bar separated

No innovative design

USER CONTROL

No usage of pop-up notifications / Self explorer

No customizations options

FUNCTIONALITIES

Additional features for slides, collaboration, events.

NAVIGATION

No shortcut links on home page

Navigation hierarchy with navigation hub,

no direct access

41 of 42

LEARNABILITY

Hard to familiarize for first-time users

SUGGESTED IMPROVEMENTS�

USER INTERFACE

Chat function and pop-ups difficult to notice

Side menu and top bar separated

No innovative design

USER CONTROL

No usage of pop-up notifications / Self explorer

No customizations options

FUNCTIONALITIES

Additional features for slides, collaboration, events

NAVIGATION

No shortcut links on home page

Navigation hierarchy with navigation hub,

no direct access

  • Include top bar functions into side bar OR bigger and with text label
  • Place Pop-up centrally (with the option to deactivate)
  • More innovative design (still considering the mental model of the users)

  • Provide direct access to all important elements from every screen (e.g. change “view“)
  • Eliminate Main Navigation Hub
  • Hide Paper work (less important)

  • Offer customization options (e.g. drag & drop, colors)
  • setting whether to receive pop-ups or not

3rd iteration to evaluate usefulness (vs. overload)

Offer quick guided tutorial for first-time usage to explore system with the option to customize system

42 of 42

COMPLEX CHALLENGE

To satisfy multiple user profiles.

DIGITAL POSSIBILITIES

New, innovative way of teaching.

Conclusion on

our project

USEFUL SYSTEM

Participants liked the added value.

THE BIGGER PICTURE

Gained different skills, especially iterative

user-centered design.