Published using Google Docs
486-W19-Project
Updated automatically every 5 minutes

Cal Poly SLO                CSC 486-W19 Human-Computer Interaction                 Prof. Franz J. Kurfess

Project Overview

The term project is an important part of this class, and provides an opportunity to practice some of the methods discussed in class, and described in the literature. This quarter, the projects are aligned along the theme of interacting with mobile devices. Some of them will be done in collaboration with outside partners.

The emphasis of the project is on the user interaction and user interface aspects, not so much on the underlying functionality. The project work may also focus on a usability evaluation of an existing product or system, and the design of an alternative interaction paradigm or user interface for that system.

Project Topic

I’ve prepared a list of possible project topics on the Piazza page for the class (only accessible to students, instructors, and affiliated collaborators), including some involving outside partners.

Project Organization

The project is a team effort, with a team size of about five people. Ideally, your team should be the same as for the assignments. Typically, each team will use a TRAC Wiki as the repository for project materials. The Wiki contains templates for the different parts of your project, and notes on what you’re expected to include. Depending on confidentiality constraints with outside collaborators, teams may also use more secure arrangements for their repositories. The following sections identify the main parts of the project, and typically correspond to entries in the class schedule. For the the assessment of the team project, I usually use rubrics. You can view them in a Google docs Rubrics spreadsheet. Not all rubrics are available initially, and they will be based on the short descriptions given below. I’ll do the actual scoring of the system on PolyLearn with abbreviated versions of the rubrics that contain only categories and rating labels (like “good” or “excellent”). In the past, I’ve included the details in the PolyLearn rubrics directly, but this makes them rather unwieldy to use.

Project Overview, Background, Difficulty and Relevance

Features, Requirements, and Evaluation Criteria

I'm using this to evaluate the part of your project description where the team identifies what they will be working on, and how it can be evaluated.

System Design and Architecture

This section focuses on the functional components of the system, their interfaces and interaction methods with other parts (possibly also users), and design patterns.

The description usually is at an abstract level, and should not refer to specific implementation aspects (such as languages, technologies or tools used); those will be addressed in the next section. The core of this section is often a block diagram that shows the main components and how they interact. Class hierarchies and UML diagrams also frequently fit in this section. If you use specific computational methods or algorithms, you can describe them here, e.g. through flowcharts or pseudocode. Data structures, data base schemata (e.g. entity-relationship diagrams), ontologies, APIs also typically fit in here). If necessary, also address changes in the system design that became necessary during the course of the quarter.

Implementation

Here you identify specific technologies, tools, languages, development methods or processes, and hardware aspects. Obviously there will be a close correspondence to the previous section. If necessary, you can also merge this section into the previous one, but then you should be careful to structure the combined section in an appropriate manner so that design and implementation aspects can still be identified separately. You should also address obstacles and implementation issues here, especially if they lead to changes in the design and architecture of your system. If you’re using an agile development model, you will probably go through multiple cycles of implementation -> validation and evaluation -> user feedback, either until the system is reasonably stable, or the quarter is over.

Validation and Evaluation

This section should identify mappings between features of the system as described in the respective section, and the corresponding requirements. You should use the evaluation criteria identified there to measure or judge important aspects of the system, such as performance, response times, task completion times, responses to test inputs, or comparison against standards or similar systems.

User Feedback

If possible and appropriate, you should demonstrate your system to members of the intended user community, and collect feedback from them. This can include reactions to the user interface, comments on functionality of the system, or their impressions of the performance.

Conclusions

This section gives an overview of the system from the "hindsight" perspective. Typically it includes a discussion of features, performance aspects, interesting technical and implementation issues, lessons learned, and future work.

Project Grading Scheme

The contribution for the project to the overall grade is 40%, distributed as indicated in the table below. Usually every team member gets the same score. If there is a clear discrepancy between the contributions and performance of the different team members, I may give individual scores for team members. My score will take into account feedback from external customers (for projects that have one), but their scores will not go directly into my grade calculations. Up to 10 points come from an evaluation of your teammates, calculated as the average of all your teammates' scores for your contributions. Students who do not submit this mutual team member evaluation may receive a significantly lower score for this part. However, I may adjust this part, especially if all team members give each other the highest score, but the overall quality of the project is not correspondingly high.

Aspect

Explanation

Percent

Mid-quarter Project Display

Presentation overview; user interaction; usability; explanations by team members; demonstration

25

Final Project Display

same as above

25

Project Documentation

based on the criteria above under “Project Organization”

25

Mutual Team Member Evaluation

calculated from the feedback scores by your teammates, possibly subject to adjustments

25