1 of 13

Gesture Keyboard

<Names retracted for anonymity>

2 of 13

Motivation

Assistive Technology (Keyboard)

Leap Motion (Finger Tracking)

3 of 13

Background Research / Prior Work

  • Idea: Nomon Keyboard (http://www.inference.org.uk/nomon/instr.html)
    • Assistive keyboard for users that need assistance in using the keyboards. The program has 26 + dials that the user can select by clicking the space bar when the dial that corresponds to the letter that they want to type hits 12.

  • Prior Work: Leap Motion
    • Past UROP work done using the hardware
    • Detecting hand motion to utilize the web application developed.

4 of 13

Reading Data from the Device

  • Tracks all 10 fingers, including the palm vector and hand radius.
  • Comprised of two cameras and three infrared (IR) LEDs.
  • Interaction space: 2ft wide by 2ft long by 2ft deep
  • Take coordinates of the tip of the index finger

5 of 13

Training Data

EMNIST dataset for handwritten letter recognition

  • EMNIST (an extended version of the MNIST dataset)
  • https://www.kaggle.com/crawford/emnist
    • EMNIST letters traning (30MB) / testing (5MB)

6 of 13

Data Transformation

  • Input: 28 X 28 pixel images
  • Reshape and rescale the data
    • [0, 255] → [0, 1]
  • Split the data into training and testing data

7 of 13

Architecture

  • Initialize Sequential Model and add layers with respective neurons
  • Output: decision of 1 letter
    • Output layer set with 26 (num_classes) neurons.

8 of 13

Training

  • Small loss and value loss smaller than training loss
  • Good fit overall

9 of 13

Project Overview: Connecting to Database

image of drawing detected by Leap Motion

Firebase Realtime Database

10 of 13

Results

Test accuracy for the model on the EMNIST dataset: 93.1%

11 of 13

Future Works

  • Automatic Detection of beginning and the end of the character
  • Improved accuracy
  • Better user interface

12 of 13

Reference

  • http://cs231n.github.io/convolutional-networks/#layerpat

13 of 13