1 of 8

G(esture) Bot

Control motors with only gestures

Matthew Romage & Bach Tran

2 of 8

The Problem

  • Wheelchairs can be hard/clunky for disabled people to operate
  • We introduce a prototype of a bot that can move based on gestures to lift this burden completely

3 of 8

Approach: System Diagram

  • Idea: Classify gesture (up, down, left, right) and move motors accordingly. Data transmitted between 2 Arduinos.
  • Data Gathering: Collect ~40 samples/each gesture. Rasterize it to images and feed it to the model.
  • Models:
    • CNN: Classification for 4 classes.

4 of 8

Data Collection

  • 4 classes: [up, down, left, right]
  • ~40 images per class
  • Recorded via Magic Wand Gesture Recorder
  • Up = Forward
  • Down = Stop
  • Left = Turn Left
  • Right = Turn Right

5 of 8

Model

  • 3 convolution blocks
  • Global Average Pooling
  • Dense FC
  • Results:
  • 99.3% accuracy (quantized) on validation set
  • Size:

6 of 8

Demo

7 of 8

Demo details - Let’s dive in

  • There are 2 Arduino boards: peripheral (reads data, give predictions, give results to central); central (receives data and run the motors)
  • These 2 boards are connected via Bluetooth and listens for event via BLENotify.
  • Peripheral: IMU sensors read if there is a gesture in progress, rasterize it and feed it to the model => produce label
  • Central: Read labels and move/stop/turn left/turn right the bot

8 of 8

What I learnt and What I would improve

  • Transmitting data via Bluetooth was a very hard process (latency, delays, different formats, etc)
  • Tuning the robot movements was also quite time consuming

  • Improvement:
  • More complex movements (circle, curve up, curve down, etc..)
  • Added KWS support
  • Adjust speed functionality via intensity of gestures.