1 of 11

Blind Assist System

Ben Ingraham, Nick DiCara, Milton Crispin, Jason Aguilera

2 of 11

Agenda

  • Problem Statement
  • Problem Solution
  • Personnel and Budget
  • Hardware Design
  • Overall System Design
  • Software Abilities
  • Conclusions
  • Questions

3 of 11

Problem Statement

  • According to the World Health Organization, there are 285 million people who are considered visually impaired. About 245 million are diagnosed with low vision and 40 million are blind. In today’s fast-paced, high traffic world it can be difficult for the blind, or visually impaired to navigate areas.

  • Traditional forms of aid include visually impaired walking canes, and service dogs. However, these strategies have their limitations, and service dogs can be very expensive.

4 of 11

Problem Solution

  • Our system is designed to assist the visually impaired by increasing their knowledge of their surroundings. We aim to increase the efficiency of the user's mobility through audio aid and vibrational warnings. It can be used as another layer of affordable aid with traditional assistance canes or service dogs.
  • Designed to be lightweight and wearable for everyday use, it utilizes a camera and sensors in order to make users aware of a person or object that may be in front of them.

5 of 11

Personnel and Budget

  • Electrical Engineers
    • Nick DiCara
    • Ben Ingraham

  • Computer Engineers
    • Milton Crispin
    • Jason Aguilera

Part

Cost

Raspberry Pi 3 Model B

Available in Lab

VL53L1X Time-of-Flight Distance Sensor

$11.95 x 6 = $71.70

Vibrating Mini Motor Disc (15 pack)

$12.99

Raspberry Pi Camera Module v1

Available in Lab

USB Battery Pack for Raspberry Pi - 4000mAh - 5V @ 1A

$24.95

ON/OFF Switch

$2.95

Chest Strap Harness

$8.99

Auxiliary Headphones

N/A (Already Owned)

TOTAL

$121.58

6 of 11

Hardware Design

7 of 11

System Design

8 of 11

Software Abilities

  • Software implementation is done in Python
  • Raspberry Pi utilizes OpenCV face recognition to track faces within video stream.
    • face_cascade=cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
    • faces = face_cascade.detectMultiScale
  • The device implements triangle similarity in order to track the distance from our camera to a detected object (face).
    • (F) = (P x W) / D
    • D’= (W x F) / P

9 of 11

Software Abilities Continued

  • Flite Speech Synthesis Program converts distance measured into audio output (via headphones)
    • Flite (Festival-Lite) is the library used for speech synthesis
    • Example: os.system(‘flite -t “Person detected four feet ahead” ‘)
  • Vibrating haptic feedback discs are activated via pulse width modulation from detections by the time of flight sensors.
    • Example: Left.ChangeDutyCycle(100 - (distance/11))
      • We adjust our left pwm GPIO pin on a scale from 0 -100
      • Based on the distance we increase or decrease intensity of vibrating disc

10 of 11

Conclusions

  • Overall, our design was successful and proved our concept, however there are a few improvements that could be made in future designs.
  • Designing the system using the most affordable sensors led to inaccuracies in our data, and reliability issues. Broken components caused setbacks and more labor hours. A major improvement would be to invest in higher quality components (i.e. camera and sensors).
  • Confident that our system will enhance the safety and awareness of its users

11 of 11

Questions?