1 of 42

Team Members:

Gitaek Lee

Thomas Xu

Calen Robinson

Jinkun Liu

Yukun Xia

Preliminary Design Review Presentation (3/22)

2 of 42

Project Description

3 of 42

Predictive Avoidance for Industrial Mobile Robots

Context:

  • OMRON deploys fleets of industrial robots
  • Inefficient obstacle avoidance causes productivity losses

Main goals:

  • Develop algorithm to detect and classify obstacles
  • Avoid obstacles more efficiently by predicting their future motion
  • Increase overall productivity for a fleet of robots over time

Sponsor: OMRON

3

4 of 42

Main Components

  • Object Detection
  • Object Localization

Best validated in real-life

4

Best validated in simulation

  • Path Planning in Large Environments
  • Trajectory Prediction
  • Obstacle Avoidance

in Multi-robot-vehicle Interaction

5 of 42

Real-Life System (RLS)

  • TurtleBot
  • Functionalities:
    • Obstacle Classification
    • Obstacle Localization

Main Systems

Virtual Robot System (VRS)

  • Gazebo
  • Functionalities:
    • Path Planning/Following
    • Trajectory Prediction
    • Obstacle Avoidance

5

6 of 42

Use Case

7 of 42

Use Case

The robots are deployed from their storage area and begin moving autonomously toward their commanded goal points to complete tasks. On its way to the next task endpoint, one of the robots detects a moving obstacle in its path. The robot detects both the obstacle’s position and its classification: a forklift. Using this information, the robot predicts its future motion and determines an evasive maneuver. It takes the maneuver to avoid the obstacle and continue on its path.

7

8 of 42

Requirements

9 of 42

Performance Requirements - Virtual System

From sponsor

specs and

environment

Overall performance metric

For real-time

operation

9

The System will:

V1. Autonomously move with max speed of 1.8 m/s

V2. Autonomously move with max rotation speed of 60 deg/s

V3. Autonomously move with min cruise speed of 0.5 m/s

V4. Increase productivity1 by > 5% using avoidance strategies based on object classification, compared to the nominal case2

V5. Operate in real-time with planning time within 100 ms

V6. Receive user-commanded waypoints within 1 s (desirable)

[1] Productivity: The number of finished delivery tasks per unit time (e.g. day)

[2] Nominal case: Avoidance without considering classification or prediction (i.e. considering obstacles as “static” at each instant in time)

10 of 42

Performance Requirements - Real-life System

10

The System will:

R1. Classify obstacles1 of interest with mAP2 of at least 60%

R2. Detect positions of obstacles of interest within 0.1 m accuracy

c. Detect obstacles of interest within a range of 3 2 m

R4. Output results of positioning and classification within 100 ms per

frame

[1] Obstacles: Forklifts, pedestrians, and other OMRON robots (simulation only)

[2] mAP: Mean average precision

11 of 42

Non-functional Requirements

11

  1. Mobility subsystem should be non-holonomic

(Restriction based on sponsor’s robots)

  • Modular1 avoidance subsystem

(Allow comparison against nominal case)

  • Ability to operate in a cluttered non-empty2 environment

[1] Modular: Able to switch between any combination of individual avoidance algorithms on-the-fly during operation operate with either only local A* (nominal case) or predictive avoidance subsystem

[2] Non-empty: The environment for validation is not only free space, but contains static walls/corridors/objects so that it resembles an actual industrial environment more closely

12 of 42

Architectures

13 of 42

Functional Architecture - Virtual System

13

14 of 42

Cyberphysical Architecture - Virtual System

14

15 of 42

Functional Architecture - Real-life System

15

16 of 42

Cyberphysical Architecture - Real-life System

16

Robot

Data Acquisition Subsystem

Computing Platform

Mobility Subsystem

Visualization Subsystem

Ground Truth Subsystem

Environment

Command Module

YOLOv4

Sensor Fusion

Camera

2D LiDAR

Movement Commands

Marker Tracker

Wi-Fi Module

Motor Controller

Position Error Calculator

Wheel Motor

Wheels

Classification mAP Calculator

Camera

LiDAR Data Processing

Obstacle/robot positions in world frame

Data Visualizer

Obstacle Mock-ups

Obstacle positions in robot frame

Power Supply

Battery/Tether

Power

17 of 42

Subsystem Description/Progress

18 of 42

Factory Environment

18

Full-scale factory environment (92m x 64m)

Occupancy grid map for the factory

Full-scale, 92 x 64 m; Done, with tuning in-progress

19 of 42

Object Models

19

Robot (OMRON LD-60)

- Differential drive

Forklift

- Reverse Ackermann steering

20 of 42

Robot Path Planning and Pure Pursuit

20

Path Planning:

  • Uses given map/floorplan to create costmap: Done
  • Plans static path (A*) within costmap from start to goal point: Done
  • Robot uses pure pursuit to follow pre-planned path: Tuning for edge cases

x

x

x

y

y

y

start: (6, -2)

start: (6, 2)

goal: (5, 8)

goal: (5, 8)

Top view of the environment

Costmap with path 1

Costmap with path 2

21 of 42

Forklift Trajectory Recording and Replay

21

  • Forklift obstacles can be manually controlled
  • Trajectories can be saved as text file
  • Can replay trajectories
  • Done

2x speed

22 of 42

Obstacle Filtering

22

  • Take simulation ground-truth pose and add noise (Gaussian)
  • Use raycasting with range to determine visibility

  • Current status:

To be implemented in Fall

23 of 42

Local A*

23

  • Acts as nominal case
    • Robot path is replanned at each time step
    • Assumes static obstacles at each step
    • Can replace predictive avoidance as the avoidance node

  • Current status:
    • Planning done
    • Requires obstacle filtering and a local costmap
    • To be implemented in Fall

Credit: https://www.jstage.jst.go.jp/article/transinf/E96.D/2/E96.D_314/_article

24 of 42

Trajectory Prediction

24

Kruse, E.; Gutsche, R.; and Wahl, F.: Acquisition of statistical motion patterns in dynamic environments and their application to mobile robot motion planning. In: Intelligent Robots and Systems (IROS). Volume 2, pages 712–717, 1997.

25 of 42

Stochastic Trajectory “Dictionary”

25

[KGW97]

26 of 42

Predictive Avoidance

26

Current Status:

  • Classes/functions designed
  • Implementation & unit testing in progress

27 of 42

  • Modified TurtleBot Burger
  • Data Acquisition Subsystem: Point Grey Camera & RPLIDAR A2
  • Computing Platform: NVIDIA Jetson Xavier AGX
  • Mobility Subsystem: DC Motors & Wheels (x2), OpenCR driver board

Physical Robot Base

27

28 of 42

Current System Status - Physical Robot Base

  • Functioning:

Robot base: all sensors, computing platform, mobility subsystem

  • Work in progress:

Untethered operation with Wi-Fi and battery

28

29 of 42

  • YOLOv4 trained to recognize obstacles of interest
  • YOLOv4 outputs bounding boxes around obstacles
  • Match bounding boxes to points clouds in LiDAR data (sensor fusion)
  • Output list of obstacles with class & position estimation
  • Current status: to be implemented in fall

Real-Life Obstacle Perception

29

Credit: Stanford

30 of 42

  • Overhead camera captures robot & obstacles
  • Markers on top of objects
  • Everything scaled down to 1:10
  • Current status: to be implemented in Fall

Real-Life Ground Truth Subsystem

30

31 of 42

Project Management

32 of 42

WBS

32

33 of 42

Spring 2021 Schedule

34 of 42

Fall 2021 Schedule

35 of 42

Milestones

Milestone

Capabilities

Test Plan

PR3

  • VRS Trajectory dictionary generation
  • VRS Distributed Simulation
  • Show dictionary dataset
  • Show simulation working on separate computers

PR4

  • VRS Obstacle trajectory dataset
  • VRS Predictive Avoidance
  • Show dataset
  • Show collision predictions

PR5-6 (SVD)

  • VRS Predictive avoidance with one robot
  • Show robot predicting and avoiding forklift

September

  • VRS Local A* (nominal)
  • RLS Ground Truth Subsystem
  • RLS Forklift image dataset
  • Demonstrate nominal obstacle avoidance
  • Demonstrate obstacle localization using ground truth camera
  • Show dataset

October

  • VRS obstacle filtering
  • RLS Yolo v4 Detection
  • Demonstrate obstacle filtering
  • Show classification results

November

  • RLS Camera + lidar sensor fusion
  • Show localization results

FVD

  • VRS Productivity increase from prediction
  • RLS Object detection, classification, localization
  • Compare prediction vs local A*
  • Compare results with ground truth

VRS = Virtual Robotic System

RLS = Real-life Robotic System

36 of 42

  • Test location: A classroom with at least one projector
  • Sequence of events:
  • Start simulator with factory environment
  • Forklift and robot begin moving according to scenario descriptions
  • Robot predicts forklifts movements and avoids collisions
  • Repeat for each scenario

  • Success criteria:
  • Robot cruises at 0.5m/s to 1.8 m/s
  • Robot’s rotational velocity is less than 60°/s
  • Robot plans at a minimum of 10 Hz
  • With the above constraints, robot can successfully:
    1. Predict forklift trajectory
    2. Plan evasive maneuvers
    3. Avoid collisions

Spring Validation Demonstration

36

37 of 42

  • Test location: A classroom with at least one projector
  • Sequence of events:
  • Demonstrate predictive avoidance in multi-obstacle scenario
  • Demonstrate local A* (nominal) avoidance in multi-robot/multi-forklift scenario
  • Contrast productivity of predictive avoidance and local A*

  • Success criteria:
  • All criteria from spring validation demonstration
  • Predictive avoidance is at least 5% more productive than local A*

Fall Validation Demo - Simulation

37

38 of 42

  • Test location: Classroom with empty space
  • Sequence of events:
  • Setup overhead ground truth camera and place robot/obstacles in testing area
  • Drive robot while obtaining classification/position of obstacles from robot and ground truth subsystem
  • Compare robot detection results with ground truth results

  • Success criteria:
  • Detect obstacles up to 2 m away
  • Classification accuracy of at least 60%
  • Detect obstacle positions within 0.1 m
  • Detect obstacles at at least 10 Hz

Fall Validation Demo - Real-life Perception

38

39 of 42

Budget

  • Total Budget: $5,000
  • Spent 50%
  • Big ticket items
    • NVidia Jetson
    • Turtlebot
    • RPLidar

39

Part

Quantity

Unit Cost

Total Cost

TurtleBot (Burger)

1

$549.00

$549.00

Mic/Music stand

1

$100.00

$100.00

Lights

2

$70.00

$140.00

Doll collection

1

$35.00

$35.00

Forklift A

1

$38.00

$38.00

Forklift B

1

$49.98

$49.98

NVIDIA Jetson Xavier AGX

1

$700.00

$700.00

RPLIDAR A2

1

$319.00

$319.00

Firefly S Camera

1

$199.00

$199.00

M.2 Wi-Fi card

1

$25.00

$25.00

Additional Hardware

1

$46.06

$46.06

PDB PCB parts

1

$200.00

$200.00

Total

-

-

$2,481.04

40 of 42

Risk Table

40

41 of 42

Risk Matrix

Before Mitigation

After Mitigation

41

42 of 42

Questions