Published using Google Docs
Process Doc
Updated automatically every 5 minutes

due dec 13 2019

BeeBot, The Composting Robot

Members: Dylan Arceneaux, Sophia Batchelor, Stephanie Daffara, Mu-Ti Huang

Designing for Emerging Technologies, Fall 2019

Project Description

Beebot is a robot built to make composting easy. It is a robotic composter which arrives at a Beebot stop or your door to collect compost and protect the environment. BeeBot takes your compost to a secret area where your compost is used to nurture trees and other flowering plants. The secret area is unique to each BeeBot, with each BeeBot having seeds and watering components to tend to its plants.


Project Description        1

01 Introduction        3

02 Design Process        3

Initial Ideation: The Relationship Between Humans and Robots        3

BCI Art Bot?        6

Rebuilding ROS        7

Design Iteration        10

03 Final Results        11

Bee Bot: The composting robot        11

Interactions        11

Fabrications        12

Code        13

Identity and Design         15

04 Future Iterations        15

05 Appendix        15


01 Introduction

It is no joke, nor an understatement, to say that humanity is slowly destroying nature. As nature has provided so much for us, from oxygen to land, to beautiful food, sights and living animals and plants, we have given very little in return. On the other hand we have constructed and invented so much, yet our inventions are still harmful to our planet. This is where our project steps in. BeeBot is a robotic composter, taking people’s leftover scraps, and turning it into food for plants. Not only that, it also keeps the location of its planting a secret, so that no human can find and harm it, and safe if needed.

BeeBot is envisioned to run  as a fleet through cities. They will have scheduled stops, much like a bus stop, where compost is dropped off at, and then taken to their secret garden for growing plants. Our robot can carry, transport, move through rough terrains, cut up the compost, deposit it, plant seeds, and water it’s plants, all within its shell. It is a fully functional composter created by humans that saves nature from us.

02 Design Process

Initial Ideation: The Relationship Between Humans and Robots

Traditionally robotic systems have been implemented to assist, or replace humans in various tasks. They are used in assembly lines to improve manufacturing time and accuracy, in bomb defusal to avoid human injury, and to complete tasks that are difficult or dangerous for humans to perform. Thus, the role of robots in society has traditionally been to fulfil a service. This is reflected in popular futuristic sci-fi depictions of robotic systems “taking over” humanity following a traditional student defeating master, subject uprising, and coup narratives. Our initial brainstorming sessions built into this idea; “what could a robot do for us”. Through experimentation we learnt that the Husqvarna robot could carry the weight of a person (image below).

Classmate with the robot.

To that effect, we started brainstorming different uses and ways that we could use the specific strengths of the Husqvarna.

List of our initial ideas on a whiteboard.

Notable Ideas from our brainstorming:

  1. Physical Therapy Robot
  1. A robot that can help the healing process after physical ailments.
  2. Use of string and proximity sensors for moving around the room and the patient.
  1. Art Bot
  1. something artsy, intervention. Like a robot that does contact improv with you, or helps you practice, or you can perform, collaborate with
  1. Emotional Support Bot
  1. The idea of a playful robot or affective computing is pretty awesome - like a robot being able to sense when you’re sad/happy and executing an action based on that.

Scanning The Model

Image of 3D scanning the Robot to use it’s model for fabricating a new shell.

To get a better grasp of the robot we were going to build from, we 3D scanned the lawn mower, with and without the shield.

Learning ROS

Now that the bot was scanned, it was time to learn about the software stack that made the robot move. The Robotic Operating System (ROS) that had been flashed onto our SD cards was running conflicting versions (jessie kinetic, buster indigo) which larger limited the interactions that could be used with the bot. Our group went through a crash course in ROS[1] and dove into development.

BCI Art Bot?

Upon learning more about ROS that we had been given to teleoperate the robot with, it was discovered that ROS could interface with the web and thus was capable of connecting with the 2019 brain-computer-interface device “Notion” which also used websocket protocols.

Notion is able to detect various neural patterns formed by the synchronized firing of neurons, what this means is that the system could learn what a specific neural fingerprint for (almost) any person is when they think of “up”, “left”, “right”, “down”, “triangle”, “pinch”, among various other event related actions[2]. If Notion could be connected via websocket to the robot running ROS, an individual could think of a triangle, and if a paint brush was attached, the robot would move and paint a triangle. This project was continued with for the first half of the project before pivoting due to various constraints around internet protocols allowed on school wifi[3], and the fact that there is a general lack of StackOverflow answers for “how to connect brain to a robot”.

Rebuilding ROS

The conflicting versions of ROS currently on the SD card meant that in order to interface with ROS beyond the supplied keystroke teleoperator, we would need to rebuild ROS. A new SD card was used and a new image was constructed that ran kinetic. It was a wonderful experience learning about swap files to get around the 1GB of RAM on the RaspberryPi, our first time with a Linux operating system, and first time playing around in lubuntu. It was a long struggle to get the necessary files onto the Pi in order for it to run ROS successfully, and in addition the standard “git pull” would not correctly install the ROS system architecture[4]. Once up and running we pulled the Husqvarna research platform into the ROS environment and started to control the robot with the same keystroke teleoperation that was supplied in the original SD image.

The moment the bot successfully ran the teleop.py script with the new image

Unfortunately the VNC viewer software that we had been using through out the Pi to remotely interface with it would not run on the Linux/Pi/Ubuntu system and so a 10 meter HDMI cable was used with a monitor, along with a peripheral keyboard and mouse.

The BCI-bot (brain-bot) proved to be immensely challenging as it has not been done successfully by any known team to date - this is large in part because of the tech stack that had been accessible for brain-computer-interfaces, which was why Notion was used[5]. The idea of the BCI-bot was for someone to imagine the strokes of a paint brush on a canvas, and without having the human lift a finger, to have that  art piece be painted. We wanted to critique the idea of robots as a service and have a collaboration between the artist (person) and the painter (artist). The performance of making art from a mental picture in your mind evoked the well quoted interaction in iRobot between Will Smith’s human character and the semi-conscious robot.

WS: “...can a robot turn a canvas into a beautiful masterpiece?”

Robot: “Can you?”[6] 

There were various learning curves for our group associated around working with a brand new tech ecosystem (ROS, nodejs, ROSbridge, HTML) and there were numerous battles between ourselves and the data debt we accidentally created 24hrs earlier.

We successfully opened up a websocket and were able to control a simulation of the robot through the webpage.

Websocket open and receiving data.

Test webpage that connected to ROS

Robot simulation (turtlesim) that was being controlled by Notion.

Design Iteration

After speaking to the instructors, we found out there was a large number of art creations with robots already. The idea was difficult to get across that one could control a painting robot with your mind by training a BCI system to recognize synchronized and desynchronization events of neural patterns. Thus, we decided to explore other options and Art-bot was no more.

This time, we looked at the top problems in the world and brainstormed what a robot with such capacity would be able to contribute to solving them.

List of ideas from the second brainstorm session

Notable Ideas from second brainstorming session.:

We returned to our initial discussions around what the role of a robot was and decided to play off the idea of robot uprisings destroying the planet.

Imagining of the world after robots take over as depicted by Terminator.

And instead play with robots as a medium for protecting the earth from the humans.

03 Final Results

Bee Bot: The composting robot

In the world today over 90 billion pounds of food ends up wasted, with 31% of food supply ending up in landfills. In addition, the food that sits in landfills can’t breakdown properly causing it to release methane, which is one of the greenhouse gas contributors. Sadly, this contributes to 8% of global greenhouse gas emissions.[7][8][9] 

One way to manage this issue is by composting. It provides wonderful benefits such as being a superfood for plants keeping them safe from pests and supercharged for growth. On top of that, composting is actually an easy and simple process. The compost material that people throw away properly goes through a composting process which chops and shreds the kitchen scraps, combines them with leaves and bark, and then adds water to it. This process produces compost over a period of about three months.

Still people do not compost! This is mostly because it is hard to maintain. Usually the compost bin smells and keeping it in a kitchen can create an unsafe cooking environment. Also many buildings do not have access to a compost bin or truck.

To manage this issue we created BeeBot.

BeeBot allows you to compost daily without having to collect too much food scraps within your kitchen. It will pick up your compost, and use it for planting!

The Features include:

Interactions

The interaction with BeeBot starts with the user walking up to a BeeBot stop or schedule one to show up at their door for pick up. There is a companion BeeBot Map App that shows users where the nearest stop is, along with the schedule for that stop.

     

Then the user opens the hatch on BeeBot and places their compost waste in the internal bin. Then When BeeBot is done collecting for the day (or they are full) they will ride off to their secret garden location. Next the compost is chopped by BeeBot’s motors. BeeBot then disperses a seed along with the compost onto the ground. Finally, water is poured onto the compost pile.

Fabrication

There were two ballast tanks on either side of BeeBot in order to distribute equal weight.

Angles of the modular “Bee” system that clipped onto the “Bot”.

3D printed hinges that were held by magnets.

Seed dispenser.

Undercarriage of the BeeBot

3D-printed stand for snapping onto the bot

The motor and the blade for the compost dispenser

Major Components:

Blade Design

Initial designs of the blade included thin blades that would spin on a 360 degree rotation to chop the compost. However, there was no stopping mechanism between the compost collection bin and the funnel. We revisited the blade design and cut a large fan shape that could utilize a 180 motor and overlap with the struts in an “off” potion to stop compost falling through.

Code

There were two parts to BeeBot’s code.

The Interaction Code

BeeBot planning code.

BeeBot had 3 motors and 1 additional control system in the microswitches that were attached to BeeBot’s doors.

A 180 servo motor was used to control the blade which would “chop” the compost by rotating rapidly between 180-0 degrees. It was decided to have a single large fan blade that would largely cover the gaps which allowed compost to drop through. Future iterations would include a double blade system or a hatch and bladed system to allow one blade dedicated to chopping compost and another larger piece to function as a “stopper” to stop compost from immediately falling through BeeBot’s funnel.

An additional 180 servo motor was used to open and close doors that released seeds into the compost. The water pump was a DC motor that had an “on” (1) and “off” (0) function.

BeeBot also had Audio and Light control scripts.

Both of these were used in order to inform users about the different states BeeBot could be in.

[10]

The sounds were chosen based on the interaction that they were involved with and what would best convey the message about BeeBot’s functionality during that interaction.

There were 5 light settings for BeeBot.

No lights for standard driving mode.

White lights whenever the hatch was open.

A temporary yellow confirmation light when the hatch was closed.

Green light for planting.

Blue light for when BeeBot was watering.

The light colours were chosen to be most intuitive for a bystander or user to understand.

Identity Design

Given the concept of the product, we decided to name the project BeeBot because bees are a polylectic animal. The action of collecting compost from people and planting around is similar to a bee's relationship to flowers. In addition, how the doors for the compost open on the top of the robot resembles the look of a bee.

Besides designing the bee for the logo, we also design the wings to be the symbol of infinity, hinting at the sustainability aspect of the product.

In terms of color, we didn't want the addition of BeeBot to create visual disruption in nature. Therefore, we decided to color the product green.

04 Future Iterations

Bee Bot has a lot of potential  to be put into the real world. But in order for it to be better at what it does there would have to be some sort of planning and studying behind where the Bee Bot stop would be and how often they should stop by them. Also, in order to support this project we would start a conversation with city planning and try to get subsidies from the cities we wish to implement this Bot.

For the compost and storing of materials it would be useful to add Computer Vision inside the bot that monitors what is being deposited, this way it can get rid of items that are not compostable.

To better the user experience we’d like to implement automatic self-opening and closing doors so that the user does not have to physically touch the bot. This would provide a safer and cleaner interaction. To better the safety we would also have the blade and hatch be in two separate compartments. This not only would make the bot safer for people sticking their hands in (even though the blades stop once the hatch is open) but it would also give us more control of when to push compost out.

Another interesting addition would be a reward system that incentivizes users to compost more. Maybe the user would get a score and a message that says “Congratulations! Your compost has feed 3 new plants this week.” We also think that this data, such as how much people are composting, what are they composting, what neighborhoods are falling behind on composting could be great information for city administrations and general public knowledge.

Finally, the bot is already designed in an extremely modular way, where you can simply snap on and off the shell. This way, one could maybe exchange shells depending on what they want to use the robot for.


[1] https://lmgtfy.com/?q=what+is+ROS

[2] “Event related potentials”

[3] Cal Visitor’s permissions

[4] That was found out “the hard way”.

[5] It also helped to have the neuroscientist who built some of Notion’s stack on the team -   although OpenBCI’s ganglion and gTec’s Unicorn were also considered and prototyped with.

[6] https://www.youtube.com/watch?v=KfAHbm7G2R0

[7] https://www.choosemyplate.gov/resources/lets-talk-trash

[8] https:://www.usda.gov/oce/foodwaste/faqs 

[9] https://ilsr.org/compost-impacts-2s://ilsr.org/compost-impacts-2

[10] “Eve” was a previous name for BeeBot and was preserved in the codebase

[11] If a seed is planted in a forest and there’s no one to hear it...