WALLI - Waste Awareness Leads to Lasting Impact
Problem: Americans incorrectly sort around a third of their recyclables due to inconsistent recycling guidelines, inadequate education, confusing material identification systems, and lack of motivation, leading to contamination in recycling, higher processing costs, and unnecessary waste in landfills, which contributes to environmental degradation and greenhouse gas emissions.
Name: Max Castrapel and Shawn Castrapel
School: Sierra Vista Middle School
Teacher: Mr. Trevor Johnson
The idea for WALLI came from a problem my brother and I noticed at home. When we threw away trash or recyclables, we sometimes put them in the wrong bin, and our mom had to sort everything again, which was really frustrating for her. We realized that this wasn’t just an issue for our family—many people are confused about what can be recycled, both at home and in public. When we researched more, we found out that a lot of recyclables end up in the trash because of this confusion. Even worse, when trash is accidentally disposed in the recycle bin, often times the trash companies throw everything in the landfill due to contamination. (Blance, Spanbauer, & Stienecker, 2023).
Seeing how big this problem was, we wanted to come up with a solution that could help people sort their waste correctly, while also making it fun and educational. That’s how WALLI (Waste Awareness Leads to Lasting Impact) was born. WALLI is designed to help people know where their waste should go, motivate them to recycle properly, and educate future generations on how to protect the environment.
Abstract
Our project tackles the problem of recycling contamination with a smart waste-sorting system. Worldwide, humans currently make more than 2 billion tons of waste annually, and if the trend continues, it will be more than 3 billion tons by 2050. (Worldbank, n.d.). In 2018, the U.S. alone generated almost 300 million tons of waste, which is 4.9 pounds per person per day, with 66 percent of paper material being recycled, but more than 18 percent of plastics ending up in landfills (EPA, n.d.). To help solve this, we created WALLI, which uses computer vision, machine learning, and a reward system to teach users how to sort waste properly.
WALLI is powered by a Raspberry Pi with a camera and servo motors. We used custom Python code with the YOLO model to identify items as either recyclable or trash. The system includes facial recognition so it knows who is using it and accepts voice commands so users can say if they’re throwing away trash or recycling. WALLI checks if the item matches what the user said, gives feedback with an audio response, awards points for correct sorting, and opens the right bin.
Our tests showed that facial recognition worked with 95% accuracy when using the DeepFace library with the Facenet512 recognition model. (Serengil & Ozpinar, 2024). The accuracy of identifying recyclables varied depending on the item and its position, with the average item being identified correctly 73% of the time. Voice command recognition reached around 96% accuracy thanks to the Whisper AI model and Levenshtein distance, which helped recognize similar words even if they were detected differently.
WALLI makes recycling more interactive and motivational by recognizing waste and encouraging proper disposal through its point system and a website that offers rewards that can be customized by parents and teachers, with the initial reward being a virtual tree that grows as users earn points. The long-term goal is to reduce contamination, confusion, and the amount of waste in landfills. Although we faced challenges with processing speed and detection accuracy, WALLI shows that affordable technology can make recycling easier and more educational through positive reinforcement.
Triangular recycling symbols mislead and confuse consumers, contributing to a high contamination rate. Note: Only numbers 1, 2 and 5 are commonly recyclable in California. (Hayward, 2024).
(Garcia, 2021)
Introduction (Background Research)
Project Constraints:
1. Cannot handle items larger than 1 foot
2. Limited to recyclables (glass, metal, paper, plastics #1, 2), and trash (no compost)
3. Unable to determine whether different types of plastic are recyclable or not
4. Needs Wi-Fi access
5. Can only process one item at a time
Source | Features | Cost |
WALLI | Trash sorting: ✓ Reward system: ✓ Education: ✓ At Least 3 Bins: X | <$200 |
TrashBot | Trash sorting: ✓ Reward system: X Education: X At Least 3 Bins: ✓ | >$1500 |
MyMatr | Trash sorting: ✓ Reward system: X Education: ✓ At Least 3 Bins: X | ~$3000 |
Only 32 percent of household recyclables are captured in the United States. Households who have access to curbside recycling often mishandle recyclables, which increases the contamination rate through improper mixing of waste and recyclables (The Recycling Partnership, 2020), and increases the amount of recyclables in our landfills. Inconsistent regulations and recycling symbols create confusion and discourage recycling participation. (Latkin, Dayton, & Yi, 2022). Some cities have attempted to improve recycling rates by modifying trash pickup systems to be more accessible and implementing clearer tags on products to reduce contamination. (The Recycling Partnership, 2020). However, education from childhood through adulthood, and incentive systems can be the most effective sustainable solutions that will encourage behavioral changes and environmental awareness. (Jam, 2024). Current solutions like manual sorting are expensive and not efficient. Smart bins already exist, but their high cost prevents them from being widely used. WALLI includes trash sorting capabilities, and a rewards and education system to motivate users of all ages to recycle, at a cost of less than $200 for the hardware.
Success Criteria:
1. Should accurately detect a user and common recyclables with at least 70% accuracy and recognize a user in less than 5 seconds
2. Should be portable, easy to install, and support multiple users
3. Should provide a web-based point system to motivate users
4. Should provide educational feedback
Engineering Goal: Create an easy to install, portable, inexpensive, accurate, reliable, fast program to detect trash and recyclables, and reduce confusion between trash and recyclables, motivate, and educations users to recycle more.
Machines Like WALLI
(EPA, n.d.)
The Engineering Solution, Prototype/Model to be tested.
Webcam
for mic only
Speaker
Breadboard
and
Raspberry Pi
(behind)
Servo 1 and camera
Servo 2
WALLI is built with a Raspberry Pi that controls the system, two servo motors with chopsticks hot-glued to them, plastic tubing hot-glued to trash can lids (chopsticks fit inside to lift lids), a camera for facial recognition and object detection, a speaker, a microphone (from USB webcam) for voice commands, manual control buttons, a desktop PC with a GPU (separate server) for facial recognition, voice-to-text, and object detection.
The system identifies users with facial recognition, identifies waste as trash or recycling with YOLO object detection, understands voice commands with OpenAI Whisper, opens the correct bins with servo motors, gives the user points, and provides a website to view points. The Raspberry Pi communicates with the desktop server over HTTP for facial recognition and object detection because they are too intensive for the Raspberry Pi.
Materials
Hardware
Item | Count |
Raspberry Pi | 1 |
Camera module | 1 |
USB webcam (Only used microphone) | 1 |
Tupperware | 1 |
Bluetooth speaker | 1 |
Plastic trash bin | 2 |
Servo Motor | 2 |
Chopstick | 2 |
Desktop Computer with GPU (Server) | 1 |
Operating Systems
Raspberry Pi OS (64 bit), macOS Sequoia (15.0)
Programming Languages
Python, HTML, JavaScript, CSS
Software and Libraries
OpenCV, TensorFlow, Fritzing, PyTorch, Whisper, DeepFace, PiCamera 2, YOLO v8 - Oriented Bounding Boxes, Levenshtein, PyDub, SciPy, PlaySound, gTTS, GPIOZero, FastAPI, NumPy, Roboflow, Ultralytics, SQLAlchemy, Facenet512, MTCNN, Mermaid
Procedure
Raspberry Pi
Code
Server Code:
Results – Data/Observations
Face Authentication v1 (face_recognition) Accuracy
Trash/Recycling Object Detection Accuracy
Speech-to-text Word Detection Accuracy (10 attempts each)
Item | Accuracy | Detected? |
Plastic Bottle (#1 and #2) | 96% for bottle and 82% for cap | Plastic bottle, plastic bottle cap |
Crumpled up paper bag | 95% | cardboard |
Aluminum can | 94% for can and 82% for Pop Tab | Pop Tab, aluminum can |
Aluminum foil | 91% | Cardboard |
Cardboard | 85% | Paper |
Paper drink carton | 81% for drink carton and 71% for cap | Plastic bottle cap and paper drink carton |
Glass Bottles | 73% | Plastic bottle |
Food container | 62% | Cardboard |
Plastic Jar | 57% | Glass Bottle |
Paper | 38% | Paper |
Books | 33% | Paper |
Chip Bag (Not recyclable) | 77% | Paper |
Plastic Bag(Not recyclable) | 73% | Cardboard |
Stats | Correctly Detected? | Detected as? | Confidence? |
No person | Yes | N/A | N/A |
Not a valid user | Yes | Unknown person | 33.37% user Max 32.9% user Shawn Correct because confidence below 50% |
Shawn (w/glasses) | Yes | Shawn | 78.46% user Shawn 67.29% user Max |
Max (w/glasses) | Yes | Max | 72.68% user Shawn 78.46% user Max |
Shawn (no glasses) | Yes | Shawn | 74.76% Shawn 72.38% Max |
Max (no glasses) | No | Shawn | 66.49% user Shawn 65.24% user Max |
Stats | Detected? | Detected as? | Confidence |
Max (w/glasses) | yes | Max | 100% |
Max (no glasses) | yes | Max | 100% |
Shawn (w/ glasses) | yes | Shawn | 90% |
Shawn (no glasses) | yes | Shawn | 90% |
Face Authentication v2 (DeepFace) Accuracy - 10 attempts each
Face Authentication: The face_recognition library was confused with Max and Shawn (without glasses), so we switched to the Deepface library with the Facenet512 model, which had an accuracy of greater than 95% under good lighting conditions regardless of whether the user has glasses on.
Object Detection: WALLI detected plastic bottles, aluminum cans, and paper bags with high accuracy. Aluminum foil, cardboard, paper drink cartons, and glass bottles were correctly detected as recyclables over 70% of the time, but often miscategorized. Chip Bags and plastic bags were incorrectly detected as recyclables over 70% of the time.
Voice Recognition: Using Whisper AI, the user’s intention (“trash” or “recycling”) was detected accurately 96% of the time when the user used a trigger word by itself, or in a sentence.
Manual Bin Operation: Manual buttons worked well as a backup, letting users open bins if they are in a hurry and don’t want to use the full WALLI system.
Trash | Recycle | Recycling | Garbage | Rubbish | Refuse |
100% | 100% | 100% | 100% | 100% | 75% |
Revised Solution and Prototype/Model
We tried several ideas and improved the design multiple times before deciding on the best one. For authenticating the user, we initially tried NFC tags and QR codes, but both were too inconvenient because users had to use a cell phone while throwing waste away. We ultimately chose facial recognition using a camera and machine learning using a library called face-recognition, and later we switched to DeepFace, which gave us an accurate way to identify users.
Our hardware improved through multiple prototypes. In the first version, we placed a servo motor with a chopstick under the trash can lid, but the jittery motor made it unreliable, and the lid couldn’t be opened by hand. In the second version, we connected the lid to the servo hand (chopstick) with magnets, allowing the lid to be pulled up for manual operation, but having the servo motors inside the lid wasn’t ideal because it would get dirty and possibly become less reliable over time as trash was thrown in the can.
Our final prototype solved these issues by placing the servo on top of the lid, keeping it safe from trash and avoiding tangled wires. We also added manual control buttons to ensure users could open the lid even if the device wasn’t working properly or if the user wasn’t registered. The final design was simple and a smoother experience for users.
8
Prototype 1
(Servo inside, using Cable Tie. Securely attached to bin. User cannot manually open bin.)
Prototype 2
(Servo inside, arm in closed position. Using magnets shows how bin can be manually opened.)
Prototype 2
(Servo Inside, using magnets.
Bin in open position.)
Prototype 3
(Servos outside, Securely attached to lid. Added Speaker, Camera, Microphone, and buttons to manually open bin, 2 bins)
Discussion
Facial recognition: Initially, we were authenticating users against their image taken from a cell phone, but we found it more accurate to authenticate the user against images taken from the same camera that we attached to the trash can because it uses the same resolution, lighting conditions, and angle as the images we were comparing to. When these conditions were met, the accuracy of authenticating users with their face was above 95%.
Challenges: We found that most facial detection libraries were trained mainly on images of Caucasian individuals and didn’t work as well for faces with Asian features. They also have trouble with family members (Yucer et al., 2023). This was a problem for us since my brother and I look similar. Identical twins would have a similar problem. The system often mixed us up, even when we switched from using the facial_recognition library to DeepFace. By taking images multiple times, experimenting with different models, then removing blurry images, the accuracy was above 95%.
Object detection: Accuracy of object detection saw improvements with the usage of Oriented Bounding Boxes in YOLOv8, which detects objects better at different angles. We were able to improve accuracy by increasing our Raspberry Pi camera resolution from 640x480 to 1280x960 but it slowed down the processing speed. These efforts improved accuracy by 39% for common household items.
Challenges: Two factors influenced our accuracy. First, the open source models we found were trained mostly with images of trash on the ground rather than trash being held, which doesn’t match our use case. Second, many images were trained with background clutter that confused the model. In the future we would want to train a model in the same way we suggested for facial recognition - with images that are taken in the same conditions as the ones we are comparing to (lighting, angle, camera, resolution, etc). It’s also unable to determine whether different types of plastic are recyclable or not.
Voice command: We improved accuracy with Whisper AI and Levenshtein Distance. The system detects where you are saying trash or recycling with 96% accuracy, even if you say one of those words in a full sentence, or if the system detects a similar word like “tarsh” or “bicycle”.
Challenge: Having separate devices for the speaker and microphone made the system harder to use and less efficient. We have a device with a combined speaker and microphone, but we had trouble getting it to work on the Raspberry Pi due to driver issues. �Speed Challenges: Our object and facial recognition weren't as fast or accurate as we wanted. This was due to the limited processing power of our Raspberry Pi. We spent a lot of effort later moving the heavier tasks to a desktop PC with a graphics card and created a web server so the Raspberry Pi could send images and audio to the server, which helped speed up the system. By discarding blurry images, it helps speed up correct detection.
Website Challenges: We ran into difficulties with the website and reward system, especially with the code for the tree-growing video because we wanted to stop the video at a certain position based on the number of points the user had. This issue was addressed with our dad’s help.
Manual buttons were added because when we were in a hurry, we wanted to use the trash cans without going through the entire WALLI flow. These buttons were simple to add, and it made it easy to open the cans quickly.
Conclusion
Criteria 1: WALLI was successful in detecting recyclables with over 70% accuracy and recognizes users within 5 seconds. We realized that to make WALLI more reliable, we would need to train the model on a much larger set of unblurry images that were ideally taken with the same camera, resolution, lightning conditions, and positioning that we would be using for real use-cases.
Criteria 2: WALLI was successful at being portable and easy to set up. It fits on standard bins and can be used at home or in the office. It also supports multiple users with features like facial recognition and voice commands. Even though we’re happy with how this turned out, we noticed that using separate devices for the speaker and microphone made the setup more complicated than we’d like. A combined device would make things much simpler.
Criteria 3: The web-based point system was successful at motivating users and giving educational feedback. My brother and I had fun with the point system after my parents asked us to sort through waste and get a certain number of points before letting us play video games. We believe this system could improve user engagement and education with the right rewards, which would lead to fewer mistakes and a better correct sorting rate.
Criteria 4: WALLI was successful in providing educational feedback through audio messages and points. Users learned the right way to recycle as they interacted with the system. The voice-to-text feature worked well using OpenAI’s Whisper model. We also added support for synonyms and similar-sounding words with Levenshtein Distance so that even if the voice command came out as “trash,” “rubbish,” or “recyclable,” WALLI could still understand and respond properly, as long as one of the trigger words is used by itself or in a sentence.
Other smart bins on the market are pricey and don’t include educational or motivational features. WALLI shows that AI can make recycling more engaging and teach users the right way to sort waste at a lower cost. However, it still needs more refinements and investments to become fully reliable.
Success Criteria 1
Should accurately detect a user and common recyclables with at least 70% accuracy and in less than 5 seconds to recognize the user
Success Criteria 2
Should be portable, easy to install, and support multiple users
Success Criteria 3
Provides a web based point system to motivate users
Success Criteria 4
Should provide educational feedback
Reflection/Application
Lessons:
1. During this project, we learned a lot about working with Raspberry Pi, using servos and buttons, programming in Python, and building circuits with a breadboard. Our servos were jittery, but getting a more powerful power supply didn’t fix the issue. Using pigpio caused the signals to the motor to be more consistent. and fixed the jitter
2. We also learned how computers compare faces with facial recognition and discovered how to find models and annotate images using Roboflow. We learned how to ignore blurry images using Laplacian variance which reduced wrong authentication and object identification.
3. We realized to train a model to accurately identify different items, a large dataset of images taken in similar conditions to the images we were comparing them to is necessary. We used an open-source dataset we obtained from YOLO, but would want to train it on a larger set of images and items held in the user’s hand for higher accuracy with WALLI. (TACO, 2024).
4. We avoided scanning recycling symbols because symbol usage is often inaccurate. Some governments are working on stricter regulations on the usage of the recycling symbol to avoid consumer confusion and contamination, which might make scanning practical in the future. (LaMotte, Richardson, Meng, & Munger, 2021).
5. We also found that facial recognition is not perfect, especially when working with simple hardware like ours. This taught us that making a reliable system requires more than just coding because it also requires a lot of testing and adjustments since this device is going to be used by families who have similar facial features.
Future improvements
1. We would upgrade our hardware, like adding a more powerful graphics card to speed up object detection and facial recognition.
2. We also want to use a better-trained model to improve the accuracy of facial recognition.
3. To make the system easier to use, we would add a higher-quality microphone and speaker for clearer interactions, and hope to find a speaker and microphone in one device in the future.
4. Currently, WALLI can recognize 7 out of 13 the most common recyclable items with over 70 percent accuracy and 1 type of trash (plastic bag). If the user is holding a different item, WALLI will recognize it as trash because the item was not found in its training data. We know that training a larger model that could detect more items would make it more useful, and having sensors (like a spectrometer) could also improve accuracy by helping the system understand what materials an item is made from.
5. We also want to make the website more engaging by adding better rewards and educational content to motivate and teach users.
6. Give the user an option to register from the trash can (And not the website) if they are a new user.
This project showed us that even simple technology can help solve big problems, but it takes careful planning and improvements to make it effective. By upgrading our system and adding new features, we hope to make WALLI even better at sorting recyclables and educating users about proper recycling practices.
References Cited
Blance, C., Spanbauer, C., & Stienecker, S. (2023). America’s Broken Recycling System. California Management Review. Retrieved from https://cmr.berkeley.edu/2023/05/america-s-broken-recycling-system/
EPA. (2024). National overview: Facts and figures on materials. Retrieved from https://www.epa.gov/facts-and-figures-about-materials-waste-and-recycling/national-overview-facts-and-figures-materials
Garcia, M. (2021). That Recycling Symbol Doesn’t Always Mean What You Think It Does. Retrieved from https://www.kqed.org/news/11883400/bill-aims-to-clarify-which-plastic-products-can-display-recycling-symbol
Hayward. (2024). Decoding Plastic Recycling in California. Retrieved from https://www.hayward-ca.gov/discover/news/jan24/decoding-plastic-recycling-california#:~:text=Plastics%20marked%20with%20numbers%201,%2C%20jars%2C%20and%20yogurt%20tubs
Jam, S. (2024, March 7) Community Engagement and Education for Recycling: Best Practices and Strategies. Medium.Retrieved from https://medium.com/@sohrabjam/community-engagement-and-education-for-recycling-best-practices-and-strategies-b2b1485f1fce
LaMotte, R., Richardson, B., Meng, D.,& Munger, S. ( 2021). California Prohibits Use of Chasing Arrows on Non-Recyclable Items. Retrieve from https://www.bdlaw.com/publications/california-prohibits-use-of-chasing-arrows-on-non-recyclable-items/
Latkin, C., Dayton, L., & Yi, G. (2022). The (Mis)Understanding of the Symbol Associated with Recycling on Plastic Containers in the US: A Brief Report., Research Gate, 1415). DOI:10.3390/su14159636
Serengil, S., & Ozpinar, A. (2024). A benchmark of facial recognition pipelines and co-usability performances of modules. Journal of Information Technologies, 17(2), 95–107. https://doi.org/10.17671/gazibtd.1399077
TACO. (2024, October). yolo v8-trash-detections dataset [Open source dataset]. Roboflow Universe. Retrieved from https://universe.roboflow.com/fyp-bfx3h/yolov8-trash-detections
The Recycling Partnership. (2020). 2020 State of curbside recycling report. Retrieved from https://recyclingpartnership.org/wp-content/uploads/dlm_uploads/2020/02/2020-State-of-Curbside-Recycling.pd
The World Bank. (2024). Trends in Solid Waste Management. Retrieved from https://datatopics.worldbank.org/what-a-waste/trends_in_solid_waste_management.html.
Ultralytics. (2024). Oriented bounding boxes (YOLOv 8-OBB) object detection using Ultralytics YOLOv 8 [Video file]. YouTube. https://www.youtube.com/watch?v=Z7Z9pHF8wJc
Yucer, S., Tektas, F., Al Moubayed, N., & Breckon, T. P. (2023). Racial Bias within Face Recognition: A Survey. arXiv. Retrieved from https://doi.org/10.48550/arXiv.2305.00817
Log Book 1
Log Book 2
Log Book 3
Log
Book
4
Log Book 5