AUTONOMOUS NETWORKED ROBOTS FOR ADVANCED MANUFACTURING
11th Annual COE Graduate Poster Presentation Competition
Student (PhD): M A Muktadir
Advisor: Dr. Sun Yi
Cross-Disciplinary Research Area:
Additive Manufacturing
Recent advance in advanced manufacturing, including 3D printing technologies, enables different types of manufacturing systems (additive and subtractive) to produce complex objects. However, there still exist many challenges in the production processes. For example, one of the main challenges is safely handling products after production without damage and material loss in an autonomous manner in remote environments where human access is limited due to radioactive materials. Robotics manipulators with mobile robots have been the right choice for these complex works. Depending on the situation of the manufacturing processes, the gripper fingers need to be modified, and the robot’s types should vary; it may be a fixed robotic arm, or a robotic arm placed on a mobile ground robot. Such robot systems can minimize human involvement to avoid insecurity and risk of injury. Also, unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) have to operate under complex or unorganized places, such as rough surfaces and windy areas. For UGVs, certain surface types of ground must reach their destination. UAVs cannot move in a complex place with limited flying time and stability limitations. SPOT, a robotic dog, would be an excellent alternative for the AM environment, to inspect the production process on time repeatedly without any operator involvement as it has the capabilities to charge itself and complete a mission in a specific way in a timely manner.
ACKNOWLEDGEMENT
This research is funded by the NCDOT.
Abstract
Methodology
Conclusion & Future work
Data Collection
Implementation
Fig1: Flow chart of Methodology.
Postprocessing of data
Training and result
Evaluation Metrics of the trained model:
Precision =
True Positive/ (True Positive + False Positive)
Recall =
True Positive/ (False Negative + True Positive )
F Score =
(2 X Precision X Recall ) / (Precision + Recall)
ML Model | Precision | Recall | F Score |
LEAK | 1 | 1 | 1 |
JOINT | 1 | 1 | 1 |
DEPOSITION | 1 | 1 | 1 |
LEAK & JOINT | 1 | 1 | 1 |
LEAK & DEPOSITION | 1 | 0.75 | 0.857 |
JOINT & DEPOSITION | 1 | 0.75 | 0.857 |
LEAK, JOINT, & DEPOSITION | 1 | 0.6667 | 0.8 |
Fig2: Data collection with a 3D scanner.
Fig3: Postprocessing and classifying the different types of defects.
LIDAR CAMERA
Fig4: ML training result.
Fig5: SPOT robot with a LIDAR camera.