• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Collaborative Robotics Trends

The essential guide for cobot end users.

  • News
  • Cobot Arms
    • Cobots Comparison Tool
    • Cobots 101
  • AMRs
  • Case Studies
    • Assembly
    • Pick & Place
    • Machine Tending
    • Materials Handling
    • Packing & Palletizing
    • Welding & Fabrication
  • Industries
    • Agriculture
    • Food / Beverage
    • Healthcare
    • Manufacturing
    • Supply Chain
  • Resources
    • Cobot Arm Glossary
    • Cobots Comparison Tool
    • Digital Issues
    • Newsletter
    • Publications
      • Robotics Business Review
      • The Robot Report
    • Webinars
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
You are here: Home / Cobot Arms / Perception-based region selection for collaborative robotic sanding

Perception-based region selection for collaborative robotic sanding

December 30, 2020 By Jorge Nicho Leave a Comment

The need for robotic systems that can collaborate with humans on the factory floor is in demand by the manufacturing community. But collaborative robots are still lacking in many respects. One such problem appears in quality control of subtractive manufacturing applications, such as sanding, grinding, and deburring, where material from a part is removed using an abrasive tool until a desired surface condition is obtained.

In such a scenario, the quality of the finish can be assessed by an expert human operator. It would be advantageous to leverage this expertise to guide semi-automated robotic systems to work on the regions that need further work until the desired quality is achieved.

Given this challenge, this research focused on enhanced human-robot collaboration, specifically collaborative robotic sanding. It produced a capability that allows a human operator to guide the process by physically drawing a closed selection region on the part itself. This region will then be sensed by a vision system coupled with an algorithmic solution to crop out sections of the nominal process toolpaths that fall outside the confines of this region.

Approach

Initially, a small dataset of hand-drawn, closed-region images was produced to aid the initial development of the 2D contour detection method and projection into 3D. These images were made with a dark marker on white paper laying on a flat surface and imaged with the Framos D435 camera. The 2D contour method that resulted from this dataset was implemented with the open-source OpenCV library and comprised the following filters/method:

  • Grayscaling
  • Thresholding
  • Dilation
  • Canny edge detection
  • Contour finding

The output of this operation was the 2D pixel coordinates of the detected contours (Figures 1.a and 1.b).

collaborative robotic sanding

The following stage used the 2D pixel coordinates and located the corresponding 3D points from the point cloud associated with the image; this was possible because both the 2D image and point cloud were of the same size. Following that, some additional filters were applied, and adjacent lines were merged to form larger segments.

In the final steps, the segments were classified as open and closed contours and then normal vectors were estimated. The results are shown in Figures 2.a and 2.b. Additional datasets were collected with varying conditions such as thicker lines, thinner lines, curved surfaces and multiple images containing parts of the same closed contour. These datasets allowed refining the method and addressed corner cases that emerged under more challenging conditions such as regions spanning multiple images (Figures 3.a, 3.b, 3.c).

collaborative robotic sanding

Accomplishments

This research lead to the creation of an open-source C++ library that can be used to detect regions that have a similar need for human-robot collaboration. The repository can be found here.

Furthermore, the work was part of a project called “Collaborative Robotic Sanding” at the Advanced Robotics for Manufacturing (ARM) Institute. Spirit AeroSystems was the prime investigator. The Southwest Research Institute (SwRI) developed the perception-enabled region detection. An excerpt of that demonstration video highlighting the region detection is included atop this page.

About the Author

Jorge Nicho is a Research Engineer at the Southwest Research Institute (SwRI). He earned a master’s degree in mechanical engineering from the University of Texas at San Antonio in December 2010. Nicho joined SwRI in 2011 as an engineer for the Manufacturing and Robotics Technologies Department. One of his strengths is intelligent robot motion planning. Nicho has participated in the development of several intelligent robotic systems built with ROS.

Filed Under: Cobot Arms, Finishing, Human Robot Interaction / Haptics, News, Research

Reader Interactions

Leave a Reply Cancel reply

You must be logged in to post a comment.

Primary Sidebar

The Robot Report Listing Database

Cobot Trends Newsletter

RBR50 Digital Issue

Tweets by Cobot Trends

Footer

Cobot Trends

Robotics Network

  • The Robot Report
  • Robotics Summit & Expo
  • Healthcare Robotics Engineering Forum

Collaborative Robotics Trends

Contact Us
About Us
Advertise
Subscribe
Follow us on TwitterAdd us on FacebookFollow us on YouTube

Copyright © 2021 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy