You’ve probably already seen, or even participated in, the Bottle Cap Challenge. It’s the viral challenge of the week where people try to unscrew a bottle cap with a 360-degree kick.
MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) didn’t want to be left out. So it got in on the fun using its RoboRaise robot that can mirror a user’s motions and follow non-verbal commands by monitoring arm muscles.
There is a slight twist, however. As you can see in the video below, RoboRaise, which is a Baxter robot from Rethink Robotics, doesn’t have feet and can’t perform a 360-degree kick. RoboRaise takes its cues from MIT’s Joseph DelPreto, who is wearing small sensors on his right arm. Following his lead, RoboRaise uses its soft gripper to unscrew the bottle cap from the stand, successfully completing the Bottle Cap Challenge.
DelPreto said he could envision RoboRaise being used in manufacturing and construction settings, or even as an assistant around the house. Here is a little bit about how RoboRaise works. For more in-depth details, check out this MIT News article.
“The project builds off [an] existing system that allows users to instantly correct robot mistakes with brainwaves and hand gestures, now enabling continuous motion in a more collaborative way. “We aim to develop human-robot interaction where the robot adapts to the human, rather than the other way around. This way the robot becomes an intelligent tool for physical work,” says MIT Professor and CSAIL Director Daniela Rus.
“EMG signals can be tricky to work with: They’re often very noisy, and it can be difficult to predict exactly how a limb is moving based on muscle activity. Even if you can estimate how a person is moving, how you want the robot itself to respond may be unclear.
“RoboRaise gets around this by putting the human in control. The team’s system uses noninvasive, on-body sensors that detect the firing of neurons as you tense or relax muscles. Using wearables also gets around problems of occlusions or ambient noise, which can complicate tasks involving vision or speech.
“RoboRaise’s algorithm then processes biceps activity to estimate how the person’s arm is moving so the robot can roughly mimic it, and the person can slightly tense or relax their arm to move the robot up or down. If a user needs the robot to move farther away from their own position or hold a pose for a while, they can just gesture up or down for finer control; a neural network detects these gestures at any time based on biceps and triceps activity.
“A new user can start using the system very quickly, with minimal calibration. After putting on the sensors, they just need to tense and relax their arm a few times then lift a light weight to a few heights. The neural network that detects gestures is only trained on data from previous users.”
Will other robots get in on the Bottle Cap Challenge? If so, they better hurry up as it won’t be long before the internet is onto the next viral challenge.
Leave a Reply
You must be logged in to post a comment.