ARTIFICIAL INTELLIGENCE (AI) SOFT ROBOTS
ARTIFICIAL INTELLIGENCE (AI) SOFT ROBOTS
Soft robots that can sense touch, pressure, movement and temperature.
Researchers at Harvard University have built soft robots inspired by nature that can crawl, swim, grasp delicate objects and even assist a beating heart, but none of these devices has been able to sense and respond to the world around them.
That's about to change.
Inspired by our bodies’ sensory capabilities, researchers at the Wyss Institute for
Biologically Inspired Engineering and the Harvard John A. Paulson School of
Engineering and Applied Sciences have developed a platform for creating softrobots
with embedded sensors that can sense movement, pressure, touch, and even
temperature.
“Soft robotics are typically limited by conventional molding techniques that constrain geometry choices, or, in the case of commercial 3D printing, material selection that hampers design choices,” said Robert Wood, Ph.D., Core Faculty Member of the Wyss Institute and the Charles River Professor of Engineering and Applied Sciences at SEAS, and co-author of the paper. “The techniques developed in the Lewis Lab have the opportunity to revolutionize how robots are created — moving away from sequential processes and creating complex and monolithic robots with embedded sensors and actuators.”
To test the sensors, the team printed a soft robotic gripper comprised of three soft fingers or actuators. The researchers tested the gripper’s ability to sense inflation pressure, curvature, contact, and temperature. They embedded multiple contact sensors, so the gripper could sense light and deep touches.
“Soft robotics are typically limited by conventional molding techniques that constrain geometry choices, or, in the case of commercial 3D printing, material selection that hampers design choices,” said Robert Wood, Ph.D., Core Faculty Member of the Wyss Institute and the Charles River Professor of Engineering and Applied Sciences at SEAS, and co-author of the paper. “The techniques developed in the Lewis Lab have the opportunity to revolutionize how robots are created — moving away from sequential processes and creating complex and monolithic robots with embedded sensors and actuators.”
Next, the researchers hope to harness the power of machine learning to train these devices to grasp objects of varying size, shape, surface texture, and temperature.
The research was coauthored by Abigail Grosskopf, Daniel Vogt and Sebastien Uzel. It was supported it part by through Harvard MRSEC and Harvard’s Wyss Institute for Biologically Inspired Engineering.
Leave a Comment