
Dorsa Sadigh and team hope to make robot-assistive feeding possible
Stanford researchers improve the skewering, scooping, and bite transfer steps.
Professor Dorsa Sadigh and her research team have developed several novel robotic algorithms for autonomously and comfortably accomplishing each step of the feeding process for a variety of food types. One algorithm combines computer vision and haptics to evaluate the angle and speed at which to insert a fork into a food item; another uses a second robotic arm to push food onto a spoon; and a third delivers food into a person’s mouth in a way that feels natural and comfortable.
“The hope is that by making progress in this domain, people who rely on caregiver assistance can eventually have a more independent lifestyle,” states Priya Sundaresan, CS PhD candidate.
Read their publications on skewering, scooping, and bite transfer
- Learning Visuo-Haptic Skewering Strategies for Robot-Assisted Feeding
- Learning Bimanual Scooping Policies for Food Acquisition
- In-Mouth Robotic Bite Transfer with Visual and Haptic Sensing
Learn more about Dorsa and her research lab, Stanford ILIAD – Intelligent and Interactive Autonomous Systems Group.
Excerpted from Stanford HAI, "Building a Precise Assistive-Feeding Robot That Can Handle Any Meal"