SuPer: A Surgical Perception Framework for Endoscopic Tissue Manipulation With Surgical Robotics

Traditional control and task automation have been successfully demonstrated in a variety of structured, controlled environments through the use of highly specialized modeled robotic systems in conjunction with multiple sensors. However, the application of autonomy in endoscopic surgery is very chall...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters Vol. 5; no. 2; pp. 2293 - 2300
Main Authors: Li, Yang, Richter, Florian, Lu, Jingpei, Funk, Emily K., Orosco, Ryan K., Zhu, Jianke, Yip, Michael C.
Format: Journal Article
Language:English
Published: Piscataway IEEE 01-04-2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Traditional control and task automation have been successfully demonstrated in a variety of structured, controlled environments through the use of highly specialized modeled robotic systems in conjunction with multiple sensors. However, the application of autonomy in endoscopic surgery is very challenging, particularly in soft tissue work, due to the lack of high-quality images and the unpredictable, constantly deforming environment. In this letter, we propose a novel surgical perception framework, SuPer, for surgical robotic control. This framework continuously collects 3D geometric information that allows for mapping a deformable surgical field while tracking rigid instruments within the field. To achieve this, a model-based tracker is employed to localize the surgical tool with a kinematic prior in conjunction with a model-free tracker to reconstruct the deformable environment and provide an estimated point cloud as a mapping of the environment. The proposed framework was implemented on the da Vinci Surgical System in real-time with an end-effector controller where the target configurations are set and regulated through the framework. Our proposed framework successfully completed soft tissue manipulation tasks with high accuracy. The demonstration of this novel framework is promising for the future of surgical autonomy. In addition, we provide our dataset for further surgical research.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2020.2970659