Vision-based estimation of manipulation forces by deep learning of laparoscopic surgical images obtained in a porcine excised kidney experiment
In robot-assisted surgery, in which haptics should be absent, surgeons experience haptics-like sensations as “pseudo-haptic feedback”. As surgeons who routinely perform robot-assisted laparoscopic surgery, we wondered if we could make these “pseudo-haptics” explicit to surgeons. Therefore, we create...
Saved in:
Published in: | Scientific reports Vol. 14; no. 1; p. 9686 |
---|---|
Main Authors: | , , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
London
Nature Publishing Group UK
27-04-2024
Nature Publishing Group Nature Portfolio |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In robot-assisted surgery, in which haptics should be absent, surgeons experience haptics-like sensations as “pseudo-haptic feedback”. As surgeons who routinely perform robot-assisted laparoscopic surgery, we wondered if we could make these “pseudo-haptics” explicit to surgeons. Therefore, we created a simulation model that estimates manipulation forces using only visual images in surgery. This study aimed to achieve vision-based estimations of the magnitude of forces during forceps manipulation of organs. We also attempted to detect over-force, exceeding the threshold of safe manipulation. We created a sensor forceps that can detect precise pressure at the tips with three vectors. Using an endoscopic system that is used in actual surgery, images of the manipulation of excised pig kidneys were recorded with synchronized force data. A force estimation model was then created using deep learning. Effective detection of over-force was achieved if the region of the visual images was restricted by the region of interest around the tips of the forceps. In this paper, we emphasize the importance of limiting the region of interest in vision-based force estimation tasks. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-024-60574-w |