RASNet: Segmentation for Tracking Surgical Instruments in Surgical Videos Using Refined Attention Segmentation Network
Zhen-Liang Ni1,2; Gui-Bin Bian1,2; Xiao-Liang Xie1; Zeng-Guang Hou1,2,3; Xiao-Hu Zhou1,2; Yan-Jie Zhou1,2
2019-07
会议日期2019.7.23-2019.7.27
会议地点Berlin, Germany
DOI10.1109/EMBC.2019.8856495
英文摘要

Segmentation for tracking surgical instruments plays an important role in robot-assisted surgery. Segmentation of surgical instruments contributes to capturing accurate spatial information for tracking. In this paper, a novel network, Refined Attention Segmentation Network, is proposed to simultaneously segment surgical instruments and identify their categories. The U-shape network which is popular in segmentation is used. Different from previous work, an attention module is adopted to help the network focus on key regions, which can improve the segmentation accuracy. To solve the class imbalance problem, the weighted sum of the cross entropy loss and the logarithm of the Jaccard index is used as loss function. Furthermore, transfer learning is adopted in our network. The encoder is pre-trained on ImageNet. The dataset from the MICCAI EndoVis Challenge 2017 is used to evaluate our network. Based on this dataset, our network achieves state-of-the-art performance 94.65% mean Dice and 90.33% mean IOU.

语种英语
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/48699]  
专题自动化研究所_复杂系统管理与控制国家重点实验室_先进机器人控制团队
作者单位1.the State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
3.CAS Center for Excellence in Brain Science and Intelligence Technology
推荐引用方式
GB/T 7714
Zhen-Liang Ni,Gui-Bin Bian,Xiao-Liang Xie,et al. RASNet: Segmentation for Tracking Surgical Instruments in Surgical Videos Using Refined Attention Segmentation Network[C]. 见:. Berlin, Germany. 2019.7.23-2019.7.27.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace