Fast Organization of Objects’ Spatial Positions in Manipulator Space from Single RGB-D Camera
Yangchang Sun2,3; Minghao Yang2,3; Jialing Li1; Baohua Qiang1; Jinlong Chen1; Qingyu Jia1
2021-12-08
会议日期2021-12-8
会议地点BALI, Indonesia
关键词3D Reconstruction Real-Time Reconstruction Robot Grasping
英文摘要

For the grasp task in physical environment, it is important for the manipula-tor to know the objects’ spatial positions with as few sensors as possible in real time. This work proposed an effective framework to organize the ob-jects’ spatial positions in the manipulator 3D workspace with a single RGB-D camera robustly and fast. It mainly contains two steps: (1) a 3D reconstruc-tion strategy for objects’ contours obtained in environment; (2) a distance-restricted outlier point elimination strategy to reduce the reconstruction errors caused by sensor noise. The first step ensures fast object extraction and 3D reconstruction from scene image, and the second step contributes to more accurate reconstructions by eliminating outlier points from initial result ob-tained by the first step. We validated the proposed method in a physical sys-tem containing a Kinect 2.0 RGB-D camera and a Mico2 robot. Experiments show that the proposed method can run in quasi real time on a common PC and it outperforms the traditional 3D reconstruction methods.

语种中文
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/52204]  
专题类脑智能研究中心_微观重建与智能分析
通讯作者Minghao Yang
作者单位1.School of computer science and information security, Guilin university of electronic technology
2.Research Center for Brain-inspired Intelligence (BII), Institute of Automation, Chinese Academy of Sciences
3.School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Yangchang Sun,Minghao Yang,Jialing Li,et al. Fast Organization of Objects’ Spatial Positions in Manipulator Space from Single RGB-D Camera[C]. 见:. BALI, Indonesia. 2021-12-8.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace