A CGRA based Neural Network Inference Engine for Deep Reinforcement Learning | |
Minglan Liang; Mingsong Chen; Zheng Wang | |
2018 | |
会议日期 | 2018 |
会议地点 | 中国成都 |
英文摘要 | Recent ultra-fast development of artificial intelligence algorithms has demanded dedicated neural network accelerators, whose high computing performance and low power consumption enable the deployment of deep learning algorithms on the edge computing nodes. State-of-the-art deep learning engines mostly support supervised learning such as CNN, RNN, whereas very few AI engines support on-chip reinforcement learning, which is the foremost algorithm kernel for decision-making subsystem of an autonomous system. In this work, a Coarse-grained Reconfigurable Array (CGRA) like AI computing engine has been designed for the deployments of both supervised and reinforcement learning. Logic synthesis at the design frequency of 200MHz based on 65nm CMOS technology reveals the physical statistics of the proposed engine of 0.32$mm^2$ in silicon area, 15.45 mW in power consumption. The proposed on-chip AI engine facilitates the implementation of end-to-end perceptual and decision-making networks, which can find its wide employment in autonomous driving, robotics and UAVs. |
URL标识 | 查看原文 |
内容类型 | 会议论文 |
源URL | [http://ir.siat.ac.cn:8080/handle/172644/13736] |
专题 | 深圳先进技术研究院_集成所 |
推荐引用方式 GB/T 7714 | Minglan Liang,Mingsong Chen,Zheng Wang. A CGRA based Neural Network Inference Engine for Deep Reinforcement Learning[C]. 见:. 中国成都. 2018. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论