CORC  > 自动化研究所  > 中国科学院自动化研究所
Rectified Exponential Units for Convolutional Neural Networks
Ying, Yao1; Su, Jianlin2; Shan, Peng1; Miao, Ligang3; Wang, Xiaolian4; Peng, Silong1,4
刊名IEEE ACCESS
2019
卷号7页码:101633-101640
关键词Activation function convolutional neural network rectified exponential unit parametric rectified exponential unit
ISSN号2169-3536
DOI10.1109/ACCESS.2019.2928442
通讯作者Shan, Peng(peng.shan@neuq.edu.cn)
英文摘要Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, the REU is designed by introducing the advantage of flexible exponent and multiplication function form. Moreover, we propose the Parametric REU (PREU) to increase the expressive power of the REU. The experiments with three classical CNN architectures, LeNet-5, Network in Network, and Residual Network (ResNet) on scale-various benchmarks including Fashion-MNIST, CIFAR10, CIFAR100, and Tiny ImageNet demonstrate that REU and PREU achieve improvement compared with other activation functions. Our results show that our REU has relative error improvements over ReLU of 7.74% and 6.08% on CIFAR-10 and 100 with the ResNet, the improvements of PREU is 9.24% and 9.32%. Finally, we use the different PREU variants in the Residual unit to achieve more stable results.
资助项目National Natural Science Foundation of China[61601104] ; Natural Science Foundation of Hebei Province[F2017501052]
WOS研究方向Computer Science ; Engineering ; Telecommunications
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000481688500091
资助机构National Natural Science Foundation of China ; Natural Science Foundation of Hebei Province
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/27574]  
专题中国科学院自动化研究所
通讯作者Shan, Peng
作者单位1.Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Liaoning, Peoples R China
2.Sun Yat Sen Univ, Sch Math, Guangzhou 510220, Guangdong, Peoples R China
3.Northeastern Univ, Sch Comp & Commun Engn, Shenyang 110819, Liaoning, Peoples R China
4.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Ying, Yao,Su, Jianlin,Shan, Peng,et al. Rectified Exponential Units for Convolutional Neural Networks[J]. IEEE ACCESS,2019,7:101633-101640.
APA Ying, Yao,Su, Jianlin,Shan, Peng,Miao, Ligang,Wang, Xiaolian,&Peng, Silong.(2019).Rectified Exponential Units for Convolutional Neural Networks.IEEE ACCESS,7,101633-101640.
MLA Ying, Yao,et al."Rectified Exponential Units for Convolutional Neural Networks".IEEE ACCESS 7(2019):101633-101640.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace