Learning-based Tensor Decomposition with Adaptive Rank Penalty for CNNs Compression | |
Deli, Yu1,2; Peipei, Yang1,2; Cheng-Lin, Liu1,2 | |
2021-08 | |
会议日期 | September 8-10, 2021 |
会议地点 | Tokyo, Japan (online) |
关键词 | low-rank decomposition network compression learning-based decomposition adaptive rank penalty |
英文摘要 | Low-rank tensor decomposition is a widely-used strategy to compress convolutional neural networks (CNNs). Existing learning-based decomposition methods encourage low-rank filter weights via regularizer of filters’ pair-wise force or nuclear norm during training. However, these methods can not obtain the satisfied low-rank structure. We propose a new method with an adaptive rank penalty to learn more compact CNNs. Specifically, we transform rank constraint into a differentiable one and impose its adaptive violation-aware penalty on filters. Moreover, this paper is the first work to integrate the learning-based decomposition and group decomposition to make a better trade-off, especially for the tough task of compression of 1x1 convolution. The obtained low-rank model can be easily decomposed while nearly keeping the full accuracy without additional fine-tuning process. The effectiveness is verified by compression experiments of VGG and ResNet on CIFAR-10 and ILSVRC-2012. Our method can reduce about 65% parameters of ResNet-110 with 0.04% Top-1 accuracy drop on CIFAR-10, and reduce about 60% parameters of ResNet-50 with 0.57% Top-1 accuracy drop on ILSVRC-2012. |
语种 | 英语 |
内容类型 | 会议论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/45025] |
专题 | 自动化研究所_模式识别国家重点实验室_模式分析与学习团队 |
通讯作者 | Cheng-Lin, Liu |
作者单位 | 1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences 2.School of Artifical Intelligence, University of Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Deli, Yu,Peipei, Yang,Cheng-Lin, Liu. Learning-based Tensor Decomposition with Adaptive Rank Penalty for CNNs Compression[C]. 见:. Tokyo, Japan (online). September 8-10, 2021. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论