OPT OpenIR  > 光谱成像技术研究室
Randomly translational activation inspired by the input distributions of ReLU
Cao, Jiale1; Pang, Yanwei1; Li, Xuelong2,3; Liang, Jingkun4
作者部门光学影像学习与分析中心
2018-01-31
发表期刊NEUROCOMPUTING
ISSN0925-2312
卷号275页码:859-868
产权排序2
摘要

Deep convolutional neural networks have achieved great success on many visual tasks (e.g., image classification). Non-linear activation plays a very important role in deep convolutional neural networks (CNN). It is found that the input distribution of non-linear activation is like Gaussian distribution and the most of the inputs are concentrated near zero. It makes the learned CNN likely sensitive to the small jitter of the non-linear activation input. Meanwhile, CNN is easily prone to overfitting with deep architecture. To solve the above problems, we make full use of the input distributions of non-linear activation and propose the randomly translational non-linear activation for deep CNN. In the training stage, non-linear activation function is randomly translated by an offset sampled from Gaussian distribution. In the test stage, the non-linear activation with zero offset is used. Based on our proposed method, the input distribution of non-linear activation is relatively scattered. As the result, the learned CNN is robust to the small jitter of the non-linear activation input. Our proposed method can be also seen as the regularization of non-linear activation to reduce overfitting. Compared to the original non-linear activation, our proposed method can improve classification accuracy without increasing computation cost. Experimental results on CIFAR-10/CIFAR-100, SVHN, and ImageNet demonstrate the effectiveness of the proposed method. For example, the reductions of error rates with VGG architecture on CIFAR-10/CIFAR-100 are 0.55% and 1.61%, respectively. Even when the noise is added to the input image, our proposed method still has much better classification accuracy on CIFAR-10/CIFAR-100. (C) 2017 Elsevier B.V. All rights reserved.

 

关键词Cnn Non-linear Activation Relu The Input Distributions Of Relu Random Translation Rt-relu
DOI10.1016/j.neucom.2017.09.031
收录类别SCI ; EI
语种英语
WOS记录号WOS:000418370200081
EI主题词20174204277681
引用统计
被引频次:25[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.opt.ac.cn/handle/181661/30823
专题光谱成像技术研究室
作者单位1.Tianjin Univ, Sch Elect & Informat Engn, Tianjin 300072, Peoples R China;
2.Chinese Acad Sci, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning OPTIMAL, State Key Lab Transient Opt & Photon, Xian 710119, Shaanxi, Peoples R China;
3.Univ Chinese Acad Sci, Beijing 100049, Peoples R China;
4.Hainan Trop Ocean Univ, Coll Ocean Informat Engn, Sanya 572022, Peoples R China
推荐引用方式
GB/T 7714
Cao, Jiale,Pang, Yanwei,Li, Xuelong,et al. Randomly translational activation inspired by the input distributions of ReLU[J]. NEUROCOMPUTING,2018,275:859-868.
APA Cao, Jiale,Pang, Yanwei,Li, Xuelong,&Liang, Jingkun.(2018).Randomly translational activation inspired by the input distributions of ReLU.NEUROCOMPUTING,275,859-868.
MLA Cao, Jiale,et al."Randomly translational activation inspired by the input distributions of ReLU".NEUROCOMPUTING 275(2018):859-868.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Randomly translation(853KB)期刊论文出版稿限制开放CC BY-NC-SA请求全文
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Cao, Jiale]的文章
[Pang, Yanwei]的文章
[Li, Xuelong]的文章
百度学术
百度学术中相似的文章
[Cao, Jiale]的文章
[Pang, Yanwei]的文章
[Li, Xuelong]的文章
必应学术
必应学术中相似的文章
[Cao, Jiale]的文章
[Pang, Yanwei]的文章
[Li, Xuelong]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。