Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction
Xu, Yichen2; Zhu, Yanqiao3,4; Yu, Feng1; Liu, Qiang3,4; Wu, Shu3,4,5
2021-12
会议日期2021-12
会议地点Online
页码3263-3267
英文摘要

 Recently, Deep Neural Networks (DNNs) have made remarkable progress for text classification, which, however, still require a large number of labeled data. To train high-performing models with the minimal annotation cost, active learning is proposed to select and label the most informative samples, yet it is still challenging to measure informativeness of samples used in DNNs. In this paper, inspired by piece-wise linear interpretability of DNNs, we propose a novel Active Learning with DivErse iNterpretations (ALDEN) approach. With local interpretations in DNNs, ALDEN identifies linearly separable regions of samples. Then, it selects samples according to their diversity of local interpretations and queries their labels. To tackle the text classification problem, we choose the word with the most diverse interpretations to represent the whole sentence. Extensive experiments demonstrate that ALDEN consistently outperforms several state-of-the-art deep active learning methods.

会议录出版者ACM Press
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/48468]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Wu, Shu
作者单位1.Alibaba Group
2.School of Computer Science, Beijing University of Posts and Telecommunications
3.Center for Research on Intelligent Perception and Computing, Institute of Automation, Chinese Academy of Sciences
4.School of Artificial Intelligence, University of Chinese Academy of Sciences
5.Artificial Intelligence Research, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Xu, Yichen,Zhu, Yanqiao,Yu, Feng,et al. Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction[C]. 见:. Online. 2021-12.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace