CORC  > 北京大学  > 信息科学技术学院
A novel Contrast Co-learning framework for generating high quality training data
Zheng, Zeyu ; Yan, Jun ; Yan, Shuicheng ; Liu, Ning ; Chen, Zheng ; Zhang, Ming
2010
英文摘要The good performances of most classical learning algorithms are generally founded on high quality training data, which are clean and unbiased. The availability of such data is however becoming much harder than ever in many real world problems due to the difficulties in collecting large scale unbiased data and precisely labeling them for training. In this paper, we propose a general Contrast Co-learning (CCL) framework to refine the biased and noisy training data when an unbiased yet unlabeled data pool is available. CCL starts with multiple sets of probably biased and noisy training data and trains a set of classifiers individually. Then under the assumption that the confidently classified data samples may have higher probabilities to be correctly classified, CCL iteratively and automatically filtering out possible data noises as well as adding those confidently classified samples from the unlabeled data pool to correct the bias. Through this process, we can generate a cleaner and unbiased training dataset with theoretical guarantees. Extensive experiments on two public text datasets clearly show that CCL consistently improves the algorithmic classification performance on biased and noisy training data compared with several state-of-the-art classical algorithms. ? 2010 IEEE.; EI; 0
语种英语
DOI标识10.1109/ICDM.2010.23
内容类型其他
源URL[http://ir.pku.edu.cn/handle/20.500.11897/329587]  
专题信息科学技术学院
推荐引用方式
GB/T 7714
Zheng, Zeyu,Yan, Jun,Yan, Shuicheng,et al. A novel Contrast Co-learning framework for generating high quality training data. 2010-01-01.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace