Norm-based Noisy Corpora Filtering and Refurbishing in Neural Machine Translation | |
Yu, Lu1,2; Jiajun, Zhang1,2 | |
2022-12 | |
会议日期 | 2022-12 |
会议地点 | 线上 |
关键词 | 神经机器翻译 |
英文摘要 | Recent advances in neural machine translation depend on massive parallel corpora, which are collected from any open source without much guarantee of quality. It stresses the need for noisy corpora filtering, but existing methods are insufficient to solve this issue. They spend much time ensembling multiple scorers trained on clean bitexts, unavailable for low-resource languages in practice. In this paper, we propose a norm-based noisy corpora filtering and refurbishing method with no external data and costly scorers. The noisy and clean samples are separated based on how much information from the source and target sides the model requires to fit the given translation. For the unparallel sentence, the target-side history translation is much more important than the source context, contrary to the parallel ones. The amount of these two information flows can be measured by norms of source-/target-side context vectors. Moreover, we propose to reuse the discovered noisy data by generating pseudo labels via online knowledge distillation. Extensive experiments show that our proposed filtering method performs comparably with state-of-the-art noisy corpora filtering techniques but is more efficient and easier to operate. Noisy sample refurbishing further enhances the performance by making the most of the given data. |
学科主题 | 计算机科学技术 ; 人工智能 |
语种 | 英语 |
URL标识 | 查看原文 |
内容类型 | 会议论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/51838] |
专题 | 模式识别国家重点实验室_自然语言处理 |
通讯作者 | Jiajun, Zhang |
作者单位 | 1.School of Artificial Intelligence, University of Chinese Academy of Sciences 2.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Yu, Lu,Jiajun, Zhang. Norm-based Noisy Corpora Filtering and Refurbishing in Neural Machine Translation[C]. 见:. 线上. 2022-12. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论