Towards Brain-to-Text Generation: Neural Decoding with Pre-trained Encoder-Decoder Models
Shuxian, Zou2,3; Shaonan, Wang2,3; Jiajun, Zhang2,3; Chengqing, Zong1,2,3
2021-09
会议日期2021-12-13
会议地点线上会议
英文摘要

Decoding language from non-invasive brain signals is crucial in building widely applicable brain-computer interfaces (BCIs). However, most of the existing studies have focused on discriminating which one in two stimuli corresponds to the given brain image, which is far from directly generating text from neural activities. To move towards this, we first propose two neural decoding tasks with incremental difficulty. The first and simpler task is to predict a word given a brain image and a context, which is the first step towards text generation. And the second and more difficult one is to directly generate text from a given brain image and a prefix. Furthermore, to address the two tasks, we propose a general approach that leverages the powerful pre-trained encoder-decoder model to predict a word or generate a text fragment. Our model achieves 18.20% and 7.95% top-1 accuracy in a vocabulary of more than 2,000 words on average across all participants on the two tasks respectively, significantly outperforming their strong baselines. These results demonstrate the feasibility to directly generate text from neural activities in a non-invasive way. Hopefully, our work can promote practical non-invasive neural language decoders a step further.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/48643]  
专题模式识别国家重点实验室_自然语言处理
作者单位1.CAS Center for Excellence in Brain Science and Intelligence Technology
2.National Laboratory of Pattern Recognition, Institute of Automation, CAS
3.School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Shuxian, Zou,Shaonan, Wang,Jiajun, Zhang,et al. Towards Brain-to-Text Generation: Neural Decoding with Pre-trained Encoder-Decoder Models[C]. 见:. 线上会议. 2021-12-13.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace