Instance-aware Prompt Learning for Language Understanding and Generation
Jin feihu1,2; Lu jinliang1,2; Zhang jiajun1,2; Zong chengqing1,2
刊名TALLIP
2023-06
页码19
DOI10.1145/3604613
英文摘要

Prompt learning has emerged as a new paradigm for leveraging pre-trained language models (PLMs) and has shown promising results in downstream tasks with only a slight increase in parameters. However, the current usage of fixed prompts, whether discrete or continuous, assumes that all samples within a task share the same prompt. This assumption may not hold for tasks with diverse samples that require different prompt information. To address this issue, we propose an instance-aware prompt learning method that learns a different prompt for each instance. Specifically, we suppose that each learnable prompt token has a different contribution to different instances, and we learn the contribution by calculating the relevance score between an instance and each prompt token. The contribution weighted prompt would be instance aware. We apply our method to both unidirectional and bidirectional PLMs on both language understanding and generation tasks. Extensive experiments demonstrate that our method achieves comparable results using as few as 1.5\% of the parameters of PLMs tuned and obtains considerable improvements compared to strong baselines. In particular, our method achieves state-of-the-art results using ALBERT-xxlarge-v2 on the SuperGLUE few-shot learning benchmark.

内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/52013]  
专题模式识别国家重点实验室_自然语言处理
通讯作者Zhang jiajun
作者单位1.中国科学院大学
2.中国科学院自动化研究所
推荐引用方式
GB/T 7714
Jin feihu,Lu jinliang,Zhang jiajun,et al. Instance-aware Prompt Learning for Language Understanding and Generation[J]. TALLIP,2023:19.
APA Jin feihu,Lu jinliang,Zhang jiajun,&Zong chengqing.(2023).Instance-aware Prompt Learning for Language Understanding and Generation.TALLIP,19.
MLA Jin feihu,et al."Instance-aware Prompt Learning for Language Understanding and Generation".TALLIP (2023):19.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace