Prompting

1.P-Tuning v2: Prompt Tuning Can be Comparable to Finetuning Universally Across Scales and Tasks (ACL 2021)

post-thumbnail

2.The Power of Scale for Parameter-Efficient Prompt Tuning (EMNLP 2021)

post-thumbnail

3.SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer (ACL 2022)

post-thumbnail

4.On Transferability of Prompt Tuning for Natural Language Processing (NAACL 2022)

post-thumbnail

5.XPrompt: Exploring the Extreme of Prompt Tuning (EMNLP 2022)

post-thumbnail