학술논문

Knowledge Prompts: Injecting World Knowledge into Language Models through Soft Prompts
Document Type
Working Paper
Source
Subject
Computer Science - Computation and Language
Computer Science - Artificial Intelligence
Computer Science - Machine Learning
Language
Abstract
Soft prompts have been recently proposed as a tool for adapting large frozen language models (LMs) to new tasks. In this work, we repurpose soft prompts to the task of injecting world knowledge into LMs. We introduce a method to train soft prompts via self-supervised learning on data from knowledge bases. The resulting soft knowledge prompts (KPs) are task independent and work as an external memory of the LMs. We perform qualitative and quantitative experiments and demonstrate that: (1) KPs can effectively model the structure of the training data; (2) KPs can be used to improve the performance of LMs in different knowledge intensive tasks.