Plain Template Insertion: Korean-Prompt-Based Engineering for Few-Shot Learners (IEEE Access 2022)

IEEE Access

  • Impact Factor 2022: 3.47

Authors

  • Jaehyung Seo, Hyeonseok Moon, Chanhee Lee, Sugyeong Eo, Chanjun Park, Jihoon Kim, Changwoo Chun, Heuiseok Lim

Abstract

Prompt-based learning is a method used for language models to interpret natural language by remembering the prior knowledge acquired and the training objective. Recent prompt-based few-shot learners have achieved superior performance by alleviating the catastrophic forgetting that occurs in pretrained language models. Few-shot learning contributes towards solving the data scarcity problem, an enormous challenge in AI systems and a significant consideration in natural language processing research. In spite of the significance of few-shot learning, research on Korean language-based few-shot learning is insufficient, and whether the prompt-based approach is appropriate for the Korean language has not been thoroughly verified. As a step toward realizing a Korean-prompt-based few-shot learner, we attempt to apply prompt engineering to the Korean language understanding benchmark dataset and introduce plain template insertion to overcome data scarcity in a more practical few-shot setting. The contributions of this study are as follows: (1) presumably, this is the first study to apply prompt-based few-shot learning to Korean benchmark datasets. With 32 few-shot settings, it improves performance by +14.88, +29.04, and +1.81 in the natural language inference, semantic textual similarity, and topic classification tasks. (2) We present prompt engineering, which merely inserts a plain template and increases data efficiency without training example selection, augmentation, reformulation, and retrieval. (3) Our approach is robust to the Korean prompt’s contextual information and sentence structure and is applicable to both hard- and soft-prompt. Check out the This Link for more info on our paper.