The 2023 researchers introduce PersonaPKT, a novel framework designed to create personalized dialogue agents that maintain a consistent personality without needing explicit, private user descriptions. By utilizing parameter-efficient transfer learning, the system represents individual personas as continuous vectors or "prefixes," which adds less than 0.1% of new trainable parameters to a frozen language model. This method extracts implicit personality traits from just a few dialogue samples, significantly improving storage efficiency compared to traditional fine-tuning. The process involves a two-stage training strategy where a source prefix is first optimized across many personas before being adapted to specific individuals. This approach not only ensures high-quality response generation and persona consistency but also enhances user privacy by avoiding the collection of sensitive personal statements. Empirical results demonstrate that PersonaPKT outperforms various baselines, particularly in low-data scenarios where user information is limited. Source: 2023 PersonaPKT: Building Personalized Dialogue Agents via Parameter-efficient Knowledge Transfer University of Colorado Boulder, Amazon Alexa Xu Han, Bin Guo, Yoon Jung, Benjamin Yao, Yu Zhang, Xiaohu Liu, Chenlei Guo https://aclanthology.org/2023.sustainlp-1.21.pdf