Papers Read on AI

Baize: An Open-Source Chat Model with Parameter-Efficient Tuning on Self-Chat Data


Listen Later

Chat models, such as ChatGPT, have shown impressive capabilities and have been rapidly adopted across numerous domains. However, these models are only accessible through a restricted API, creating barriers for new research and progress in the field. We propose a pipeline that can automatically generate a high-quality multi-turn chat corpus by leveraging ChatGPT to engage in a conversation with itself. Subsequently, we employ parameter-efficient tuning to enhance LLaMA, an open-source large language model.
2023: Canwen Xu, Daya Guo, Nan Duan, Julian McAuley
https://arxiv.org/pdf/2304.01196v2.pdf
...more
View all episodesView all episodes
Download on the App Store

Papers Read on AIBy Rob

  • 3.7
  • 3.7
  • 3.7
  • 3.7
  • 3.7

3.7

3 ratings