Paper Talk

396-scPEFT: Fine-Tuning for Single-Cell Foundation Models


Listen Later

Researchers have developed scPEFT, a framework that utilizes parameter-efficient fine-tuning to adapt large-scale single-cell foundation models to diverse biological contexts. Traditional methods of updating these models often require massive computational power and risk catastrophic forgetting, where the system loses its original learned knowledge. By using pluggable adapters like LoRA and prefix tuning, scPEFT significantly reduces the number of trainable parameters while improving performance on tasks such as cell-type identification and batch correction. This approach proves especially effective for cross-species analysis, allowing models trained on human data to accurately interpret cellular structures in animals like macaques and worms. Furthermore, the system enhances biological discovery by identifying specific gene regulators and rare cell subpopulations that standard methods might overlook. Ultimately, this method offers a more accessible and robust solution for researchers working with specialized or limited genomic datasets.

References:

  • He F, Fei R, Krull J E, et al. Harnessing the power of single-cell large language models with parameter-efficient fine-tuning using scPEFT[J]. Nature Machine Intelligence, 2025: 1-16.
...more
View all episodesView all episodes
Download on the App Store

Paper TalkBy 淼淼Elva