PowerInfer-2: Fast Large Language Model Inference on a Smartphone
A podcast discussion about PowerInfer-2, a framework for running large language models on smartphones, focusing on its neuron cluster design, adaptive computation strategies, and I/O optimizations.
PowerInfer-2: Fast Large Language Model Inference on a Smartphone
A podcast discussion about PowerInfer-2, a framework for running large language models on smartphones, focusing on its neuron cluster design, adaptive computation strategies, and I/O optimizations.