
Sign up to save your podcasts
Or
In this episode, we discuss using AWS Lambda for machine learning inference. We cover the tradeoffs between GPUs and CPUs for ML, tools like ggml and llama.cpp for running models on CPUs, and share examples where we've experimented with Lambda for ML like podcast transcription, medical imaging, and natural language processing. While Lambda ML is still quite experimental, it can be a viable option for certain use cases.
đ° SPONSORS đ°
Do you have any AWS questions you would like us to address?
4.6
1111 ratings
In this episode, we discuss using AWS Lambda for machine learning inference. We cover the tradeoffs between GPUs and CPUs for ML, tools like ggml and llama.cpp for running models on CPUs, and share examples where we've experimented with Lambda for ML like podcast transcription, medical imaging, and natural language processing. While Lambda ML is still quite experimental, it can be a viable option for certain use cases.
đ° SPONSORS đ°
Do you have any AWS questions you would like us to address?
272 Listeners
283 Listeners
152 Listeners
1,027 Listeners
592 Listeners
624 Listeners
443 Listeners
202 Listeners
142 Listeners
982 Listeners
7,865 Listeners
181 Listeners
23 Listeners
30 Listeners
52 Listeners