
Sign up to save your podcasts
Or


In this episode, we discuss using AWS Lambda for machine learning inference. We cover the tradeoffs between GPUs and CPUs for ML, tools like ggml and llama.cpp for running models on CPUs, and share examples where we've experimented with Lambda for ML like podcast transcription, medical imaging, and natural language processing. While Lambda ML is still quite experimental, it can be a viable option for certain use cases.
đ° SPONSORS đ°
Do you have any AWS questions you would like us to address?
By AWS Bites4.6
1111 ratings
In this episode, we discuss using AWS Lambda for machine learning inference. We cover the tradeoffs between GPUs and CPUs for ML, tools like ggml and llama.cpp for running models on CPUs, and share examples where we've experimented with Lambda for ML like podcast transcription, medical imaging, and natural language processing. While Lambda ML is still quite experimental, it can be a viable option for certain use cases.
đ° SPONSORS đ°
Do you have any AWS questions you would like us to address?

383 Listeners

1,084 Listeners

626 Listeners

374 Listeners

153 Listeners

215 Listeners

226 Listeners

987 Listeners

181 Listeners

181 Listeners

209 Listeners

203 Listeners

79 Listeners

142 Listeners

25 Listeners