
Sign up to save your podcasts
Or
Transformer models are the powerful neural networks that have become the standard for delivering advanced performance behind these innovations. But there is a challenge: Training these deep learning models at scale and doing inference on them requires a large amount of computing power. This can make the process time-consuming, complex, and costly.
Today we will talk about all kinds of issues around accessible, production level AI solutions. We also talk about ethical questions around AI usage and why open, democratized AI solutions are important.
Learn more:
Hugging Face Hub
Fast Inference on Large Language Models: BLOOMZ on Habana Gaudi2 Accelerator
Accelerating Stable Diffusion Inference on Intel CPUs
Transformer Performance with Intel & Hugging Face Webinar
Intel Explainable AI Tools
Intel Distribution of OpenVINO Toolkit
Intel AI Analytics Toolkit (AI Kit)
Guests:
4.5
44 ratings
Transformer models are the powerful neural networks that have become the standard for delivering advanced performance behind these innovations. But there is a challenge: Training these deep learning models at scale and doing inference on them requires a large amount of computing power. This can make the process time-consuming, complex, and costly.
Today we will talk about all kinds of issues around accessible, production level AI solutions. We also talk about ethical questions around AI usage and why open, democratized AI solutions are important.
Learn more:
Hugging Face Hub
Fast Inference on Large Language Models: BLOOMZ on Habana Gaudi2 Accelerator
Accelerating Stable Diffusion Inference on Intel CPUs
Transformer Performance with Intel & Hugging Face Webinar
Intel Explainable AI Tools
Intel Distribution of OpenVINO Toolkit
Intel AI Analytics Toolkit (AI Kit)
Guests:
3 Listeners