
Sign up to save your podcasts
Or
Scaling laws are as important to artificial intelligence (AI) as the law of gravity is in the world around us. AI is the empirical science of this decade, and Cerebras is a company dedicated to turning state-of-the-art research on large language models (LLMs) into open-source data that can be reproduced by developers across the world. In this episode, James Wang, an ARK alum and product marketing specialist at Cerebras, joins us for a discussion centered around the past and the future of LLM development and why the generative pre-trained transformer (GPT) innovation taking place in this field is like nothing that has ever come before it (and has seemingly limitless possibilities). He also explains the motivation behind Cerebras’ unique approach and the benefits that their architecture and models are providing to developers.
4.7
381381 ratings
Scaling laws are as important to artificial intelligence (AI) as the law of gravity is in the world around us. AI is the empirical science of this decade, and Cerebras is a company dedicated to turning state-of-the-art research on large language models (LLMs) into open-source data that can be reproduced by developers across the world. In this episode, James Wang, an ARK alum and product marketing specialist at Cerebras, joins us for a discussion centered around the past and the future of LLM development and why the generative pre-trained transformer (GPT) innovation taking place in this field is like nothing that has ever come before it (and has seemingly limitless possibilities). He also explains the motivation behind Cerebras’ unique approach and the benefits that their architecture and models are providing to developers.
2,172 Listeners
1,003 Listeners
3,369 Listeners
1,782 Listeners
1,209 Listeners
932 Listeners
810 Listeners
1,837 Listeners
1,028 Listeners
8,716 Listeners
416 Listeners
113 Listeners
335 Listeners
275 Listeners
438 Listeners