
Sign up to save your podcasts
Or


Scaling laws are as important to artificial intelligence (AI) as the law of gravity is in the world around us. AI is the empirical science of this decade, and Cerebras is a company dedicated to turning state-of-the-art research on large language models (LLMs) into open-source data that can be reproduced by developers across the world. In this episode, James Wang, an ARK alum and product marketing specialist at Cerebras, joins us for a discussion centered around the past and the future of LLM development and why the generative pre-trained transformer (GPT) innovation taking place in this field is like nothing that has ever come before it (and has seemingly limitless possibilities). He also explains the motivation behind Cerebras’ unique approach and the benefits that their architecture and models are providing to developers.
By ARK Invest4.7
388388 ratings
Scaling laws are as important to artificial intelligence (AI) as the law of gravity is in the world around us. AI is the empirical science of this decade, and Cerebras is a company dedicated to turning state-of-the-art research on large language models (LLMs) into open-source data that can be reproduced by developers across the world. In this episode, James Wang, an ARK alum and product marketing specialist at Cerebras, joins us for a discussion centered around the past and the future of LLM development and why the generative pre-trained transformer (GPT) innovation taking place in this field is like nothing that has ever come before it (and has seemingly limitless possibilities). He also explains the motivation behind Cerebras’ unique approach and the benefits that their architecture and models are providing to developers.

3,373 Listeners

1,285 Listeners

533 Listeners

2,177 Listeners

1,090 Listeners

940 Listeners

1,215 Listeners

613 Listeners

909 Listeners

559 Listeners

9,938 Listeners

449 Listeners

131 Listeners

465 Listeners

35 Listeners