
Sign up to save your podcasts
Or
AI is evolving faster than ever—and open-source AI models are catching up to proprietary models at an incredible pace. In this episode of the Data Neighbor Podcast, we sit down with Maarten Grootendorst, co-author of Hands-On Large Language Models with Jay Alammar, DeepLearning.AI instructor, and creator of BERTopic and KeyBERT, to break down the real differences between open-source and closed-source AI models.We’ll discuss how LLMs (Large Language Models) evolved from bag-of-words and Word2Vec to modern transformer-based models like BERT, GPT-4, DeepSeek, LLaMA 2, and Mixtral. More importantly, we explore when open-source AI models might actually be better than proprietary models from OpenAI, Google DeepMind, and Anthropic.
Connect with us!
AI is evolving faster than ever—and open-source AI models are catching up to proprietary models at an incredible pace. In this episode of the Data Neighbor Podcast, we sit down with Maarten Grootendorst, co-author of Hands-On Large Language Models with Jay Alammar, DeepLearning.AI instructor, and creator of BERTopic and KeyBERT, to break down the real differences between open-source and closed-source AI models.We’ll discuss how LLMs (Large Language Models) evolved from bag-of-words and Word2Vec to modern transformer-based models like BERT, GPT-4, DeepSeek, LLaMA 2, and Mixtral. More importantly, we explore when open-source AI models might actually be better than proprietary models from OpenAI, Google DeepMind, and Anthropic.
Connect with us!