
Sign up to save your podcasts
Or


In this episode of the Ruby AI Podcast, hosts Valentino Stoll and Joe Leo
welcome AI and Ruby expert Chris Hasinski. They delve into the benefits and
challenges of self-hosting AI models, including control over model updates, cost
considerations, and the ability to fine-tune models. Chris shares his journey
from machine learning at UC Davis to his extensive work in AI and Ruby, touching
upon his contributions to open source projects and the Ruby AI community. The
discussion also covers the limitations of current LLMs (Large Language Models)
in generating Ruby code, the importance of high-quality data for effective AI,
and the potential for Ruby to become a strong contender in AI development.
Whether you're a Ruby enthusiast or interested in the intersection of AI and
software development, this episode offers valuable insights and practical
advice.
00:00 Introduction and Guest Welcome
00:31 Why Self-Host Models?
01:28 Challenges and Benefits of Self-Hosting
03:14 Chris's Background in Machine Learning
04:13 Applications Beyond Text
06:39 Fine-Tuning Models
12:27 Ruby in Machine Learning
16:06 Distributed Training and Model Porting
18:22 Choosing and Deploying Models
25:19 Testing and Data Engineering in Ruby
27:56 Database Naming Conventions in Different Languages
28:19 Importance of Data Quality for AI
18:03 Monitoring Locally Hosted AI Models
29:37 Challenges with LLMs and Performance Tracking
31:09 Improving Developer Experience in Ruby
31:45 Ruby's Ecosystem for Machine Learning
32:43 The Need for Investment in Ruby's AI Tools
38:25 Challenges with AI Code Generation in Ruby
43:35 Future Prospects for Ruby in AI
51:26 Conclusion and Final Thoughts
By Valentino Stoll, Joe Leo4.7
33 ratings
In this episode of the Ruby AI Podcast, hosts Valentino Stoll and Joe Leo
welcome AI and Ruby expert Chris Hasinski. They delve into the benefits and
challenges of self-hosting AI models, including control over model updates, cost
considerations, and the ability to fine-tune models. Chris shares his journey
from machine learning at UC Davis to his extensive work in AI and Ruby, touching
upon his contributions to open source projects and the Ruby AI community. The
discussion also covers the limitations of current LLMs (Large Language Models)
in generating Ruby code, the importance of high-quality data for effective AI,
and the potential for Ruby to become a strong contender in AI development.
Whether you're a Ruby enthusiast or interested in the intersection of AI and
software development, this episode offers valuable insights and practical
advice.
00:00 Introduction and Guest Welcome
00:31 Why Self-Host Models?
01:28 Challenges and Benefits of Self-Hosting
03:14 Chris's Background in Machine Learning
04:13 Applications Beyond Text
06:39 Fine-Tuning Models
12:27 Ruby in Machine Learning
16:06 Distributed Training and Model Porting
18:22 Choosing and Deploying Models
25:19 Testing and Data Engineering in Ruby
27:56 Database Naming Conventions in Different Languages
28:19 Importance of Data Quality for AI
18:03 Monitoring Locally Hosted AI Models
29:37 Challenges with LLMs and Performance Tracking
31:09 Improving Developer Experience in Ruby
31:45 Ruby's Ecosystem for Machine Learning
32:43 The Need for Investment in Ruby's AI Tools
38:25 Challenges with AI Code Generation in Ruby
43:35 Future Prospects for Ruby in AI
51:26 Conclusion and Final Thoughts

121 Listeners

702 Listeners

208 Listeners

34 Listeners

32 Listeners

14 Listeners

9,904 Listeners

5,504 Listeners

12 Listeners

26 Listeners

6 Listeners

1,158 Listeners

0 Listeners

4 Listeners

0 Listeners