
Sign up to save your podcasts
Or
Today we continue our AI Trends 2024 series with a conversation with Thomas Dietterich, distinguished professor emeritus at Oregon State University. As you might expect, Large Language Models figured prominently in our conversation, and we covered a vast array of papers and use cases exploring current research into topics such as monolithic vs. modular architectures, hallucinations, the application of uncertainty quantification (UQ), and using RAG as a sort of memory module for LLMs. Lastly, don’t miss Tom’s predictions on what he foresees happening this year as well as his words of encouragement for those new to the field.
The complete show notes for this episode can be found at twimlai.com/go/666.
4.7
412412 ratings
Today we continue our AI Trends 2024 series with a conversation with Thomas Dietterich, distinguished professor emeritus at Oregon State University. As you might expect, Large Language Models figured prominently in our conversation, and we covered a vast array of papers and use cases exploring current research into topics such as monolithic vs. modular architectures, hallucinations, the application of uncertainty quantification (UQ), and using RAG as a sort of memory module for LLMs. Lastly, don’t miss Tom’s predictions on what he foresees happening this year as well as his words of encouragement for those new to the field.
The complete show notes for this episode can be found at twimlai.com/go/666.
162 Listeners
474 Listeners
295 Listeners
322 Listeners
147 Listeners
196 Listeners
275 Listeners
90 Listeners
97 Listeners
105 Listeners
193 Listeners
64 Listeners
418 Listeners
29 Listeners
32 Listeners