
Sign up to save your podcasts
Or
Today we continue our AI Trends 2024 series with a conversation with Thomas Dietterich, distinguished professor emeritus at Oregon State University. As you might expect, Large Language Models figured prominently in our conversation, and we covered a vast array of papers and use cases exploring current research into topics such as monolithic vs. modular architectures, hallucinations, the application of uncertainty quantification (UQ), and using RAG as a sort of memory module for LLMs. Lastly, don’t miss Tom’s predictions on what he foresees happening this year as well as his words of encouragement for those new to the field.
The complete show notes for this episode can be found at twimlai.com/go/666.
4.7
416416 ratings
Today we continue our AI Trends 2024 series with a conversation with Thomas Dietterich, distinguished professor emeritus at Oregon State University. As you might expect, Large Language Models figured prominently in our conversation, and we covered a vast array of papers and use cases exploring current research into topics such as monolithic vs. modular architectures, hallucinations, the application of uncertainty quantification (UQ), and using RAG as a sort of memory module for LLMs. Lastly, don’t miss Tom’s predictions on what he foresees happening this year as well as his words of encouragement for those new to the field.
The complete show notes for this episode can be found at twimlai.com/go/666.
159 Listeners
476 Listeners
298 Listeners
340 Listeners
151 Listeners
183 Listeners
298 Listeners
91 Listeners
425 Listeners
128 Listeners
201 Listeners
72 Listeners
496 Listeners
31 Listeners
43 Listeners