In-Ear Insights from Trust Insights

In-Ear Insights: What Is A Large Language Model?


Listen Later

In this week’s In-Ear Insights, Katie and Chris answer the big question that people are afraid to ask for fear of looking silly: what IS a large language model? Learn what an LLM is, why LLMs like GPT-4 can do what they do, and how to use them best. Tune in to learn more!

[podcastsponsor]

Watch the video here:

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

https://traffic.libsyn.com/inearinsights/tipodcast-what-is-large-language-model.mp3

Download the MP3 audio here.

  • Need help with your company’s data and analytics? Let us know!
  • Join our free Slack group for marketers interested in analytics!
  • Machine-Generated Transcript

    What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

    Christopher Penn 0:00

    In this week’s In-Ear Insights, today, we are talking all about large language models.

    Now, the ones you’ve probably heard of most are models like GPT-3, point five, and GPT-4, which are from open AI.

    But there are many of these things.

    There is the GPT Neo X series from Eleuther.ai AI there stable LM from stability AI, there’s POM from Google.

    So there’s many, many, many of these language models out there.

    And today, we figured we’d talk about what they are, why you should probably know what they are, and maybe a little bit about how they work.

    So Katie, where would you like to start?

    Katie Robbert 0:35

    I think we need to start with some basic definitions, because you just said a bunch of words that basically made my eyes glaze over.

    And so I think we need to start first is what is a large language model.

    And before you give me an overcomplicated definition, let me see if I can take a stab at this in a non like sort of technical see if I’m even understanding.

    So my basic understanding of a large language model is that it is basically like, if you think of like a box or a bucket, and you put all of your content, your papers, your writing your text, your data, whatever, into that bucket, then that’s sort of like the house, the container for all of the language that you want to train the model on.

    And so, you know, the more the more stuff you put in it, the bigger the bucket you need, hence the large because you can’t just have like, this tiny little handheld bucket have like two documents in it and say, Okay, that’s my large language model.

    That’s not enough.

    That’s not enough examples to give the model to train on, you need to keep giving it more information.

    And the more information you put in the bucket, the more refined you can make the model.

    Am I even close?

    ...more
    View all episodesView all episodes
    Download on the App Store

    In-Ear Insights from Trust InsightsBy Trust Insights

    • 5
    • 5
    • 5
    • 5
    • 5

    5

    9 ratings


    More shows like In-Ear Insights from Trust Insights

    View all
    KnowledgeDB.ai by KnowledgeDB

    KnowledgeDB.ai

    0 Listeners