Misreading Chat

#53 – BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding


Listen Later

NN の自然言語処理で transfer learning を実現した BERT について向井が話します。感想などはハッシュタグ #misreading か [email protected] にお寄せください。

https://misreading.chat/wp-content/uploads/2019/03/ep53.mp3

  • [1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  • Improving Language Understanding by Generative Pre-Training
  • GitHub – openai/gpt-2: Code for the paper “Language Models are Unsupervised Multitask Learners”
  • [1901.11504] Multi-Task Deep Neural Networks for Natural Language Understanding
    • Microsoft’s New MT-DNN Outperforms Google BERT – SyncedReview – Medium
    • BERT with SentencePiece を日本語 Wikipedia で学習してモデルを公開しました – 原理的には可能 – データ分析界隈の人のブログ、もとい雑記帳
    • Follow up
      • Jay Alammar جهاد العمار | LinkedIn
      • STV
      • ...more
        View all episodesView all episodes
        Download on the App Store

        Misreading ChatBy Hajime Morrita, Jun Mukai

        • 5
        • 5
        • 5
        • 5
        • 5

        5

        6 ratings


        More shows like Misreading Chat

        View all
        Rebuild by Tatsuhiko Miyagawa

        Rebuild

        48 Listeners

        耳で学ぶAI、ロボシンク by 矢野 哲平

        耳で学ぶAI、ロボシンク

        0 Listeners