LessWrong (30+ Karma)

“Gated Attention Blocks: Preliminary Progress toward Removing Attention Head Superposition” by cmathw, Dennis Akar, Lee Sharkey


Listen Later

This work represents progress on removing attention head superposition. We are excited by this approach but acknowledge there are currently various limitations. In the short term, we will be working on adjacent problems are excited to collaborate with anyone thinking about similar things!

Produced as part of the ML Alignment & Theory Scholars Program - Summer 2023 Cohort

Summary: In transformer language models, attention head superposition makes it difficult to study the function of individual attention heads in isolation. We study a particular kind of attention head superposition that involves constructive and destructive interference between the outputs of different attention heads. We propose a novel architecture - a ‘gated attention block’ - which resolves this kind of attention head superposition in toy models. In future, we hope this architecture may be useful for studying more natural forms of attention head superposition in large language models.

Our code can [...]

---

Outline:

(01:10) Background

(03:27) Attention head superposition for OV-Incoherent Skip Trigrams

(06:21) Removing Attention Head Superposition from models trained on OV-Incoherent Skip Trigrams

(06:29) Toy Dataset Generation

(07:18) Gated Attention Mechanism

(12:17) Gated Attention Block Training

(13:51) Summary of Architectural and Training Choices

(15:10) Experimental Results on Toy Model Setup

(19:12) Related Work

(20:14) Conclusion

(21:00) Limitations

(25:32) Future Work

(27:42) Acknowledgements

(28:08) Contributions Statement

(28:46) Appendix

(28:49) Appendix A: OV-Incoherent skip trigram dataset generation algorithm

(29:03) Appendix B1: 2 Heads, 3 Skip Trigrams (Trained via Stochastic Gradient Descent)

(29:47) Appendix B2: 3 Heads, 4 Skip Trigrams (Trained via Stochastic Gradient Descent)

(30:32) Appendix C: Other approaches that we investigated

(32:31) Appendix D: Weights for Manually Specified Model encoding 3 OV-Incoherent Skip Trigrams with 2 heads

---

First published:

April 8th, 2024

Source:

https://www.lesswrong.com/posts/kzc3qNMsP2xJcxhGn/gated-attention-blocks-preliminary-progress-toward-removing-1

---

Narrated by TYPE III AUDIO.

...more
View all episodesView all episodes
Download on the App Store

LessWrong (30+ Karma)By LessWrong


More shows like LessWrong (30+ Karma)

View all
Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,446 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,389 Listeners

The Peter Attia Drive by Peter Attia, MD

The Peter Attia Drive

7,910 Listeners

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas by Sean Carroll | Wondery

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

4,136 Listeners

ManifoldOne by Steve Hsu

ManifoldOne

87 Listeners

Your Undivided Attention by Tristan Harris and Aza Raskin, The Center for Humane Technology

Your Undivided Attention

1,462 Listeners

All-In with Chamath, Jason, Sacks & Friedberg by All-In Podcast, LLC

All-In with Chamath, Jason, Sacks & Friedberg

9,095 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

87 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

389 Listeners

Hard Fork by The New York Times

Hard Fork

5,432 Listeners

The Ezra Klein Show by New York Times Opinion

The Ezra Klein Show

15,174 Listeners

Moonshots with Peter Diamandis by PHD Ventures

Moonshots with Peter Diamandis

474 Listeners

No Priors: Artificial Intelligence | Technology | Startups by Conviction

No Priors: Artificial Intelligence | Technology | Startups

121 Listeners

Latent Space: The AI Engineer Podcast by swyx + Alessio

Latent Space: The AI Engineer Podcast

75 Listeners

BG2Pod with Brad Gerstner and Bill Gurley by BG2Pod

BG2Pod with Brad Gerstner and Bill Gurley

459 Listeners