Epistemic status: I think something like this confusion is happening often. I'm not saying these are the only differences in what people mean by "AGI alignment".
Summary:
Value alignment is better but probably harder to achieve than personal intent alignment to the short-term wants of some person(s). Different groups and people tend to primarily address one of these alignment targets when they discuss alignment. Confusion abounds.
One important confusion is an assumption that the type of AI defines the alignment target: strong goal-directed AGI must be value aligned or misaligned, while personal intent alignment is only viable for relatively weak AI. I think this assumption is important but false.
While value alignment is categorically better, intent alignment seems easier, safer, and more appealing in the short term, so AGI project leaders are likely to try it.[1]
Overview
Clarifying what people mean by alignment should dispel some illusory [...]
---
Outline:
(00:18) Summary:
(01:04) Overview
(03:57) Another unstated divide
(06:14) Personal intent alignment for full ASI: can I have your goals?
(07:24) Conclusions
The original text contained 6 footnotes which were omitted from this narration.
The original text contained 1 image which was described by AI.
---