Astral Codex Ten Podcast

Fundamental Value Differences Are Not That Fundamental


Listen Later

Ozy (and others) talk about fundamental value differences as a barrier to cooperation.

On their model (as I understand it) there are at least two kinds of disagreement. In the first, people share values but disagree about facts. For example, you and I may both want to help the Third World. But you believe foreign aid helps the Third World, and I believe it props up corrupt governments and discourages economic self-sufficiency. We should remain allies while investigating the true effect of foreign aid, after which our disagreement will disappear.

In the second, you and I have fundamentally different values. Perhaps you want to help the Third World, but I believe that a country should only look after its own citizens. In this case there’s nothing to be done. You consider me a heartless monster who wants foreigners to starve, and I consider you a heartless monster who wants to steal from my neighbors to support random people halfway across the world. While we can agree not to have a civil war for pragmatic reasons, we shouldn’t mince words and pretend not to be enemies. Ozy writes (liberally edited, read the original):

From a conservative perspective, I am an incomprehensible moral mutant…however, from my perspective, conservatives are perfectly willing to sacrifice things that actually matter in the world– justice, equality, happiness, an end to suffering– in order to suck up to unjust authority or help the wealthy and undeserving or keep people from having sex lives they think are gross.

There is, I feel, opportunity for compromise. An outright war would be unpleasant for everyone…And yet, fundamentally… it’s not true that conservatives as a group are working for the same goals as I am but simply have different ideas of how to pursue it…my read of the psychological evidence is that, from my value system, about half the country is evil and it is in my self-interest to shame the expression of their values, indoctrinate their children, and work for a future where their values are no longer represented on this Earth. So it goes.

And from the subreddit comment by GCUPokeItWithAStick:

I do think that at a minimum, if you believe that one person’s interests are intrinsically more important than another’s (or as the more sophisticated versions play out, that ethics is agent-relative), then something has gone fundamentally wrong, and this, I think, is the core of the distinction between left and right. Being a rightist in this sense is totally indefensible, and a sign that yes, you should give up on attempting to ascertain any sort of moral truth, because you can’t do it.

I will give this position its due: I agree with the fact/value distinction. I agree it’s conceptually very clear what we’re doing when we try to convince someone with our same values of a factual truth, and confusing and maybe impossible to change someone’s values.

...more
View all episodesView all episodes
Download on the App Store

Astral Codex Ten PodcastBy Jeremiah

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

123 ratings


More shows like Astral Codex Ten Podcast

View all
EconTalk by Russ Roberts

EconTalk

4,234 Listeners

Robert Wright's Nonzero by Nonzero

Robert Wright's Nonzero

584 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,395 Listeners

Odd Lots by Bloomberg

Odd Lots

1,789 Listeners

Future of Life Institute Podcast by Future of Life Institute

Future of Life Institute Podcast

105 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

269 Listeners

ManifoldOne by Steve Hsu

ManifoldOne

89 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

88 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

426 Listeners

Clearer Thinking with Spencer Greenberg by Spencer Greenberg

Clearer Thinking with Spencer Greenberg

128 Listeners

Joe Lonsdale: American Optimist by Joe Lonsdale

Joe Lonsdale: American Optimist

164 Listeners

"Moment of Zen" by Erik Torenberg, Dan Romero, Antonio Garcia Martinez

"Moment of Zen"

91 Listeners

Latent Space: The AI Engineer Podcast by swyx + Alessio

Latent Space: The AI Engineer Podcast

75 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

146 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

123 Listeners