Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Why it's so hard to talk about Consciousness, published by Rafael Harth on July 2, 2023 on LessWrong.
[Thanks to Charlie Steiner, Richard Kennaway, and Said Achmiz for helpful discussion.]
[Epistemic status: my best guess after having read a lot about the topic, including all LW posts and comment sections with the consciousness tag]
There's a common pattern in online debates about consciousness. It looks something like this:
One person will try to communicate a belief or idea to someone else, but they cannot get through no matter how hard they try. Here's a made-up example:
"It's obvious that consciousness exists."
Yes, it sure looks like the brain is doing a lot of non-parallel processing that involves several spatially distributed brain areas at once, so
"I'm not just talking about the computational process. I mean qualia obviously exists."
Define qualia.
"You can't define qualia; it's a primitive. But you know what I mean."
I don't. How could I if you can't define it?
"I mean that there clearly is some non-material experience stuff!"
Non-material, as in defying the laws of physics? In that case, I do get it, and I super don't
"It's perfectly compatible with the laws of physics."
Then I don't know what you mean.
"I mean that there's clearly some experiential stuff accompanying the physical process."
I don't know what that means.
"Do you have experience or not?"
I have internal representations, and I can access them to some degree. It's up to you to tell me if that's experience or not.
"Okay, look. You can conceptually separate the information content from how it feels to have that content. Not physically separate them, perhaps, but conceptually. The what-it-feels-like part is qualia. So do you have that or not?"
I don't know what that means, so I don't know. As I said, I have internal representations, but I don't think there's anything in addition to those representations, and I'm not sure what that would even mean.
and so on. The conversation can also get ugly, with boldface author accusing quotation author of being unscientific and/or quotation author accusing boldface author of being willfully obtuse.
On LessWrong, people are arguably pretty good at not talking past each other, but the pattern above still happens. So what's going on?
The Two Intuition Clusters
The basic model I'm proposing is that core intuitions about consciousness tend to cluster into two camps, with most miscommunication being the result of someone failing to communicate with the other camp. For this post, we'll call the camp of boldface author Camp #1 and the camp of quotation author Camp #2.
Characteristics
Camp #1 tends to think of consciousness as a non-special high-level phenomenon. Solving consciousness is then tantamount to solving the Meta-Problem of consciousness, which is to explain why we think/claim to have consciousness. In other words, once we've explained why people keep uttering the sounds kon-shush-nuhs, we've explained all the hard observable facts, and the idea that there's anything else seems dangerously speculative/unscientific. No complicated metaphysics is required for this approach.
Conversely, Camp #2 is convinced that there is an experience thing that exists in a fundamental way. There's no agreement on what this thing is – theories range anywhere from hardcore physicalist accounts to substance dualists that postulate causally active non-material stuff – but they all agree that there is something that needs explaining. Also, getting your metaphysics right is probably a part of making progress.
The camps are ubiquitous; once you have the concept, you will see it everywhere consciousness is discussed. Even single comments often betray allegiance to one camp or the other. Apparent exceptions are usually from people who are well-read on the subject and may have optimized...