Science Saturday: Dreaming of an Artificial Intelligence | Eliezer Yudkowsky & Jaron Lanier

[26:00]  Lanier: "Here's a way to think about it. Imagine you have a mathematical structure that consists of this huge unbounded continuous yield of scalars and then there's one extra point that's isolated, and I think that the knowledge available to us has a shape something like this. There's this whole world of empirical things we can study and then there's this one thing which is that there's also the epistemological channel of experience which is different, which is very hard to categorize, very hard to talk about, and not subject to empirical study"

This statement, from an obscure interview from 2008, is one of my favorites of Lanier's comments on the consciousness problem. This is a beautiful and insightful statement of the problem, but it's unfortunately lost on most of the crowd that showed up in the comments; perhaps they don't have consciousness themselves, as Lanier half-jokingly taunts Yudkowsky about in this video.

If that is the case, that some people out there don't have it, then we can never possibly bridge the communication gap with them, but this metaphor could be useful in helping those of us who do have it to talk more clearly

1-2-3-4-5-6 Consciousness

Dennett and those on his side, for example, often boast, regarding the "Easy Problems", that "well, at least we're out there making progress!". This is foolish if you understand the Hard Problem - progress on Easy Problems isn't progress at all! You're not moving forward toward cracking the Hard Problem whatsoever, and this metaphor helps illustrate that point: no matter where you move inside this empirically accessible yield, you're not taking a single step toward the extra point floating isolated outside of it!

Imagine, for instance, that this yield is a simple number line of the integers, and consciousness is this extra point you can imagine as just floating off to the side above the line. Anything a Turing-machine can compute, and any program you could run on one, exists somewhere on this line; every possible Turing-machine state is just a large integer. No matter how complex a program you write, you're just moving to the right along the line, but you're taking no steps at all towards the floating point above the line. No matter how much data you output, again, you're just moving up the line.

Accumulating empirical knowledge is just couting really high; there exists some integer encoding of the sum total of your knowledge.

When Dennett and friends say "Ha! At least we're doing something" they're saying "Look at us! Look how high we're counting!" but no matter how high they count they're making zero progress in the direction of that one isolated point

Hofstadter vs Penrose: "Some Numbers Are Conscious"

Roger Penrose tells a story about meeting Douglas Hofstadter and how he thought he'd attempt to "corner him" with an argumentum ad absurdm that some numbers of conscious, but Hofstadter simply "leapt into the corner" and agreed: some numbers are consciousness.

This goes back to what I call Dennettian Consciousness vs Chalmerian Consciousness (or perhaps Lanierian Consciousness). As Lanier points out in another instance, the word "consciousness" and its related terms have been "colonized" by eliminative materialists. The word now refers to a muddled mishmash of concepts and, often, excludes actual consciousness!

Dennettian Consciousness is really just cognition - it's everything empirically observable that the brain does; in other words, not consciousness at all!

Chalmerian/Lanierian Consciousness is actual consciousness itself - that ineffable extra first-person-only thing that would be missing from the picture even if you had an perfect description of the state of every synapse (or, whatever the needed granularity turns out to be) at one instant. Even with those hundred-trillion data points, you wouldn't have any certainty that consciousness is "in there". The consciousness remains that extra thing that you have to insert by assumption, but that the data don't actually give you understanding of. It's, as Lanier says, a totally different empirical category of thing, one you can only learn about first-hand, and the "shape" of the knowledge available to us is such that empirical observation gives you a totally different kind of knowledge, isolated to its own realm with no bridge between.

It's hard to say, without reading Hofstadter extensively, whether he's an eliminative materialist, or an emergentist (Strong Emergence). An eliminative materialist who only believes in Dennettian Consciousness would naturally think "some numbers are conscious". If brain states are computable, then the brain states are just very big integers (symbol-sequences for the Turing-machine), and if there's nothing more to know except the brain states, then there's an integer for each possible conscious state and that's the end of the story. Some numbers are conscious.

But a believer in Chalmerian Consciousness could be a strong-emergentist. Ray Kurzweil seems to be in this category, though he's seldom explicit about it (more on this in another post). Kurzweil presumably thinks that consciousness "pops up" at some point, strong-emergently, when the right computation happens. Hofstadter could be arguing for this view as well; big integers aren't conscious because there's nothing more than data to be know, but because these are special numbers that make the extra consciousness thing pop up.

This kind of computationalism has a lot to answer for and seems ludicrous. That too is for another post.