< Where Is It? A Continuing Series
Everybody Loves Dirt Candy >

[Comments] (4) How Strange Is The Loop?: I borrowed I Am a Strange Loop from Evan and read it after reading Consciousness Explained. As you might expect I agree with most of what Hofstadter says in the book, but there's one big thing I don't think he ever makes an argument for.

Nonlocalized consciousness (not the real term) makes it legitimate to think of other people as existing inside you to the extent that you've internalized their mental processes. Which is always really not very much at all, but better than nothing. One implication of this is that fictional characters can exist inside you in the same sense. That seems reasonable to me. Fictional thoughts are thoughts.

But Hofstadter says that you can (and do) actually internalize another person's consciousness on a coarse-grained level: the "strange loop" that drives that person's consciousness. And so not only do (your copies of) other people and fictional characters exist inside you, they are actually conscious inside you, because their strange loop is running on your hardware.

Stipulating the strange loop thing, I guess fictional characters could be called conscious if you simulated their thought processes in enough detail, but is that really what's happening? Do you ever know enough about someone else's thought process to internalize their consciousness, a thing so inaccessible that the person themself can't describe it? When I make up a character I have, in theory, complete control over their mental states, but I don't create a strange loop for them. I use my own, preexisting consciousness to simulate them. That's different. I'm pretending to be them, with greater or lesser success.

I'm not as old as Hofstadter, so I don't have as much practice, but I've known Sumana for about eight years and I do have an internalized Sumana that acts kind of like the real Sumana. But I wouldn't say I've internalized a copy of Sumana's consciousness, her sleep number strange loop. I'm using my own strange loop as the simulation engine.

So my intuition is that my Sumana-symbol, my symbols for dead people I used to know, and my symbols for fictional characters are the same kind of symbols as I keep for other complex entities like the World Wide Web. Not the kind that forms an "I". I can be surprised by something one of my fictional characters does, but it's the same kind of surprise as when I come up with an idea some other way. I don't see where Hofstadter argues that our representations of other people have this unusual property, but a lot of the book assumes it.

I'm explicitly not saying that mental simulations of consciousness would not be conscious. They would be. And I could believe that someone with eg. multiple personality disorder had multiple strange loops in their mind. I just don't think that's what happens when we think about a dead person we used to know.

Filed under:

Comments:

Posted by Zack at Mon Dec 22 2008 20:23

Counterpoint: this article by Emma Bull about the subjective experience of writing process. (Read the comments.)

Based on that and my own subjective experience mostly with RPG characters (and not on having read the book, just to be clear), I would argue that "using one's own strange loop as a simulation engine for someone else" and "having multiple strange loops in one's mind" are different only in degree. It seems to me that the more time one spends inhabiting a character (to borrow a bit of stage-actor terminology) the more independent the simulated consciousness gets, to the point where it can be more useful to metacognize about it as a separate individual inhabiting one's head rather than a mask one wears.

Of course, short of genuine MPD, one's primary consciousness is still experiencing everything a character experiences, and so the character remains subordinate in a sense; its subjective experience is a subset of your own. (Although that can get pretty twisty. I only imagine the world that my RPG characters live in, but to the extent that they are this independent, their world is real to them...)

Posted by Nicholas at Tue Dec 23 2008 02:57

I did the same as you, reading both these books at roughly the same time, and I had the same feeling as you. I don't think I've internalised the consciousness of my partner of 12 years, Catherine, to anywhere near the same degree that Douglas claims he internalised Carol's consciousness. Oh sure, I can guess what she's going to like and dislike, and I can do a pretty convincing parody of her, but I never get a sense of "this is Catie".

I found his argument a little bit contradictory, too, because he also makes the point that the strange loop is constantly evolving, so Douglas-of-now isn't the same person as Douglas-of-last-year. In the same way then, surely, Carol's-Consciousness-In-Douglas-of-now is not the same as Carol's-Consciousness-In-Douglas-of-last-year, and it is sure to have diverged a lot from what Carol's-Consciousness-in-Carol would have looked like.

My major take from that book was a strong feeling of sadness for Douglas' loss.

Posted by Leonard at Tue Dec 23 2008 08:29

I think talking about fictional characters obscures the point because fictional characters have no reference implementation, so it's natural for us to use our own loops to simulate them, and natural for my Sherlock Holmes to differ substantially from your Sherlock Holmes. It's not okay for my Zack to differ from Zack's Zack because that means I've missed something.

I can't create another strange loop in my head, but maybe if I knew enough about your thought processes I could make my strange loop emulate yours? If I know that a fictional character is very impulsive, my simulations of that character make different choices and have different thoughts than I would. Maybe this is how it works.

Because of these two books I came up with my current working explanation of death: when you die, you become a fictional character.

Posted by Ian Bicking at Wed Dec 24 2008 00:38

Is prediction and consciousness the same thing? Is there a point when the crudeness of our simulation is such that it isn't consciousness at all? One could imagine a "simulation" of a person that is made up of sayings they would say, a kind of Eliza based on how they tend to act. But of course that isn't consciousness. What's the simulation really doing? I'm not sure my simulations of other people, even people I know intimately, go a lot further than simple pattern matching prediction based on past experience. Maybe that's my own flaw.

That said, I was listening to this radio program and it had this character actor on (this guy: http://www.publicradioquest.com/node/219) and they interviewed the actor some. One thing he talked about was realizing how he had invented himself, so creating this other character wasn't so strange, just another invention, and while the facts of his life weren't authentic, the personality itself was no less authentic. It got me to imagining what if I reinvented my personality. Is that really possible? Many things seem innate, but certainly not everything. Anyway, this seems to somehow relate.


[Main] [Edit]

Unless otherwise noted, all content licensed by Leonard Richardson
under a Creative Commons License.