20101014

Testing...

Consciousness is meaning without context.


15 comments:

  1. Is the statement "entropy increases" equivalent to "information increases"? Is information meaning ?

    ReplyDelete
  2. What do you mean by "context"? As a Prof. teaching a seminar that I'm not once said, "context is everything, and its hard to teach a seminar on everything".

    Also, I don't think information is meaning. Information needs to be represented, but its hard to ground meaning in representation. Consciousness is what grounds that meaning in information. Searle's and the Chinese Room.

    ReplyDelete
  3. If meaning comes from consciousness then consciousness imbues meaning into information. Do you mean consciousness is "raw meaning" un-affixed to any information? That "meaning out of context" is potential meaning waiting to be linked to info? It feels a bit dualist or Freudian. Do we have stores of "potential meaning" waiting to affix themselves similar to the libido? Can't be. Meaning would need to be attached in the moment of sensation causing perception. I prefer this, it requires embodiment. Consciousness would emerge in complex embodied agents interacting with their environment. Does this kill the argument of Desktop AI? Do minds really need to be embodied to be minds?

    ReplyDelete
  4. Consciousness inscribes Meaning on Context. Meaning exists without Context, but is continually eroded away by Thought and Time. Eventually, all there is disconnected Noise.

    ReplyDelete
  5. what? My question actually made sense. Entropy _is_ measured in bits and higher entropy bit vectors tend to have more "information", in the sense that a optimally compressed string _should_ have nearly maximal entropy.

    Maybe my inability to parse subsequent comments is due to a general lack of familiarity with the liberal arts.

    F's string is poetic, but may have a rigorous interpretation.

    Information is meaning, context is a scheme for interpreting information. Programs are strings of bits, but you need to know "this is assembly for x86" to get it to anything.

    To say that consciousness is meaning without context, I think, is somehow arguing that the information present in the brain needs no corresponding context of interpretation. But, this doesn't really make sense. I'm running up to the limits of language : our use of language is, in of itself, meaning(information), in the context of the statistical distribution of reality, bound arbitrarily to certain sounds and shapes. The binding is the context. Attempting to express and communicate any thing at all requires an arbitrary mapping from the information in the communication channel to _some_other_ information, in this case shared or similar representations of the environment ( consensus reality ).

    SO, consciousness is not meaning without context. If the topology of the environment changes, our internal representations become neigh useless. If the environment changes radically, subjective experience changes radically. If the interpretation scheme of a bit string is altered ("this data is actually executable"), the meaning of the bit string is changed.

    Now, I suppose, we can ( and now I partially understand Evan's first comment ) remove the context by including environment in the information(meaning) that we are calling "consciousness", in which case, it is the internal structure of our now joined brain-environment that determines subjective experience.

    in conclusion :

    twisted knots in black boxes.

    oh dear. I've lost it.

    ReplyDelete
  6. If you cut out the excursion about how our communication is limited by the information and context binding, then I actually almost made a point, in the last paragraph. I hope, assuming, of course, that our contexts for the interpretation of language are suitably aligned.

    ReplyDelete
  7. No, twisted knots in black boxes is exactly it, I stand by that metaphor.

    ReplyDelete
  8. I was mostly being snarky. I like F's comment, and your explanation.

    Consider the klein bottle as the neural topology of the typical US Senator. Now, try and pour knowledge 'inside' said Senator. Then, vote No on Life Panels.

    ReplyDelete
  9. Lol. I really think language hinders this discussion

    ReplyDelete
  10. I've sort of mulled this over for a few days. I don't think that "information" should be conflated with meaning, because that is not the same kind of meaning that F is going for. Let me try to write exactly what I got out of it.

    A sequence of zeros and ones on a hard disk, may constitute "information" in some strict information theoretical sense, but it does not carry meaning unless those bits have an intended interpretation. Some obvious examples: If I write you a letter and email it to you, you will be able to read and understand it, given its context. If I take a jpeg, rename it as a .txt, and email it to you, you will probably be very confused. If I encrypt my letter and you intercept it and try to look at the ciphertext, then unless you can break my cipher, the ciphertext will be meaningless.

    This also includes the more general computer science mantra that we should not make a conceptual separation between "programs" and "data". In different contexts the same information can be interpreted either way.

    We can also think now about, instead of a static file of zeros and ones, a stream of data coming from some source. The same principles apply -- if I attached a measurement device to measure the static on my broken TV set, then posted that on the internet as a Blog, people would probably think I'm a jerk.

    What I interpret F's claim to be is, consciousness is a property of a dynamical system or other sort of physical system, which permits us to attach meaning to the collection of data streams emanating from the system, without needing to have a context.

    Its appealing in some ways but hard to refine. Part of my reasoning is that, when we are thinking about how machines operate, we are often taught / tell others to "imagine as though they are the machine"... then when we look at the output of the machine, we can imagine what would have caused us to do that thing, or try to diagnose problems or figure out what the machine is confused about. But, when the machine is actually conscious, then this imagining process becomes much easier -- the fact of the machine's consciousness is all the context that is necessary.

    I doubt that I am successfully reconstructing F's thoughts, but this is what I've got.

    ReplyDelete
  11. Yeah, I think Beck basically got at what I was trying to say. To be more concise, but not quite as cryptic as initially: a conscious system is one in which symbols have a canonical (or inherent?) interpretation, not one that must be assigned by an external agent.

    After thinking a little bit about how to implement this, I decided that Hofstadter's strange loops are actually a reasonable first effort. By encoding a copy of itself, a system could provide a dictionary between its symbols and the outside world. Maybe.

    And yes, these things are hard to express in words.

    ReplyDelete
  12. Oh so I guess the other part I didn't say before is,

    *the conscious system can make its streams with the full knowledge that some other conscious system may eventually read them*

    so, as in stories like Contact, its possible (we assert) for a conscious alien entity to make a signal that could be "deciphered" by other conscious entities, using no language or a priori knowledge whatsoever, using no context on which to base their communication, only the assumption that the other party is intelligent. This is not generally going to be possible for say, x86 Desktop computers running software that currently exists. Its not clear that this is even possible for other animals and things that we also believe are conscious, so this actually may be too specific a property, as currently stated... but I think its related.

    ReplyDelete
  13. But, an x86 program, once set running, given continual access to inputs and outputs, will faithfully perform its function, including interpreting inputs and controlling outputs, without a shred of understanding of its own code. In a sense, the symbols in a compiled x86 don't have to mean anything either, they just need to happen to perform some computation on a particular cluster of atoms. I don't feel like consciousness is particularly distinct from this. Code word 0x57DE makes a light blink, Neuron 5000279 makes a muscle twitch. The canonical interpretation of code words in the nervous system is fixed by hardware, as is the canonical interpretation of code words in machine code.

    I like Beck's second point.

    ReplyDelete
  14. So perhaps the claim can be salvaged if the muscle that neuron 5000279 makes twitch counts as context. I wanted to hypothesize that consciousness is that part of a system whose interpretation (as symbols) doesn't depend on stuff outside the system.

    The whole idea may be circular, because what's a symbol if you don't relate it to something external? All that remains is that it's something that's accessible to conscious interpretation. On the other hand, we do experience things that have no sensory equivalent, through dreams for example.

    In other words, I suspect that the interpretation of certain code words in the nervous system is fixed by things other than hardware. A degenerate case of this would be that conscious experience is simply a sufficiently complete description.

    ReplyDelete