Made with Natural and Artificial Flavors: Consciousness Part 1 of ...?
Willful Hedging
It’s hard to tell anymore what the trends are in pop-sci-culture with the way we are fed content via the machines. I’ve had a whole lot of “free will” pushed into my online notice in recent weeks. Much of this content is related to the proposition of purposelessness that Robert Sapolsky has made with his book Determined. I did listen to this Audible book a couple months back, and presumably the machines are well aware of that, and as such are willing this related content in my direction. One of these pieces, the greatest grabber of my interest but turned out to be disappointingly unrevealing, was a podcasted debate between Robert Sapolsky and Daniel Dennett. Daniel Dennett is the most familiar to me adherent of compatibilism, that is the contention that free will is not incompatible with a determinism. Dennett’s perspective, (my possibly inaccurate interpretation) is that we do in fact have free will, if we constrain the boundaries of an individual’s decision making to only the psychological variables relevant to the context of the situation that demands a conscious decision. At least that’s the best argument I can make based on my understanding of his. And I do want to find a good argument for the compatibility of free will in a deterministic framework of human behavior. A core premise, or assumption, of this point of view is that conscious decisions directly influence behavior. And this is a hard case to make, particularly since we can’t find a good scientific, much less conversational baseline to start to flesh out what consciousness is.
Consciousness and Self Awareness
It is often claimed that consciousness is somehow special and potentially inexplicable. The nature of consciousness - how it’s defined, what forces and machinery it is built on, where those systems reside and what their boundaries are - may be so difficult to pin down because language and consciousness seem to me to be in service of the same biological function using much of the same machinery. It is not apparent to me that consciousness is categorically different than any other aspect of the systems that manage biological behavior.
Consciousness may be a consequence of evolution providing an adaptive lever for social cooperation. What better way to provide a selective pressure for cooperation than to develop some sensory mechanism for social alignment? That begs the question can we have consciousness without a role, or a Sense of self? Just to make a point for the sake of clarity, I don’t want to suggest that this Sense of self implies or would require self awareness. A perceptual Sense that provides useful stimulus in this regard might be achievable in a primitive Sense of self, not necessarily in a complex form we now hold, but maybe in some more rudimentary sense that provides simple social accountability. Such a Sense could have evolved into greater complexity as a stronger more broad Sense of self provided better collective problem solving and the distribution of increasingly complex tasks. This in turn could have evolved into the ability to recount and simulate situations to predict and coordinate. Maybe this led to complex language or vice versa? Either way this simulation would be what we might consider imagination. This imagination might be all there is to the Sense of consciousness. From what we think we understand about how the brain relates to behavior, how it dictates behavior, the entirety of this imaginative process occurs after we behave. That is to say that all of our decisions and actions are made before we imagine them, or before we become “conscious” of them. It may be that these simulations are necessary to manage the temporal complexities of cooperation.
Consciousness may be nothing more than a complex, but not categorically special, sensory mechanism to provide simulated stimuli to drive behavior that supports advanced social coordination.