# Consciousness Consciousness can be defined in information terms as a property of an entity (usually a living thing, but we can also include artificially conscious machines or computers) that reacts to the information (and particularly to changes in the information) in its environment. In the context of information philosophy, we can define this as _information consciousness_. Thus an animal in a deep sleep is not conscious because it ignores changes in its environment. And robots may be conscious in our sense. Even the lowliest control system using negative feedback (a thermostat, for example) is in a minimal sense conscious of (aware of, exchanging information about) changes in its environment. This definition of consciousness fits with our model of the mind as an _experience recorder and reproducer_ (ERR). The ERR model stands in contrast to the popular cognitive science or "computational" model of a mind as a digital computer. No "information processing" (no processing units, algorithms, or stored programs) is needed for the ERR model, although we also see mind as _immaterial_ "software" in the _material_ brain "hardware." The _physical_ metaphor for ERR is a non-linear random-access data recorder, where data is stored using content-addressable memory (the memory address is the data content itself). Simpler than a computer with stored algorithms, a better technological metaphor might be a video and sound recorder, enhanced with the ability to simultaneously record smells, tastes, touches, and - critically essential - feelings, something no computer machine can do. The _biological_ model is neurons that wire together during an organism's experiences, in multiple sensory and limbic systems, such that later firing of even a part of the wired neurons can stimulate firing of all or part of the original complex experience. The _psychological_ aspect of ERR is that "subjective experience" depends on the diverse and unique past life experiences of each individual, leading to what [David Chalmers](https://www.informationphilosopher.com/solutions/philosophers/chalmers/) calls the "hard problem" of consciousness, [Thomas Nagel](https://www.informationphilosopher.com/solutions/philosophers/nagelt/)'s "what it's like to be...". Where neurobiologist [Donald Hebb](https://www.informationphilosopher.com/solutions/scientists/hebb/) famously argued that "neurons that fire together wire together," our _experience recorder and reproducer_ ERR model assumes that "neurons that have been wired together will fire together." If just some of those wired-together neurons are fired by a new experience, many more of them may fire again, explaining many aspects of memory, feelings, and the association of ideas. Neuroscientists are investigating how diverse signals from multiple pathways can be unified in the brain. We offer no specific insight into these "binding" problems. Nor can we shed much light on the question of philosophical "meaning" of any given information structure, beyond the obvious relevance (survival value) for the organism of remembering, and thus learning from, past experiences. A conscious being is constantly recording information about its perceptions of the external world, and most importantly for ERR, it is simultaneously recording its feelings. Sensory data such as sights, sounds, smells, tastes, and tactile sensations are recorded in a sequence along with pleasure and pain states, fear and comfort levels, etc. All these experiential and emotional data are recorded in association with one another. This means that when the experiences are reproduced ("played back" when some of their interconnected neurons are fired by something in current experience), the accompanying emotions are once again felt, in synchronization. The capability of reproducing experiences is critical to _learning_ from past experiences, so as to make them guides for action in future experiences. The ERR model is the minimal mind model that provides for such learning by living organisms. ERR also explains the uncontrollable and unpleasant recall of past negative experiences that generates post-traumatic stress disorders. The ERR model does not need a single "central processor unit (CPU) or even several "parallel processors." It does not use computer-like "data retrieval," based on the "address" of the data, to reproduce past experiences. All that is required is that past experiences "play back" (are reproduced) whenever they are stimulated by present experiences that _resemble_ the past experiences in one or more ways. When the organism recreates past experiences by acting them out, they become "habitual" and "subconscious" information structures. This repetition, with the random variations caused by noise in recall, subtly changes the recorded experiences over time. It is critical that the original emotions also play back, along with any variations in current emotions that are experienced on playback. ERR then becomes an explanatory basis for conditioning experiments, classical Pavlovian and behaviorist operant conditioning, and in general a model for associative learning. [Bernard Baars](https://www.informationphilosopher.com/solutions/scientists/baars/)'s _Global Workspace Theory_ uses the metaphor of a "Theater of Consciousness," in which there is an audience of purposeful agents calling for the attention of the executive on stage. In the ERR model, vast numbers of past experiences clamor for the attention of the central executive at all times, whenever anything in current experience has some resemblance. Global Workspace Theory is a version of the "blackboard" model of [Allan Newell](https://www.informationphilosopher.com/solutions/scientists/newell/) and [Herbert Simon](https://www.informationphilosopher.com/solutions/scientists/simon/), concepts written on the blackboard call up similar concepts by association from deep memory structures. The ERR model supports this view, and explains the mechanism by which concepts (past experiences) are retrieved and come to the blackboard. In [Daniel Dennett](https://www.informationphilosopher.com/solutions/philosophers/dennett/)'s consciousness model, the mind is made up of innumerable functional homunculi, each with its own goals and purposes. Some of Dennett's homunculi are information structures in the genes, which transmit "learning" or "knowledge" from generation to generation by heredity alone. Others are environmentally and socially conditioned, or consciously learned through cultural transmission of information. If we define "current experience" as all afferent perceptions _plus_ the current contents of consciousness itself, we get a dynamic self-referential system with plenty of opportunities for negative and positive feedback. [William James](https://www.informationphilosopher.com/solutions/philosophers/james/)'s description of a "stream of consciousness" together with a "blooming, buzzing confusion" of the unconscious appear to describe the ERR model very well. ## The Elements of Consciousness Four "Levels" of Consciousness > Instinctive Consciousness - by animals with little or no learning capability. Automatic reactions to environmental conditions are transmitted genetically. Information about past experiences (by prior generations of the organism) is only present implicitly in the inherited reactions. > > Learned Consciousness - for animals whose past experiences guide current choices. Conscious, but mostly habitual, reactions are developed through experience, including instruction by parents and peers. > > Predictive Consciousness - The Sequencer in the ERR system can play back beyond the current situation, allowing the organism to use imagination and foresight to evaluate the future consequences of its choices. > > Reflective (Normative) Consciousness– in which conscious deliberation about [values](https://www.informationphilosopher.com/value/) influences the choice of behaviors. All four levels are [emergent](https://www.informationphilosopher.com/knowledge/emergence.html), in the sense that they did not exist in the lower, earlier levels of biological evolution.