SSS Yet To Be Done.
https://www.britannica.com/topic/philosophy-of-mind/Homunculi#ref1052314
“What it’s like”
Ned Block has pointed out an important distinction between two concepts of consciousness that many of these proposals might be thought to run together: “access” (or “A-”) consciousness and “phenomenal” (or “P-”) consciousness. Although they might be defined in a variety of ways, depending upon the details of the kind of computational (or other) theory of thought being considered, A-consciousness is the concept of some material’s being conscious by virtue of its being accessible to various mental processes, particularly introspection, and P-consciousness consists of the qualitative or phenomenal “feel” of things, which may or may not be so accessible. Indeed, the fact that material is accessible to processes does not entail that it actually has a feel, that there is “something it’s like,” to be conscious of that material. Block goes on to argue that the fact it has a certain feel does not entail that it is accessible.
In the second half of the 20th century, the issue of P-consciousness was made particularly vivid by two influential articles regarding the very special knowledge that one seems to acquire as a result of conscious experience. In “What Is It Like to Be a Bat?” (1974), Thomas Nagel pointed out that no matter how much someone might know about the objective facts about the brains and behaviour of bats and of their peculiar ability to echolocate (to locate distant or invisible objects by means of sound waves), that knowledge alone would not suffice to convey the subjective facts about “what it’s like” to be a bat. Indeed, it is unlikely that human beings will ever be able to know what the world seems like to a bat. In a paper published in 1982, “Epiphenomenal Qualia,” Jackson made a similar point by imagining a brilliant colour scientist, “Mary” (the name has become a standard term in discussions of the notion of phenomenal consciousness), who happens to know all the physical facts about colour vision but has never had an experience of red, either because she is colour-blind or because she happens to live in an unusual environment. Suppose that one day, through surgery or by leaving her strange environment, Mary finally does have a red experience. She would thereby seem to have learned something new, something that she did not know before, even though she previously knew all of the objective facts about colour vision.
http://protoscience.wikia.com/wiki/Phenomenal_and_Access_Conciousness
Phenomenal and Access Consciousness
Ned Block (1942- )
Ned Block is in the NYU Department of Philosophy.
Two types of consciousness
According to Block[1], "Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action." Block feels that it is possible to have phenomenal consciousness and access consciousness independently of each other, but in general they do interact.
There is no generally agreed upon way of categorizing different types of consciousness. Block's distinction between phenomenal consciousness and access consciousness tries to distinguish between conscious states that either do or do not directly involve the control of thought and action.
Phenomenal consciousness. According to Block, phenomenal consciousness results from sensory experiences such as hearing, smelling, tasting, and having pains. Block groups together as phenomenal consciousness the experiences of sensations, feelings, perceptions, thoughts, wants and emotions. Block excludes from phenomenal consciousness anything having to do with cognition, intentionality, or with "properties definable in a computer program".
Access consciousness. Access consciousness is available for use in reasoning and for direct conscious control of action and speech. For Block, the "reportability" of access consciousness is of great practical importance. According to Block:
"reportability.....is often the best practical guide to A-consciousness" [Note: Block often uses the terms "P-consciousness" and "A-consciousness" to refer to "Phenomenal consciousness" and "Access consciousness"]
Also, access consciousness must be "representational" because only representational content can figure in reasoning. Examples of access consciousness are thoughts, beliefs, and desires.
A potential source of confusion is that some phenomenal consciousness is also representational. The key distinction to keep in mind about representational content that Block would place in the access consciousness category is that the reason it is placed in the access consciousness category is because of its representational aspect. Elements of phenomenal consciousness are assigned to the phenomenal consciousness category because of their phenomenal content.
Reaction
An immediate point of controversy for Block's attempt to divide consciousness into the subdivisions of phenomenal consciousness and access consciousness is that many people view the mind as resulting (in its entirety) from fundamentally computational processes. This computational view of mind implies that ALL of consciousness is "definable in a computer program", so Block's attempt to describe some consciousness as phenomenal consciousness cannot succeed in identifying a distinct category of conscious states.
As mentioned above, Block feels that phenomenal consciousness and access consciousness normally interact, but it is possible to have access consciousness without phenomenal consciousness. In particular, Block believes that zombies are possible and a robot could exist that is "computationally identical to a person" while having no phenomenal consciousness. Similarly, Block feels that you can have an animal with phenomenal consciousness but no access consciousness.
Block believes that we can have conscious experiences that are not possible to produce by any type of computational algorithm and that the source of such experiences is "the hard problem" of consciousness. Block's position with respect to consciousness is analogous to that of Vitalists who defined Life as being in a category distinct from all possible physical processes. Biologists refute Vitalism by describing the physical processes that account for Life. In order to refute Block's claim about the distinction between phenomenal consciousness and access consciousness, it is up to biologists and artificial consciousness researchers to describe computational algorithms that account for consciousness.
Why are some neurobiologists and computer scientists sure that Block's dualist division of consciousness is wrong? What is the source of Block's certainty that there are non-computational forms of consciousness? One example of phenomenal consciousness discussed by Block is a loud noise that you do not consciously notice because you are paying attention to something else. Block is sure that you were aware of the noise (phenomenal consciousness) but just not "consciously aware" (access consciousness). Many scientists would say that in this case, you were not "consciously aware" of the noise, but it is almost certain that portions of your unconscious brain activity responded to the noise (you could electrically record activity in the primary auditory cortex that is clearly a response to action potentials arriving from the ears due to sound waves from the noise). This suggests that Block's controversial "non-computational" category of phenomenal consciousness includes brain activity that others would categorize as being unconscious, not conscious. Some unconscious brain activity can begin to contribute to consciousness when the focus of one's conscious awareness shifts. This suggests that some of what Block calls phenomenal consciousness is brain activity that can either take place outside of consciousness or as part of consciousness, depending on other things that might be going on in the brain at the same time. If so, we can ask why the consciously experienced version of this kind of brain activity is computational while the unconscious version is not.
Block stresses that he makes use of introspection to distinguish between phenomenal consciousness and access consciousness. Presumably this means that when the loud noise was not noticed, it was not accessed by introspection. Block has thus defined a category of consciousness that is outside of our "conscious awareness" (although he says we are "aware" of it in some other way) and not accessed by introspection. Maybe it is this inaccessibility of some cases of phenomenal consciousness that motivate Block's idea that such forms of consciousness cannot be computational. When experiences are accessible to introspection and available for inclusion in reasoning processes, we can begin to imagine computational algorithms for their generation.
Forms of phenomenal consciousness that are open to introspection
In his 1995 article, Block went on to discuss the more interesting cases such as if upon starting to "pay attention to" the load noise (see above) that was previously ignored, the experiencer noticed that there had been some earlier experience of the noise, just not of the type that we "pay attention to"; a type of experience that had been just "on the edge" of access consciousness.
In Ned Block's entry for "Consciousness" in the 2004 Oxford Companion to the Mind[2], he discusses another example that he feels distinguishes between phenomenal consciousness and access consciousness.
As described by Block, Liss[3] performed an experiment in which he presented test subjects with visual stimuli: views of 4 letters. The 4 letters were shown to the test subjects in two different ways:
1) "long" stimulus, e.g. 40 msec, followed by a second visual stumulus, a “mask” known to make the first stimulus (the letters) hard to identify
- or
2) "short", e.g. 9 msec, without a second stimulus (the mask).
- "Subjects could identify 3 of the 4 letters on average in the short case but said they were weak and fuzzy. In the long case, they could identify only one letter, but said they could see them all and that the letters were sharper, brighter and higher in contrast. This experiment suggests a double dissociation: the short stimuli were phenomenally poor but perceptually and conceptually OK, whereas the long stimuli were phenomenally sharp but perceptually or conceptually poor, as reflected in the low reportability."
This experiment demonstrates a distinction between
- i) reportability of names of the letters
- and
- ii) perceptual sharpness of the image.
Block's definitions of these two types of consciousness leads us to the conclusion that a non-computational process can present us with phenomenal consciousness of the forms of the letters, while we can imagine an additional computational algorithm for extracting the names of the letters from their form (this is why computer programs can perform character recognition). The ability of a computer to perform character recognition does not imply that it has phenomenal consciousness or that it need share our ability to be consciously aware of the forms of letters that it can algorithmically match to their names.
Reactions
If Block's distinction between phenomenal consciousness and access consciousness is correct, then it has important implications for attempts by neuroscientists to identify the neural correlates of consciousness and for attempts by computer scientists to produce artificial consciousness in man-made devices such as robots. In particular, Block seems to suggest that non-computational mechanisms for producing the subjective experiences of phenomenal consciousness must be found in order to account for the richness of human consciousness or for there to be a way to rationally endow man-made machines with a similarly rich scope of personal experiences of "what it is like to be in conscious states". Other philosophers of consciousness such as John Searle have similarly suggested that there is something fundamental about subjective experience that cannot be captured by conventional computer programs.
Many advocates of the idea that there is a fundamentally computational basis of mind feel that the phenomenal aspects of consciousness do not lie outside of the bounds of what can be accomplished by computation[4]. Some of the conflict over the importance of the distinction between phenomenal consciousness and access consciousness centers on just what is meant by terms such as "computation", "program" and "algorithm". In practical terms, how can we know if it is within the power of "computation", "program" or "algorithm" to produce human-like consciousness? There is a problem of verification; can we ever really know if we have a correct biological account of the mechanistic basis of conscious experience and how can we ever know if a robot has phenomenal consciousness?
Many neurobiologists and computer scientists feel that philosophers such as Block and Searle are overly-pessimistic about the power of "computation", "program" or "algorithm" to produce human-like consciousness. The study of "computation", "program", "algorithm" and consciousness is too primitive for us to be able to trust our intuitions about exactly what is possible for computational algorithms to accomplish. Further, it may not matter what we call physical processes that can generate consciousness as long as we can figure out what they are and how to work with them. Thus, neurobiologists and computer scientists feel justified in continuing to search for the physical basis of consciousness and for ways to endow man-made devices with human-like consciousness. Further, despite warnings from philosophers, neurobiologists and computer scientists often suspect that conventional physical accounts of brain processes and some form of computational algorithm can be found to explain consciousness and allow us to instantiate it in robots.
Some philosophers such as Thomas Nagel have claimed a fundamental distinction between the first person experience of consciousness and any third person account of the mechanisms by which consciousness is generated[5]. If philosophers can be overly-pessimistic about what neuroscientists and computer scientists can accomplish from the third person perspective, they might also be overly-enthusiastic about the reliability of first person introspection. Some philosophers have been fundamentally skeptical about our ability to be certain about anything we observe from the first person perspective[6]. Despite any sense we me have about our inability to be be wrong about our subjective evaluations of our own consciousness, it may be wise to keep an open mind and remain open to the possibility that phenomenal consciousness is not a distinct category from access consciousness. For example, they may be at the two ends of a continuous spectrum of consciousness for which some forms of consciousness are easier to imagine as being algorithmically generated that others
The results of the experiment of Liss (discussed above) can have several interpretations. Viewing printed letters can lead to activation of many different brain regions and brain processes. Some parts of the brain that are devoted just to visual processing make heavy contributions to our ability to form a clear mental image of the shape, form and color of letters. These brain regions allow us to become aware of visual features but we are almost totally unable to have any introspective insight into how we become aware of shape, form and color. Other parts of the brain are required for our normal ability to report the names of letters that we see. The experiment shows that by controlling the exact conditions under which experimental subjects are asked to report their experience of the letters, conditions will either favor awareness of letter form or awareness of the names of the letters. Presumably, a sufficiently detailed analysis of brain activity would reveal how the variable test conditions of the experiment result in different patterns of activity in various parts of the brain and would allow for an account of the results of the experiment in terms of the details of brain function.
Most philosophers participate in introspective efforts to understand the steps involved in their own linguistic competencies. Introspection can allow us to be aware of mental processes that seem to have a linear sequence for the production of speech or lines of reasoning. Computer science also has an established history of defining explicit algorithms by which strings of words can be placed in grammatically correct orders and various theorem generating programs now exist which seem to replicate some aspects of reasoning. Thus, introspection combined with knowledge of what computer science has and has not yet accomplished provides philosophers with certain intuitions[7] about the nature of consciousness and the nature of computation. In particular, Block has been led to suspect that phenomenal consciousness is fundamentally outside of the range of things that can be done with programs.
References
- ^ Block, N. (1995). ON A CONFUSION ABOUT A FUNCTION OF CONSCIOUSNESS. Behavioral and Brain Sciences 18 (2): 227-287.
- ^ Block, N. (2004). "Consciousness" (in R. Gregory (ed.) Oxford Companion to the Mind, second edition 2004).
- ^ Liss, P., (1968). “Does backward masking by visual noise stop stimulus processing?” Perception & Psychophysics 4, 328-330.
- ^ For a short account, see the Wikipedia entry for phenomenal and access consciousness. Charles Siewert provides a more detailed analysis in his article "Consciousness and Intentionality" in The Stanford Encyclopedia of the Philosophy of Mind.
- ^ "What is it like to be a bat?" by Thomas Nagel in The Philosophical Review LXXXIII, 4 (1974): 435-50.
- ^ On Certainty by Ludwig Wittgenstein. Publisher: Harper Perennial (1972) ISBN: 0061316865.
- ^ Güven Güzeldere described such intuition about the distinctions between phenomenal consciousness and access consciousness as segregationist intuition. See "The many faces of consciousness: a field guide" in THE NATURE OF CONCIOUSNESS; PHILOSOPHICAL DEBATES Publisher: The MIT Press (1997) ISBN: 0262522101.
Alternatives to Block's two categories of consciousnessEdit
Stanislas Dehaene, Claire Sergent, and Jean-Pierre Changeux in Proc Natl Acad Sci U S A (2003) Volume 100 pages 8520–8525.
See also
- This essay about Block's division of consciousness into two distinct categories was originally written for the "Consciousness studies" wikibook.
https://en.wikibooks.org/wiki/Consciousness_Studies/The_Conflict2#Phenomenal_consciousness_and_access_consciousness
Phenomenal consciousness and access consciousness
Block(1995) drew attention to the way that there appear to be two types of consciousness: phenomenal consciousness and access consciousness:
Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action. (Block 1995).
See the section on Ned Block's ideas for a deeper coverage of his approach to access and phenomenal consciousness.
Block uses Nagel's famous (1974) paper, "What is it like to be a bat?" as an exemplary description of phenomenal consciousness. Excellent descriptions have also been proffered by the empiricist philosophers who gave lengthy descriptions of consciousness as partly experience itself. Although Block has formalised the idea of phenomenal and access consciousness similar ideas have also been put forward by many philosophers including Kant and Whitehead.
Access consciousness has two interpretations, in the first, used by Block, it applies to the functions that appear to operate on phenomenal consciousness. In the second, used by the behaviourists and eliminativists, it is some property of the functions of the brain that can be called 'consciousness'.
This division between phenomenal and functional aspects of consciousness is useful because it emphasises the idea of phenomenal consciousness as observation rather than action. Some philosophers such as Huxley in 1874, have taken the view that because phenomenal consciousness appears to have no function it is of no importance or cannot exist. James (1879) introduced the term "epiphenomenalism" to summarise the idea that consciousness has no function.
The idea that phenomenal consciousness cannot exist is a type of Eliminativism (also known as Eliminative Materialism). Eliminativism owes much to the work of Sellars (1956) and Feyerbend (1963). Dennett (1978) applied Eliminativism to phenomenal consciousness and denies that pain is real. Others such as Rey(1997) have also applied eliminativism to phenomenal consciousness.
Dennett (1988) redefines consciousness in terms of access consciousness alone, he argues that "Everything real has properties, and since I don't deny the reality of conscious experience, I grant that conscious experience has properties". Having related all consciousness to properties he then declares that these properties are actually judgements of properties. He considers judgements of the properties of consciousness to be identical to the properties themselves. He writes:
"The infallibilist line on qualia treats them as properties of one's experience one cannot in principle misdiscover, and this is a mysterious doctrine (at least as mysterious as papal infallibility) unless we shift the emphasis a little and treat qualia as logical constructs out of subjects' qualia-judgments: a subject's experience has the quale F if and only if the subject judges his experience to have quale F. "
Having identified "properties" with "judgement of properties" he can then show that the judgements are insubstantial, hence the properties are insubstantial and hence the qualia are insubstantial or even non-existent. Dennett concludes that qualia can be rejected as non-existent:
"So when we look one last time at our original characterization of qualia, as ineffable, intrinsic, private, directly apprehensible properties of experience, we find that there is nothing to fill the bill. In their place are relatively or practically ineffable public properties we can refer to indirectly via reference to our private property-detectors-- private only in the sense of idiosyncratic. And insofar as we wish to cling to our subjective authority about the occurrence within us of states of certain types or with certain properties, we can have some authority--not infallibility or incorrigibility, but something better than sheer guessing--but only if we restrict ourselves to relational, extrinsic properties like the power of certain internal states of ours to provoke acts of apparent re- identification. So contrary to what seems obvious at first blush, there simply are no qualia at all. " (Dennett 1988)
Dennett's asserts that "a subject's experience has the quale F if and only if the subject judges his experience to have quale F". This is a statement of the belief that qualia are the same as processes such as judgements. Processes such as judgements are flows of data where one state examines a previous state in a succession over time and embody what Whitehead called the "materialist" concept of time. Dennett does not consider how a scientific concept of time might affect his argument.
Dennett's argument has been persuasive and there are now many philosophers and neuroscientists who believe that the problem of phenomenal consciousness does not exist. This means that, to them, what we call 'consciousness' can only be a property of the functions performed by the brain and body. According to these philosophers only access consciousness exists.
Those who support the idea of phenomenal consciousness also tend to frame it in terms of nineteenth century theory where one state examines a previous state in a succession over time, for instance Edelman(1993) places the past in memories at an instant and time within experience is explained as continuing modelling processes:
"Primary consciousness is the state of being mentally aware of things in the world--of having mental images in the present. But it is not accompanied by any sense of a person with a past and a future.... In contrast, higher-order consciousness involves the recognition by a thinking subject of his or her own acts or affections. It embodies a model of the personal, and of the past and the future as well as the present. It exhibits direct awareness--the noninferential or immediate awareness of mental episodes without the involvement of sense organs or receptors. It is what we humans have in addition to primary consciousness. We are conscious of being conscious."
Block(2004) also suggests this flow from state to state in his idea of "Reflexivity" where our idea of familiarity with an object is due to one state being analysed by another:
"Thus in the “conscious” case, the subject must have a state that is about the subject’s own perceptual experience (looking familiar) and thus conscious in what might be termed a “reflexive” sense. An experience is conscious in this sense just in case it is the object of another of the subject’s states; for example, one has a thought to the effect that one has that experience. The reflexive sense of 'consciousness' contrasts with phenomenality, which perhaps attaches to some states which are not the objects of other mental states. Reflexive consciousness might better be called ‘awareness’ than ‘consciousness’. Reflexivity is phenomenality plus something else (reflection) and that opens up the possibility in principle for phenomenality without reflection. For example, it is at least conceptually possible for there to be two people in pain, one of whom is introspecting the pain the other not. (Perhaps infants or animals can have pain but don’t introspect it.) The first is reflexively conscious of the pain, but both have phenomenally conscious states, since pain is by its very nature a phenomenally conscious state. "
Both Block and Edelman allow phenomenal consciousness, our experience, as an unexplained phenomenon. Block, Edelman and also Dennett's ideas of consciousness are shown in the illustration below:

This model differs from the empirical reports of phenomenal consciousness that were considered earlier. According to the empirical reports the present moment in our experience is extended so the succession of outputs or stages of access consciousness could constitute the contents of phenomenal consciousness. In other words phenomenal consciousness is composed of periods of access consciousness. This is how it seems to the empiricist and in our own experience but how such a state could be explained in terms of brain activity is highly problematical. Given that nineteenth century ideas cannot explain such a state a scientific explanation will be required.
The idea that phenomenal consciousness misrepresents or "misdiscovers" itself (Dennett 1988) deserves further discussion. According to materialism the present instant has no duration so can only be known in succeeding instants as a report or memory and this could be wrong. Whitehead considered that this viewpoint originates in an archaic view of science, particularly the concept of time in science:
"The eighteenth and nineteenth centuries accepted as their natural philosophy a certain circle of concepts which were as rigid and definite as those of the philosophy of the middle ages, and were accepted with as little critical research. I will call this natural philosophy 'materialism.' Not only were men of science materialists, but also adherents of all schools of philosophy. The idealists only differed from the philosophic materialists on question of the alignment of nature in reference to mind. But no one had any doubt that the philosophy of nature considered in itself was of the type which I have called materialism. It is the philosophy which I have already examined in my two lectures of this course preceding the present one. It can be summarised as the belief that nature is an aggregate of material and that this material exists in some sense at each successive member of a one-dimensional series of extensionless instants of time. Furthermore the mutual relations of the material entities at each instant formed these entities into a spatial configuration in an unbounded space. It would seem that space---on this theory-would be as instantaneous as the instants, and that some explanation is required of the relations between the successive instantaneous spaces. The materialistic theory is however silent on this point; and the succession of instantaneous spaces is tacitly combined into one persistent space. This theory is a purely intellectual rendering of experience which has had the luck to get itself formulated at the dawn of scientific thought. It has dominated the language and the imagination of science since science flourished in Alexandria, with the result that it is now hardly possible to speak without appearing to assume its immediate obviousness." (Whitehead 1920).
Links
See Also
Subcategories
``
Pages
`
Pages in Other Languages
Forking
Categories
Comments (0)
You don't have permission to comment on this page.