250 likes | 417 Views
Machine Models of Consciousness: Part 1. Ron Chrisley COGS/Informatics University of Sussex. Overview of Part 1. (Some) philosophical background Approaches: Sloman and Chrisley Haikonen Discussion. What is Machine Consciousness?.
E N D
Machine Models of Consciousness: Part 1 Ron Chrisley COGS/Informatics University of Sussex
Overview of Part 1 • (Some) philosophical background • Approaches: • Sloman and Chrisley • Haikonen • Discussion
What is Machine Consciousness? • Machine Consciousness (MC) is the attempt to make artefacts that possess some or all of the features of natural consciousness, e.g.: • Autonomy • Adaptivity/learning • Emotion/affect • Responsibility • Intelligence • Perception • Action • Imagination • Self-awareness • Attention
General approaches to MC • Synthesizing: instantiating • Modelling: testing/developing theories • Engineering: Behavioural pragmatics As in AI, an important question is: To what extent must/should/can we duplicate the structure of existing, biological consciousness?
What won't be covered today • MC as prosthesis • Ethical issues • Objections to the very idea of machine consciousness • Consciousness is a natural phenomenon • Scepticism about machine consciousness shouldn't be so strong as to result in scepticism about natural consciousness
MC and theories of consciousness • General MC is compatible with most if not all theories of consciousness - even dualistic ones • Functional: • Higher order • Narrative • Representational • Informational • Enactive • Material: • Neurophysiological/Biological/Enactive • Quantum
Sloman and Chrisley • Primarily philosophical/conceptual, but: • Dovetails well with Haikonen and other approaches presented today • More empirical/implementation work being done (e.g., CoSy) • Detailed in: Sloman, A. and Chrisley, R. (2003) "Virtual machines and consciousness", Journal of Consciousness Studies10: 4-5, pp133-172.
Sloman and Chrisley • Aspects of the approach • Deconstruction of concept of consciousness • Architecture-based interactive conceptual change • Qualia: Realist heterophenomenology • Relevant, but won't say much about: • Virtual machine functionalism • Emotion and affect
Deconstruction of "consciousness" • Concept of consciousness is not well-behaved • Cluster concept: worse than being a vague concept, or mongrel concept • Confusion is not just empirical, but conceptual • Context-sensitive • Parable of blind men and elephant • As it stands, the concept is unsuitable for scientific purposes
Architectural approach • Proposed solution: Modify existing concepts (or develop new ones) based on (interaction with) architectural analysis of the processes underlying different phenomena related to consciousness • Why necessary: • Concepts of consciousness like "qualia" are architecture-driven, causally indexical concepts • They therefore have no meaning outside of the context of their particular architectural home
How to explain consciousness architecturally • Construct an architecture that models an agent who has the concept of, say, "qualia" • Find the processes in that architecture that are the ones actually being referred to by that agent's use of "qualia" • The explanations of those processes will be explanations of qualia
Realist heterophenomenology Compare and contrast with Dennett's approach: • Like Dennett, grants that • Phenomenological reports are an important source of data about consciousness • But this does not mean they should be taken to be true • Unlike Dennett, doesn't take falsity to be failure of reference • Not eliminativist nor fictionalist • "Qualia" more like "gold" than "phlogiston"
Haikonen • Principal Scientist of Cognitive Technology at Nokia Research • Presentation here based on: Haikonen, P. (2003) The Cognitive Approach to Conscious Machines. Exeter:Imprint Academic, especially chapters 8, 9, 18 and 19 • General approach is to develop a general cognitive/affective architecture, and then show how it implements/models apsects of consciousness
The Mind-Body Effect • "Any worthwhile material theory of consciousness must explain why our thinking and feelings appear to be immaterial and how these seemingly immaterial processes are related to the material processes of the brain" (p 144)
The Mind-Body Effect • Solution seems to be philosophical rather than empirical/computational: "The apparent immateriality of our thoughts is caused by omission; the inability to perceive the actual signals and machinery that carry the perceived information. This is not a result of any intricate steps of evolution, this is the simplest state of affairs" (p 249)
General architecture • Architecture consists of a large number of perception/response loops that produce "percepts" of their own domain, either about external world or internal causes, and responses to these percepts • Not all percepts reach consciousness; what makes the difference? • "The level of active cross connections and binding between modalities; the cross modality reporting and learning of related associative connections and thus the establishment of episodic memories of the event" (p 254)
The basic building block – the perception/response module • The internal feedback represents introspection, which is returned into the terms of sensory percepts by the feedback neurons. • The introspection may represent a prediction or expectation for the sensory input. This is compared to the actual sensory input at the feedback neurons and a match/mismatch/novelty signal is generated. • Kinesthetic perception/response loops are connected to mechanical effectors. In that case the internal feedback represents the the kinesthetic percepts that are expected due to the effector action. (Similar to the corollary discharge.) Here the match signal indicates successful execution and also the ownership of the cause of the action. P O A Haikonen 2006
Dual nature of conscious states • Intentionality: • aboutness or reference • not hard to explain: representational theory • Feel: • Not things represented, but a system reaction that controls attention, etc. • Why do these feel like anything? • Overexplaining: Microwave oven fallacy
Self-consciousness • Based on perception of bodily self • So agrees (with, e.g., Holland) that robots should being able to perceive themselves (esp acting) • Dual nature of touching oneself
Bonus slides:Synthetic phenomenology Two (related) varieties: • Synthetic phenomenology: • Creation of phenomenal states (experience) in an artificial system • Or using an artificial system ("robot") to simulate or model such states • “constructive synthetic phenomenology” • Synthetic phenomenology: • Specification, representation, communication or visualisation of experiential states involving artificial systems ("robots") • “descriptive synthetic phenomenology”
Expected Sensations D-map PredictedState T-map Previous Predicted State (Context Units) Action Key: Full Inter-Connection Between Layers Of Units Recurrent Connection (Copy) Full description in Chrisley (1990), Chrisley (1992), Chrisley & Holland (1990), et al
Displaying experiential states:Enactive approach • Don't focus on representations, but on sensory-motor contingencies (cf O'Regan, et al) • Experience at any time t includes: • Focal sensations at t • Focal sensations one would expect to have were one to perform action a at t • Weighted by that a's priority/possibility/likelihood? • Organised/indexed by a's position in space of actions • Focal sensations one would expect to havewere one to perform actiona' at t' in the situation one is in after performing a at t • So display not only current sensations, but (temporally/priority discounted) expected sensations