260 likes | 345 Views
Babies and Computers. Are They Related? – Abel Nyamapfene. Abstract:
E N D
Babies and Computers Are They Related? – Abel Nyamapfene
Abstract: Current opinion suggests that language is a cognitive process in which different modalities such as perceptual entities, communicative intentions and speech are inextricably linked. In this talk I discuss my belief that the problems psychologists are grappling with in child development are also the same problems computer scientists working in artificial intelligence and robotics are facing. I show how computational modelling, in conjunction with the availability of empirical data, has contributed to our understanding of child language acquisition, and how this knowledge has advanced progress in robotics.
How do babies learn life skills? Psychologist How can you be as adaptive as a baby? Computer Scientist
Basic Computer OrganisationVon Neumann Architecture • stored program: data and programs are stored together • sequential control: programs that are executed sequentially. • Algorithmic: Everything to be done defined beforehand • Program implements algorithm in computer friendly language
Good for procedures that can be pre-defined before execution: e.g: numerical computation Word processing Car assembly Precision surgery Poor for procedures that have to bee adapted on a situation by situation basis e.g: Language processing Pattern processing Artificial human assistant Von Neumann Architecture Pros & Cons
Emerging Computer Applications • Social Interaction • caregivers • domestic • helpmates • Intelligent weaponry • Games • Medicine • Education
Examples Games humanoids Medical Diagnostics Education Weapons of War
Features Common To Intelligent Computer Applications • Computer applications still fall far short of expectations • Applications only work well within well specified environments • Application scalability is limited • Processing capability has little or no incremental capability
In Comparison: Children come into the world with little or no cognitive skills but exhibit developmental progression of increasing processing power and complexity. An example is language where children progress from no language, to babbling, to one-word utterances, two-word utterances and finally full adult speech – almost all the children . What can Computing learn from Children?
Learning from Child Development 1: Carry out Empirical Investigations of Developmental Activities - Behavioural Investigation - Neuroscientific Investigation 2: Use Empirical Data to develop Models of Development process 3:Assess and Incrementally Improve the Models 4:Apply knowledge to computer tasks
Empirical Investigation: Behavioural • Observe developmental activity – e.g. language acquisition • Track single child from conception to stage of full acquisition – “Keep a Diary” • Study sizeable number of children at same stage of development • Carry out ethically approved psychological investigations on children etc
Empirical Investigation: Neuroscientific Investigate: • Brain Maturation Processes • Interaction of Brain Regions • Interaction of Individual Neurons
Models of Development Based on Brain Neural Processing Actual Neurons: Complex
Models of Development Based on Brain Neural Processing Artificial Neurons: Very Very Simplified
Some Models of One-Word Child Language “Dada” instead of “Here comes Daddy.” “Uh oh” instead of “I am happy.” “More” instead of “Give me some more”
Image (output) Label (output) joint internal representation Label representation Image representation Image (input) Label (input) 1: Amultilayer perceptron network for mapping images to text(Plunkett et al, 1992). Network by Plunkett et al simulates word – image association and exhibits same developmental learning as a child, but learning mechanism not biologically feasible
Unidirectional links from Perception to Speech Neuron Layers Second SOM First SOM Unidirectional links from Speech and Perception Neuron Layers activated neuron 2: Hebbian-linked Self –Organising Architecture Li, Farkas & MacWhinney (2004) Perceptual Input Speech Input Network was inspired by the belief that Brain Modules are interlinked. It successfully simulates Word-Object Mapping in children
Z1 Z2 ZN cluster layer x input layer y input layer x output layer y output layer 3:An Approach that can associate Two Input Types: - Full counterpropagation network (Hecht-Nielsen,1987)
4: Extending the Counterpropagation Approach to Modelling Child Language(Nyamapfene &Ahmad, 2007) Competitive Neuron layer Modal weights Perceptual Input Speech Input Intentional Input Model based on empirical evidence that children have intentions and that brain has multimodal neurons
I have described some investigations of child • language acquisition through: • Physically observing infants acquiring language • Studying relevant brain structures • Building, testing and modifying brain inspired computer models of child language acquisition.
Current Conclusions on Child Language • Acquisition Suggest That: • Child language has multiple inputs that need to be processed simultaneously • Language acquisition takes place through social interaction with caregivers • Children have desires, have emotions, set and modify goals, monitor ongoing speech acts and generate communicative intentions which lead to speech utterances
Child Environment Desires Emotions Drive Communicative intentions Single-Word Utterance Goals Caregiver response Block diagram of a control systems approach to modelling child language at the one-word early child language acquisition stage 5: A Control-Theoretic Neural Multi-Net Model of Child Language Acquisition(Nyamapfene, 2008)
From Child Development To Computing Cynthia Breazeal has developed Kismet, a robot that employs drives and emotions to interact with a human – based on social interaction of an infant and a caregiver (Breazeal and Brooks, 2004)
Current & Future Projects • Developing a multimodal neural network model that learns from Child - directed Speech using cross-situational techniques • Implementing the control-theoretic model of child language acquisition presented in this talk using neural multi-nets • Migrating child work onto a robotic platform – (circa 2009 – 2010)
Finally: Yes, I Think Babies and Computers are Related Thank You!!??!!
References • C. Breazeal and R. Brooks (2004). "Robot Emotion: A Functional Perspective," In J.-M. Fellous and M. Arbib (eds.) Who Needs Emotions: The Brain Meets the Robot, MIT Press (forthcoming 2004). • R. Hecht-Nielsen (1987). “Counterpropagation Networks,” Applied Optics 26:4979-4984. • P. Li, I. Farkas, B. MacWhinney (2004). “Early lexical development in a self-organizing neural network,” Neural Networks 17: 1345 - 1362 • A. Nyamapfene (2008). “Computational Investigation of Early Child Language Acquisition Using Multimodal Neural Networks: A Review of Three Models,” Artificial Intelligence Review (submitted). • A. Nyamapfene and K. Ahmad (2007). “A Multimodal Model of Child Language Acquisition at the One-Word Stage,” 20th IJCNN: International Joint Conference on Neural Networks, 12th-17th August, 2007, Orlando, Florida, USA • K. Plunkett , C. Sinha, MF. Muller, O. Strandsby (1992). “Symbol grounding or the emergence of s symbols? Vocabulary growth in children and a connectionist net,” Connection Science 4: 293-312