290 likes | 422 Views
Emergent Semantics: Meaning and Metaphor. Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University. language. The Parallel Distributed Processing Approach to Semantic Cognition.
E N D
Emergent Semantics:Meaning and Metaphor Jay McClelland Department of Psychology andCenter for Mind, Brain, and ComputationStanford University
language The Parallel Distributed Processing Approach to Semantic Cognition • Representation is a pattern of activation distributed over neurons within and across brain areas. • Bidirectional propagation of activation underlies the ability to bring these representations to mind from given inputs. • The knowledge underlying propagation of activation is in the connections. • Experience affects our knowledge representations through a gradual connection adjustment process
dog goat hammer Distributed Representations:and Overlapping Patterns for Related Concepts dog goat hammer
Emergence of Meaning and Metaphor • Learned distributed representations that capture important aspects of meaning emerge through a gradual learning process in simple connectionist networks • Metaphor arises naturally as a byproduct of learning information in homologous domains in models of this type
Emergence of Meaning: Differentiation, Domain-Specificity, and Reorganization
The Training Data: All propositions true of items at the bottom levelof the tree, e.g.: Robin can {grow, move, fly}
aj wij ai neti=Sajwij wki Forward Propagation of Activation
Back Propagation of Error (d) aj wij ai di ~ Sdkwki wki dk ~ (tk-ak) Error-correcting learning: At the output layer: Dwki = edkai At the prior layer: Dwij = edjaj …
Early Later LaterStill Experie nce
Learning is sensitive to patterns of coherent covariation. Coherent Covariation: The tendency for properties of objects to co-vary in clusters Why Does the Model Show Progressive Differentiation?
What Drives Progressive Differentiation? • Waves of differentiation reflect coherent covariation of properties across items. • Patterns of coherent covariation are reflected in the principal components of the property covariance matrix. • Figure shows attribute loadings on the first three principal components: • 1. Plants vs. animals • 2. Birds vs. fish • 3. Trees vs. flowers • Same color = features covary in component • Diff color = anti-covarying features
A Sensitivity to Coherence Requires Convergence A A
Conceptual Reorganization (Carey, 1985) Carey demonstrated that young children ‘discover’ the unity of plants and animals as living things with many shared properties only around the age of 10. She suggested that the coalescence of the concept of living thing depends on learning about diverse aspects of plants and animals including Nature of life sustaining processes What it means to be dead vs. alive Reproductive properties Can reorganization occur in a connectionist net?
Conceptual Reorganization in the Model Suppose superficial appearance information, which is not coherent with much else, is always available… And there is a pattern of coherent covariation across information that is contingently available in different contexts. The model forms initial representations based on superficial appearances. Later, it discovers the shared structure that cuts across the different contexts, reorganizing its representations.
Organization of Conceptual Knowledge Early and Late in Development
Medial Temporal Lobe Proposed Architecture for the Organization of Semantic Memory name action motion Temporal pole color valance form
Metaphor in Connectionist Models of Semantics • By metaphor I mean: the application of a relation learned in one domain to a novel situation in another
Person 2 Person 1 Relation Hinton’s Family Tree Network
Understanding Via Metaphor in the Family Trees Network Marco’s father is Pierro. Who is James’s father? Christopher’s daughter is Victoria. Who is Roberto’s daughter?
Emergence of Meaning and Metaphor • Learned distributed representations that capture important aspects of meaning emerge through a gradual learning process in simple connectionist networks • Metaphor arises naturally as a byproduct of learning information in homologous domains in models of this type