350 likes | 557 Views
Reductive and Representational Explanation in Synthetic Neuroethology. Pete Mandik Assistant Professor of Philosophy Coordinator, Cognitive Science Laboratory William Paterson University, New Jersey. Collaborators. Michael Collins, City University of New York Graduate Center
E N D
Reductive and Representational Explanation in Synthetic Neuroethology Pete Mandik Assistant Professor of Philosophy Coordinator, Cognitive Science Laboratory William Paterson University, New Jersey
Collaborators • Michael Collins, City University of New York Graduate Center • Alex Vereschagin, William Paterson University
My Thesis • Even for the simplest cases of intelligent behavior, the best explanations are both reductive and representational
Overview • Mental representation in folk-psychological explanation • Mental representation in non-humans • The problem of chemotaxis • Modeling the neural control of chemotaxis • What the representations are
Mental reps in folk-psych • George is opening the fridge because: • George desires that he drinks some beer • George sees that the fridge is in front of him • George remembers that he put some beer in the fridge • George’s psychological states cause his behavior • George’s psychological states have representational content
Mental reps in non-human animals • Rats and maze learning • After finding the platform the first time, rats remember its location and can swim straight to it on subsequent trial from novel starting positions. • Rats not only represent the location, but compute the shortest path.
Mental reps in non-human animals • Ducks’ representation of rate of return • Every day two naturalists go out to a pond where some ducks are overwintering and station themselves about 30 yards apart. Each carries a sack of bread chunks. Each day a randomly chosen one of the naturalists throws a chunk every 5 seconds; the other throws every 10 seconds. After a few days experience with this drill, the ducks divide themselves in proportion to the throwing rates; within 1 minute after the onset of throwing, there are twice as many ducks in front of the naturalist that throws at twice the rate of the other. One day, however, the slower thrower throws chunks twice as big. At first the ducks distribute themselves two to one in favor of the faster thrower, but within 5 minutes they are divided fifty-fifty between the two “foraging patches.” … Ducks and other foraging animals can represent rates of return, the number of items per unit time multiplied by the average size of an item. • (Gallistel 1990; emphasis mine)
Positive Chemotaxis • Movement toward the source of a chemical stimulus
2-D food finding • 2-Sensor Chemophile: • Steering muscles orient creature toward stimulus • Perception of stimulus being to the right fully determined by differential sensor activity Sensors Brain Steering Muscles
1-D food finding • 1- Sensor “Lost” Creature • left/right stimulus location underdetermined by sensor activity • only proximity perceived • Adding memory can help Sensor Brain Steering Muscles
Things to Note: • Note that single-sensor gradient navigation is a “representation hungry” problem • Note the folk-psychological explanation of how a human would solve the problem • Note, in what follows, the resemblance to the explanation of the worm’s solution
C. Elegans • Caenorhabditis Elegans
C. Elegans • Feree and Lockery (1999). “Computational Rules for Chemotaxis in the Nematode C. Elegans.” Journal of Computational Neuroscience 6, 263-277
Zeroth Order • The simulations were run keeping only the terms up to the zeroth order: • This rule failed to produce chemotaxis for any initial position.
First Order • Next the simulations were run keeping all terms up to the first order: • This rule accurately reproduced the successful chemotaxis performed by the network model.
Problems • Remains open. . . • How the network controllers are working • What the networks themselves are representing and computing • Whether the networks are utilizing memory
Framsticks • 3-D Artificial Life simulatorBy Maciej Komosinski • and Szymon Ulatowski • Poznan University of Technology, Poland • http://www.frams.poznan.pl/
Memory in Chemotaxis • Experimental Set Up • 3 orientation networks: Feed-forward, Recurrent, and Blind • five runs each, for 240 million steps • mutations allowed only for neural weights • fitness defined as lifetime distance • Initial weights: Evolved CPGs with un-evolved (zero weights) orienting networks
What the representations are • States of neural activation isomorphic to and causally correlated with environmental states • Sensory states • Memory states • Motor-command states
Representation and Isomorphism • Isomorphism • One to one mapping between structures • structure = set of elements plus set of relations on those elements
Representation and Isomorphism • Representation • Primarily: a relation between isomorphic structures • Secondarily: a relation between elements and/or relations in one structure and those in another
Isomorphisms between multiple structures • Which of the many structures a given structure is isomorphic to, does a given structure represent? • The range of choices will be narrowed by the causal networks the structure is embedded in
For further investigation • States of desire/motivation • Clearer in models of action selection, not intrinsic to the stimulus orientation networks • Modeling representational error and falsity • Error and falsity are distinct, but this is clearer in non assertoric attitudes
Summing up • Single-sensor chemotaxis is a “representation hungry” problem • Even explanations of adaptive behaviors as simple as chemotaxis benefit from psychological state ascriptions
Summing up • The psychological states in question are identical to neural states • The neural states in question are causally explanatory of intelligent behavior in virtue of isomorphisms between structures of neural activations and structures of environmental features
Summing up • Therefore… • Even for the simplest cases of intelligent behavior, the best explanations are both reductive and representational