400 likes | 521 Views
CAUSATION, ORGANISATION & EMERGENCE. Fabio Boschetti and David Batten CSIRO, Australia. Summary of lucubrations over many years Work in progress Clear conclusions need developing. Warnings. Speaker’s background: Numerical optimisation Modelling (physical, ecological, social)
E N D
CAUSATION, ORGANISATION & EMERGENCE Fabio Boschetti and David BattenCSIRO, Australia
Summary of lucubrations over many years • Work in progress • Clear conclusions need developing Warnings • Speaker’s background: • Numerical optimisation • Modelling (physical, ecological, social) • Relation between computation and Complex System Science • Can we do CSS on a computer at all? What CSS? What are the minimum ingredients I need to generate both causation and emergence?
a process modelled What are the minimum ingredients I need to generate both causation and emergence? Ultimate test “You really understand an algorithm when you've programmed it” (Chaitin, 1997) Understanding → Prediction
Outline Personal choice Consequence • not all behaviours are ‘causal’ • it is useful for us to discriminate between entailment and causation • it is useful to identify causation with intervention • there is a strong relation between causation and emergence • not all emergent processes are causal • all causal processes are emergent • it is very hard to make sense of this picture only in terms of behaviours • it is easier in terms of interaction or relations or organisation • some relations act by constraining elements’ behaviour -> symmetry breaking (maybe these can be modelled) • some relation act by generating novelty (these require external intervention = open system)
not all behaviours are ‘causal’ • it is useful for us to discriminate between entailment and causation • it is useful to identify causation with intervention Entailment: logical necessity or physical inevitability P╞ Q or P→Q or If P then Q Intervention: an action external to the system that produces an effect by altering the course of a process
Intervention: an action external to the system that produces an effect by altering the course of a process Causation as intervention: by imposing a chosen perturbation on event a and observing the consequence on event b we may be able to unravel the underlying causal relation between a and b (Pearl) “Useful causation requires control. Clearly it is valuable to know that malaria results from mosquitoes. …while it is true that mosquitoes follow the laws of physics, we do not usually say that malaria is caused by the laws of physics (the universal cause). That is because we can hope to control mosquitoes, but not the laws of physics”Pattee, 1997
Intervention: an action external to the system that produces an effect by altering the course of a process Causation as intervention: by imposing a chosen perturbation on event a and observing the consequence on event b we may be able to unravel the underlying causal relation between a and b (Pearl) Causation as control: “we can hope to control mosquitoes, but not the laws of physics”Pattee, 1997 Causation as agency: ”an event A is a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B.” (Menzies and Price, 1993) Neither intervention nor agency imply human intervention; they represent a relation
Neither intervention nor agency imply human intervention; they represent a relation
Neither intervention nor agency imply human intervention; they represent a relation
Intervention: an action external to the system that produces an effect by altering the course of a process Causation as intervention: by imposing a chosen perturbation on event a and observing the consequence on event b we may be able to unravel the underlying causal relation between a and b (Pearl) Causation as control: “we can hope to control mosquitoes, but not the laws of physics”Pattee, 1997 Causation as agency: ”an event A is a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B.” (Menzies and Price, 1993) Causation as asymmetry: asymmetry in correlation, asymmetry in agency/control, Principle of Independence (Hausman, 1998)
Multiple effects of common causes need to be correlated; multiple causes of common effect do not • We can intervene in the cause to alter the effect; we can not intervene on the effect to alter the cause • Independence principle: every effect must have at least two independent causes Causation as asymmetry: asymmetry in correlation, asymmetry in agency/control, Principle of Independence (Hausman, 1998)
Intervention: an action external to the system that produces an effect by altering the course of a process Causation as intervention: by imposing a chosen perturbation on event a and observing the consequence on event b we may be able to unravel the underlying causal relation between a and b (Pearl) Causation as control: “we can hope to control bacteria and mosquitoes, but not the laws of physics”Pattee, 1997 Causation as agency: ”an event A is a cause of a distinct event B just in case bringing about the occurrence of A would be an effective means by which a free agent could bring about the occurrence of B.” (Menzies and Price, 1993) Causation as asymmetry: asymmetry in correlation, asymmetry in agency/control, Principle of Independence (Hausman, 1998)
Emergence • there is a strong relation between causation and emergence • not all emergent processes are causal • all causal processes are emergent Pattern Formation→ Prediction Intrinsic emergence → Information processing for trade agents Emergence of causal power → we can intervene on the stock market and affect the economy Causal emergence: “the arising of a system property on which intervention can be exerted without manipulating the system components” (Boschetti and Gray, 2007).
Cellular Automata Human Pattern Formation Causal power • not all emergent processes are causal • all causal processes are emergent
Input + Rules Enter Human decision making Eventa Human action Eventb Human action Eventc ?? Human action …. ……. Event z Reminder of moral values Human action Output Observation Model ExperimentalEconomics Rules ← Behaviour of Economically Rational Agent Intervention
Input + Rules Enter Eventa Eventb Eventc ?? …. Event z Output Model • Event c is not caused by b since after b happens c follows as a logic necessity • What changes c changes also b (correlated) • What causes c? • What do I need to do to actuate a chance in c? Two alternatives: • If I can not interact with the run, I have to change input or code • Control lies only in the input and code • Logic entailment (Rosen) • If I can interact with the run • I need to preconceive all possible interventions, since they need to be written in the code Impossible
Input + Rules Enter Eventa Eventb Eventc ?? …. Event z Output Model Causation as intervention: by imposing a chosen perturbation on event b and observing the consequence on event c we may be able to unravel the underlying causal relation between b and c (Pearl) Causation as agency: ”an event b is a cause of a distinct event c just in case bringing about the occurrence of b would be an effective means by which a free agent could bring about the occurrence of c.” (Menzies and Price, 1993) Causation as asymmetry: asymmetry in correlation, asymmetry in agency/control, Principle of Independence (Hausman, 1998)
Model ExperimentalEconomics Input + Rules Enter Human decision making Eventa Human action Eventb Human action Eventc Human action …. ……. Event z Human action Output Observation Logic entailment Effective control / causation
Logic entailment Input + Rules Enter Human decision making Eventa Human action Eventb Human action Effective control / causation Eventc Human action …. ……. Event z Human action Output Observation “Useful causation requires control. Clearly it is valuable to know that malaria …results from mosquitoes. .. …while it is true that mosquitoes follow the laws of physics, we do not usually say that malaria is caused by the laws of physics (the universal cause). That is because we can hope to control mosquitoes, but not the laws of physics”Patte, 1997
Logic entailment Input + Rules Enter Eventa Eventb Effective control / causation Eventc …. Event z Output “Useful causation requires control. Clearly it is valuable to know that malaria …results from mosquitoes. .. …while it is true that mosquitoes follow the laws of physics, we do not usually say that malaria is caused by the laws of physics (the universal cause). That is because we can hope to control mosquitoes, but not the laws of physics”Patte, 1997
Logic entailment Input + Rules Enter Human decision making Eventa Convert effective control into logic necessity Project processes into ‘rule subspace’ Human action Eventb Human action Effective control / causation Eventc Human action …. Convert Causal Emergence into Pattern Formation ……. Event z Human action Output Observation
Logic entailment Input + Rules Neo Classical economic theory Rational Economic Agent Enter Eventa Convert effective control into logic necessity Project processes into ‘rule subspace’ Eventb Eventc …. Event z Nash ‘optimal’ equilibrium Invisible Hand Output
Logic entailment Input + Rules Enter Eventa Convert effective control into logic necessity Project processes into ‘rule subspace’ Eventb Eventc …. Event z Output Distributed sensors Features detection algorithm New discoveries New scientific laws
Logic entailment Input + Rules Enter Eventa Convert effective control into logic necessity Project processes into ‘rule subspace’ Eventb Eventc …. Event z Output AI Rules Intelligence
Machine 1 Words = {00, 01, 10, 11} Transition = {00→01, 01→10, 10 →11, 11 →00} Unit of Interaction Interactive identity machines P = in(message).out(message).P Wegner, P.Why Interaction is More Powerful than Algorithm. Comm. ACM 40(5), 89–. 91 (1997). Machine 2 Words = {00, 01, 10, 11} Transition = {00→10, 01→11, 10 →01, 11 →00} Statistical Complexity = C1 00011011000110110001101100011011.. 01101100011011000110110001101100.. Statistical Complexity = C2
Machine 2 Words = {00, 01, 10, 11} Transition = {00→10, 01→11, 10 →01, 11 →00} Machine 1 Statistical Complexity = C1 →C1’ Words = {00, 01, 10, 11} Transition = {00→01, 01→10, 10 →11, 11 →00} ..00011011000110110001101100011011 10 11.. Change in statistical complexity ..non stationarity.. ..0110110001101100011011000110 11 00.. Statistical Complexity = C2 →C2’
Machine 1 Words = {00, 01, 10, 11} Transition = {00→01, 01→10, 10 →11, 11 →00} Unit of Interaction Interactive identity machines P = in(message).out(message).P Wegner, P.Why Interaction is More Powerful than Algorithm. Comm. ACM 40(5), 89–. 91 (1997). Machine 2 Words = {22, 23, 32, 33} Transition = {22→23, 23→32, 32 →33, 33 →22} 00011011000110110001101100011011.. 22233233222332332223323322233233..
Machine 2 Words = {22, 23, 32, 33} Transition = {22→23, 23→32, 32 →33, 33 →22} Machine 1 Words = {00, 01, 10, 11} Transition = {00→01, 01→10, 10 →11, 11 →00} 00011011000110110001101100011011 22 halt halt 22233233222332332223323322233233 00
Machine 1 Words = {00, 01, 10, 11} Transition = {access memory 3 steps back and copy two consecutive symbols} Machine 2 Words = {22, 23, 32, 33} Transition = {22→23, 23→32, 32 →33, 33 →22} 13.. 0100100101000110110001101100011 33 • What happened • ’13’ is not a possible word for either Machine 1 or Machine 2 • It is not a wff (well-formed-formula) for either systems • It is genuinely novel halt 22233233222332332223323322233233 01
Machine 1 Words = {00, 01, 10, 11} Transition = {access memory 3 steps back and copy two consecutive symbols} Machine 2 Words = {22, 23, 32, 33} Transition = {22→23, 23→32, 32 →33, 33 →22} 13.. 0100100101000110110001101100011 33 • Ingredients • Some behaviour • Some basic interaction • Some ability to handle novel input halt 22233233222332332223323322233233 01
Machine 2 Words = {22, 23, 32, 33} Transition = {22→23, 23→32, 32 →33, 33 →22} Machine 1 Words = {00, 01, 10, 11} Transition = {access memory 3 steps back and copy two consecutive symbols} 13.. 010010010100011011000110110001133 • Types of behaviours • Entailments • Relations • Generation of higher level unit • Causation halt 2223323322233233222332332223323310
Outline Personal choice Consequence • not all behaviours are ‘causal’ • it is useful for us to discriminate between entailment and causation • it is useful to identify causation with intervention • there is a strong relation between causation and emergence • not all emergent processes are causal • all causal processes are emergent • it is very hard to make sense of this picture only in terms of behaviours • it is easier in terms of interaction or relations or organisation • some relations act by constraining elements’ behaviour -> symmetry breaking (maybe these can be modelled) • some relation act by generating novelty (these require external intervention = open system) What are the minimum ingredients I need to generate both causation and emergence?
Summary • Entities need to ‘do’ something; have properties or behaviours • Entities need to interact; in order to have anything ‘new’ happening • Interactions may happen as entailments; which creates a ‘new’ closed system/unit • Some interaction may be causal; these are characterised by a special kind of relation; they require certain asymmetries to occur • At a different scale/scope, the relation allowing intervention may not be detected and the system may appear as an entailment • The behaviour should not be fully determined in order to generate ‘real’ novelty • The behaviour should not be determined only in terms of structures in the system; there should be some space to process structures not seen before • Normally, in our models, we do not account for interaction and we fully specify behaviours and properties
Summary Limitations of formal systems Closed systems, No novelty, Uncomputability, Chaos Interaction Internal to the system External to the system Causation as a relation between entities/processes Agency theory, Menzies and Price, Pattee.. Importance of organisation to generate new behaviours Self-organisation, Prigogine, Laughing.. Causal asymmetries Hausman (1998) Statistically novel non-causal behaviours Statistically novel causal behaviours Ability to handle novel situations Genuinely novel non-causal behaviours Genuinely novel causal behaviours Closed | Complex Systems | Open Far from equilibrium, energy & information flows, Novelty
Things to check • Mathematical / formal tools to describe changes in context and structure (group theory and beyond) Forward problem Shadelength = Poleheight * F [Sunangle ] Shadelength ← Poleheight * F [Sunangle ] F [Sunangle ] = Poleheight / Shadelength F [Sunangle ] ←Poleheight / Shadelength Inverse problem Group = {A, Property, Property, …, .. } → Closed to interaction
Things to check a process modelled • Mathematical / formal tools to describe changes in context and structure (group theory and beyond) • Relation between hardware and software – computer science and biology • More on causal asymmetries and Hausman • Intuitive perception of causality from shape and symmetries in terms of history of an entity • In general many of the things I do not know are surely well known in other fields.. Ultimate test “You really understand an algorithm when you've programmed it” (Chaitin, 1997) Ultimate question
Logic entailment Input + Rules Enter Human decision making Eventa Convert effective control into logic necessity Project processes into ‘rule subspace’ Human action Eventb Human action Effective control / causation Eventc Human action …. Convert Causal Emergence into Pattern Formation ……. Event z Human action Output Observation
References • Hausman, D., 1998. Causal asymmetries. Cambridge University Press., Cambridge. • Menzies, P. and Price, H., 1993. Causation as a secondary quality. The British Journal for the Philosophy of Science 44:187-203. • Pattee, H., 1997. Causation, Control, and the Evolution of Complexity. In: P.B. Andersen, C. Emmeche, N.O. Finnemann and P.V. Christiansen (Editor), Downward Causation. University of Århus Press, Århus, pp. 322-348. • Laughlin, R., 2005. A Different Universe: Remaking Physics from the Bottom Down Basic Books, New York. • Leeuwen, J and Wiedermann, J, The emergent computational potential of evolving artificial living systems. Source, AI Communications archive. Volume 15 , Issue 4 • Milner, R., 1993. Elements of interaction: Turing award lecture. ACM, pp. 78-89. • Wegner, P., 1997. Why interaction is more powerful than algorithms. ACM, pp. 80-91. • Wiedermann, J. and Leeuwen, J., 2002. The emergent computational potential of evolving artificial living systems. IOS Press, pp. 205-215.
References • Boschetti, Causality, emergence, computation and unreasonable expectations, Synthese, in print. • Prokopenko, Boschetti & Ryan, 2009, An Information-Theoretic Primer On Complexity, Self-Organisation And Emergence, Complexity, DOI: 10.1002/cplx.20249. • Batten, Salthe & Boschetti, 2008, Visions of Evolution: Self-organization proposes what natural selection disposes, Biological Theory, Vol. 3, No. 1, Pages 17-29 • Boschetti, McDonald & Gray, 2008, Complexity of a modelling exercise: a discussion of the role of computer simulation in Complex System Science, Complexity, 13, 6, pp 21-28 • Boschetti & Gray. 2007, A Turing test for Emergence, in M. Prokopenko (ed.), Advances in Applied Self-organizing Systems, Springer-Verlag, London, UK, 2007 , pp 349-364 • Boschetti & Gray, 2007, Emergence and Computability, Emergence: Complexity and Organization, Volume 9 Issues 1-2, 120-130
For more informationFabio.Boschetti@csiro.auhttp://www.per.marine.csiro.au/staff/Fabio.Boschetti/