460 likes | 608 Views
A Modified Backpropagation Learning Algorithm With Added Emotional Coefficients. Qun Dai 2009-04-10. Outline. Introduction Related work Motivation The key Idea The authors’ work in this paper Emotional Back Propagation Learning Algorithm EmBP Application To Face Recognition
E N D
A Modified Backpropagation Learning AlgorithmWith Added Emotional Coefficients Qun Dai 2009-04-10
Outline • Introduction • Related work • Motivation • The key Idea • The authors’ work in this paper • Emotional Back Propagation Learning Algorithm • EmBP Application To Face Recognition • Experimental Results And Comparison • Conclusion
Introduction • Joy, sadness,anger, fear, disgust, and surprise • “machines with emotions” • A distinctive and challenging fact about human beings is a potential for both opposition and entanglement between will, emotion, and reason. • Just as emotions are critical to human behavior, they are equally critical for intelligent machines.
Related work • Perlovsky states that aesthetic emotions are inseparable from every act of perception and cognition… • Emotions play an important role in human decision-making process … • Another researcher Fellous states “For a long time, the ‘cognitive’ has been opposed to the ‘affective’.
Related work • Picard states “In affective computing, we can separately examine functions that are not so easily separated in humans…” • “In principle, we might be able to program a computer to do what a human (or a human brain) does, …” • Cooke suggests that we should make machines think the way that machines should think and not the way we humans think:…
Related work • A number of models addressing emotion have been developed in cognitive science and AI. • Examples of integrated architectures focusing on emotion include the works of Davis and Lewis… • Moffat et al. [26] who examined an emotion model called ACRES, …
Related work • Gratch and Marsella who developed a general computational model of human emotion, emotion and adaptation (EMA) … • Bates proposed a believable agent using a model described by Ortony et al. … • Ushida et al. proposed an emotion model for life-like agents with emotions and motivations.
Related work • El-Nasr et al. proposed a fuzzy logic adaptive model of emotions (FLAME) as a system for generation of emotions in agents. • Gratch described one research effort that begins to address how behavior moderators such as stress and emotion can influence military command and control decision making, …
Related work • Kort et al. suggested a model by which they aim to conceptualize the impact of emotions upon learning… • Poel et al. introduced modularhybrid neural network architecture, called SHAME, for emotionlearning. • Doya presented a computational theory onthe roles of the ascending neuro-modulatory systems…
Related work • More recent research works proposed the addition of emotions to intelligent agents, thus making them more human-like. • Clocksin explored the issues in memory and affect in connection with possible architectures for artificial cognition. • Martinez-Miranda and Aldea presented a review of some research work done on the inclusion of emotions into intelligent systems…
Related work • Recently, Abu Maria and Abu Zitar proposed and implemented a regular and an emotional agent architecture,… • Gobbini and Haxby proposed a model for distributed neural systems that participatein the recognition of familiar faces,…
Common Question • Perhaps the common question among all these works and also the work that is presented within this paper is as follows: Should we, and can we, develop machines with feelings? • The answer to the first part is yes, we should have emotional and intelligent machines. • The answer to the second part is yes and no.
Motivation • The motivation for the work in this paper comes from our belief that we can have certain human emotions artificially modeled in machines. • The authors choose to model specific emotional parameters with the objective of improving the learning capability of a “nonhuman” system.
Motivation • The authors’ aim in this paper is to model simulated emotions within a simple supervised neural network structure, and to investigate the effect of the added emotional factors on the learning and decision making capabilities of the neural network.
The key idea • They only consider two emotions (anxiety and confidence), which they believe have an effect on learning and decision making in humans.
The authors’ idea • When we learn a new task, the anxiety level is high at the beginning and the confidence level is low. After time, practice and getting positive feedback, the anxiety level decreases while the confidence level increases. Once learning is achieved, we tend to be less anxious and more confident performing a task that we have already experienced. Therefore, anxiety and confidence are two dependent dimensions, where confidence is defined as the negative rate of change of anxiety.
The authors’ work in this paper • In this paper, they propose a new emotional backpropagation (EmBP) learning algorithm based on incorporating two essential emotions (anxiety and confidence) during the processes of learning and decision making. • The distinctive, emotion-related elements of EmBP are twofold. First, there is pattern averaging, which attempts mimicking the tendency for human emotional judgments and preferences to be based on general impressions rather than precise details of the objects being perceived. Second, there are the confidence and anxiety variables, which are influenced by the perceived objects.
The authors’ work in this paper • This paper suggests that these two emotional responses can be successfully simulated and used during the training of a supervised neural network with the purpose of providing more effective learning. • The two emotional variables are dependent on each other, where confidence is measured as the difference between anxiety levels at two different iterations.
The authors’ work in this paper • The novel “emotional” neural network, which is based on EmBP, will be implemented to a facial recognition problem using 400 face images from the Olivetti Research Laboratory (ORL) face database, and a comparison between this network and a conventional BP-based neural network will be provided.
EMOTIONAL BACK PROPAGATION LEARNING ALGORITHM • Input-Layer Neurons where and are, respectively, the input and output values of neuron in the input layer.
Fig. 2. Input/output configuration of a hidden-layer neuron.
Fig. 3. Input/output configuration of an output-layer neuron.
The Emotional Backpropagation Parameters The proposed emotional parameters are the anxiety coefficient ( ) and the confidence coefficient (k).
Assumption 1: The anxiety level is dependent on the input patterns, where new patterns cause higher anxiety. During the first iteration (new task learning), the initial anxiety coefficient value is set to “1.”
Assumption 2: The anxiety level is dependent on the difference (error) between the actual output of the neural network and the desired (target) output. This is a kind of feedback that the emotional neural network uses to measure how successful its learning is. Anxiety decreases with the minimization of the error.
Assumption 3: The confidence level increases with the decrease in anxiety level. During the first iteration (new task learning), the initial confidence coefficient value is set to “0.”
Based on the above assumptions, the anxiety coefficient value is defined as
For output-layer neurons, a quantity called the error signal is represented by , which is defined in
CONCLUSION • The EmBP-based neural network has been shown to outperform the BP-based neural network in correct face recognition rate and in the significant reduction of the run time (execution) of the trained neural network.
FUTURE WORK • Further work will focus on exploring different application areas where the proposed “emotional” neural network could be efficiently used. • Additionally, research work will continue exploring other human emotions and investigating the possibilities of simulating them artificially.
REFERENCES [1] D. S. Levine, “Neural network modeling of emotion,” Phys. Life Rev., vol. 4, pp. 37–63, 2007. [2] L. I. Perlovsky, “Toward physics of the mind: Concepts, emotions, consciousness, and symbols,” Phys. Life Rev., vol. 3, no. 1, pp. 23–55, Jan. 2006. [3] L. I. Perlovsky, “Integrated emotions, cognition, and language,” in Proc. Int. Joint Conf. Neural Netw., 2006, pp. 1570–1575. [4] D. I. Lewin, “Why is that computer laughing?,” IEEE Intell. Syst., vol. 16, no. 5, pp. 79–81, Sep./Oct. 2001. [5] R. W. Picard, Affective Computing. Cambridge, MA: MIT Press, 1997. [6] D. A. Norman, Emotional Design: Why We Love (or Hate) Everyday Things. New York: Basic Books, 2004. [7] J. Yoon, S. Park, and J. Kim, “Emotional robotics based on iT_Media,” in Proc. 30th Annu. Conf. IEEE Ind. Electron. Soc., Busan, Korea, 2004, pp. 3148–3153. [8] K. Boehner, R. DePaula, P. Dourish, and P. Sengers, “How emotion is made and measured,” Int. J. Human-Comput. Studies, vol. 65, pp. 275–291, 2007.