330 likes | 615 Views
Affective Computing and Emotion Recognition Systems: The Future of Biometric Surveillance?. Joseph Bullington, PhD Department of Information Systems Georgia Southern University jbullington@georgiasouthern.edu. Welcome To The French Quarter-You Are Being Watched.
E N D
Affective Computing and Emotion Recognition Systems: The Future of Biometric Surveillance? Joseph Bullington, PhD Department of Information Systems Georgia Southern University jbullington@georgiasouthern.edu INFOSEC CD 2005
Welcome To The French Quarter-You Are Being Watched There are (were?) approximately 192 surveillance cameras in the French Quarter. INFOSEC CD 2005
Times Square, New York City, September 2002 INFOSEC CD 2005
Times Square, New York City, May 2005 INFOSEC CD 2005
Images of the July 21 London Bombers from CCTV Images INFOSEC CD 2005
CCTV Image of July 7 London Bombers INFOSEC CD 2005
9/11 Hijackers caught on airport surveillance camera INFOSEC CD 2005
Arrested for child battery after being caught on surveillance camera(CNN, 2002) INFOSEC CD 2005
Problem • Human observers are a critical feature of any surveillance system • The weakest link in any surveillance system are the human observers • Through maliciousness or lack of attention critical events can be missed • Human observers will still be needed • How to add more automated features to these systems? INFOSEC CD 2005
A possible future for surveillance technology • Represents the confluence of several current technologies • Surveillance camera networks • Wireless networks • Storage technology • Ubiquitous and wearable computing and sensing devices • Biometrics (especially face recognition) • Emotion-recognition by computers (i.e., Affective Computing) INFOSEC CD 2005
Affective Computing • Computing that relates to, arises from, or deliberately influences emotions (Picard, 1997) • Giving a computer the ability to recognize and express emotions (note: this is different from the question of whether computers can have ‘feelings’) • Developing the ability to respond intelligently to human emotion. INFOSEC CD 2005
Affective Computing • The recognition of emotion by computer is of primary importance for future surveillance applications INFOSEC CD 2005
HAL INFOSEC CD 2005
HAL INFOSEC CD 2005
HAL analyzed voices to determine the emotional state of the crew …but he could also read lips! INFOSEC CD 2005
Call centers in Europe are experimenting with computers that analyze not what a caller is saying but how (e.g., speaking rate, volume, frequency) they say it. A call is routed to a human agent for further analysis if something in the caller’s voice sounds ‘agitated’ or ‘suspicious’ (Wired Magazine, June, 2005). Affective Computing Applications INFOSEC CD 2005
Affective Computing Applications (from Picard, 2000) INFOSEC CD 2005
Applications to Surveillance • DARPA has called for proposals for an “Integrated System for Emotional State Recognition for the Enhancement of Human Performance and Detection of Criminal Intent.” • SRI included “Affective Computing” in its list of “Next Generation Technologies” INFOSEC CD 2005
Applications to Surveillance An emotion-recognition system should, based on analysis of facial and bodily response data from a surveillance camera (supplemented by perhaps an audio recording and/or physiological data), be capable of: • deciding that a person is under severe stress, emotional distress, or impaired in some fashion, • deciding what the person’s intentions might be based on this determination, and • alerting a human observer and/or recommending or initiating a course of intervention. INFOSEC CD 2005
Applications to Surveillance • The ultimate goal would be to complement (and perhaps eventually replace) human observers with computer-based systems • Unlikely that these systems will be utilized in public settings (e.g., airports, subway stations) any time soon. • The most likely applications will be in individualized settings such as automobiles, offices, plane cockpits. INFOSEC CD 2005
Implementation Scenario #1 • Could include cameras in the engine room of a train (or cockpit of a plane) that could relay footage of a possibly sleepy, intoxicated or distressed driver/pilot to a central system • Central system could then either alert security personnel or deliver a warning to the driver/pilot and crew about the nature of the person’s emotional state. • The system might also include wearable sensors in the driver’s clothing to relay physiological data. • Were such an automated system in place, the train’s (or plane’s or ferry’s) crew could take over the control of the vehicle in such a situation, or perhaps an autopilot system could be engaged. INFOSEC CD 2005
Implementation Scenario #2 • Deployment scenario could involve the installation of emotion recognition systems in the offices of senior executives at major corporations in the financial industry, or in the offices of high-level government officials. • Because fraud and security violations perpetrated by ‘insiders’ is such a big problem, such a system might serve as a deterrent or provide early warning signs of potential problems. • This scenario, more so than Scenario #1, would probably generate more resistance because of its perceived invasive nature. INFOSEC CD 2005
Privacy and Ethical Issues • Emotion-sensing systems, like face recognition systems, will be unobtrusive and non-invasive • Will they be thought of as more intrusive, because of their psychological nature, than face recognition systems? • What about false positives (i.e., system reports distress and the person reports no problem)? And possible faking or suppression on the part of individuals? INFOSEC CD 2005
Privacy and Ethical Issues • Reynolds and Picard (2004) gave participants scenarios suggesting that a computer system would be capable of giving them music and news recommendations based on the ability of the computer to recognize emotion. • Half of the participants were also given a ‘contract’ outlining how the ‘emotional’ data would be gathered, and how it would be used. INFOSEC CD 2005
Privacy and Ethical Issues • Participants were asked to report if they would use the system, how comfortable they would be with it, and whether they felt that their privacy would be affected by the system • Results indicated that the ‘contract’ group reported lower levels of ‘privacy invasion’ than the ‘no-contract’ group • Though the responses of the ‘contract’ group were still on the ‘invaded’ side of the scale INFOSEC CD 2005
See-through scanner sets off alarms'Virtual strip search' testing provokes outcries (CNN.com, 2002) The screen of a ‘Rapiscan Secure 1000’ displays the front- and rear-view image of a test subject's body outlines. INFOSEC CD 2005
Rapiscan Secure 1000 INFOSEC CD 2005
Results of ‘Quickvote’ Survey on CNN.com • Mon Mar 18 2002 • Do you approve of airport security scanning equipment that can see through your clothing? • Yes. 63% 106666 votes • No. 37% 63198 votes • Total: 169,864 votes INFOSEC CD 2005
A possible future for surveillance technology? • Represents the confluence of several current technologies • Surveillance camera networks • Wireless networks • Storage technology • Ubiquitous and wearable computing and sensing devices • Biometrics (especially face recognition) • Emotion-recognition by computers (i.e., Affective Computing) INFOSEC CD 2005
Questions Remain • Assuming that technical problems can be solved, would emotion-recognition based surveillance systems be cost effective? • Would these systems provide more effective automated security or enhance the current human-centric systems? • How will the information generated by such systems be stored? And who will have access to it? INFOSEC CD 2005
Questions Remain • Does the use of emotion recognition systems constitute a violation of a person’s right to privacy? • Should people be informed when they enter areas where this kind of surveillance is taking place? INFOSEC CD 2005
Questions? • Thank you. INFOSEC CD 2005