1 / 42

Recognizing Human Activity from Sensor Data

Recognizing Human Activity from Sensor Data. Henry Kautz University of Washington Computer Science & Engineering graduate students : Don Patterson, Lin Liao CSE faculty : Dieter Fox, Gaetano Borriello UW School of Medicine : Kurt Johnson

osborn
Download Presentation

Recognizing Human Activity from Sensor Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognizing Human Activity from Sensor Data Henry Kautz University of WashingtonComputer Science & Engineering graduate students: Don Patterson, Lin Liao CSE faculty: Dieter Fox, Gaetano Borriello UW School of Medicine: Kurt Johnson Intel Research: Matthai Philipose, Tanzeem Choudhury

  2. Converging Trends… • Pervasive sensing infrastructure • GPS enabled phones • RFID tags on all consumer products • Wireless motes • Breakthroughs in core artificial intelligence • After “AI boom” fizzled, basic science went on… • Advances in algorithms for probabilistic reasoning and machine learning • Bayesian networks • Stochastic sampling • Last decade: 10 variables  1,000,000 variables • Healthcare crisis • Epidemic of Alzheimer’s Disease • Deinstitutionalization of the cognitively disabled • Nationwide shortage of caretaking professionals

  3. ...An Opportunity • Develop technology to • Support independent living by people with cognitive disabilities • At home • At work • Throughout the community • Improve health care • Long term monitoring of activities of daily living (ADL’s) • Intervention before a health crisis

  4. The University of Washington Assisted Cognition Project • Synthesis of work in • Ubiquitous computing • Artificial intelligence • Human-computer interaction • ACCESS • Support use of public transit • CARE • ADL monitoring and assistance

  5. This Talk • Building models of everyday plans and goals • From sensor data • By mining textual description • By engineering commonsense knowledge • Tracking and predicting a user’s behavior • Noisy and incomplete sensor data • Recognizing user errors • First steps toward proactive assistive technology

  6. ACCESSAssisted Cognition in Community, Employment, & Support SettingsSupported by The National Institute on Disability & Rehabilitation Research (NIDDR)The National Science Foundation (NSF) Learning & Reasoning About Transportation Routines

  7. Task • Given a data stream from a wearable GPS unit... • Infer the user’s location and mode of transportation (foot, car, bus, bike, ...) • Predict where user will go • Detect novel behavior • User errors? • Opportunities for learning?

  8. Why Inference Is Not Trivial • People don’t have wheels • Systematic GPS error • We are not in the woods • Dead and semi-dead zones • Lots of multi-path propagation • Inside of vehicles • Inside of buildings • Not just location tracking • Mode, Prediction, Novelty

  9. GPS Receivers We Used GeoStats wearable GPS logger Nokia 6600 Java Cell Phone with Bluetooth GPS unit

  10. Geographic Information Systems Street map Data source: Census 2000 Tiger/line data Bus routes and bus stops Data source: Metro GIS

  11. Architecture Learning Engine • Goals • Paths • Modes • Errors GIS Database Inference Engine

  12. Probabilistic Reasoning • Graphical model: Dynamic Bayesian network • Inference engine: Rao-Blackwellised particle filters • Learning engine: Expectation-Maximization (EM) algorithm

  13. Graphical Model (Version 1) • Transportation Mode • Velocity • Location • Block • Position along block • At bus stop, parking lot, ...? • GPS Offset Error • GPS signal

  14. Rao-Blackwellised Particle Filtering • Inference: estimate current state distribution given all past readings • Particle filtering • Evolve approximation to state distribution using samples (particles) • Supports multi-modal distributions • Supports discrete variables (e.g.: mode) • Rao-Blackwellisation • Each particle includes a Kalman filter to represent distribution over positions • Improved accuracy with fewer particles

  15. Tracking blue = foot green = bus red = car

  16. Learning • User model = DBN parameters • Transitions between blocks • Transitions between modes • Learning: Monte-Carlo EM • Unlabeled data • 30 days of one user, logged at 2 second intervals (when outdoors) • 3-fold cross validation

  17. Results

  18. Prediction Accuracy How can we improve predictive power? Probability of correctly predicting the future City Blocks

  19. Transportation Routines A B Work • Goals • work, home, friends, restaurant, doctor’s, ... • Trip segments • Home to Bus stop A on Foot • Bus stop A to Bus stop B on Bus • Bus stop B to workplace on Foot “Learning & Inferring Transportation Routines”, Lin Liao, Dieter Fox, & Henry Kautz, AAAI-2004 Best Paper Award

  20. gk-1 gk tk-1 tk mk-1 mk xk-1 xk zk-1 zk Hierarchical Model Goal Trip segment Transportation mode x=<Location, Velocity> GPS reading

  21. Hierarchical Learning • Learn flat model • Infer goals • Locations where user is often motionless • Infer trip segment begin / end points • Locations with high mode transition probability • Infer trips segments • High-probability single-mode block transition sequences between segment begin / end points • Perform hierarchical EM learning

  22. Inferring Goals

  23. Inferring Trip Segments Going to work Going home

  24. Correct goal and route predicted 100 blocks away

  25. Novelty & Error Detection • Approach: model-selection • Run several trackers in parallel • Tracker 1: learned hierarchical model • Tracker 2: untrained flat model • Tracker 3: learned model with clamped final goal • Estimate the likelihood of each tracker given the observations

  26. Detect User Errors Untrained Trained Instantiated

  27. Application:Opportunity Knocks Demonstration (by Don Patterson) at AAHA Future of Aging Services, Washington, DC, March, 2004

  28. CARECognitive Assistance in Real-world Environmentssupported by the Intel Research Council Learning & Inferring Activities of Daily Living

  29. Research Hypothesis • Observation: activities of daily living involve the manipulation of many physical objects • Cooking, cleaning, eating, personal hygiene, exercise, hobbies, ... • Hypothesis: can recognize activities from a time-sequence of object “touches” • Such models are robust and easily learned or engineered

  30. Sensing Object Manipulation • RFID: Radio-frequency ID tags • Small • Semi-passive • Durable • Cheap

  31. Where Can We Put Tags?

  32. How Can We Sense Them? coming... wall-mounted “sparkle reader”

  33. Example Data Stream

  34. Making Tea

  35. Building Models • Core ADL’s amenable to classic knowledge engineering • Open-ended, fine-grained models: infer from natural language texts? • Perkowitz et al., “Mining Models of Human Activities from the Web”, WWW-2004

  36. Experimental Setup • Hand-built library of 14 ADL’s • 17 test subjects • Each asked to perform 12 of the ADL’s • Data not segmented • No training on individual test subjects

  37. General Solution Quantitative Results 95/84 Point Solution Quantitative Results General Solution Anecdotal Results Point Solution Anecdotal Results Pervasive Computing, Oct-Dec 2004

  38. Current Directions • Affective & physiological state • agitated, calm, attentive, ... • hungry, tired, dizzy, ... • Interactions between people • Human Social Dynamics • Principled human-computer interaction • Decision-theoretic control of interventions

  39. Why Now? • A goal of much work of AI in the 1970’s was to create programs that could understand the narrative of ordinary human experience • This area pretty much disappeared • Missing probabilistic tools • Systems not able to experience world • Lacked focus – “understand” to what end? • Today: tools, grounding, motivation

  40. Challenge to Nanotechnology Community • Current sensors detect physical or physiological state: user mental state must be indirectly inferred • To what can extend can nanotechnology afford direct access to a person’s emotions and intentions?

More Related