160 likes | 260 Views
Thrust IIA: Environmental State Estimation and Mapping. Dieter Fox (Lead) Nicholas Roy. MURI 8 Kickoff Meeting 2007. Task Objective: Human-Centered Maps. Observation: Automatic map-building (SLAM) is solved sufficiently well Goal : Describe environments by higher-level concepts:
E N D
Thrust IIA: Environmental State Estimation and Mapping Dieter Fox (Lead) Nicholas Roy MURI 8 Kickoff Meeting 2007
Task Objective: Human-Centered Maps • Observation: Automatic map-building (SLAM) is solved sufficiently well • Goal: Describe environments by higher-level concepts: • Places (room, hallway, street, walkway, parking lot, …) • Objects (tree, person, building, car, wall, …) • Key challenges: • Estimating concept types is mostly a discrete problem • Complex features and relationships University of Washington
Existing Technology • Human-centered mapping requires • integration of high-dimensional, continuous features from multi-modal sensor data • reasoning about spatial and temporal relationships • Conditional Random Fields provide extremely flexible probabilistic framework for learning and inference University of Washington
Conditional Random Fields • Discriminative, undirected graphical model • Introduced for labeling sequence data to overcome weaknesses of Hidden Markov Models [Lafferty-McCallum-Pereira: ICML-01] • Applied successfully to • Natural language processing [McCallum-Li: CoNLL-03], [Roth-Yih: ICML-05] • Computer vision [Kumar-Hebert: NIPS-04], [Quattoni-Collins-Darrel: NIPS-05] • Robotics [Limketkai-Liao-Fox: IJCAI-05], [Douillard-Fox-Ramos: IROS-07] University of Washington
Conditional Random Fields • Directly models conditional probability p(x|z) (instead of modeling p(z|x) and p(x), and using Bayes rule to infer p(x|z)). • No independence assumption on observations needed! Hidden states x Observations z University of Washington
Online Object Recognition [Douillard-Fox-Ramos: IROS-07, ISRR-07]
From Laser Scans to CRFs Shape and appearance Object type of laser beam 1 Object type of laser beam 2 Object type of laser beam 3 Object type of laser beam 4 … Shape and appearance Object type of laser beam n
Temporal Integration • Taking past and future scans into account can improve labeling accuracy. • Match consecutive laser scans using ICP. • Associated laser points are connected in CRF. • Can perform online filtering or offline smoothing via BP. … … … … k-2 k-1 k k+1
Example Trace: Car vs. Others • Trained on 90 labeled scans • Inference via filtering in CRF
Proposed Technical Advances • Integrate recognition results into maps • Improve results by leveraging web training data and high level object detectorss • Add object types suited for target scenario • Improve CRF training University of Washington
Situation Awareness via Wearable Sensors • Records 4 hours of audio, images (1/sec), GPS, and sensor data (accelerometer, barometric pressure, light intensity, gyroscope, magnetometer) Indicator LEDs 2 GB SD card Light sensors Microphone Camera
Soldier Activity Recognition • Automatic generation of mission summaries • Motion type (linger, walk, run, drive, …) • Environment (inside, outside building) • Events (conversations, marked via keyword) • Technical challenges • High-dimensional, continuous observations / features • Different data rates: (1 Hz - 256 Hz) • Getting labeled training data • Different persons / environments
Data Visualization / Summarization GPS traces Image sequence (currently in car) Timeline of soldier activities
Milestones • Goals: • Real time wearable interface on cell phone • Data sharing among soldiers and robots • Real time display on remote laptop
Milestones • Year 1: • Real time data sharing between wearable sensor platforms • Integration of object recognition into mapping • Year 2: • Real time data sharing between soldiers, robots, and remote laptop • Detection of specific soldier states / activities (moving, incapacitated, ...)