140 likes | 310 Views
Sentient recipes. First Year Talk. Simon Fothergill DTG Computer Laboratory University of Cambridge. February 2006. Presentation Content. Intent of Ph.D. What I have done The area I have carved out so far Ideas for how to continue What I am working on at the moment Tie up
E N D
Sentient recipes First Year Talk Simon Fothergill DTG Computer Laboratory University of Cambridge February 2006
Presentation Content • Intent of Ph.D. • What I have done • The area I have carved out so far • Ideas for how to continue • What I am working on at the moment • Tie up • Comments, suggestions, criticisms…
Summary of PhD : Inferring stuff! • Sentient Computing, Context awareness, Sensor fusion • Signal to symbol translation (stepwise, logical and statistical disambiguation) • Extending the Sentient vocabulary • Trying a number of different domains • Location (x,y,z) • Sentient Lecture Theatre (x,y,z, sound, video)
What I have been doing • Experimented with Bat system: • Python programming • Filtering, logging and visualisation of my movements for ~3 weeks • Bat poster • Bat buttons • Analysis of GPS data (inaccessible,not ready to commit!) • Lecture theatre (abandon research proposal! - lighting, other ideas, microphone installation) • Broadband phones (was too complex) • Background to Signal to Symbol translation taken from other fields
Vision • Recipe analogy (mapping for how to create ways of getting information about different phenomena): • Want machine understanding of a phenomena (dish) for better interaction • You need these sensors (ingredients) • Analyse data using these algorithms (combine ingredients according to these instructions) • Stepwise procedure with subparts • Infer results (produce dish) • Plug and play: • Plug in any sensor into the local sensor infrastructure • Possibly need drivers/configuration/calibration phase (possibly long term training) • Possibly specify constraints • Get some “high level” information on what it sensors
Previous work • Signal to Symbol in other domains (NLP, Emotions, Protein folding) • Stepwise, stack-based separation of concerns/levels • Context awareness, context models, middleware infrastructure, programming paradigm, with simple logical examples (Lab assistant) • Robust location systems, fusion of similar sensors, uncertain reasoning of topological information • Not much “hard-core” detailed inference.
Ideas for what to focus on in future • Analysis of relevant verticals (1 at a time) to find the best descriptions of / exact word for, phenomena/features that can be disambiguated – define parameter space • Try and achieve recognition of these lists from building up inferences using raw data from different sensors systems. • For theoreticians/formal reason linguists: • Sensor Metric: Entropy of signal: Examine properties of signal. • What happens to recognition graph when change sensors. • Percentage of time X recognised with probability Y. • Sensor fusion: How to use the data appropriately • Compute P (evidence | witness) • Semantic net: Necessary and sufficient signal properties or data to infer phenomena, for example, a meeting.
Example recognition graph 1 bat 2 bats P(X) Left right forward backwards Leaning
“An” ontology of ontologies (Sentient vocabulary)… Cars Sport Environmental conditions Social insects Pheromones Smell Mood Lighting Speech tracking Ergonomics Theatre Lecture Theatre Alexander Technique Body RSI Sound Meeting Location Posture Movement Gesticulation Activities Leaning Slouching Topography Corridor scale Direction Office scale (running around, energy path) Length Speed
…My current corner • Using the sensors we’ve got, or simple extensions (multiple bats give much more information) • LT • Microphones, video, Ubisense kit • Speech tracking based on lecture notes, dialogue pattern, compression of: He goes THERE, THEN, in THIS way, saying THIS in THIS way. • Interest level, slight lines, obscuration, lines of sight • 1 + 2 Bats • Worn Normally • Worn front and back • On lapels and in pockets • As a ring and wrist band • 3D visualisation of trails. Now machine clustering, classification? • Upwards: Add temporal index and extend uncertainty to beyond sensor system: (Z+, X-) (forwards, left) Shapes. • Up 5cms means different things, depending on the history. • Detecting a slouch. Good example: defined application area, extends vocabulary, granularity of sensors support it, enough variation. • Standing, lowering, contact point, relax. • Any user • Beeps if polling rate to low metric
Previous work on standing/sitting Diagram: Eli Katsiri, Thesis
Titles • Previous title • How to solve the meeting problem • Current title • Sentient recipes • Possible future title • {a specific recipe/type of dish(!)}
People • Andy Hopper • Sean Holden • Rob Harle • Alastair Beresford • Alastair Tse • Bo
Time to chat! • Comments? • Suggestions? • Criticisms? Thank you.