1 / 84

Natural User Interface Systems

Natural User Interface Systems. Vijay Kumar Kolagani Yingcai Xiao. Outline. Introduction Hardware Software Device Drivers Middleware API API Standardization Examples. 2. Introduction. NUI.

jody
Download Presentation

Natural User Interface Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Natural User Interface Systems Vijay Kumar Kolagani Yingcai Xiao

  2. Outline • Introduction • Hardware • Software • Device Drivers • Middleware • API • API Standardization • Examples 2

  3. Introduction

  4. NUI • A natural user interface (NUI) is a system for communicating with a computing device through natural interactions (gestures, voices, …) without touching a physical device. 4

  5. M. Karam and M.C. Schraefel HCI Taxonomy 5

  6. Evolution of HCI 6

  7. Expanding Dimensions • Real World: 3D • CLI: 0D, 2D • GUI: 2D • NUI: 3D 7

  8. From traditional to Non-traditional APPLICATIONS Display User Graphics Application Controller

  9. Video Game Interactive animation: user-> interface -> game object action -> feedback (A/V, haptic) Game objects can represent data.

  10. From traditional to Non-traditional APPLICATIONS 10

  11. NUI Hardware

  12. Key Features of NUI hardware • 3D Interaction • Point cloud • Structured light based • Time-of-flight based 12

  13. Key Features of NUI hardware • 3D Interaction • Point cloud • Structured light based • Time-of-flight based • Triangulation 13

  14. Key Features of NUI hardware • Multi-channels • Infrared • RGB (Fusion) • Audio (Synch) • Touch (Haptic) • Smell • Taste 14

  15. Key Features of NUI hardware • Specialized API • Lack of standardization at current stage of development • OpenNI is a good starting point, but it could not keep up with the fast advancing NUI development. 15

  16. NUI HardwareExamples

  17. Kinect • Kinect was named as the Project Natal at the time of its development. • Kinect Windows SDK released to the public in February 2012. • SDK allows to write Kinect apps in C++, C#, or Visual Basics. 17

  18. Kinect 18

  19. Leap Motion • The first product launched in the year 2012 which was originally called “The Leap” • Leap Motion was just a small sized USB device which can control the computer with the hand movements. • It can detect each finger individually from your hand. 19

  20. Leap Motion • Leap Motion consists of three Infrared Light emitters and two cameras which received the IR lights. • Leap Motion provides us with a powerful API which helps to track and recognize fingers with high precision. • This followed the Cartesian coordinates system. 20

  21. Leap Motion 21

  22. HoloLens • https://www.microsoft.com/en-us/hololens • Mixed Reality • Audio • RGB • Depth (ToF) • Fusion • VR 22

  23. HoloLens • Mixed Reality • Infrared • RGB • Fusion • VR/AR/MR <=> XR https://www.youtube.com/watch?v=EIJM9xNg9xs https://www.youtube.com/watch?v=6lxGU66w0NM 23

  24. HoloLens https://www.microsoft.com/en-us/hololens • HW Details https://docs.microsoft.com/en-us/windows/mixed-reality/hololens-hardware-details https://www.theverge.com/2016/4/6/11376442/microsoft-hololens-holograms-parts-teardown-photos-hands-on https://www.microsoft.com/en-us/hololens/buy 24

  25. NUI InputAbstraction

  26. Gesture Recognition • Gestures recognition follows three steps : Extraction Features Estimation Classification 26

  27. Hand gesture • Thiery and Malek separate gestures into two types : • Intransitive gestures in the absence of the interactive object  • Transitive gestures in the presence of the interactive object  27

  28. Abstract Hand gestures can be designed According to application needs 28

  29. API

  30. NUI API • Three parts of a NUI APPLICATION: • Hardware: e.g., Kinect • Software: drivers, middleware • Application: integration of HW enabling software with applications.

  31. OpenNI • body imaging (2) joint recognition (3) gesture construction

  32. OpenNI

  33. Interface

  34. Interfaces • An interface is a group of zero or more abstract methods • Abstract methods have no default implementation. • Abstract methods are to be implemented in a child class or child struct. • Subclassing an interface by a class or struct is called implementation of the interface. • An interface can be implemented but not instantiated. You can’t use an interface class to create an object. • Interfaces can also include properties and events, but no data. • An interface defines a contract between a type and users of that type. Used to define software interface standards. • All interface methods are public, no specifiers needed. • A class can implement multiple interfaces. Interfaces

  35. interface ISecret { void Encrypt (byte[] inbuf, byte[] outbuf, Key key); void Unencrypt (byte[] inbuf, byte[] outbuf, Key key); } //no implementation, just prototyping. class Message : ISecret { public void Encrypt (byte[] inbuf, byte[] outbuf, Key key) { /* implementation here */ } public void Unencrypt(byte[] inbuf, byte[] outbuf, Key key) { /* implementation here */ } } Message msg = new Message(); // e.g. check if object msg implements interface ISecret if (msg is ISecret) { // type checking, // an object of a child type is also an object of the parent type, but not the other way around ISecret secret = (ISecret) msg; // from child to parent, explicit cast secret.Encrypt (...); } Interface Example

  36. An abstract class is a class that can’t be instantiated, i.e., one can’t use an abstract class to create an object. • The definition of an abstract class looks like a regular class except the preceding keyword“abstract”. • It can have member fields and methods. • It can only be used as a base class for subclassing. • Its subclasses can inherit its methods as default implementation. (They can overwrite those methods too.) • It is not allowed to inherit from multiple abstract classes. Abstract Class

  37. Both can’t be instantiated. • Both defines standardsfor their subclass to implement. • An abstract class defines the minimal implementation of its subclasses. • An interface has no implementation at all. • A child class can’t subclass from more than one abstract classes. No multiple inheritance for abstract classes. • A child class can implement more than one interfaces. Multiple inheritance allowed for interfaces. • Abstract classes and interfaces can be used together. Abstract Class vs. Interface Classes

  38. abstract class DefaultTokenImpl { private readonly string name; public string ToString() { return name; } protected DefaultTokenImpl(string name) { this.name = name; } } interface IToken { string ToString(); } interface IVisitable { void Accept(ITokenVisitor visitor); } interface IVisitableToken : IVisitable, IToken { } class KeywordToken : DefaultTokenImpl, IVisitableToken { public KeywordToken(string name) : base(name) { } void IVisitable.Accept(ITokenVisitor visitor) { visitor.VisitKeyword(ToString());} } Abstract Class & Interface Examples

  39. KeywordToken subclasses the abstract class DefaultTokenImpl • It also implements the interface IVisitableToken (which implements interfaces IVisitable and IToken) • It implements the Accept abstract method specified in interface IVisitable (a parent of IVisitableToken) • It inherits the default implementation of ToString from the abstract class DefaultTokenImpl to implement the ToString abstract method specified in interface IToken (the other parent of IVisitableToken). Abstract Class & Interface Examples

  40. OpenNI • Production Nodes: • a set of components that have a productive role in the data creation process required for Natural Interaction based applications. • the API of the production nodes only defines the language. • The logic of data generation must be implemented by the modules that plug into OpenNI. • E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

  41. OpenNI: Sensor-Related Production Nodes •  Device: represents a physical device (a depth sensor, or an RGB camera). Its main role is to enable device configuration. •  Depth Generator: generates a depth-map. Must be implemented by any 3D sensor that wishes to be certified as OpenNI compliant. •  Image Generator: generates colored image-maps. Must be implemented by any color sensor that wishes to be certified as OpenNI compliant •  IR Generator: generates IR image-maps. Must be implemented by any IR sensor that wishes to be certified as OpenNI compliant. •  Audio Generator: generates an audio stream. Must be implemented by any audio device that wishes to be certified as OpenNI compliant.

  42. OpenNI: Middleware-Related Production Nodes •  Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified. •  Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane. The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background. •  Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location. •  User Generator: Generates a representation of a (full or partial) body in the 3D scene.

  43. OpenNI: Recording Production Notes •  Recorder: Implements data recordings •  Player: Reads data from a recording and plays it •  Codec: Used to compress and decompress data in recordings

  44. OpenNI: Capabilities • Supports the registration of multiple middleware components and devices. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports. • Currently supported capabilities: •  Alternative View: Enables any type of map generator to transform its data to appear as if the sensor is placed in another location. •  Cropping: Enables a map generator to output a selected area of the frame. •  Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

  45. OpenNI: Capabilities • Currently supported capabilities: •  Mirror: Enables mirroring of the data produced by a generator. •  Pose Detection: Enables a user generator to recognize when the user is posed in a specific position. •  Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user calibration capabilities. •  User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

  46. OpenNI: Capabilities • Currently supported capabilities: •  Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly. •  Lock Aware: Enables a node to be locked outside the context boundary. •  Hand Touching FOV Edge: Alert when the hand point reaches the boundaries of the field of view.

  47. OpenNI: Generating and Reading Data • Production nodes that also produce data are called Generator. • Once these are created, they do not immediately start generating data, to enable the application to set the required configuration. • The xn::Generator::StartGenerating() function is used to begin generating data. • The xn::Generator::StopGenerating stops it. • Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function. • OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.

  48. NUI Programming

  49. for window Architecture6 NUI Programming Styles • As part of an application: NUI API integrated (linked) into the application. • As a service: NUI API runs as a service in the OS. • As an event-regenerator: Map NUI events into GUI/CLI events. (1) Using NUI API to regenerate events; (2a) send the regenerated events back to the OS or (2b) send the regenerated events to application through IPC (inter-process communications). Best for using NUI to control existing non-NUI applications. 50 Kinect Design Model

  50. NUI API as Part of an Application

More Related