180 likes | 566 Views
Visualisation and Authoring. Lakshmi Prabha Nattamai Sekar CERN , Geneva, Switzerland. Objective. To design, develop, implement and test the augmented reality visualization and authoring tool for maintenance task in extreme environment. Assumption
E N D
Visualisation and Authoring LakshmiPrabhaNattamaiSekar CERN, Geneva, Switzerland
Objective To design, develop, implement and test the augmented reality visualization and authoring tool for maintenance task in extreme environment. • Assumption • Pose information from camera based visual tracking (EPFL and Roma2) is given as input. • Head mounted display are used as display device and user interface from WP4.
Problem definition • Visualisation • What, when and how to show the AR content • Where to show the AR content • Six pose parameters
Visualisation • To study and design a usable visualization scheme with improved error handling techniques. • State of the art studies • KARMA - http://graphics.cs.columbia.edu/projects/karma/karma.html • ARMAR - http://monet.cs.columbia.edu/projects/armar/ • ARTESAS - http://www.artesas.de/site.php?lng=en • ARVIKA - http://www.arvika.de/www/index.htm • JOINPAD - http://www.joinpad.net/prodotti/augmented-xp-2/ • AR on Tablet - http://www.igd.fraunhofer.de
Prototype • Defining what to display is a human centric problem. • Prototype development helps collecting the user feedback. • To acquire an experience on AR application development • First prototype was designed on a handheld android mobile device using Ubitrack and Unity 3D. Ubitrack [2], [3] and Unity [4]
Visualisation • Error reduction and handling • To have accurate pose data using group of sensors along with camera and to acquire the benefits of fused data. • Visualization scheme to handle the minor registration errors or jitters. Camera Sensor fusion Input IMU Result better than Individual source Other sensors Sensor fusion [1]
Prototype assessment • Results • Warning messages or voice advises related to a scene procedure. • Options to select either voice or text feedback for the warning messages. • Avoid intensive text descriptions.
Problem definition • Visualisation • What, when and how to show the AR content • Where to show the AR content • Six pose parameters • AR visualisation content • Authoring tool • Repeated maintenance actions
Authoring • Given 3D model database, provide basic tools for labeling, text message, audio message, arrow and animation for creating AR content. • Why we need this? • Basic industrial maintenance action. • Reduced time and no programming skill required to design AR content. • Easy adaptation to changes in environment or maintenance procedure. Authoring Tool Survey
Authoring 3D model database Designer with Authoring tool 3D Model Server Online Process AR content database AR content AR content Offline Process Request for step procedure Request for step procedure Sends appropriate step visualisation
Authoring 3D model database Game Designer Game content database Server Multiplayer Game Game making Game Engine Survey
AR Content Visualization Authoring Information presentation study Access to 3D model DB Feature to be added based on feedback from users Registration error handled in rendering Arrow, text and audio messages Gamma imaging related visualisation Primitive shapes and transformations Pose calculation from group of sensors Basic animation to design maintenance procedure Camera based visual tracking IMU - Inertial Measurement Unit Save object file or animation file in appropriate format.
Summary • First prototype helped in understanding the basics of AR visualisation and authoring tool requirement. • Authoring tool related studies, feature design and implementation related survey. • We had a meeting with WP4 for the Gamma imaging related visualisation. • Required to work and establish a sensor fusion algorithm to fuse camera tracking and IMU data. • Attended and completed Computer Vision summer school (ICVSS [5]) and Beginners French course.
Future work • Courses Planning for • Mathematics • Computer graphics • Augmented reality • Sensor fusion • Wearable technology • Scientific writing and Project management • Conferences and workshops • IEEE VR and 3DUI 2014 • ECCV 2014 • ISMAR 2014
References [1] G. T. McKee, ”What can be fused?”, Multisensor Fusion for Computer Vision, Nato Advanced Studies Institute Series F, 99, pp 71 - 84, 1993. [2] Fachgebiet Augmented Reality, TU Munchen, Ubitrack library, http:// campar.in.tum.de/ubitrack/webhome, October 2013 accessed. [3] J. Newman, M. Wagner, M. Bauer, A. MacWilliams, T. Pintaric, D. Beyer, D. Pustka, F. Strasser, D. Schmalstieg, and G. Klinker, ”Ubiquitous tracking for augmented reality” Third IEEE and ACM International Symposium on Mixed and Augmented Reality, pp192-220, 2004. [4] Unity Technologies, Unity3D, http://unity3d.com/, October 2013 accessed. [5] International Computer Vision Summer School (ICVSS 2013) http://svg. dmi.unict.it/icvss2013/, October 2013 accessed.