1 / 13

ITN Edusafe – WP2 General Layout

ITN Edusafe – WP2 General Layout. G . Aielli ITN EDUSAFE Kickoff meeting. CERN 17/9/2012. WP2 summary.

zilya
Download Presentation

ITN Edusafe – WP2 General Layout

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ITN Edusafe – WP2General Layout G. Aielli ITN EDUSAFE Kickoff meeting. CERN 17/9/2012

  2. WP2 summary • Description:Development of present and future technologies for VR/AR applications based on very fast pattern recognition. In particular for the Personal Supervision System being developed by this project but suitable for general purpose machine vision applications • Very ambitious: Aim to go beyond present HW and SW limits to go toward the next generation of performance. • Pivoting on new hardware concepts (the WRM weighting resistive matrix developed by Roma2) and on the state of the art pattern recognition algorithmic techniques from EPFL. • Front end and back end: DAQ and final user • Dedicated infrastructure to optimize the bi-directional video transmission: INPUT images to be recognized, OUTPUT augmented content on the original frames in real time • The front end HW infrastructure will be developed by NOCA ensuring a sufficient bandwidth and computing power to support the data transmission with the operator on field. CERN will use the data of the recognized images to add the augmented contend through a dedicated. • High risk and high reward scheme mitigated by a two stage approach: • Short term phase: software driven methods will be optimised to provide a sufficient AR support to the Personal Supervision System (PSS). Also the possibilities for hardware acceleration by means of a simplified WRM system will be studied as parallel development. • Advanced phase: Image recognition studies from EPFL and mathematical studies of the WRM properties will be then used to drive the design of the an advanced WRM prototype demonstrator.

  3. WP2 workflow DAQ and transmission (NOCA) HW edge detection (optionally software) WRM 32x32 (minimum) SMD components X,Y,X’,Y’ image edge coordinates to the server. The scale factor is also calculated The augmented content generated and superimposed at the right coordinates of the field of vision of the operator

  4. Implementation summary Roma2 is responsible for WRM chip development (WRM chip and algorithm design, fast prototype development). EPFL will develop the 3D tacking system and adapt it to the WRM technology. NOCA will research the video and sensing data treatment and adaptation to computer vision requirements as well as wireless transmission. CERN is responsible for providing the environmental and usability requirements and developing the authoring and visualizer software prototype optimized for AR technologies and methods. Milestones:

  5. WP2 Project Title WRM chip development (ER, ESR) • Partner recruiting: Roma2 Partners involved: NOCA, CERN, NTUA, EPFL, TUM • Relevance: • new pattern recognition concept  overcome the present state of the art performance • will open brand new possibilities such as personal electronic devices, robotics, and artificial intelligence. • Derived directly from the LHC aimed experimental research. • The idea is to exploit this extreme speed applied to simple patterns to build a sophisticated pattern recognition system working at rates fast for the human meter but many orders of magnitude. • The core of the system is the WRM chip, a special resistive network able to perform analogically a very good approximation of the least square fit between the data and a given hypothetical pattern. • FELLOW 1 (=ER1): Development of the integrated WRM prototype: the WRM chip, the readout system based on maximum selector, and the surrounding electronics and firmware to support the WRM functioning. The connected scientific goal is to measure the performance gain of such a device in terms of detection speed and efficiency in function of the noise and environmental condition, to the standard benchmarks. • FELLOW 2 (=ESR1): The goal is to study the theoretical and mathematical aspects connected to the WRM. This will vary from the best way of interconnecting the matrix nodes to the study of application of the general Hough transform in the WRM system. Part of the work will consist in adapting the algorithms developed in WP2 by Fellow 3 and 4 (EPFL) to the WRM. In particular, it has to be studied how to exploit the feature-point based and image-patch based techniques within the WRM framework.

  6. Expected results First concept prototype of the WRM tested on beam at RD5 (1993)

  7. 3D tracking system development (ESR2, ER2) • Partner recruiting: EPFLPartners involved: Roma2, NOCA, CERN, NTUA, TUM • Relevance: • development of a camera 3D tracking system relying on general image patches generalizing the concept of feature points. • The 3D tracking system will determine the 3D camera pose using the data sent by the platform developed by NOCA • the operational instructions and animation created by CERNwill exploit this information to integrate them on the real image • The challenge is to make this system work in difficult industrial environments. • FELLOW 3 (=ESR2):Develop a fast image patch recognition method, which will be the core of the 3D tracking system, robust to lighting, perspective changes and image noises. Algorithmic solutions developed for fast, robust, and scalable image patch search, combining previous work on fast image patch recognition and scalable Approximate Nearest Neighbor search methods. First developed and experimented on software, and then further developed and merged with the WRM system (Fellow 1, 2) to further speed up. • FELLOW 4 (=ER2):Fellow 4 will develop the 3D tracking system itself which will estimate the camera 3D pose at run-time, exploiting the 3D geometry and images acquired offline, together with the measures from an accelerometer and a gyroscope, provided by the video/sensing platform developed by Fellow 5 (NOCA). The Fellow will develop the system integrating the image patch recognition method (ROMA) so that the camera can compute 3D pose from the image patch 2D locations and orientations. It will provide the camera pose to the visualizer developed by Fellow 6 (CERN) for proper integration of the virtual contents into the image

  8. Expected results • ESR2 Expected results: • Development of a fast image patches recognition method, robust to lighting and perspective changes. Software implementation of the method as a library to be integrated to the 3D tracking system.(related to D2.3) • Adaptation, together with Roma2, of the patch recognition method as a low-level algorithm suitable to be implemented in the WRM acceleration platform.(related to D2.4) • ER2 Expected results: • Development of a 3D tracking system exploiting the image patch recognition library implemented by ESR2 and inertial information to estimate the camera 3D pose and provide it to the visualization system developed by CERN (related to D2.3) • Evolution of the 3D tracking system to exploit the WRM-accelerated image patches recognition. (related to D2.4)

  9. Mobile sensing and data transfer platform (ESR3) • Partner recruiting: NOCAPartners involved: CERN, EPFL, Roma2, TUM • Associate partner providing PhD: AUTH • Relevance: • Special portable device permitting to monitor personnel and the surroundings in hostile environments. • Adaptable to different environments with varying sensing and transmitting requirements, scalable, robust and cost effective. • Real time A/V communication between operator and supervision • The main challenge is to develop a robust and performing transmitting device allowing real time and interactive A/V • FELLOW 5 (=ESR3): Fellow 5 will design a system architecture modular and flexible enough to accommodate different, sensing, computational, transmitting and power expenditure requirements. A protocol will be then developed to provide an acknowledgement mechanism on top of the UDP protocol as to enable re-transmission of lost packets. It must be able to resolve problems with multiple radio waves arriving in the same antenna from different paths. Finally, to support the development of the aforementioned architecture and of the corresponding wireless data transmission mechanism, a systematic methodology is needed to determine the best configuration of component parts to include in the system architecture. • Expected results ESR3: • Design of a mobile sensing and A/V transmission platform and optimization for long and reliable operation. • Design and test of a high bandwidth and robust wireless integrated transmission system suitable for real time data transmission.

  10. Authoring & Visualizer Software (ESR) • Partner recruiting: CERNPartners involved: EPFL, NTUA, TUM, Roma2 • Partner providing PhD: NTUA • Relevance: • Design and prototype of the authoring software generating the VR and AR data contents and Visualizer software merging this data on the real image. • It will exploit the parametric data from the 3D tracking system. • Challenge A special care shall be taken not to introduce in the AR system delays that could decrease the overall system efficiency above 10 ms. • This project has a crucial importance as the efficiency and the usability of the integrated system depends strongly on the image rendering quality and data content efficiency displayed on the visualizer device. • develop user interfaces scalable to various types of display devices. • FELLOW 6 (=ESR4): The Fellow will study and support the complete development of the authoring and visualizer software • design the SW system architecture compatible with the various display infrastructures. • analyse the state of the art of the AR rendering technology and methods and will determine and develop further the most appropriate ones, exploiting the pose data from the 3D tracking system • study the transfer the large quantity of video data to the display devices without degrading performance. • integrate the visualizer software in the global CS (Fellow 7, NTUA) together with the AR software developed by Roma2 and EPFL to optimize the AR system. • Results: • Authoring and visualizer SW system architecture development for the selected display infrastructures incl. documentation (related to D2.2) • Development and system integration of the authoring and visualizer SW including AR features, incl. final documentation and validation tests results

  11. Training

  12. Secondments of WP2

  13. Secondments and interaction scheme • Setup of a WP2 technical board meeting online regularly • One or two people per partner • Encouraging mixed worrking groups which will refer to the TB the work progress • Total 27 monththe 6 fellow will spend by other partners • Most internally to the WP2 • Crucial activities since the activities of WP2 are strongly interconnected CERN ESR4 ER2 ER1 EPFL Roma2 ESR2 ESR1 • The TB will: • check in advance potential deadlocks due to WP2 inter-dipendencies • Update the project management on the general status and issues • Offer periodically the opportunity of a status presentation and review of the work • Need for careful planning to avoid inefficiency instead of benefitting from that • Resources for transfers must be atequate ESR3 Other Partners NOCA

More Related