220 likes | 362 Views
Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical Engineering Department, Faculty of Engineering, National University of Singapore. CIRP Annals - Manufacturing Technology 60 (2011) 1–4. 2013 / 06 / 13. contents. Abstract Introduction
E N D
Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical Engineering Department, Faculty of Engineering, National University of Singapore CIRP Annals - Manufacturing Technology 60 (2011) 1–4 2013/06/13
contents • Abstract • Introduction • 3D bare-hand interaction method • Assembly data management • Augmented assembly process • Assembly sequence evaluation and feedback • Implementation and case study • Conclusion and future work
Abstract • Augmented reality has been applied to develop augmented assembly systems. • However, most reported studies used pre-defined assembly information. • AR is predominantly used to display information and interactions between users and the augmented environment are limited. 1/2
Abstract • This paper presents 3D barehandinteraction in an augmented assembly environment to manipulate and assemble virtual components. • A hybrid method based on constraint analysis is presented, which interprets users’ manual assembly intents robustly without the need for auxiliary CAD information. 2/2
Introduction • In recent years, virtual reality and virtual prototyping techniques have been widely used to simulate and evaluate assembly in the early design stage. • The assembly planning experience is limited to a pure virtual environment due to a lack of real spatial feeling and suitable sensory feedback. 1/3
Introduction • Augmented assembly is an application of augmented reality in assembly where an augmented environment is created, in which virtual objects are combined with the real environment to enhance the assembly design and planning process. • AA system that interprets users’ manual assembly intents, supports on-line constraint recognition, and provides a robust 3D bare-hand interface to allow realistic visual feedback during assembly. 2/3
Introduction • A bare-hand interaction augmented assembly (BHAA) system has been developed. 3/3
3D bare-hand interaction method • To achieve natural and intuitive human computer interaction (HCI), human hands can be used as interaction devices in AEs. • Computer vision (CV) based human hand detection and tracking techniques can identify bare-hand gestures from video streams and use them as commands for the systems. 1/3
3D bare-hand interaction method • In the 3DNBHI method, the users’ bare hands are tracked to extract the hand contours, determine the palm centers and detect the fingertips. • The hand centers are tracked using a matching algorithm that minimizes the displacement of the pair of hand centers overtwo successive frames, so that these two hands can always be differentiated from the live video stream. 2/3
3D bare-hand interaction method • To achieve interactions between the bare hands and virtual objects. • A small virtual sphere is rendered on each fingertip. 3/3
Assembly data management • A tri-layer assembly data structure (TADS) is used for assembly data management in BHAA. • First layer consists of geometric information. • Second layer is assembly sequence. • Third layer is assembly structure • part-pair, • surface-pair • constraint information. 1/1
Augmented assembly process • With the 3DNBHI interface, users can manipulate and assemble two different parts more intuitively and realistically. • When these two parts are sufficiently close to each other, the user can adjust the positions and orientations of these parts easily and efficiently to trigger the assembly intent interpretation and constraints recognition functions. 1/6
Augmented assembly processAssembly feature recognition • The surface contact query method is carried out as follows: • Step#1: Check the types of the surface pairs in contact. • Step#2: Check the parameters of the surface pairs in contact. 3/6
Augmented assembly processAssembly feature recognition • When the difference Ti in each parameter for a surface pair is within a threshold range,this surface pair remains in the list of surface contacts;otherwise, this surface pair will be removed from the list of surface contacts. 4/6
Augmented assembly processConstraint confirmation and refinement • For each constraint that has been recognized, the system can adjust the position and orientation of the components automatically to ensure that the constraint is met precisely. 5/6
Augmented assembly processAssembly tool operation • In BHAA, the user can select an assembly tool from the TADS to carry out an assembly operation. • The assembly tool operation process is carried out as follows. Step#1: Identification Step#2: Operation Step#3: Withdrawal 6/6
Assembly sequence evaluation and feedback • To improve assembly efficiency and reduce assembly cost, changes in assembly directions and tools should be minimized. • During an assembly simulation using BHAA, the user can evaluate an assembly sequence to obtain a near-optimum plan considering the ease of assembly, tool and orientation changes. 1/1
Implementation and case study • The BHAA system works well and consistently at about 15 frames per second for a 512 × 384 frame resolution. • The fingertip detection method which has a RMS error of 1–2 mm in all axes. 1/2
Conclusion and future work • A 3D dual-handed interaction interface is provided to facilitate AA. • The limitations are a lack of force feedback, a lack of realism using only fingertips for virtual objects manipulation and only three typical assembly constraints are considered. 1/1