400 likes | 1.21k Views
Virtual Factory. Factory Automation Lab. SNU. Feb. 4. 1998 Koh, Do-sung. Overall contents. Factory Models Using Virtual Reality - The Industrial Virtual Reality Institute (IVRI). Decision Models for Virtual Factory Layout and Material Flow
E N D
Virtual Factory Factory Automation Lab. SNU. Feb. 4. 1998 Koh, Do-sung 1999.02.04
Overall contents • Factory Models Using Virtual Reality - The Industrial Virtual Reality Institute (IVRI). • Decision Models for Virtual Factory Layout and Material Flow • Data Input model for Virtual reality-aided facility layout 1999.02.04
The Industrial Virtual Reality Institute (IVRI). • Homepage: http://alpha.me.uic.edu/ • Participants: The University of Illinois at Chicago, Northwestern Univ & Argonne National Lab • Sponsors: National Institute of Standards and Technology, National Science Foundation, Office of Naval Research, Motorola / University of Illinois - Manufacturing Research Center • Other project: Remote Engineering, Product Simulation 1999.02.04
Factory Models Using Virtual Reality • Object: to develop a prototype factory floor • Current status: a gear factory model • Intent: to make it compatible with other ongoing project • Goal: to allow the manufacturing engineer to manipulate 3-D objects in a 3-D environment. 1999.02.04
Decision Models for Virtual Factory Layout and Material Flow • Purpose: to assist a designer in evaluating the impact of local decisions on a factory floor on the global layout and material flow problem. • Collaboration: to make these decision models callable from the virtual factory floor environment and to have the results from the models presented before the designer during the design process. • Four layout decision algorithms • A genetic algorithm based block layout and material flow design algorithm • A minimization procedure for maximum material flow congestion • Collision Detection for Detailed Layout Design • A Fast Data Input Model for Facility Layout Design • Current status: 1999.02.04
Virtual factory of IVRI 1999.02.04
Other Issues • Virtual Reality Devices for a Computer-Aided-Design Environment • Development of a Virtual Configurable Flexible Manufacturing System • A Virtual Environment for Training Overhead Crane Operators • VEDAM: Virtual Environments for Design and Manufacturing 1999.02.04
Data input model for virtual reality-aided facility layout D. Zetu1, P. Banerjee1 and P. Schneider2 1Department of Mechanical Engineering & 2Department of Electrical Engineering and Computer Science, The University of Illinois at Chicago IIE Transactions, Vol. 30, no. 7, 1998, pp. 597-620. 1999.02.04
About the authors • Dan Zetu -working for Ph.D. in Industrial Engineering at The University of Illinois at Chicago • Prashant Banerjee - Associate Professor in Mechanical Engineering at The University of Illinois at Chicago. MS & Ph.D from Purdue University & a Btech from the Indian Institute of Technology, Kanpur. • Paul Schneider - received an M.S in Computer Science from the University of Illinois at Chicago, in 1997. B.S & M.S in mathematics & his B.S in Computer Science at the University of Stuttgart, Germany 1999.02.04
Contents • Computer vision techniques for 3D facility layout design: advantages and obstacles • The overall MIRRORS(Methodology for Inputting Raw Recordings into 3D Object Renderings for Stereo) architecture and a camera auto-calibration • The camera image understanding methodology, including the points-of-interest (corners) extraction, stereo matching and depth recovery • The topology construction methodology using Delaunay triangulation and ray tracing techniques 1999.02.04
2D Layout Vs. 3D Layout • A block layout & material flow design model usually addresses decision on cell placement & material flow paths between cells • 2D Layout is limited to cell placement and material flow paths between cells • 3D Layout model can address additional problems, such as clearances around equipment, workplace aestethics, operator and supervisor conveniences. • Sources of information for 3D Layout model • Output of a block layout & material flow model • CAD model of blue print of the facility • Camera shots of actual facility • Other facility records 1999.02.04
Limitations of Current Techniques • Lack of an integrated system for 3D object construction from 2D stereo images • Calibration problem: necessity of presence within the field of view of the camera of a calibration pattern -> tedious in most instances • Stereo matching algorithms are computationally expensive, error-prone and too general purpose • Lack of a proven algorithm for shape recovery from a set of 3D scattered points which handles well a wide range of geometrical features 1999.02.04
Contributions • An approach for detailed facility representation, with a focus on efficiently, economically and accurately extracting the 3D geometries and topologies of commonly encountered physical objects in it • Useful in situations where such 3D models are not readily available in a CAD DB • Integrated system to build 3D object models from 2D images 1999.02.04
A review and comparison of object extraction technique • Range images • based on laser scanning • provide with dense data • handle well free-form surfaces • do not handle large-sized objects • expensive • difficult to merge information from different range images • lack texture information • Stereo techniques • based on camera shots • handle well large-sized objects • cost-efficient • information from different images can be easily merged • do not provide with dense data • do not handle well free-form surfaces 1999.02.04
Stereo-Based Object Extraction Methodology 1999.02.04
The Pinhole camera model • World coordinate system(WCS) • Image coordinate system(ICS) • Camera coordinate system(CCS) • P(xw, yw, zw) is a point in the WCS belonging to a visualized object and I(x, y) is its image on the image plane. • Focal length(f) = The distance(z axis) between the image plane and the origin of CCS, is measured. • I = M * P • M = perspective matrix ( from the 3D to 2D) 1999.02.04
Camera auto-calibration methodology • Electromagnetic tracker consists of a transmitter(antenna) & a receiver • Principle of operation 1999.02.04
Geometry of the camera-tracker unit • TCS = Transmitter coordinate system • RCS1,2 = Receiver coordinate system in two consecutive positions • CCS1,2 = CCS in two consecutive positions • WT = from WCS to TCS • TR1,2 = from TCS to RCS • WC1,2 = from WCS to CCS • CR = from CCS to RCS • R1R2 = from RCS1 to RCS2 • WCS -> TCS -> RCS1 • WCS -> TCS -> RCS2 • R1R2 * TR1 * WT = TR2 * WT 1999.02.04
Existing Stereo Matching Techniques • Area-Based • exploit the fact that the corresponding image pixels from two images have the same intensity • additional constraints are imposed to find matches, and the search for correspondent pixels is made within an area surrounding the pixel in question • Feature-Based • extract first some significant features from the images and then attempt to find matches between the extracted features • more flexible to surface discontinuities and less computationally expensive because the search space is reduced 1999.02.04
Image understanding methodology and experimental setup • Extracting the points-of-interest (corners) of the object from the 2D images • Matching the extracted points from each pair of stereo images • Computing the coordinates of the extracted points (depth) relative to the WCS 1999.02.04
Stereo imaging system • An object point P is projected onto the image plane (assimilated with the camera lens) by a perspective projection having the focal center of the camera as the center of projection. • In order to retrieve the 3D coordinates of the extracted object points, we need to relate their image coordinates to the corresponding world coordinates. • This relationship is achieved by successive coordinates transformations between the WCS, the CCS and the ICS. 1999.02.04
Establishing stereo correspondence between the extracted corners(1/2) • The process of stereo matching in MIRRORS • The process is as follows: an object corner in the left image -> find the object corner in the right image • Coplanarity constraint - object point p is projected onto the left image plane in Il and in the right image in Ir • The intersection between the line determined by the focal centers Vl and Vr with the two image planes are El and Er respectively. 1999.02.04
Establishing stereo correspondence between the extracted corners(2/2) • Tangent orientation(slope) constraint - degree 30 • Curvature ratio constraints • The stereo correspondence procedure • match a closed contour from the left image with a closed contour from the right image • select the contour in the left image which is closest to the image center and find a correspondent contour in the right image • contours are classified as correspondent if all the corners belonging to the left contour have a correspondent in the right contours VP = zc VI 1999.02.04
Points-of-interest (corners) extraction methodology • Object contours -> Object corners • Edge detection algorithm • Contour following algorithm • Dealing with intersections 1999.02.04
B-spline approximation of object contours and corner detection • The pixel chains are not a suitable representation • Why B-Spline? • Property of B-Spline • each segments of B-Spline curve has the following equations • x = x(t) = a1t3 + b1t2 + c1t + d1 = TMGx • y = y(t) = a2t3 + b2t2 + c2t + d2 = TMGy 1999.02.04
Corner detection • the corners are sought amongst the knots of the B-Splines approximating the given sets of connected pixels. • To over come the rounding effects - give the threshold (dx, dy) • The advantage: sharp & curved contours, closed & open curves 1999.02.04
Depth recovery • Through camera calibration and stereo matching process • At least two images are needed • Il = Ml * P • Ir = Mr * P 1999.02.04
Experimental results(1/2) 1999.02.04
Experimental results(2/2) Dimension Our method measured dimension difference percentage error (mm) (mm) (mm) (mm) (%) 1-2 2-3 4-2 5-6 3-12 3-11 4-5 5-8 8-9 7-13 6-12 767.243 1363.6 789.03 600.335 763.223 785.18 250.794 738.35 530.38 19.92 528.65 762 1365 790 600 765 790 250 730 535 20 530 5.243 1.4 0.7 0.335 1.78 4.82 0.794 8.35 4.62 0.08 1.35 0.68 0.1 0.08 0.05 0.23 0.61 0.31 1.14 0.86 0.4 0.25 1999.02.04
Surface topology construction from a set of scattered 3D points • 3D scattered points -> object surface topology • Make explicit the connectivity relationships between points on the surface of the object, a geometrical structure on the set of points has to be built. • Contribution: MIRRORS has incorporated in it a novel method for surface topology construction for depth data obtained from stereo images. 1999.02.04
Set of points No Connectivity information available? Yes Unconstrained Delaunay triangulation Constrained Delaunay triangulation Convex hull of the set of points Ray tracing from all the visible points(of the given set) Eliminate redundant triangles(intersection by rays) Update the object surface until all redundant triangles are eliminated Overall methodology for surface topology recovery 1999.02.04
Delaunary triangulation Unconstrained Delaunary triangulation Constrained Delaunary triangulation 1999.02.04
Object shape recovery • carving the convex hull • optical ray tracing 1999.02.04
Example of object recovered by MIRRORS 1999.02.04
Conclusions • MIRRORS - an integrated architecture for 3D object construction from 2D stereo image • Incorporated several new techniques to improve the performance • Camera auto calibration technique • Feature extraction and matching from sequences of stereo images by processing 2D B-Splines • Algorithm of shape recovery from a scattered set of 3D points 1999.02.04
References • following source is available from the http://alpha.me.uic.edu/ • Extended range tracking for remote virtual reality-aided facility management, D.Zetu, P. Schneider and P. Banerjee, The university of Illinois at Chicago • Fast data input model for virtual reality-aided factory layout and material handling decision, D.Zetu and P. Banerjee • and other homepages.. • Evaluation of virtual reality interface for product shape designs, Chi-cheng P. Chu, Tushar H. Dani and Rajit Gadh, IIE transactions(1998)30, 629-643 1999.02.04