270 likes | 433 Views
A Distributed Ranging and Relative Attitude Determination Sensor for Spacecraft Docking. Cengiz Akinli Christopher Hall, PhD Virginia Space Grant Consortium Department of Aerospace and Ocean Engineering Virgina Polytechnic Institute and State University November 8, 2004. Logo.
E N D
A Distributed Ranging and Relative Attitude Determination Sensor for Spacecraft Docking Cengiz Akinli Christopher Hall, PhD Virginia Space GrantConsortium Department of Aerospace and Ocean Engineering Virgina Polytechnic Institute and State University November 8, 2004 Logo
This presentation addresses... Existing systems A new approach Progress
Many autonomous docking sensors already exist. All are optical in nature– radar is not sufficiently precise for docking All cameras are chase mounted All offer precision inversely proportional to range
The AVGS uses relative separation of reflectors within the Field-of-View The
Chase attitude adjustments do no alter separation distance When the chase turns, all reflector points in the FOV move together. Their separation remains unchanged.
Target attitude changes (or chase position changes) alter relative separation When the target turns, or the chase translates and turns, the point positions relative to each other change. But accuracy degrades as range increases.
The Docking Guidance Sensor (DGS) provides an alternative, distributed approach. Distributed configuration provides precision proportionate to range Laser target arrangement allows for much simpler optics Opens possibility of use in other proximity operations
The DGS relies on the fundamental mathematics of conic sections Four tracking lasers define a unique cone in space Intersection of the cone with a plane is an ellipse Minor axis gives range and eccentricity and orientation give attitude
Advanced image processing techniques are required to maximize system precision Hi-contrast imagery is easy to process. Images can be immediately thresholded. But contrast must come from the hardware, not software preprocessing.
The challenge is in preprocessing noisy images, such as those in sunlight Optical sensors must adapt to variable ambient lighting. While low-end CCD’s are adaptive, the effect is limited.
Low-end CCD’s struggle under partial shading conditions These CCD’s adapt globally, not pixel-by-pixel.
Grayscale intensity becomes insufficient as a basis for shape recognition in noisy images
Spectral decomposition using native graphics formats is a first cut approach But hardware quality is the ultimate limiting factor.
Conventional digital image capture technology is entirely trichromatic • Essentially 3 byte resolution in color rendering • Equivalent to reproducing sounds by varying the intensities of sine waves of just 3 distinct frequencies. • Negates the benefit of high power, narrow bandwidth laser light.
Pure red laser light does not substantially increase average intensity of a broad red range when competing with white light. This is not much of an improvement over grayscale processing.
Digital masking overcomes hardware limitations if lighting is constant If the background pixels remain constant long enough to capture two frames, subtraction highlights the laser POI’s.
Pixelwise subtraction of a control image yields a result which can be directly thresholded - = Grayscaling image discards too much information to be useful in a wide range of lighting conditions. A single channel (red, green, or blue) is selected and used as the thresholding base to produce a two color image.
POI’s must be identified and centroided But active pixels are returned in row major, column minor order. Pixels must be grouped by POI.
Graphs can calculate pixel adjacency and connectedness to determine POI’s A data structure known as a graph was invented to address exactly this type of problem. Among other things, graphs can build groups of connected vertices (components). But adjacency logic is not builtin, and can be customized externally to maximize performance
Connected components are equivalence classes Vertices and the edges between them form individual components. This graph has 7 components. Just like equivalence classes can be based on anything, so can components (examples).
High edge density graphs use an adjacency matrix to build components
POI location now tolerates a high level of noise Centroids: (72, 44) (209, 43) (71, 180) (214, 176) Centroids: (75, 44) (211, 43) (73, 180) (217, 175) Error: (3, 0) (2,0) (2, 0) (3, -1)
Properties of the ellipse are easily calculated from the points of incidence of four lasers A general ellipse is defined by which we simplify to Then semi-major and semi-minor axes and eccentricity are found by }
Relative attitude and position are calculated from the properties of the ellipse
The θ’-γ-α rotation is a 3-2-3 Euler angle set, which yields a simplified direction cosine matrix
In conclusion, the available CCD’s are crude, but now workable in most environments Spectral analysis and image processing of this type are fields unto themselves. Double-sampling improves results to a usable degree. On to control!