1 / 1

Advanced Cubesat Imaging Payload

Advanced Cubesat Imaging Payload. Robert Urberger, Joseph Mayer, and Nathan Bossart. ECE 490 – Senior Design I – Department of Electrical and Computer Engineering. http://acip.us. Background. Progress. RASCAL : A two spacecraft mission to demonstrate key technologies for proximity

plato
Download Presentation

Advanced Cubesat Imaging Payload

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Cubesat Imaging Payload Robert Urberger, Joseph Mayer, and Nathan Bossart ECE 490 – Senior Design I – Department of Electrical and Computer Engineering http://acip.us Background Progress • RASCAL: A two spacecraft mission to demonstrate key technologies for proximity • operations. • Two spacecraft passively drift apart • Each spacecraft uses its propulsion system in conjunction with the imaging payload • in order to facilitate orbiting and docking • Imaging Payload: Computer Vision in Space • To achieve the goals of the RASCAL mission, each spacecraft will identify the other and interpolate knowledge of parameters such as distance. The goal of this project is to design and implement an imaging payload for obtaining raw image data and converting it into actionable high-level data. Our main focus this semester was in software verification and interfacing with our camera. The software verification and LED pattern creation are nearly complete while the camera will have to be moved to next semester and be done along side the hardware verification of the image processing pipeline. Preliminary progress has been made into assembling a bus interaction pipeline with the ARM processor on the Zynq. Figure 1: RASCAL mission diagram Figure 4: Preliminary software verification of LED pattern ID The camera components have been selected for the custom imager design, along with a standard lens housing which can use multiple different lens sizes and apertures. LED identification through image segmentation has been successfully proven. The other stages of the image processing pipeline are actively under development. Description Materials &Methods • The Imaging Payload consists of three main components: • Face Identification: • Involves using LED patterns on the side of each Cubesat for identification • Capturing Image Data: • A single CMOS camera inside the Cubesat will obtain raw images of surroundings • The camera will be custom built (imager and lens) and will be space-ready • Custom camera interface will be created for the purposes of this project • Processing Image Data: • Low-level processing on raw image data will be transformed into actionable high-level data • Three main image processing goals: • 1. Distance detection: identifying the relative distance between the two Cubesats • 2. Object detection: identifying objects of interest • 3. Object classification: identifying the current visible face of the other Cubesat • Solutions of these goals will be used to provide functionality for identifying the relative distance, identifying the approximate attitude, and identifying the other Cubesat (as well as its currently visible face) • Processing will be performed in hardware on a Zynq-7000 Xilinx Chip and in software on a bare-metal C or Linux core Figure 2: Preliminary LED Pattern Ideas Preprocess Image Image Capture Face Classification Figure 5: Gantt Chart (as of 25 November 2013) Depth Identification Store / Output Data Figure 6: Functional Decomposition Current Status • Remainder of software verification in progress • Software verification will be ported to High Level Synthesis toolkit code and converted directly into hardware • Camera board is being designed and constructed, and camera interface to FPGA finalized • LED patterns are being verified through processing pipeline in software to verify visibility of different patterns Results • LEDs have been obtained and patterns are being created. • Software verification is in progress and is completed for object detection. • Bill of Materials has been created for the Camera. References & Thanks Figure 6: Team Photo – Joe Mayer, Bob Urberger, and Nathan Bossart Figure 3: Previous in-space cubesat imaging Special thanks to Dr. Kyle Mitchell, Dr. Jason Fritts, and Dr. Will Ebel. [1] Jan Erik Solem, Programming Computer Vision with Python. Creative Commons. [2] Milan Sonka, Vaclav Hlavac, Roger Boyle, Image Processing, Analysis, and Machine Vision. Cengage Learning; 3rd edition. [3] http://www.cs.columbia.edu/~jebara/htmlpapers/UTHESIS/node14.html October 29, 2013 [4] http://cubesat.slu.edu/AstroLab/SLU-03__Rascal.html October 31, 2013 [5] http://docs.opencv.org November 11, 2013 [6] Gary Bradski, Adrian Kaehler, Learning OpenCV: Computer Vision with the OpenCV Library. O'Reilly Media, Inc.; 1st edition. [7] http://i.space.com/images/i/000/022/868/original/cubesat-satellites-launch-1.jpg?1350419887 November 25, 2013

More Related