460 likes | 619 Views
MIDN 1/C Hansen, MIDN 1/C Fincher, MIDN 1/C Keith, MIDN 1/C Noyola , MIDN 1/C Topp Advisor: CAPT Nicholson, USN. Problem Statement.
E N D
MIDN 1/C Hansen, MIDN 1/C Fincher, MIDN 1/C Keith, MIDN 1/C Noyola, MIDN 1/C Topp Advisor: CAPT Nicholson, USN
Problem Statement To design an autonomous underwater vehicle to compete in the annual Association of Unmanned Vehicle Systems International and Office of Naval Research AUV competition in San Diego.
Background • Competition • 6th year competing • Placed highly in recent competitions • Current Strengths • Navigation by dead reckoning using DVL • Current Weaknesses • No mission devices (grabber, launcher, etc.) • Sensors are not fully integrated
Competition • 15th Annual Robosub Competition • This year’s theme: The Ides of March • Consists of a series of underwater obstacles • Points awarded for completion of obstacles (partial credit discretionary) • It is not required that you attempt every obstacle
Research • Other Team Projects (Top Three) • Team Sonia ETS • Cornell • University of Florida • Experience of former team members and our advisor
Demonstration Plan • Follow Path • Navigate with Dead Reckoning • Implement cameras for primary navigation • Buoys • Use cameras to identify correct buoy • Use cameras to fine tune position • Gates • Navigate through gates using Dead Reckoning • Implement cameras for primary navigation • Bins • Actuator triggered by the cameras • Use the cameras to fine tune the position • PVC • Pick up the PVC and surface • Return PVC to original position and resurface • Surfacing through Octagon • Utilize SONAR (passive) to identify correct octagon • Utilize SONAR (passive) to navigate to correct octagon
Responsibility Breakdown Key: P = Primary S = Secondary
Frame and Actuators MIDN 1/C Hansen MIDN 1/C Noyola
Frame Design Increase adaptability Allow more room for actuators Allow for future modifications
Grabber Design Figure 1: Pin design Figure 3: Target to be picked up Figure 2: Wheel design
Torpedo Design Figure 5: Torperdo launcher Figure 4: Torpedo targets
Dropper Design Figure 6: Dropper design Figure 7: Dropper targets (Bins)
Wiring MIDN 1/C Hansen
Wiring Example *Kill Switch Board* Thrusters (wire #1 from each) To Camera Box Light (#5) Fwd Down Port Aft Down Stbd Kill Switch Relay Kill Switch Power
Software MIDN 1/C Topp
Background: Navigation • Programmed in C & run in Linux • In the past, the groups have relied heavily on waypoint navigation. • Essentially, the groups would enter a specific point based on the fix of the vehicle & would have the vehicle navigate to the point. • Previous groups have attempted to use camera navigation but have been unsuccessful. • Our goal is to successfully implement camera vision into our system navigation.
Basics of the Code • Essentially, we use a shared memory function to store all of the necessary variables • This allows variables to be called up in several different programs & be stored to one common function. • Ex: In the “maneuver.c” program, there is a switch function based on case numbers • case 0 = maintain position • case 1 = waypoint navigation • case 2 = camera navigation • case 3 = SONAR navigation • In the “forward camera.c” program, if a buoy or a bin is detected, the following line of code is executed: • shm_struc->positionControlMode = 2; • This stores “2” as the positionControlMode variable through the shared memory function. This variable can then be recalled in the “maneuver.c” program, activating camera navigation.
Waypoint Navigation • Historically, this has been the most reliable method of navigation for the vehicle. • Takes a reading from the DVL (using compass and speed over ground) and navigates the AUV to the desired waypoint. • Will use this for most obstacles except the buoy and bins obstacle.
Buoy Obstacle • The officials will release a certain order of colors to hit. • A menu pops up prompting the user to choose a color. • The choice of color stores variables xRed, yRed, etc. • Camera vision navigation is then implemented to navigate to desired buoy.
Camera Vision: Basics • The forward camera outputs a certain string of numbers: • 1 = passing, 0= fail • [row, col] of the centroid of the detected object • Color as the equivalent integer to ascii character • Red = 114 • Green = 103 • Yellow = 121 • No Match = 78
Camera Vision Pseudocode Example • If the camera detects an object (output = 1) • shm_struc->positionControlMode = 2; which switches to camera vision navigation • We then read the x coordinate for the centroid and store it in variable xRed/xGreen/xYellow • The depth of the object is given at the competition, so it will be preprogrammed into the system. • We then calculated the pixels/degree of the camera • # columns = 640 • FOV = 15° • Pixels/degree = # columns/FOV • Pixels/degree = 42.7 pixels/1 degree
Camera Vision Navigation Logic • We then implemented the following line of code: • shm_struc->ord_head = 42.7/xRed; • This line takes pixels per degree and divides it by the pixel position of the object • The output ord_head is a degree value to be implemented in the camera vision navigation portion of the code. • This portion of coding simply orders Romulus to navigate to the ordered heading.
Camera Vision Navigation • After the camera hits the correct buoy, it switches back to waypoint navigation to move on to the next obstacle.
Camera Vision: Fail Check • I have added a “timeout” feature to the code. Essentially, if the robot has switched to camera navigation, after 1 minute of not finding a buoy or a bin it will switch back to waypoint navigation.
Bins • This uses essentially the same logic as buoys but instead of color, the downward camera will output variables corresponding to shapes. • The code will then execute the appropriate sequence in order to drop the projectile into the correct bin.
Cameras MIDN 1/C Fincher
Cameras Cognex 5400C • Onboard processing • In-Sight Explorer software • C-mount lens
Buoys • Forward camera • Find curved edge first • Find color next • Bank of three colors • Pass depends on both fixtures • Trouble with thresholding
Bins • Downward camera • PatMax • Thresholding • Contrast • Rotation • Scale
SONAR MIDN 1/C Keith
Passive SONAR • Competition Requirements • ORE Multi-Beacon • SONAR Operation Basics • Four Omni-Directional Hydrophone’s • Data Processing Circuit • Code
SONAR & The Competition • Two 9’ diameter octagon shaped surfacing areas • One of the pinger’s is turned on before each competition run • Goal is to surface completely inside the correct Octagon • Practice and Competition Pinger going at the same time
ORE 4330B Multi-Beacon • Transponder/Responder modes • Same ‘pinger’ used in the competition • Set to frequency between 22kHz and 30kHz • Requires Driving Mechanism
Hydrophones • Reson TC4013 omni-directional hydrophone • Output….
SONAR Data Processing Circuit • AD605 Variable Gain Amplifier • Multiple feedback active band pass filter • Voltage Divider and Comparator with Hysteresis • Digital Signal processing microcontroller • Three simultaneous outputs • RS232 UART • Serial Peripheral Bus (SPI) 64K Serial Memory • 10-Bit Quad DAC
SONAR Code • Written in C • Two programs • Sonar.c program gets the Azimuth, Elevation, Status, and tells which pinger is being detected • Navigationcenter.c filters multiple sensor data to determine most likely position
Special Thanks to Project Advisor Captain Nicholson, USN Systems TSD Rickover Machine Shop Rickover Hydro Lab