16
Team FAU-Villanova 1 of 16 Team WORX Journal Paper for the 2016 RobotX Competition Bianca Mesa, Travis Moscicki, Winston Gresov, Harold Davis, Justin Koenig, Luke Ridley, Christina Tsai, Cullen Williams, Wesley Blummer, Thaddeus Cullina, Mary Spillane Advisors: Manhar Dhanak (FAU), Karl Von Ellenrieder (FAU), Chandrasekhar Nataraj (Villanova), Garrett Clayton (Villanova) Abstract—This document presents the design overview and challenges within the development of the 2016 WAM-V USV16 (16 foot Wave Adaptive Modular Vessel) as it is adapted to achieve all RobotX task requirements. The team exists as a partnership between Florida Atlantic University (FAU) and Villanova University (VU). The vessel’s sensor suite employs a VideoRay tethered submarine as a secondary remotely operated vehicle, RGB-D vision system robust to lighting variations, underwater USBL (Ultra Short Base Line) acoustic localization system, GPS aided MEMS-based Inertial Measurement Unit, and Hokuyo LIDAR. As a research platform, the FAU/VU vehicle has served to further the development of adaptive control and robot vision systems. For example, the controller is robust to various environmental disturbances, including wind force, current, lighting variations and rain, while the vision system implements a novel data fusion algorithm and intgrates Bayesion color recognition. A hierarchial structure and finite state machine allow for the development of modular routines which can be rapidly implemented and are utilized to conduct mission-level control. Path planning, mapping, obstacle avoidance and navigation, and three degree of freedom state estimation embody the focus of Team WORX’s WAMV USV16 platform. I. INTRODUCTION This document serves as the means to convey our team’s approach to the RobotX 2016 Competition Tasks. As a constructive Pacific-Rim Partnership between Universities from five countries, the second biennial RobotX 2016 Challenge presents autonomous system development for the maritime domain. The challenge is organized by the Association for Unmanned Systems International (AUVSI) and the Office of Naval Research (ONR), and seeks to foster innovation and develop competitive engineers with a thirst for robotics in the maritime domain. The scenarios presented for each competition task simulate problems encountered in real-world applications, while dynamic task association enforces modularity and the need for a robust state machine within the software development scope. Success with this year’s competition requires the RobotX vessel to demonstrate proficiency in the following: 1. Vehicle propulsion strength and speed 2. Combined Acoustic and Vision navigation agility 3. Identify and communicate light sequences from a target object 4. Autonomous identification and navigation of a docking scenario 5. Deploy/retrieve, manage, communicate and acquire data from a

INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 1 of 11

Team WORXJournal Paper for the 2016 RobotX Competition Bianca Mesa, Travis Moscicki, Winston Gresov, Harold Davis, Justin Koenig, Luke Ridley, Christina Tsai, Cullen

Williams, Wesley Blummer, Thaddeus Cullina, Mary SpillaneAdvisors: Manhar Dhanak (FAU), Karl Von Ellenrieder (FAU), Chandrasekhar Nataraj (Villanova), Garrett Clayton

(Villanova)

Abstract—This document presents the design overview and challenges within the development of the 2016 WAM-V USV16 (16 foot Wave Adaptive Modular Vessel) as it is adapted to achieve all RobotX task requirements. The team exists as a partnership between Florida Atlantic University (FAU) and Villanova University (VU). The vessel’s sensor suite employs a VideoRay tethered submarine as a secondary remotely operated vehicle, RGB-D vision system robust to lighting variations, underwater USBL (Ultra Short Base Line) acoustic localization system, GPS aided MEMS-based Inertial Measurement Unit, and Hokuyo LIDAR. As a research platform, the FAU/VU vehicle has served to further the development of adaptive control and robot vision systems. For example, the controller is robust to various environmental disturbances, including wind force, current, lighting variations and rain, while the vision system implements a novel data fusion algorithm and intgrates Bayesion color recognition. A hierarchial structure and finite state machine allow for the development of modular routines which can be rapidly implemented and are utilized to conduct mission-level control. Path planning, mapping, obstacle avoidance and navigation, and three degree of freedom state estimation embody the focus of Team WORX’s WAMV USV16 platform.

I. INTRODUCTION

This document serves as the means to convey our team’s approach to the RobotX 2016 Competition Tasks. As a constructive Pacific-Rim Partnership between Universities from five countries, the second biennial RobotX 2016 Challenge presents autonomous system development for the maritime domain. The challenge is organized by the Association for Unmanned Systems International (AUVSI) and the Office of Naval Research (ONR), and seeks to foster innovation and develop competitive engineers with a thirst for robotics in the maritime domain.

The scenarios presented for each competition task simulate problems encountered in real-world applications, while dynamic task association enforces modularity and the need for a robust state machine within the software development scope. Success with this year’s competition requires the RobotX vessel to demonstrate proficiency in the following:

1. Vehicle propulsion strength and speed2. Combined Acoustic and Vision navigation agility3. Identify and communicate light sequences from a

target object4. Autonomous identification and navigation of a

docking scenario5. Deploy/retrieve, manage, communicate and acquire

data from a secondary vehicle6. Visually survey and deliver a package to a target of

known dimension and location 7. All performance is independent of human

involvementThe long-time partnership between Villanova University

(VU) and Florida Atlantic University (FAU) leverages VU’s skill in visual perception and high-level control with FAU’s proficiency in systems integration, underwater acoustics navigation and low-level vehicle control. Faculty, staff members and students (both graduate and undergraduate) from both institutions have contributed to the technical and logistical challenges presented by the RobotX competition.

II.PAPER CONTENTS

A. Design StrategyThe interactive nature of the challenges for this years

competiton marks a clear step in the direction of true automation where the vessel is capable of taking dramatically different actions depending on how it perceives its environment. In order to accomplish this, Team WORX set out to make a mechanical system that is as robust as possible, an electrical system that provides as much system data as possible, and a Guidance, Navigation, and Control (GNC) system that is as modular as possible. This design strategy also meshes well with the commitment of the team to research projects centered around the WAM-V outside of RobotX.

1) Vision Based TasksFor the mainstay of the AUVSI competitions the team is

a coupled nodding LiDAR and webcam. The vessel will use the large field of view of the lidar to perofrming preliminary detection routines in the polar coordinate frame and if color information about an object is position itself such that the the object lies within the field of view of the camera. This provides a hardware low-pass filter for object detection since the lidar will only return a value for objects on the water but no the water itself.

Page 2: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 2 of 11

2) Underwater TasksThe find the break and underwater shape identification

challenge will be completed using the VideoRay Pro4 ROV. A tethered ROV was chosen because of the inherent safety factor included with launching an ROV. The ROV will perform perform a raster scan pattern at a constant depth after being launched from the WAMV. The Pro4 has an internal camera and compass that can be used to send coupled still images and compass readings to the WAMV for post processing.

3) Projectile TaskThe detect and deliver task required the design of a ball

launcher to function in the same way a baseball pitching machine functions. The design consists of PVC pipe supported by a wooden framework to keep it stationary. There are two slits cut in the pipe where the wheels contact the racquet balls. Pipe with a two and a half inch inner diameter was selected because the diameter of a racquet ball is slightly smaller than two and a half inches. The system employs two Ampflow E-30-400 motors, chosen because these motors because they were able to reach the RPM that we calculated would be necessary to launch the balls. This design was chosen for two reasons. The first reason was that it is simple, consisting only of two moving parts which are the wheels connected to the motors. Also, the framework is very basic and relatively easy to construct. Second, it is effective. This design allows for balls to be shot out of the cannon reliably at approximately the same speed and direction.4) Acoustic Pinger-Based Transit

Coupling the vision, acoustic, and control system with a path planner and a prior knowledge is one of the challenges Team WORX was most interested in because the fusion of multi-environment data is a real world challenge for USV’s. Since the gate buoys are laid out in a persistant manner, the team is able to box the data received from the acoustic system and vision system into 1 of 6 possible choices for the possiblites of entry and exit gates.

B. Vehicle Design1) Requirement Analysis

The team developed a number of system requirements by analyzing the rules and task descriptions as they were updated in addition to the experience of previous competitions. The requirements include:

An acoustic nagivation system An RGB-D acquisition system A tethered ROV + Launch and Recovery

System A multi-computer network > 4 Hours of run time Tilting thruster Azimuthing thrusters A ball launching apparatus Data transmission in a saturated network

2) Functional DecompostionTo provide a blue sky environment during design, the

team laid out a functional decomposition with abstraction held paramount. The idea was to break down each challenge into the discrete steps that the vehicle would need to take in order to accomplish is without providing any answers on how this will actually be accomplished. By defining exactly what the vessel was required to do allowed the designers a target for research .3) Trade Studies

Once research and initial designs were completed, the team completed a series of trade studies based on the Kepner-Tregoe method. The major factors for weighing results were cost, reliability, and ease of implementation. Values were primarly assigned through thoughtful conversation and group consensus.4) Functional Flow Block Diagram

After the selection of new systems to be integrated, the team developed functional flow block diagrams that highlight how each system connects to any other systems. These designs were then review by senior FAU and VU faculty for approval.

5) Vessel PlatformSince the inaugural RobotX Challenge in 2014, the WAMV USV16 Platform was provided by ONR and Marine Advanced Research (MAR) to all competition teams as the primary development platform to support all competition tasks. The USV16 is a catamaran style vessel which observes maximum maneuverability with differential thrust propulsion systems. Its uniqueness is identified with an independent suspension system for each pontoon, which allows the vehicle’s payload tray to remain stable in a range of sea states. This stability and dampened motion due to surface wave interaction serves to reduce noise observed by any instrumentation during missions. The vehicle’s suspension system and simple configuration makes the WAMV platform ideal for sensor integration and maneuvering around a range of obstacles.

Figure 1: Side View of the Team WORX WAMV USV16 Platform

6) Actuated Thruster Mounting SystemThe need for a robust propulsion system began when the

vehicle platform was delivered, and the addition of stern buoyancy pods from MAR in 2016 created the need for a new thruster mounting design. Capabilities desired for the new mounting system included the reuse of the vehicle’s two 24V MinnKota thrusters and the enhancement of the vessels azimuthing capabilities. The previous configuration

Page 3: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 3 of 11

relied on linear actuators which were prone to failure and limited the rotation of the thrusters to +-45 degress. The azimuthing capability of the thrusters have been increased to +-90 degress by integrating servos into the new design. Furthermore, the previous configuration did not allow for the thrusters to tilt, which proved cumbersome while the vehicle was launched and recovered throughout a long testing day. The new mounting system includes the ability to raise and lower the thrusters at an angle, which reduces the potential for damaging the thrusters during the deployment and retrieval process.

The actuated system employs two GearWurx high-torque servos paired with external gears which are mounted directly to the thruster shaft.

7) Launch and Recovery System (LARS)For tasks requiring the use of a secondary vehicle, the ROV Launch and Recovery System (LARS) was developed to integrate the chosen secondary vehicle with the WAMV platform. Inspired by the LARS system for FAU’s 2016 RoboBoat, this design boasts a reliable solution to ROV deployment and retrieval while having little interference with the acoustics, optics or propulsion subystems.

Figure 2: Launch and Recovery System for Secondary Vehicle (VideoRay)

For this design, a winch (HD Premium Planetary Gear Motor running at 45rpm) mounted underneath the Owl’s Nest feeds tether wire to launch and recover the ROV. For the deployment process, a ratchet on the winch disengages allowing the tether to feed out from the pulley system on the main tray. Deployment occurs for ninety seconds.

Figure 3: Winch System for LARS

For ROV retrieval, the winch is activated to pull the ROV into the LARS housing. The winch system will only allow pulling the tether in. As the sub enters the cage, the side plates engage a limit switch (one switch per plate). When a limit switch is activated, the motor will turn off and the ROV retrieval mission ends. Additionally, One foot of cable sheathing for the ROV required the design to implement a strain relief mechanism which activates from within the cage. Complex angles made this design difficult to machine by hand, and required a CNC machine. The initial design underwent three different iterations until the final design was reviewed. The main designer for this component was a senior mechanical engineering student, Justin Koenig.

8) Secondary VehicleTwo of the challenges, Find the Break and Coral

Survey, entail underwater exploration. Early versions of the rules suggested that the primary vehicle would not be permitted to enter the associated challenge areas, and that a secondary vehicle would therefore be needed. Accordingly, we arranged for a loan from a locally based sponsor, VideoRay Corporation, of a high performance remotely operated vehicle (ROV).

Figure 4: VideoRay ROV

This ROV is specially designed for underwater operation, and features waterproof headlights, video camera, magnetometer and other sensors. The loan

Page 4: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 4 of 11

permitted us to dispense with constructing a vehicle and let us concentrate on programming it to carry out the necessary tasks, i.e. exploring the sea bed and capturing images of the bottom that could be used to identify the targeted symbols. In the event, the exclusion of the primary vehicle from the challenge area was not retained. The secondary ROV continues to be of value, however, because it enables examination of the sea floor from a much closer vantage point than could be achieved with, for example, an articulated boom, and represents a more robust design as well. Finally, the strategy can be extended to accommodate more complex undersea tasks, such as are likely to be featured in future RobotX competitions.

A challenge that presented itself was that the associated software, API and doumentation were MS Windows compatible only, whereas all of our systems run under Linux (specifically Ubuntu) for compatibility with ROS and better performance in a real-time context. The appropriate control signals, therefore had to be reverse engineered, and the control program written from scratch. Fortunately, this task was facilitated by the fact that the instrument ordinarily used for controlling the ROV is compatible with a standard Microsoft video console controller.

Associated mechanical engineering consisted primarily of the construction and mounting of a launch and recovery mechanism: a winch winds the ROV's tether, which includes power, control lines and mechanical support in a single watertight package. A slip ring enables the various lines to be wound and unwound without twisting off their connections. Finally, a welded aluminum framework mounted on the rear of the primary vehicle provides support for the ROV while affording access to the water.9) Owl’s Nest

The Owl’s Nest is an independent subsystem designed as an integration platform for the anemometer, light tower, GPS unit and winch for the LARS. The components were fabricated and welded in house by students.

Figure 5: Owl's Nest Mounting Frame

10) Acoustics BoomThe need of a low cost solution for the acoustics launch and recovery subsystem was fulfilled by utilizing components already present in the lab space: transducers, carbon fiber pole and linear actuator. The system was designed to not interfere with the LARS system towards the vehicle stern. When retracted, the system allows the WAMV to clear a buoy. When deployed, the transducers are located two feet below the water surface. All components for this subsystem were machined by hand by team members in the FAU machine shop except for the triangular plates, which were fabricated via water jet. The spacers which mount to the carbon fiber boom are 3D printed form ABS and employ heat inserts for structural integrity.

Figure 6: Acoustic Boom and actuator configuration shown mounted beneath the WAMV payload tray

The acoustics launch and recovery system gracefully utilizes the current piping support for the WAMV payload tray to clamp mounts for the linear actuator and acoustics boom end. Aluminum piping mounted along the longitudinal payload tray provides additional structural support for both the linear actuator and the payload tray, which previous verisons of this subsystem did not achieve. Four different versions of this design were evaluated with the team before a final design was selected. The acoustics DAQ cables are able to run through the boom for clean cable management. This design engages a different perspective to subsystem integration, since it both contributes to and relies on the structural integrity of the host vessel. The main designer for this subsystem is Luke Ridley, a junior Ocean Engineering student.

Page 5: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 5 of 11

Figure 7: Acoustic Boom end connected to the linear actuator, with limit switches to control the actuator motion

Figure 8: Physical Hydrophone Configuration for Acoustic Data Acquisition System

11) Acoustic Data Acquisition SystemThe design for the acoustic positioning system was based on the system from the RobotX 2014 Challenge, which used an Ultra-short baseline algorithm. The old acoustic system was rather large and required its own enclosure which consumed a significant amount of available room on the platform. Thus, it became necessary to reduce the system footprint to fit in the main electronics enclosure. Since the old system required use of a National Instruments Data Acquisition System (DAQ) and a host Windows computer capable of running Labview, a total redesign was required. The new design now uses a STM32F4 processor to perform both the Data acquisition and Digital Signal Processing (DSP) as apposed to requiring a full embedded computer. Since the DAQ also provided all the front-end

signal conditioning, all of this needed to be designed on the new system as well. The only components unchanged from the original system were the hydrophones, which provided some known requirements for the new system such as input voltage range, expected dynamic range of the input signal and the power requirements for the hydrophones. The hydrophones consist of two pairs of pie-electric elements with each tuned to a certain frequency which will be known before the mission run. Communication between the acoustic system and the high level computer has also been changed from a LAN connection to a TTL serial connection since the STM32F4 used does not have nor did it need an Ethernet MAC.

The current design went through two hardware iterations. The first design required the signal to constantly be sampled in order to search for the beginning of the signal. This proved to be a major limitation because the sampling could be interrupted at any point by any of the service routines such as the communication which would likely result in missing the beginning of the signal. Revision two of the design incorporated a hardware trigger circuit so that a threshold could be set and if a signal was detected a high priority interrupt service routine would be called which would trigger the ADC sampling which would then write directly into the DMA without the processor needing to be involved. This means that the signal acquisition can no longer be interrupted by another service routine. Another improvement from the first iteration is the addition of a hardware hold off which allows the processor to disable the input signal until the system is ready for a new waveform. Doing this in hardware solved some complications with the interrupt service routines used for the hold off in the first revision. The last improvement from revision one was a change in the front-end which fixed some voltage offsets in the amplifiers caused by the amplifiers having a slightly different analog reference; these have now been combined so that the reference is identical on all the amplifier stages.

The algorithm used for the USBL is to perform a Fourier transform on the input data using the DSP library. Once the data is captured and the FFT is performed, the maximum amplitude frequency is compared to the desired frequency and if it matches then it proceeds to the next stage. Since the bearing is calculated from the phase angle between the signals on both hydrophones in each pair, the angle can be calculated by converting the complex fft to phase and magnitude and subtracting the phase information between each pair of hydrophones. Once this phase angle is gathered, the resultant bearing can be estimated using the equation: bearing = arcsin(phase/pi).

This does, however leave one issue in that the angle can be ambiguous since the source could be located on either side of the hydrophones and still create the same phase difference. In the future this can be eliminated by placing one pair perpendicular to the other so that two pairs of hydrophones can give each other information on where the source is located. This, however would require the hydrophones to be able to be adjusted remotely without manual operation.

Page 6: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 6 of 11

12) Control Box (Cullen)The Electronics Box consists of a number of components

that are designed for a specific purpose or mission. Some of the mission critical components include the Jetson TK1 Compute boards, Bottom Level Manager Microcontroller, Power Distribution circuits, Telemetry systems, and the motor control interface. Each individual system needs to work independently while coexisting and communicating in a cooperative manner. For instance, each components was designed to not overload the power distribution systems.

Jetson TK1 Compute Boards act as the central processing units for the system. These boards need to continue to operate in all conditions, and remain in contact with all other systems on the boat. The input output interfaces that accomplish these communication channels include Ethernet, UART/RS232, I2C, and GPIO. One hardware component that is vital to smooth software execution is the Real Time Clock (RTC) that is designed into the hardware, and is communicated with via an I2C interface to its RTC. The RTC keeps the time accurate at all times, even when the system is off with a small CR2032 coin cell battery. If these I2C interfaces were to fail, the Jetsons would have no way of knowing the correct time.

The Bottom Level Manager, or BLM is a small STM32 microcontroller embedded into the motherboard with the task of managing low level real time devices. These low level devices include GPIO, battery and current measurements, power supply control, and control of peripherals excluding the thrusters. The light tower for instance is vital to the mission to indicate what state the WAMV is currently in, the BLM is responsible for detecting the state of the boat and making this information available to the Jetson TK1 boards while controlling the state of the light tower. The BLM is also responsible for measuring capacity in the battery supply, and indicating the battery state to the system.

The power distribution system is perhaps the most critical system to the electronics system. Without effective power distribution, none of the systems on the boat would work as designed. The power distribution system was designed to accommodate a LiFePO4 battery packs with a lower battery voltage at the end of its discharge curve of about 10.5V. This made the 12V supply of the electronics system more challenging to accomplish. By using industrial grade DC/DC regulators capable of taking a low input voltage of 9V and still maintaining a 12V output made this battery pack topology possible. The system uses 2 of these DC/DC converters to supply the required power to all the systems, one regulator is responsible for large 12V high power devices, while the other is used to supply low power devices with 5V and 3.3V. The BLM has control over most of these supplies and can measure the current consumption of each rail individually.

The Telemetry systems on the board consist of 2 channels, one of which is the primary networking link to the system on the boat and the ground station, while the other being much slower acts as the backup for mission

critical data. The primary telemetry link consists of 2 Ubiquiti Rocket M5 Modems with high gain omni directional antennas, this link provides us with a data rate of up to 80Mb/sec over a standard Ethernet interface. The backup or failsafe telemetry system is a Digi Xtend module capable of a data rate close to that of a high speed UART connection or about 115Kb/sec.

The Motor control interface or PWM interface is a high reliability system designed to provide hardware control over the motors from shore via a standard RC remote. The Jetson TK1 uses an I2C interface to communicate with an application specific IC for generating RC or PWM signals. The electronics box is also housing a RC remote receiver which provides 12 channels, these 12 channels are switched with the output of the CPUs PWM signals. The control line of the multiplexer comes from a small PIC microcontroller whose job is solely to monitor a channel on the RC receiver and detect which way a switch is set of the RC remote on shore. This gives the team members on shore a level of control of the boat that software control could not match. The shore remote is essentially the master of the control signals, and when enabled it passes the control of the motors to the Jetson CPUs. An overview image of the motherboard can be found in Figure 9.

Figure 9: Rendering of Motherboard for main control box

13) Low Level Vehicle ControllerThe ongoing research at FAU has provided a diverse and

and rich set of both non-linear and linear controls. Team WORX’s vessel boasts an under-actuated coupled velocity and heading PID controller, a fully-actuated backstepping PD controller, a fully-actuated station keeping controller, and an Ad Hoc controller. In addition to the onboard control author, the vessel also houses both a simple and dynamic line-of-sight (LOS) trajectory generator for waypoint navigation.

Two controllers have been selected for primary use during competiton: the velocity and heading PID controller,

Page 7: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 7 of 11

and the fully-actuated station keeping controller. This decision was made because because of the complexity associated with the Ad Hoc and backstepping control styles. In 2014, Team WORX suffered a blow when all but one linear actuator control boards failed. Due to the familiarity of Dr. Ivan Bertaska with the control system, the team was ably to rework the control allocation on the vessel to utilize a single actuated thruster on the fly. In order to simply any mitigation due to potential failure, the team decided to work with controllers that easily understood and updatable.

a) Simple LOSThe simpleLOS system generates headings for the vehicle to follow based on the tangent of of the error in east direction divded by the error in the north direction.

θerror= tan( Edesired−Eactual

Ndesired−Ndesired)

Figure 10: Simple LOS Heading Generator

b) Dynamic LOSThe simpleLOS generator will only calculate error based

on the vehicles current position. Due to challenges that include densly populated fields, Team WORX felt it beneficial to have the ability to naviate back to the original trajectory should it lose its bearing. For this reason, the dynamicLOS system was implemented. The dynamicLOS generator will provide the vehcle a list of waypoints to navigate back to its initial bearing.

Figure 11: Dynamic LOS Generator

14) High Level Mission Management and Path PlannerThe Team WORX software package is entirely build in

Robot Operating System Indigo Igloo. The transition to this new system interface allows for more standardized integration compared to the previously used Lightweight Communications and Marshalling (LCM).

The high level mission planner, the_planner, leverages ROS’s commucation abilities against a c++ backbone. the_planner uses a class based approach where each RobotX task has its own derived class that inherets from a base class. This allows all missions to have access to a centralized node handle, publisher, subscriber, and common data while still having individual, task-specific data and ROS functions. At start up, a ROS launch file specifies which missions are to be run, reads in a .yaml file with pertinent start data, and then creates a dynamically allocated array with an object of each required class. To solve the array initalizion issue of each class being a different type, the array is initialized as a double pointer where each index is in fact a pointer to the object of the required class.

The planner is currently on its third major update. Versions 1 and 2 were originally developed for the FAU entry to the 2015 and 2016 RoboBoat competitons and utilized json for configuring the order of missions where version 3 fully transitions to ROS for all runtime parameters.

The initial plan for the 2016 Challenge vision system and high level mission management consisted of two objectives: first to convert the existing code from MATLAB to C++, and second to transition the sensor suite to involving a Velodyne LiDAR. The first objective was intended to enhance the vehicle’s high level performance and provide full compatibility with the Robot Operting System (ROS). However, a shortage of C programming resources resulted in using Python (2.7) instead. The second objective was to modify the program to accommodate the new Velodyne LiDAR, which was intended to replace a Hokuyo unit that had been used in previous competitions. The Velodyne features 16 concurrent scan lines, in contrast to the four provided by the Hokuyo. Setting the nodding gimbal for four vertical angles, instead of the previous 16, results in the same 64 lines total as previously obtained, albeit with greater speed, resolution and reliability. The unanticipated failure of the Velodyne shortly before the competition date, with insufficient time remaining to repair, required a fallback modification of the new code to reaccommodate the old Hokuyo using Python and working with ROS.

15) Vision-Based Perception SystemIn addition to typical camera information such as color,

morphology, and the angular location of objects within the camera’s Field of View (FOV), a LIDAR/Video vision system provides depth information. Depth is extremely important for ASVs, as it enables the use of advanced vision algorithms like simultaneous localization and mapping (SLAM) and object based color analysis [4].

Page 8: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 8 of 11

The level of autonomy required for the 2016 RobotX Challenge Tasks is achieved by using the LIDAR/Video vision system mounted on a rotating gimbal, shown in Figure 5. In this configuration, the LIDAR scanner (Hokuyo UTM-30LX) scans on a plane and is used to obtain distance information while the video camera (Logitech Pro 9000) is used to obtain color and morphological information about the objects in the vision system’s FOV.

Figure 12: Vision System Package utilizing a Logitech webcam and Hokuyo LiDAR

As an alternative to high cost LIDAR systems capable of obtaining a two-dimensional depth map, our vehicle employs a planar LIDAR system (LIDAR which collects range information from one plane in the scene being measured). The vision sensor feedback can be extended to 3D by adding secondary rotation to the sensor [1,6,12,13]. The need for a rotating LIDAR generated the primary mechanical component of the vision system: a Gimbal which enables LIDAR scanning. The LIDAR device is mounted on a rotating cradle (Figure 10). The cradle is mounted on a waterproof box that houses the forward-looking camera as well as a stepper motor (NEMA-17 with a 14:1 gearbox) which actuates the gimbal through a belt.

A potentiometer allows measurement of the gimbal angle while a 3D printed waterproof cover protects the potentiometer and timing belt from the envronment.

The vehicle's stand-alone vision system is designed with the capability to control the gimbal, measure data, and perform image-processing. The primary computational resource for this task is a Dell Precision 5510 laptop computer running a Linux operating system. The laptop is programmed using ROS and OpenCV.  Both the Hokuyo LIDAR device and the Logitech camera are connected directly to the laptop. The stepper motor used to control the gimbal is driven by a Phidgets 1067 control board and the gimbal angle is measured using a potentiometer read by the laptop, to which the stepper driver is also connected.

Typically, gimbaled LIDAR systems utilize faster scan patterns where LIDAR data is gathered one scan line at a time with motion of the gimbal occurring between scans or at a relatively slow rate. This process can be very time intensive depending on the desired resolution and LIDAR FOV. With this type of scan pattern, objects in the LIDAR image can become distorted as the vehicle turns. An example of this can be seen in Figure 11b, where the target buoy appears tilted. This occurs because points in the depth image are not collected at exactly the same instant as when the LIDAR scans the environment. To overcome this issue, one of the innovations implemented in the presented gimbaled LIDAR system is the use of Lissajous-like scan patterns to allow a trade-off between speed and resolution without constraining the FOV [1].

Figure 13: Example (a) camera and (b) depth images showing issues associated with motion

The LIDAR and video sensors produce two outputs that must be fused: 1) Depth information, with corresponding gimbal and LIDAR angles, enabling the production of a depth image (Figure 12a), 2) An RGB image (Figure 12b). Figure 12c shows detail of the approximate RGB image region from (a).

Figure 14: Example vision system images. (a) Depth image obtained from the LiDAR system. (b) RGB image obtained formt eh camera. (c) Detail image of the approximate image region in (a)

For buoy identification, the LIDAR/Video fusion algorithm uses the depth image to identify objects of interest (not just buoys, but anything in the FOV of the LIDAR). It is advantageous to carry out the object identification in the depth image because floating objects are automatically isolated, both from the water (as the

Page 9: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 9 of 11

LIDAR is unable to return the distance to the water surface) and from the background (since LIDAR range is limited). As seen in Figure 12a, this typically results in a small subset of Points of Interest (POI) in the depth image.

In order to fuse the LIDAR and Video images, the location of the POI needs to be transformed into RGB camera pixel coordinates. This is done by converting the LIDAR points to Cartesian XYZ coordinates, translating the origin to that of the video camera, and finally calculating the pixel values through use of an intrinsic model of the video camera. This typically results in a sparse mapping of depth points to the pixel frame, as the camera has significantly higher resolution. To better correlate the object being detected in the depth image with its corresponding object in the RGB image, the depth information is thresholded for a desired range and the remaining points are joined to make a continuous area using morphological operations (Figure 13c). Mathematical details can be found in [14,15].

Figure 15: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a region of interest in the camera image

Heavily coupled to the vision system is the vehicle’s Simultaneous Localization and Mapping (SLAM) capability. SLAM works to combine sensor measurements and control inputs to form an estimate about the robot’s pose as well as features within a robotic environment. For the purpose of the boat, these features will be considered as buoys in the water. The particular variant of SLAM used in this implementation was FastSLAM [11,17].Without an internal map of its surroundings, the vehicle response can only react to sensor measurements (e.g., video, LIDAR) at a current time. To provide a higher level of functionality to the system, the vehicle must be able to leverage past sensor readings with current sensor data and form some estimate of important features in its environment. With an internal map of the environment, the vehicle can perform tasks such as path planning with obstacle avoidance, recursive color estimation for particular objects, and will have more accurate vehicle pose estimates.

16) AnodizingOur team eagery took advantage of the opportunity to

anodize aluminum parts. As parts were fabricated by team members in the machine shop, they were anodized in house to further shield them from corrosion. Parts were first sanded and polished to remove imperfections and to ensure the desired finish. Some parts had previously been tested on

the vehicle and exposed to salt water. The sanding and polishing process removes the oxidized layer obtained from having been exposed to the outside environment. All parts—including cut stock metal—must be sanded prior to being submerged in a strong NaOH (base) bath to prepare for the anodizing process. It was noted that aluminum forms a black layer (becomes one tone darker) once it is removed from the NaOH bath. Parts are then rinsed with distilled water and immediately placed into a separate bath of diluted sulfuric acid (battery acid) to be anodized. The part to be anodized is connected to the positive charge of a power generator. The negative charge is connected to a large lead plate, which covers an entire side of the bath. To facilitate the even distribution of the anodizing process, an air hose circulated the diluted solution around the lead plate and anodizing parts. The surface area for each part determined the amperage and time required to evenly coat the part to be anodized.

Figure 16: Port side thruster mount advertising products of the anodization process

While the part is being anodized, the desired color of dye in an aqueous solution was heated to above 160 but no greater than 200 degees Farenheit. Using this process our best results occurred at 180 degrees Farenheit. Once the part has been removed from the anodizing bath, it is once again rinsed off with distilled water and placed into the dye bath for twenty minutes. After the part is removed from the bath it is rinsed with distilled water and fully immersed in water kept above 200 degrees Farenheit. This portion of the process seals the anodizing and the dye to the metal part. While this procedure was repeated for multiple parts employing the same process, one part in particular refused to accept dye. The anodizing process was completed and the part was cured, however it remains without color as an anomaly to our process. In addition to this, the dying process faced challenges regarding maintaining the dye bath at a particular concentration for each part. This challenge contributed to the differing colors exhibited by some of the anodized components on the vehicle, however the feature has been much welcome addition to our team’s list of capabilities. Team members primarily responsible for

Page 10: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 10 of 11

the anodization process are Ocean Engineering students, Christina Tsai and Luke Ridley.

C.Experimental ResultsFAU’s SeaTech campus provides the benefit of literally

being on the water. Pushing towards water-testing has been a driving force for the team based on the experience of systems working beautifully when in the confines of a lab then mysterious errors occurring as soon as it is placed on the water. The processing for launching the WAMV is bringing the vessel to a davit less than 100 yards from the lab is it stored in.

To overcome the 1000 miles between the Villanova and Florida Atlantic University campus’, team members from VU travelled to FAU twice during the lead up to the competiton. Prior to these trips the two teams detailed milestones to be achieved during the water trials. In addition to testing at FAU, the VU team members also performed testing using the platform developed for the 2015 RoboBoat competiton. This second USV afforded the ability to verify sensor drivers and algorithms in a nearly identical environment to the true use case.

All in, Team WORX performed approximately 120 hours of on water testing. Much of this time was spent fine tuning control algorithms and verifying the finite state machine, with significant portions also comprised of USBL testing and high level mission logic.

Static analysis on the USBL launch and recovery system was conducted for when the boom is held in the horizontal position and it was determined to have a safety factor of 4.5. This extra lift capacity is advantages should the acoustic boom be become entangle with a small object.

D.Acknowledgements

Team WORX is increasingly grateful for the technical expertise and support from the Florida Atlantic and Villanova University Staff: Ed Henderson, John Kielbasa, Tony Lavigne, and Fred Knapp of FAU provided invaluable guidance in both electrical and mechanical design. As always, thank you.

E. Appendix—Situational Awareness (optional)Team WORX developed a groundstation using ROS’s

RVIZ package. This GUI incorporates a dynamically updated map of the vehicles position, readouts of the vehicles NED location on screen, and a dynamic model of the vessel. The incorporated dynamic model and base functionality of the GUI allows the team to select a point on the map, publish this point to the vehicle, and then view both the simulated response and the physical response.

F. References[1] Anderson J. W. and Clayton G., “Lissajous-like scan

pattern for a gimbaled LIDAR.” 2014 Conference on Advanced Intelligent Mechatronics (AIM).

[2] Austin, Thomas. The Application of Spread Spectrum Signaling Techniques to Underwater Acoustic Navigation. Woods Hole, MA. Print.

[3] Burdic, William. Underwater Acoustic System Analysis. 2nd ed. Los Altos Hills, California: Peninsula Publishing, 2002. 361-380. Print.

[4] Haung S., Wang Z., and Dissanayake G., "Simultaneous Localization and Mapping: Exactly Sparse Information Filters," World Scientific Publishing Co., Danvers, MA, 1994.

[5] H. G. Sage, M. F. De Mathelin and E. Ostertag, “Robust Control of robot manipulators: a survery,” Int. J. Control, 1999.

[6] Huh S., Shim D. H., and Kim J., "Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs," 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[7] M. Miranda, P.-P. Beaujean, E. An, M. Dhanak, “Homing an Unmanned Underwater Vehicle Equipped with a DUSBL to an Unmanned Surface Platform: A Feasibility Study”, Proc. of the MTS/IEEE Oceans’2013, San Diego, CA, Sept. 2013, pp. 1-10.

[8] M. Miranda, “Mobile Docking of Remus-100 Equipped with USBL-APS To An Unmanned Surface Vehicle: A Performance Feasibility Study”, M.S. Thesis, Florida Atlantic University, Boca Raton, FL, May 2014.

[9] O. J. Sordalen, "Optimal thrust allocation for marine vessels," Control Eng. Prac., vol. 5, no. 9, pp. 1223-1231, 1997.

[10] T. I. Fossen and J. P. Strand, “Tutorial on nonlinear Backstepping: Applications to Ship Control,” Modeling, Identification and Control, 1999

[11] Thrun S., Burgard W., and Fox D., Probabilistic Robotics, Cambridge, The MIT Press, 2006

[12] Wulf O. and Wagner B., "Fast 3d scanning methods for laser measurement systems," 2014 International conference on control systems and computer science (CSCS14).

[13] Yoshida T., Irie K., Koyanagi E., and Tomono M., "3d laser scanner with gazing ability,” 2011 IEEE International Conference on Robotics and Automation (ICRA).

[14] Anderson Lebbad, Nicholas DiLeo, Chidananda Matada Shivananda, Frank Ferrese and C. Nataraj, "Probabilistic Vision Based Guidance & SLAM for Autonomous Surface Vehicles", Naval Engineers Journal, December 2014.

[15] J. Wes Anderson, Anderson Lebbad, C. Nataraj and Garrett Clayton, "An Integrated LIDAR/Video Vision System for Autonomous Surface Vehicles", Naval Engineers Journal, December 2014.

[16] Edward Zhu, Dylan DeGaetano, GinSiu Cheng, Priya Shah, Michael Benson, Gus Jenkins, Frank Ferrese, and C. Nataraj, "An Experimental Autonomous Surface Vehicle With Vision-Based Navigation, Obstacle Avoidance & SLAM", Naval Engineers Journal, December 2014.

Page 11: INTRODUCTION€¦ · Web viewFigure 14: Example images showing (a) the original RGB image, (b) LIDAR data plotted in the camera coordinates, and (c) joined LIDAR data, creating a

Team FAU-Villanova 11 of 11

[17] Nicholas DiLeo, “FastSLAM: Mapping for an Autonomous Surface Vehicle,” Independent Study report, Villanova University, May 2014.