12
ECE 492/592 - Team A2 Final Report Matthew Amar Chris Brown Robert Merholz Matthew Richardson Cecile Sadler

E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

ECE 492/592 - Team A2

Final Report

Matthew Amar

Chris Brown

Robert Merholz

Matthew Richardson

Cecile Sadler

Page 2: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

Abstract The goal of our project was to design an aircraft that would autonomously detect and drop a payload on a target. Our design uses a fixed wing, FPV aircraft as our delivery platform. We chose to use an airplane because it offered an exciting challenge, a large carrying weight, and a significantly longer flight time than a multicopter. Our aircraft contains a PixHawk flight controller, GPS module, Pitot tube, Camera, ODroid UX4 microcontroller, 2.4GHz wireless access point, RC Receiver, and all the necessary components needed to power those listed. We chose to use OpenCV as our computer vision software to detect the target and send commands via MavLink through a Python script. Once the target had been identified we dropped our payload using a servo-actuated release mechanism.

Page 3: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

Introduction Our problem centers around the autonomous detection of a target and accurate payload deployment onto the target. This problem has numerous applications in areas such as disaster relief, humanitarian aid, and search and rescue. If fully realized, this platform could be used to deliver supplies or items to areas where organized support efforts might take too long to reach or are simply inaccessible by larger aircraft. Another potential use of such an airframe could be to attach cellular technology to the aircraft and loiter over affected areas, providing cellular coverage to disaster areas. The advantage of this system is that many platforms can be created quickly and cheaply which means increased relief area in a shorter amount of time compared to traditional options. In addition, because this is a fixed wing aircraft it provides greater endurance and payload capacity than it’s multicopter counterparts but sacrifices maneuverability as a result. Related Work Amazon Prime Air Drone Service: The process of searching and scanning for a target and then dropping a payload to that specific location is currently providing solutions in the fields of consumer package delivery, military operations, and disaster relief. Amazon Prime Air Drone Service is an example of existing developments in this realm. The autonomous vehicles are a helicopter/ airplane hybrid that is designed for emergency orders and to particularly rural areas within a 15 mile line of sight. The first proof of concept flight occurred in December 2016 in which an actual package was delivered to a customer in Cambridge England.[2] The carrying capacity weight is five pounds and but can accommodate a varying range of package sizes. The drone also utilizes an Amazon branded landing pad that acts as a homing device and provides a safe delivery zone once the target has been locked. Prime Air is expected to continue to grow as Amazon expands their distribution sites and finds new ways to implement sensor and collision avoidance technologies. Humanitarian Aid Assistance: The concept of the “humanitarian drone” has been understood as a set of contested representations of technology, and technological functions, intended to meet some of these assistance needs.[3] Drones are being used in similar methods to offer assistance in times of crisis. A company known as Matternet is working to deliver medical supplies and food to areas before assistance in the form of manpower can arrive. Currently, their UAVs are able to transport a two kilogram load, and cover a ten kilometer distance in 15 minutes. The goal of the project is to improve healthcare service for the poor and low income populations in particular regions of the world. This use case for drones is something that is becoming increasingly popular as more areas of need are being discovered.[1] Aerial Mine Detection and Detonation: The removal of anti-personnel mines left over from wars past are of great concern to many of today’s developing nations. Surveying landscapes with autonomous aerial vehicles reduces the risks inherent to performing minesweeping operations even with either ground robots or trained individuals. A team of undergraduates from the Robotics Engineering Program at Worcester Polytechnic Institute created an autonomous multicopter capable of deploying sandbag payloads for the purpose of detonating marked PMN-1 landmines [4]. Another group of researchers from George Mason University have investigated the potential of having UAVs identify landmines on their own through the use of ground penetrating radar [5]. The combination of such techniques in a cost effective manner can potentially help to improve the lives of populations working to reclaim old battlegrounds.

Page 4: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

Approach To accomplish the task of target identification and payload delivery, we assembled a fixed wing aircraft capable of carrying and deploying a small toy football in a similar manner to a conventional level bomber aircraft. The level bomber approach had the drone fly parallel to the ground while both searching for the target and performing the bombing run. For this approach an onboard computer needs to identify a rectangular target while flying parallel to the ground in order to perform the bombing run. This approach was the most feasible for our experience level and timeframe, but offered a lower degree of payload accuracy compared to a dive bomber method which could potentially drop the bomb closer to the desired target. For our airframe we chose to use an FPV platform as FPV aircraft typically provide the most stable, level and predictable flight characteristics. For our system we needed the aircraft to be very stable in order for the camera, mounted at the front of the aircraft, to produce the highest quality video and still frames. After researching many different FPV platforms the Volantex Ranger EX was chosen because it was designed to hold a large payload underneath it and had plenty of clearance between the ground and the fuselage for us to mount a payload. Once our airframe was selected we began researching which autopilot board would be best for our approach. We needed to select a board that could interface with all of our sensors and actuators on the plane while being able to run the ArduPlane software without any risk of congestion. After looking at different BeagleBone boards, such as the BBMini and the BBBlue, we decided to use a PixHawk due to its ease of use, superior IMU and the extensive documentation available for it. The PixHawk would allow us to directly connect most of our peripheral devices to preconfigured input buses. A number of standalone peripherals designed for the PixHawk, such as power regulators and air speed detectors, were also available and could be quickly tested. When initially exploring methods for releasing a payload, the first approach was to use a magnetic release that would be triggered upon target identification. Several issues resulted from this. A magnet strong enough to hold an ideal payload typically runs heavier than the payload itself. A 12 V DC magnet was considered, in which we would need to use a magnet on the body of the plan, and an additional smaller permanent magnet or ferromagnetic material attached to the payload. Complications arise where current must be applied to keep the payload attached, which would draw substantial current from the power source from takeoff until payload release. The weight of the magnet in addition to the payload would introduce further difficulties during takeoff, as the weight of the plane would be much greater than needed. Lastly, dropping large metal objects from heights several meters high in a public park’s airspace posed a very serious safety risk. All factors considered, the magnetic release approach was discarded. The fixed wing autonomous bomber design favored a servo release instead.

Figure 1: Servo Release Mechanism

Page 5: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

Our implementation of this approach uses the following components:

Part Name Number Link

Volantex Ranger EX Airframe 1 Ranger

PixHawk 2.4.8 1 PixHawk

ODROID Microcontroller 1 ODROID

40A ESC 1 ESC

Flight Surface Servos 6 Servo

Drop Mechanism Servo 1 Servo

GPS Module 1 GPS

FrSky X8R Receiver 1 X8R

Taranis X9D Transmitter 1 Taranis

ELP USB Webcam 1 Camera

TP-LINK Wireless USB Adapter 1 WiFi Adapter

Pitot Tube 1 Pitot Tube

I2C Splitter 1 I2C Splitter

FPVDrone Power Module V1.0 1 Power Module

5V/5A Step Down Regulator 1 Regulator

4000 mAh 4S Battery 1 Battery

3DR Radio Telemetry 1 Radio Telemetry

Motor 1 Motor

Propeller 1 Propeller

Payload 1 Payload

Table 1: UAV Hardware Components

Inside the cockpit of the aircraft we stored the PixHawk, ESC, ODROID, battery and power regulators. Mounted to the nose of the aircraft was the GPS module, mounted underneath were the camera, bomb deployment mechanism and RC receiver, and mounted to the sides of the aircraft were the wireless access point, radio telemetry module and the Pitot air speed sensor. Much of the cabling to these many components was routed through pre-cut holes in the body of the fuselage and any excess was taped down to avoid interference.

Page 6: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

Figure 2: Hardware Block Diagram

To alleviate the load from the autopilot board and to allow us to perform real time image processing on our camera data, we needed to make use of a secondary “companion computer” board. As with most of our electronics, we spent some time evaluating the various embedded microcomputers that were available to us. A relatively fast computer that was capable of running an easy to use Linux distro was ideal for this project, as it would allow us to quickly configure all of the required software and interfaces needed to complete our project. We also wanted to ensure that the device we selected would be easy to power off of the same power supply as our autopilot computer. Two of the boards we considered were the Raspberry Pi 3B and the Odroid UX4 microcomputers. The RPi is a fairly popular system which has great compatibility with most of the project-crucial Linux software, making it a safe choice if we needed to make use of a variety of different software packages. However, the RPi suffers from a lack of onboard memory and a relatively slow CPU, which can slow down image processing and can make the desktop environment perform poorly. We decided to instead select the Odroid as our companion board, which makes use of the ARM big.LITTLE architecture to accelerate the processing of intensive applications while preserving power when low CPU loads are applied. This selection allowed us to configure a standard desktop environment for the development of the various software files needed to complete our project. We decided early on that the easiest target we would be able to consistently detect from the air would be a large, rectangular tarp with a solid, vibrant color scheme. To identify the designated target, we made use of the OpenCV 3.4.1 libraries for processing data from our camera. Using common OpenCV techniques such as BGR to HSV image transformations and contour detection would allow us to rapidly take a frame and isolate any rectangular

Page 7: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

objects that match the target’s color range. Further processing could then be done to differentiate contours that are similar to the expected target, but do not match the expected scene where the target should be. Calibrating the camera using OpenCV had to be done several times throughout the project and in multiple stages. The first calibration stage was to remove radial distortion effects caused by the fisheye lens of the camera. This was done using the “Chessboard” method of taking snapshots and having OpenCV determine the distortion coefficients of the lens. After remapping the image, several additional filtering steps were used to eliminate hues from the image that were outside of the desired range. Video recordings were made from the air to get testing data, which was then used to create a range of HSV values to be used in a filter. This range had to be adjusted over time, as we selected tarps with different colors to act as a target and shifted the desired altitude of the plane. We settled with a large blue tarp and a desired altitude of about 120 feet for our flight testing, which is where the final HSV filter values are obtained from. The target detection algorithm is further detailed in Figure 3.

Figure 3: Target Detection Software Flowchart

Our initial design called for a confirmation protocol for performing a bombing run once a target was detected. Once we had determined if the target was in view of our camera, we wanted to be able to identify where the target was relative to our plane and to create a series of waypoints that would let our plane fly directly over it or loiter around it. After selecting a target, a signal would then be sent from the RC controller or a base station to the plane to indicate that the identified target was correct or incorrect. Once given the go ahead, the plane could then plot out a course that would take it directly over the target before releasing the payload at the appropriate time. However, due to various time constraints and setbacks, we decided to scrap the user’s involvement in the decision making process as well as the waypoint path planning and instead simply focus on dropping the payload when the target was detected directly below the plane. To ensure that we only dropped the payload on the appropriate target,

Page 8: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

we checked conditions such as the orientation of the plane and the number of valid frames containing the target found over a short time period. This method is detailed in Figure 4 below. To communicate with the PixHawk, an instance of the MAVProxy interface was spawned on the ODroid. The Dronekit API was used send and parse MAVLink messages to and from MAVProxy instance. Such messages included RC channel overrides for deploying the payload, queries for position and orientation data, and flight mode checking. Additional message types, such as setting waypoints, altitude thresholds, and target speed were available, but could not be made use of in our development time.

Figure 4: Vehicle Control Software Flowchart

Analysis and Results Throughout the duration of the project, the plane encountered several obstacles, mainly variables that changed due continued testing and accumulated overtime. Upon early flight testing one 3S battery was used to operate the vehicle, however the final design of the plane utilized one 4S battery. This adjustment proved to be the better option that would provide a desired amount of power to the motor. Consecutive flights of the plane caused a sag in the battery voltage resulting a decreased battery life and flight time. The cells of the battery drained quickly despite

Page 9: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

being balanced prior to each flight. Furthermore, finding an appropriate center of gravity became difficult as more components were added to the drone. During initial flights after the construction of the frame the drone was tail heavy, which led us to begin shifting parts towards the nose to balance the weight. The packing of materials in the front became problematic as we discovered that the battery placement could cause the plane to become nose heavy. This fluctuating CG contributed to issues with takeoff and landing, leading to several crashes. As the crashes accumulated, damages from impact were observed to the landing gear and dropping mechanism. The USB camera also experienced wire strain causing an eventual need for replacement. Additionally, after multiple failed takeoffs we discovered that the wheels on the landing gear would self-tighten upon landing causing the plane to overturn. A combination of the skewed center of gravity, the uneven flight terrain, and the wheel irregularities led to variations in the takeoffs and landings that resulted in component damage and inconsistent flight performance. While the PixHawk hardware was overall very easy to use, certain aspects of Mission Planner gave us trouble as we tried to set up waypoint navigation and auto-mode. For autonomous flight we used mission planner to set specific waypoints to create a mission which the plane would perform. We used a grid design to begin so that the plane could see the field in its entirety to find the target. The initial plan for the waypoints was to find the object and then have the plane set auto waypoints back over the target to drop the payload. Our plane was able to follow the waypoints somewhat but it wasn't reliable enough for us to set up auto waypoints. With a fixed wing plane it was very difficult to follow certain waypoints because of the turn radius restrictions of our design. Our frame was designed for gliding and not aerial maneuvers so making sharp turns in the small field was not feasible. Given the small field, we had to set waypoints that were close together, this caused some waypoints to be missed and would then make the plane go off track. We also tried to adjust the radius of the waypoints themselves, but we were unable to find the ideal radius value. If we made the radius too small the plane would miss the waypoint and be forced to maneuver in a manner to reach the missed waypoint. In an ideal implementation, smaller waypoints would allow for more accurate flight, though with wind and other factors this was never the case. On the other hand, if we made the waypoint radius too large we ran the risk of the waypoints overlapping each other causing the plane to potentially reach multiple waypoints at once and thus mess up the mission.

Figure 5: Example Flight Plan

Page 10: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

In a few consecutive test flights we also had a mysterious issue where if we flipped the plane into auto-mode the flight controller would cut all power to the motor and the aircraft would appear to want to glide down to an unknown altitude. Even after opening the .tlog files (telemetry log files) and examining the sequence of events directly after switching into auto-mode we were still unsure of what was causing the plane to apparently want to glide to its demise. After further investigation it was determined that the culprit was the frame of reference Mission Planner had been using for it’s altitude. In the top-left corner of the flight plan window there is a drop down box in which you can select Absolute, Relative or Terrain. It became obvious after we discovered these options that for the previous flights the selected option had been Absolute. This meant that our plane had been trying to glide down to approximately 100 feet above sea level (or an absolute altitude of 100 feet). As the city of Raleigh is about 315 feet above sea level, the autopilot software was initially trying to reach a point roughly 200 feet beneath the ground. The image processing aspect of our project worked surprisingly well both in the lab and from the air. When properly calibrated, the camera was able to clearly differentiate the blue tarp from the surrounding ground while also being able to select images that met the criteria of having a rectangular target, as seen in Figure 6. Issues did initially arise when we viewed scenes with large amounts of blue color content, such as when the plane flew over a parking lot. This caused the color detection code to identify targets we did not anticipate. Correcting this issue was done by first applying a more critical filter that sharpened the allowed ranges for saturation and value. After performing this step, we evaluated how much of the desired hue was found in the frame, and cutting out frames that exceed a maximum percentage of this hue

Figure 6: Processed Frame with Detected Object

As previously mentioned, there was a flight in which a parking lot was identified as a possible target, and an unexpected release occurred as a result, as seen in Figure 7. While we were not able to drop the payload from the air on the desired target during any of our test flights, this is primarily due to the fact that our auto mode navigation was

Page 11: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

never properly configured or we had to abort a flight before the target could be seen. We were, however, able to demonstrate the entire codebase working from the ground, if the plane was carried over a blue object.

Figure 7: Comparison of HSV values for Target and False Positive Frames (Note <10% difference in Hue)

In our implementation for the dropping mechanism, channel 7 of the RC transmitter was linked to the servo via the PixHawk. This allows a simple switching mechanism that would open or close the hook that holds the payload. Flying in manual mode, this feature demonstrated a reliable method to release our payload at will. However, in testing with autonomous mode with onboard code, there was a significant delay for the release to occur after a channel override was set. This would occasionally yield a missed target, as the drop would occur moments after it should have. Conclusion In conclusion our project was designed to find and drop a payload on a target using a fixed wing aircraft. While we were not able to achieve our overall goal we made good progress on many of the individual components necessary for a project of this nature. Our computer-vision target detection worked well, as did our bomb deployment mechanism. One particular area which needed improvement was the autonomous mission planning portion of the project. In none of the flights did the go-to-waypoint navigation work perfectly though this was likely due to high winds and errors regarding altitude and waypoint radius size. If we were to continue this project we would add a gimbal underneath the plane and mount our camera on it to reduce vibrations and ensure a constant angle while looking at the ground.

Page 12: E C E 492/ 592 - T e a m A 2 F i na l R e port...O ur probl e m c e nt e rs a round t he a ut onom ous de t e c t i on of a t a rge t a nd a c c ura t e pa yl oa d de pl oym e nt ont

References [1] K. B. Sandvik and K. Lohne, "The Rise of the Humanitarian Drone: Giving Content to an Emerging

Concept," Millennium Journal of International Studies, vol. 43, pp. 145-164, 2014.

[2] L. Johnson, "Digital Spy," 7 February 2017. [Online]. Available: http://www.digitalspy.com/tech/feature/a820748/amazon-prime-air-drone-delivery-service.

[3] "Aid & International Development Forum," 7 July 2016. [Online]. Available: www.aidforum.org/topics/disaster-relief/drone-technology-revolutionising-disaster-relief.

[4] A. Lockman et al., “De-Mining with UAVs,” WPI Electronic Projects, Nd., [Online]. Available: https://web.wpi.edu/Pubs/E-project/Available/E-project-031216-115612/unrestricted/de-mining-with-uavs.pdf. [Accessed May 4, 2018].

[5] A. Goad et al., "Landmine detection utilizing an unmanned aerial vehicle," 2008 IEEE Systems and Information

Engineering Design Symposium, Charlottesville, VA, 2008, pp. 231-236.