ECE 4902 Spring 2017 Final Report
Autonomous Battery Charging of
Quadcopter
Team 1722:
Thomas Baietto – Electrical Engineering
Gabriel Bautista – Computer Engineering
Ryan Oldham – Electrical Engineering
Yifei Song – Electrical Engineering
Sponsor: Professor Ashwin Dani, Ph.D. Assistant Professor
Electrical and Computer Engineering
University of Connecticut
347 Fairfield Way, U-4157 ITE-Building
Storrs, CT 06269
E-mail: [email protected]
1
Contents 1. Introduction…………………………………..2
a. Summary
b. Background
2. Platform: Parrot AR.Drone 2.0…………….4
a. Choosing a Drone
b. Hardware
c. Application Programming Interface
3. Solution……………………………………....7
a. Design Limitations
b. Software Technical Design
c. Hardware Technical Design
4. Project Plan…………………………………24
a. Budget
b. Timeline
5. Conclusion…………………………………..25
a. Summary
b. Future Work
c. Appendix- Python Code
6. References…………………………………...27
2
1. Introduction
Summary
Quadcopters have a limited flight time due to current battery supply limitations.
Commercially available quadcopters typically last anywhere from 20 to 30 minutes on a single
battery charge. Adding more batteries is not a desirable solution because increasing payload
weight will consequently consume more power, ultimately reducing battery life and flight time.
Our Design Team has developed a UAV system, specifically a quadcopter, which can
autonomously recharge itself. This means that the quadcopter is able to recharge its battery when
needed without any human interaction.
For this design project, we wanted to develop an autonomous system that can navigate
our quadcopter to the charging station and then recharge the battery. Thus, this project can be
split into two separate parts. First is the software design, which will allow for autonomous flight
and docking of our quadcopter on the charging station. Second, is the hardware design of the
charging system that will succeed in recharging the drones battery in a timely manner. Our main
goal was to successfully design a fully autonomous battery charging quadcopter system.
Background
Unmanned Aerial Vehicles (UAV) provide aerial surveillance at an affordable cost with
easy to learn controls. UAVs have grown increasingly popular over the last decade causing a
high demand to continuously improve their functionality. Current UAV technology is severely
restricted by battery storage capacity and requires frequent human interaction between sessions
to manually recharge a battery.
A quadcopter is a class of UAV similar in design to that of a helicopter. Quadcopters are
composed of four vertically oriented rotors attached to a frame, each controlled by their own
individual motor. Flight is made possible by altering the thrust generated from each rotor
according to feedback collected through a collection of sensors and commands. Software can be
implemented to automatically send commands to the quadcopter motors based off of the sensory
information collected onboard. This automated system can be developed to navigate our
quadcopter to a recharging station within a close proximity. The quadcopter will then connect
with the recharging station, supplying power to the battery. Finally, after being fully charged, our
quadcopter will resume automated flight.
An important aspect of our project to consider is the environment that our system will be
performing within. Our first test will be completely indoors in an empty room. The quadcopter
will be stationed at one end of a room and the charging station will be placed at the other side.
3
The goal is to have the quadcopter successfully locate and dock itself upon the charging station
on the opposite side of the room. The drone will then recharge its battery and take off again. All
of this will be done without any human interaction.
After successfully achieving this first task, we now have the platform to develop a more
refined procedure in future work. That is, we will then have to consider a design that works well
with obstacles in the room such as furniture and more importantly a system that cooperates with
the weather conditions.
For this project to be brought to real world applications, the weather must be considered
or the autonomous charging design is useless outdoors. In order to move forward with this
design, we first achieved a fully autonomous battery charging quadcopter system in an empty
room indoors. After solving this initial problem, we will then have a solid background and
foundation for future development of a system that works in all environmental conditions.
Below is a visual representation of our project.
Figure 1. Project Visual. The drone will be placed in an empty room on the opposite side of the charging station.
4
2. Platform: Parrot AR.Drone 2.0 Choosing a Drone
To begin the design, our team first needed to select a quadcopter to work with. The
Systems Lab at UConn had two quadcopters readily available for us to use, the Parrot AR.Drone
2.0 and the Iris 3DR. We decided to choose between these two because they were free of cost
and easily accessible. A comparison was made between the two drones which can be observed
below.
Figure 2. Comparison Chart of AR.Drone 2.0 and Iris 3DR
From this chart, the most important factors are highlighted. As you can see both of the
drones have an Open Source Application Programming Interface (API) providing third party
developers a platform for application design. We decided to choose the AR.Drone 2.0 for two
main reasons. The AR.Drone 2.0 is equipped with two onboard cameras, one looking forward in
the horizontal direction and one facing downward in the vertical direction. The Iris 3DR requires
an external camera such as a GoPro to attach to the payload of the drone and only has a
horizontal field of view. The cameras on the AR.Drone 2.0 will be very useful for object tracking
used to navigate and dock itself on the charging station. The second biggest factor was the
onboard GPS that the AR.Drone 2.0 is equipped with. The AR.Drone 2.0 Flight Recorder adds
4GB of flash storage to record GPS and flight data. The Iris 3DR does not have GPS. Since our
first design was made for a quadcopter flying indoors, the GPS was not feasible and only image
processing was used. However, as discussed before when taking into account an outdoor
environment, with unknown weather conditions a GPS could be used.
5
AR Drone 2.0
The AR.Drone 2.0 is a remote-controlled
consumer quadcopter helicopter developed
by Parrot. The body is made of carbon fiber
tube structure and high resistance plastic. A
protection hull is made of Expanded
Polypropylene foam, which is both durable
and light in weight. The hull provides
protection during indoor flights. The
propellers are powered by four brushless
motors (28,500 RPM, power: 15W). Energy
is provided by a Lithium polymer battery
with a capacity of 1000mAh, which allows
a flight time of approximately 10 minutes.
The AR.Drone 2.0 carries an internal computer, specifically, a 1 GHz 32 bit ARM Cortex
A8 processor that runs a custom Linux operating system. The onboard computer and sensors
allow for automatic take-off, hovering with constant altitude and landing. This is accomplished
by sending feedback from the sensors to the control system. This feedback is required because
the quadcopter is an unstable system without it. As mentioned earlier, the AR.Drone Flight
Recorder is a mini-USB connector that adds a GPS sensor to the drone. An integrated 802.11g
wireless card provides network connectivity with an external device that controls the drone. It is
possible to control the AR.Drone from a Linux PC with the software designed for application
developers.
The AR.Drone is equipped with different types of sensors that are used for automatic
stabilization and the ability to send navigation data such as (status, altitude, attitude, speed). The
AR.Drone features a 6 degrees of freedom inertial measurement unit. It provides onboard
software with pitch, roll and yaw measurements used for automatic stabilization. The AR.Drone
is also equipped with a 3 axis gyroscope 2000 degrees/second precision, 3 axis accelerometer +/-
50mg precision, 3 axis magnetometer 6 degrees precision, pressure sensor +/- 10 Pa precision
and ultrasound sensors for ground altitude measurement. Lastly, the drone has two cameras. The
60 fps vertical QVGA camera can be used to measure the ground speed. The HD 720p, 30fps
front camera can be used for video storage in real time with Wi-Fi directly on an off-board
device. Due to the tolerance of all the sensors, they are used to determine the state estimation of
the drone. The sensor measurements are sent through algorithms that obtain a relatively accurate
Figure 1 AR Drone cross-sectional measurements
6
state estimation. All of these onboard electronics will be useful for obtaining autonomous flight
and docking of the quadcopter.
Application Programming Interface
Due to many third party developers, Parrot launched the AR.Drone open API that can be
accessed for research and application purposes. It includes SDK (Software Development Kit)
source code. The API does not include software that is embedded on the AR.Drone.
Communication with the drone is done through four main communication services which are
implemented in the SDK and listed below.
1. Controlling and configuring the drone is done by sending AT commands on a regular basis. AT
commands are text strings sent to the drone to control its actions.
2. Information about the drone (status, altitude, attitude, speed) is called navdata. The navdata also
includes estimated sensor measurements. This information is sent by the drone to its off-board
computer at approximately 200 times per second.
3. A video stream is sent by the AR.Drone to the off-board device. The SDK includes an algorithm
to decode this video for processing.
4. Critical data is communicated over a channel called the control port. It is used to retrieve
configuration data such as the state of the drone. It also acknowledges important information
such as the
sending of
configuration
information.
Figure 4. Visual of Network Communications between the AR.Drone and the off-board computer
7
3. Solution
Design Limitations
The following limitations need to be realized during the design process.
1. Network Connection. The response time between quadcopter and PC terminal over a network
can lead to crippling delays in our system. A wireless network will also be limited in range.
2. Computational Speed. Both the quadcopter and PC terminal will have delays in processing
information. Image processing can require large amounts of processing power.
3. Sensor Accuracy. Sensors are designed to operate within certain limits. The sensory information
is estimated and may not be 100 percent accurate.
4. Charging Time. Current Lithium Polymer battery technology limits the flow of current in and
out of the cells. These limits mean battery charging time will be over 2 hours.
5. Weight Capacity. As we increase weight, battery life will decrease as well.
6. Budget. We were limited to roughly $1000.00 for any additional parts.
Software Technical Design
Objective
Our main goal for the software design is to develop a tracking system that will allow the
drone to autonomously navigate and dock itself on the charging station. We decided to approach
this problem by breaking it up into two separate parts. Our first objective is to design a system
that will allow the drone to detect and fly in the general area of the charging station. Once the
drone is hovering above the charging station the second objective is to design a control system
that will steadily and accurately dock the drone on the charging station. In order to achieve
autonomous flight, we must write a set of functions for the drone that commands its movement
and tracking. These functions will be written in Python using Robot Operating System (ROS)
which is compatible with Ubuntu (Linux based operating system).
8
Robot Operating System
ROS is a collection of tools, packages and libraries that act as a flexible framework for
writing robot software. All of the libraries and packages are open source so it is free and open to
the public. ROS consists of nodes and topics. A node is a block of code that performs a desired
task. There are publisher nodes that will continuously broadcast a message and there are
subscriber nodes that will continuously receive messages. It simplifies the programming of the
drone because each node is written to compute one specific task. Communication between nodes
allows each of these separate tasks to work together in accomplishing the overall goal of the
system. This communication is obtained through topics. Topics are named buses over which
nodes exchange messages. ROS contains many open source implementations of common
robotics algorithms which will act as a useful structure for our software design.
Autonomous Flight and Docking of the AR.Drone2.0
As mentioned before, the first objective is to autonomously fly the drone in the general
area of the charging station so that it can hover above it and prepare to dock itself. At first we
attempted to make use of the AR.Drone Flight Recorder to implement a GPS system but soon
realized that the use of GPS was not accurate and feasible indoors. Instead, we figured out that it
would be best to approach this problem with an image processing based tracking system. The
charging station will be placed on the opposite side of the room to the drone. The drone will then
use object detection to locate the vertical tag that is lying above the charging station. Once the
desired vertical tag is found, the drone will lock onto its target and send necessary motor
commands to keep the tag in the center of the camera’s view while flying over to it.
To complete our object tracking algorithm, we used the OpenCV package which simply
stands for Open Source Computer Vision. OpenCV allows for real time video processing and
contains many functions that we used in our code to track our object. When dealing with
computer vision there are many different approaches aimed at detecting desired regions in a
digital image. A blob is a group of connected pixels in an image that share some common
property. OpenCV provides a convenient way to detect blobs and filter them based on different
characteristics. In our design, we chose to use color, and size to filter out all regions that are not
our desired object to track. Below are the steps in order describing how the blob detection
program works.
1. Set target color range to track red pixels only
2. Isolate the sections in the frame that contain our target color with thresholding
3. Results in a purely black and white frame where contours in our color range are white and all
other sections are black
4. Apply specific OpenCV functions to remove as much noise/disturbance as possible
9
5. Set minimum area and use conditional statements to filter out red objects less than this value
6. Enter a loop that calculates the areas of all of the tracked contours and only lock onto the biggest
one
7. Use OpenCV functions to draw a bounding rectangle and circle along the perimeter of the
desired red object.
8. Draw a line from the center of the object to the center of the camera frame.
Below are two figures that illustrate this object detection. The first figure contains two separate
pictures. The picture on the left is how we view the image and the picture on the right is how the
computer sees the image. The second figure shows that the drone only picks up on the largest red
object in the image view.
Figure xx. The left picture is how we perceive the camera image and the right picture is how the computer sees the
image. Notice that only the regions within our red color range are white and the rest of the image is shown in black.
Figure xx. Camera image frame viewed from external laptop. The drone only detects the largest red object and
places a bounding rectangle and circle around it needed for other calculations. An OpenCV function also creates a
line from the center of the object to the center of the screen.
Now that the drone can track the desired object we then needed to design two control
systems. The objective of the control systems was to keep the tracked object in the center of the
camera’s view and know how far away this object is so it can fly over to it. Both control systems
10
require the use of conditional statements that send motor commands to the drone. We installed
and configured libardrone which is a library of functions written in python to control the
AR.Drone 2.0 movement. From this library we were also able to set parameters such as speed
and the camera view. After importing the necessary dependencies we were able to call these
movement commands in our own program to control the drone. Some of these functions include
move_forward(), turn_left(), turn_right(), move_up(), move_down() which we use to control
both the orientation and location of the drone.
Once the desired object is tracked, we now must design a control system that will adjust
the orientation of the drone to keep the target in the center of the camera’s view. The drone has
self stabilizing PID controllers built into its SDK so it will always adjust itself to hover flat. We
just needed to design a system that allows the drone to turn left/right and move up/down when
controlling the orientation. To do so we used a coordinate system in which the units are pixels.
The camera’s image frame was set to a fixed width and height. From this frame we created two
variables, screenMidX and screenMidY which are just the center points of both the width and
height. Then, using simple OpenCV functions we were able to calculate two variables which
were the x and y values of the center of the object. These two variables were used as our
measured process variables in our control system and update at a rate of 30 times per second.
The goal of the control system was to send motor commands to the drone that ell it to turn
right/left or move up/down to minimize the error between (screenMidX - x) and (screenMidY -
y). When the error is within the allowable tolerance that means the tracked target will be in the
center of the camera’s image view.
Below, I list the steps used in designing this control system.
1. The drones camera frame is set to a width (W) of 640 pixels and a height (H) of 360 pixels which
will set up our coordinate system
2. Create fixed variables (screenMidX = W/2 = 320 pixels) and (screenMidY = H/2 = 180 pixels)
3. Using OpenCV functions with simple geometry, calculate the x and y values for the center of the
object
4. Set a center threshold which will be our allowable tolerance in pixels
5. Use conditional statements that send motor commands to adjust the drones orientation
a. Along x-axis, turn right or left to minimize the error between (screenMidX - x)
b. Along y-axis, move up or down to minimize the error between (screenMidY - y)
11
(0,0)
Figure xx. Camera frame that shows the fixed variables screenMidX/screenMidY and the measured
variables x/y
Above is an illustration of how we will control the orientation of the drone. It is
important to note that the origin (0,0) of the coordinate plane is the top left corner of the frame.
We want the drone to keep the tracked object in the center of the camera’s view (frame). To do
this we will send motor commands to tell the drone to turn right and left to make adjustments on
the x axis. We will also send motor commands telling the drone to move up and down to make
adjustments on the y axis. Mathematically that means that ideally we want the difference of
(screenMidX - x) to equal 0 and the difference of (screenMidY - y) to equal 0. Since screenMidX
and screenMidY are fixed variables we can use their known values and say that we want the
difference between (320 - x) and (180 - y) to equal 0.
For example if (320 - x) < 0 that means the object will be off to the right. The drone will
then use this condition to enter a certain loop that calls the turn_right() function in libardrone.
The loop will constantly call this function until this condition is false which conceptually means
the difference between (screenMidX - x) is no longer negative.
Our next objective was to detect the distance between the drone and the charging station.
Once the distance is known, we can design a control system that sends motor commands telling
the drone to fly over and hover above the charging station. Our algorithm uses triangle similarity
to estimate the distance between the tracked object and the camera of the drone.
We have our red object with a known width W of 3.5 inches. We then place this object at
a known distance D of 30 inches from our camera. We then used the camera on the drone and the
bounding rectangle to return the apparent width in pixels P which came to be 78 pixels. This
allowed us to then derive the perceived focal length F of our camera using the equation,
12
F = (P*D) / W = (78*30) / 3.5 = 668.57 pixels
Once the focal length F is calculated we were then able to apply the math behind triangle
similarity to determine the distance between the object and the camera. The following equation is
used, D = (W*F) / P where D is the measured distance, W is the known width of the object, F is
the calculated focal length and P is the width of the object in pixels. When moving the object
further away from the camera, the perceived pixel width will be smaller than the 78 pixels that
we calculated at a distance of 30 inches. To calculate the pixel width of the object at any given
distance we just use a simple function to return the width of the bounding rectangle around the
perimeter of the object. For example now let’s say that I move the object 36 inches from the
camera. The perceived width of the object is now 65 pixels so now we get,
D = (3.5*668.57) / 65 = 35.99 inches. Below are two pictures that illustrate the distance
detection algorithm. We moved the drone closer and further away from the object to observe the
changing distance measurements.
Figure xx. In the left picture the drone was placed 28 inches from the object and our distance algorithm measured
28.84 inches. In the right picture the drone was placed 77.5 inches from the object and our distance algorithm
measured 78.62 inches.
We found that our distance detection algorithm was accurate to within +/- 2 inches. In
future work, we can make this distance measurement by using two objects with a known distance
13
between them. This will allow the drone to have a depth perception and thus more accurate
measurements.
After calculating the distance between the object and the drone we then designed a simple
control system to minimize this distance. The object that we are tracking was placed 15 inches
behind the charging station so we used conditional statements that send motor commands to the
drone telling it to fly to a distance of 15 inches from the object. Once the drone is 15 inches from
the target it will hover in place over the charging station and switch to the vertical camera to
begin the docking procedure.
A combination of both the orientation and distance control systems allowed the drone to
successfully takeoff, find the desired object and fly over to it where it will then hover above the
charging station and prepare to dock.
Once the drone is hovering above the charging station, our objective is to design a control
system that will allow the drone to accurately descend and land on the charging station. In order
to accomplish this we will need to design an image processing based tracking system utilizing
the vertical camera on the drone. There will be a colored tag placed on the center of the charging
station. The drone will be programmed to identify this tag with the same tag detection algorithm
described above. Once the drone detects the tag on the charging station, we will need to
implement a similar control algorithm that will send motor commands to the drone. The motor
commands will steadily descend the drone and the controller will work to keep the tag in the
center of the camera’s view. Once the drone is docked on the charging station, the drone will be
programmed to wait and take off again once the battery is charged to the desired amount.
Summary
The goal of the software design is to implement a system that will allow the drone to
navigate and dock itself on the charging station when the battery is low. First we need to get the
drone to identify and fly over to the charging station when the battery drops below 30%. This
will be done using object detection and two separate control systems to adjust the drones
orientation and location relative to the tracked object. Once the drone is hovering above the
charging station, the second objective is to get the drone to land on it. This will be done using the
same object detection and tracking procedure but with a few tweaks. Once the tag is detected by
the drone, a similar control system is implemented, which will be used to send motor commands
that will descend the drone until it is docked on the charging station. All of this will be processed
with an off-board computer that is connected to the drone through Wi-Fi. However, after we
accomplish this, we will then run tests with the drones on-board computer and determine if this
software system can be processed completely on-board. This procedure is completely
autonomous so there will be no human interaction throughout the process. Our control system
14
was effective, however, in future work a PID controller can be implemented to smoothen the
drones trajectory. While testing our system, the drone would sometimes make quick movements
and would lose the tracked object in the camera frame. In technical terms, our control system
would minimize the steady state error quickly but in doing so would overshoot the target. To
eliminate this problem, a PID controller can be used to slow down the movements of the drone,
which bring the steady state error to 0 while avoiding a big overshoot.
Hardware Technical Design
Requirements and Assumptions
The goal of our hardware design was to construct a charging platform that could be
located while using an AR Drone within a predefined set of feedback loops using imaging
tracking. To maintain an autonomous system, the AR drone was to locate the station within a
continuous loop of predefined software instructions using values processed through camera
video frames. Since our charging station was to be used within an autonomous system, it was
high tolerance for unexpected sway disturbances is to be a necessary design obstacle in
preliminary planning.
After the station is targeted, drone movement anywhere within a predefined radius is to
be assumed. The station must provide elements that maximize the area of acquisition in which
the station can be confidently detected. Using unique target characteristics prevents false-positive
target acquisition and larger visually, both height and width, increase detection radius to target.
Additionally, autonomous flying vehicles have a short battery life so multiple stations
becomes necessary over farther desired radii of operation. Station should provide a cheap and
fast method of replication to aid in further design. The station must also provide a DC LiPo
battery charge system that should be able to operate in ideal conditions for demonstration
purposes. Future design work will need to focus on implementation of weather proofing the
charging system.
Below is a list of hardware objectives and requirements in continuation to what was discussed
above.
1. Design of a AR Drone 2.0 Docking Station
Consistent and Dependable
❖ High tolerance for any landing error.
❖ Repeatable docking/looping of autonomous system
❖ Infinite looping of system
Accessible and Detectable
❖ Drone must visually identify docking station
❖ Drone must be able to dock with station
❖ Station must be accessible without any user feedback into environment
15
❖ Station must be accessible indoors
Safe and Optimized Power Transfer
❖ Battery charging must be controlled.
❖ Charging rate must provide optimal operation time.
❖ Battery should not be too low when connected, 20% > V(battery)
2. Design of Battery Charging Circuit
❖ Recharge onboard lithium battery pack, optimized for continuous operation
❖ Provide cell voltage balancing of each cell individually, preventing damage to battery
❖ Minimize heat dissipation during battery charge to over battery cell expansion
Determining a Tolerance Factor
We must develop a charging station that allows for acceptable tolerance, permitting the
AR Drone 2.0 to dock on the charging station autonomously with a near 100% probability of
success. Determining the tolerance values that would be needed was a challenge as the AR
Drone was unpredictable most of the time. The tolerance factor was not finalized to +-6 inches
until after many test flights and observations. We concluded that the drone was very inaccurate
but would remain within about .5 feet of our desired hover spot.
Following our discussion of tolerance we discuss our choice of power delivery to the AR
Drone. Delivery methods such as solar cells or power line aerial hooking methods are not
discussed as impracticalities for real world application were seen as an obvious hindrance to any
future development of our work.
Choosing a Charging Method
To choose a charging method we first researched available options to us.
1. Inductive (wireless) power transfer
Problem: After initial research we found that inductive charging was highly sensitive to
displacement. The displacement could lead to an insufficient energy transfer and cause the drone
to not receive any charge. Additionally Inefficient/Low power transfer rate.
2. Battery Swapping System
Problem: A battery swapping system would allow for instant battery recharging. This instant
recharge design though also requires the use of many different parts. The number of parts
necessary would cause a great increase in unreliability.
Finally looking at a conductive power transfer method, we decided this would be best.
Conductive Power Transfer: 1. Provides consistent power flow compared to inductive charging.
2. Provides reliable power charging compared to both battery swapping and inductive.
16
3. The station is easily reproduced for multiple charging station units.
4. Additional design features can be easily implemented while using this design when compared to
other methods such as battery swapping.
Power Transfer Method Reliability Power Transfer Rate Reproducible
Inductive - Sensitive to
Displacement
- Low efficiency in non-
ideal systems
- Theoretically coils can be
placed anywhere providing
cheap and easy reproduction
of charging station
Battery Swapping - Many moving parts
increases chance of
failure
- Provides instant battery
recharge
- High efficiency
- Servo motor control with
multiple sensor requirements
make this an expensive
option
Conductive - Will always charge
when contact with coils
is made
- Minimal losses is
expected through metal to
metal contacts
- Coils can be placed on most
services and will conduct
- Anything metal could work
as a conduction point
Fig 8. Comparison of charging stations
Conductive charging is clearly the optimal choice in this design. Conductive charging
fulfilled all of our needed design specifications and returned little to irrelevant trade-offs in
other charging categories. Battery Swapping for instance would not be easily reproducible,
trading operation time for operational distance. With flight times less than 10 minutes, no
more than 5 minutes can be traveled in any direction using the AR Drone.
Next we discuss our initial design reasoning and diagrams. We compare the initial design
choices to our final outcome results.
17
Initial Design of Charging Station
To begin the design of the charging station we first created a basic flow diagram that we would
be following.
Fig 9. Station Design Flow Chart
The charging station first requires an input power source. To create ease of access and
reproducibility we will be implementing the station for use with any AC power outlet at the
national standard of 120V. This AC voltage will then need to be converted through an AC- DC
rectifier. Before the DC can reach our LiPo battery, the output of the rectifier will also need to be
ran through a DC-DC buck converter to insure that our voltage value is matched proportionally
to our required load requirement. AC-DC requires a typical rectifier system such as a full bridge
SCR rectifier. The AC to DC power rectifier will consist of:
Transformer: smooths out output to a near steady DC value
Diode Bridge: separates the negative and positive cycles of the AC input
and outputs them over our transformer to a stable DC output
The rectifier is to be built after all other hardware development stages as power supplies
can be supplemented for the current testing of the system. The resulting output of the AC to DC
conversion, as seen in the Station Design Flow Chart above, will lead to a battery control circuit
control. This circuit will control the flow of current that is being supplied to the battery through
four separate contact points.
PCB 3s LiPo Battery Charging
Using 2 LM7805 voltage regulators instead of a
buck converter, we can change output of our voltage
instead but varying resistance values across a 1k
potentiometer. A buck converter uses a PWM signal
to control a duty ratio. This is a low power
application so PWM is not necessary
T431 Voltage comparators are used at each line as
well to provide reference voltage values that control
flow of current into each cell. This method of LiPo
battery balancing circumvents the use of a
microcontroller and PWM to control output ranges,
simplifying design
Figure 2 - Initial Battery Balancing Schematic on Eagle PCB
18
Battery Controller to Conduction Coils
Each coil, as seen in Figure 2,
is connected individually from the
battery controller. As explained in
the LiPo charging in the end of this
report, each cell is represented by a
specific coil.
In the order of connection from our battery controller output, we
list the intermediary steps to the LiPo battery within the AR Drone
1. Battery control connects to 4 copper coils to on a piece of
plywood.
Figure 5- Plywood with our coils placed through to backside
2. The spacing of the copper coils will match the
cross-section spacing of the quadcopter.
3. Copper coils will have a PVC funnel enclosed
around it to increase landing accuracy.
Wooden funnels looked better and were
easier to shape to more effective funnel
designs. We realized that the body of the AR
Drone would hit the funnel before the Drone
leg, causing regular funnels to work counter-
productively
Figure 3 - Controller to Coil Power Transfer
Figure 4- Cross-Section of quadcopter measured on plywood
Figure 6- Wooden funnels
19
4. Copper contacts will be attached to each leg
on drone, located under each rotor.
5. Copper contacts on AR Drone are wired to
battery. Contacts cannot be interchanged with
copper coils on the station.
Final Design
Our final design did have many successful runs with
final results like what is depicted in the image on your
left in figure 7. The drone would sometimes overshoot
or undershoot the charging station entirely and would be
unable to regain its orientation on the defined target
before bumping into something. This result is primarily
a result of poor gain values that were in need of
additional tuning. We found that error correction in
error rich closed loops such as this are difficult to
implement successfully without many adjustments in-
between.
Figure 7- Drone with clips on each legs and wires wound to center
Figure 8- Completed station with drone charging onboard
20
Lithium Battery Charging
Cell Balancing
The AR Drone 2.0 uses a 3 cell lithium polymer battery. The battery is located centrally
within the drone and is accessed by removing a protective outer casing. To charge the battery
through our charging station we must connect copper
wiring to each charging lead. Each leg on the
quadcopter is attached with a copper contact that is
then wired to the contact point of each one of the 4
cell contact points.
Each leg on our drone therefore is correlated
to a specific orientation for landing on our charging
station. will have a specific cell that is being balanced
charged. For this reason the AR Drone must connect
at the same orientation every time it lands.
The figure below is a visual representation of lithium
battery balancing.
Battery Balancer Schematic
To balance the cells, a circuit was
constructed to control the flow of current in
and out of each cell. The following figure
depicts the schematic design that was created
to do this procedure. Pspice ORCAD
simulation is located on the left and EAGLE
PCB schematic is located to the right. These
schematics have been tested to work within a
simulation. Figure 10- PSpice Battery Balancing Schematic
Figure 9. Battery Cells
21
The schematic located above will
charge the lithium polymer battery at a
constant current. This mode of
charging is the fastest way to charge
the battery and will as of a result
maximize operation time.
Maximizing Operation Time
In order to maximize the
operation time which indicates we
want to short the charging time and
increase quadcopters flying time. The
figure shown below[5] is the
characteristic graph of typical LiPo Battery charging. In order to charge our battery in about an
hour, we want to keep charging our battery in the red region shown below.
Reasons are as followed.
● If we charge the quadcopter below 15%, it will reduce the life of the battery which againsts our
purpose to maximizing the operation time.
● When the battery has been charged about 80%, it requires another 2 hours to fully charge the
battery.
● During the 80% to 100% charging period, the battery will also increase the temperature
dramatically. But we want temperature be remained under 65 Celsius.
● We decided to charge the battery from 15% to 80% which will maximize our operation time.
Fig 14. LiPo Cell Characteristics
Figure 11 - Eagle PCB Battery Balance PCB
22
4. Project Plan
Budget
For this project, our sponsor was willing to spend 1,000 dollars. The software design only
requires the expense of the AR.Drone Flight Recorder, however, we decided to not use the GPS
navigation system after further testing. For the image processing, we will be using open source
software that is free for public use. The was AR.Drone is provided by the Department of
Electrical and Computer Engineering, thus the only development cost will come from the
hardware design of the battery charging station. Below is a table that outlines our spendings. We
ended up spending about 300 dollars which left us 700 dollars left over which can be used for
future
improvement to the design.
Figure 8. Budget Layout
The current budget plans to be well within our given constraints. Our budget was
calculated in terms of only completing our goal for this project with one charging station. Further
development such as multiple charging stations will lead to an increase in our expected spending.
23
Timeline Figure 17. Overview of our Timeline
Solving our problem included numerous amounts of planning, research, design,
development, and testing. This was a dense project and there are many parts that must be handled
separately. To begin, we started by researching the two sides of our project and possible
approaches that we can take. After deciding the path that we wanted to follow for designing a
charging station (conductive charging) and autonomously flying and tracking the station with the
drone (image processing), we went on to figure out what algorithms we needed to to control the
drone and came up with a part list for the charging station.
The algorithms to autonomously fly the drone using scripts in ROS and Python were
coded and we made iterations to it throughout the year. Once the code was working we then
needed to implement this code into the drone and finish developing and testing it. An outline for
image processing using ROS nodes and online packages must also be imported, developed and
tested.
The outline for the hardware design was also created throughout the semester. Once we
came up with an approach to designing a station and a part list, we made schematics for the
circuit design. Our team then ordered parts and put together the station to get ready for testing.
We then ran numerous simulations and tweaked the design until it was working as we desired.
24
5. Conclusion
Summary
Our team was assigned to customize a commercially available quad-rotor UAV and
demonstrate a functional autonomously charging aerial vehicle system that locate a charging
dock automatically when battery levels reached ‘low’ status. The autonomous system navigates a
quadcopter to our charging station using only video frames captured onboard the drone and then
converted off board on a much faster processor.
The drone must take off and fly itself autonomously. To achieve autonomous flight, we
used ROS in Python with an openCV library and an ARlib library as well. The openCV library
used numpy, a matrix algebra library for image inversions and transformations. Comparing
frame by frame would typically be impractical and bottle neck the wireless connection, causing
delay times and crashing our drone in some cases. To solve this issue, we blur each frame to
search arrays faster because we are only interested in a certain color range; this increases
response time dramatically.
The hardest part of the autonomous AR Drone system is predicting inconsistencies with
the AR drone 2.0. The drone would occasional not take off when our program started and
sometimes would not turn off when our program was terminated. We tried implementing
controllers to reduce error as well but found tuning the controllers to be overall ineffective on the
overall performance of our system.
The charging station used a team developed battery balancing circuit but was not
supplying the battery load with more than 138mA of current at any given point. The battery
balancing unit commercially developed produced higher values closer to 780mA, providing
charge times of about 1 hour. The funnel design on our station seems to be effective but could
optimized with shorter angles; sand paper could be used to readjust the funnels. Legs were
weakly attached to allow for removal and reattachment of legs.
Future Work
In our autonomous system we used only color tracking with contour threshold restrictions
to define shapes. Many superior techniques exist as alternatives to our approach but require
additional processing power. The color tracking of our AR Drone used a large portion of the
25
available resources both on and off board as we were experiencing slight delays and sometimes
connection drops from between the two devices; laptop and drone. We did do some research, we
found that a SLAM algorithm could be theoretically implemented on newer drones. Mapping of
a 3d environment is processing intensive so a filtering algorithm would be needed save memory
and processing cycles.
6. References
1. https://www.amazon.com/Parrot-AR-DRONE-2-0-1500mAh-Battery/dp/B00DAL5GD2
2. http://penumbrachamber.com/7-smart-techniques-boost-smartphone-battery-life/
3. https://www.amazon.com/Kastar-2000mAh-Upgrade-Ar-drone-
Helicopter/dp/B01N2NX6PC/ref=sr_1_1?s=photo&ie=UTF8&qid=1480959074&sr=1-
1&keywords=Parrot+AR.DRONE+2.0+-+2000mAh+LiPo+Battery
4. http://electronicdesign.com/site-
files/electronicdesign.com/files/archive/electronicdesign.com/files/29/12195/figure_01.gif
5. May 7, 2009 Staff | Electronic Design, Staff. "Design A Linear Li-ion Battery Charger For Portable
Systems." Electronic Design. Electronic Design, 7 May 2009. Web. 11 Nov. 2016.