8
Project Portfolio * Qian Qian [email protected] July 2, 2017 Contents 1 Automatic takeoff and landing of the Songbird Vertical Takeoff and Landing (VTOL) Unmaned Aerial Vehicle (UAV) 2 2 Self-balance robot (Hobby Project) 3 3 Localization of Humanoid robot NAO in RoboCup Standard Plat- form League (SPL) (Master Thesis) 3 4 Hand Detection using OpenCV and Robot Arm Control 4 5 Virtual drum player using wireless sensor nodes equipped with accelerometer 5 6 Building, stabilization and vision control of a Quadcopter 5 7 JPEG decoding using three-core multiprocessor 6 8 GPU optimization for Viola-Jones face detection 7 9 Android Phone Based Robot with Internet Remote Control (Bach- elor Thesis) 7 * The projects are listed in most recent first order 1

Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

Embed Size (px)

Citation preview

Page 1: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

Project Portfolio∗

Qian [email protected]

July 2, 2017

Contents

1 Automatic takeoff and landing of the Songbird Vertical Takeoff andLanding (VTOL) Unmaned Aerial Vehicle (UAV) 2

2 Self-balance robot (Hobby Project) 3

3 Localization of Humanoid robot NAO in RoboCup Standard Plat-form League (SPL) (Master Thesis) 3

4 Hand Detection using OpenCV and Robot Arm Control 4

5 Virtual drum player using wireless sensor nodes equipped withaccelerometer 5

6 Building, stabilization and vision control of a Quadcopter 5

7 JPEG decoding using three-core multiprocessor 6

8 GPU optimization for Viola-Jones face detection 7

9 Android Phone Based Robot with Internet Remote Control (Bach-elor Thesis) 7

∗The projects are listed in most recent first order

1

Page 2: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

1 Automatic takeoff and landing of the Songbird

VTOL UAV

Programming Language: C++, JavascriptHardware: Arduino Mega2560, mpu6050, pixhawkTechniques: Attitude and Heading Reference System (AHRS), Sensor fusion, PIDcontrol, Real-time scheduler, Gazebo simulationDuration: 1 year 8 monthCompany and position: germandrones gmbh, senior system engineerRefer to: Demo video

Between 2015.11 and 2017.06, I was working at germandrones gmbh, a startupbuilding vertical take off and landing UAVs. The UAV is designed as a fixed-wingplane but with four tilting rotors, so it can takeoff/land vertically and still fly like aplane.

Figure 1: Songbird1400 from german-drones

My responsibilty was to develop an au-topilot for the UAV which will allowsit to takeoff, transition to mission andland automatically without human in-tervention. Worth to mention is that Iwas the only software engineer in thiscompany, so I have taken great respon-sibilties in problem analysis, softwaredesign, simulation and flight test. Forthe autopilot software, I have designed

the software archtecture, real-time scheduler, control algorithms, attitude estimation,etc. Through almost two years’ effect, I have accomplished this task and convertedthe UAV from fully manual controlled to fully autonomous. Major tasks includes thedesign of:

• Quaternion based attitude estimation.

• IMU calibration algorithm.

• Real time scheduler.

• PID controller for attitude, altitude and positon control in copter mode.

• Softare achitecture design.

• Auto takeoff, transition and landing logic.

• Gazebo simulation interface.

• GCS communication interface.

2

Page 3: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

2 Self-balance robot (Hobby Project)

Programming Language: C++Hardware: Arduino, IMU, bluetoothTechniques: PID control, Kalman filter for IMU, PWM, openSCAD 3D modelingDuration: 1 monthRefer to: Demo video

I always wanted to build a Segway-like self-balance robot, so I started doing thishobby project after my thesis.

Figure 2: Self-made balance robot

I designed the structure of the balancerobot using openSCAD and printed itusing a 3D-printer. The stucture is tohold the motors as well as the Arduinoand the IMU. I used a Kalman filter tocalculate the angle of the robot accordingto the gyro and accelerometer input.The angle is then used as the input of aPID controller to control the speed of themotors. The bluetooth module enablesme to tune the PID parameters easilyfrom my phone. For detailed code and

list of hardwares, please visit my Github.

3 Localization of Humanoid robot NAO in RoboCup

SPL (Master Thesis)

Programming Language: C++, python, cythonHardware: NAO robot V4 (Intel [email protected],1GB RAM)Techniques: Kalman filter, Particle filter, featuredetection and matching, robotics programming,python optimizationDuration: 1 yearRefer to: my master thesis

This is the master thesis project I have done under DAInamite team in TU Berlin.In this project I have investigated and implemented a multi-hypotheses Kalman filterbased localization for NAO robot in RoboCup.

3

Page 4: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

The Kalman filter based algorithm takes motion and vision result as input andperforms odometry update followed by a measurement update. The update isperformed for each Extended Kalman Filter (EKF) sample.

Figure 3: Algorithm tested in real game at Frank-furt

In the end, based on howgood the feature landmarks arematched with the correspon-dence, the best sample is chosenand its state’s value is used asthe robot pose.

The main challenges lies in thisproject are that the robot hasonly partial view of the fieldand the landmarks are ambigu-ous, so hard to find the cor-respondence. To tackle theseproblems, method like nearestneighbor feature matching and L, T, X junction detection has been deployed. Particlefilter technique like resampling by landmark is also used to help recovery kidnappedrobot.

4 Hand Detection using OpenCV and Robot Arm

Control

Programming Language: C++Hardware: Freescale 8-core embedded platformTechniques: OpenCV image processing, back-ground substraction, convex hull, mean-shift, UDPDuration: 4 monthsRefer to: Demo video

Figure 4: OpenCV detection result

During my internship at FraunhoferFokus, I developed an algorithm whichcan detect and track human handsusing a down-facing camera. It isshown as part of Fraunhofer’s demon-strations in Internationale Luft- undRaumfahrtaustellung (ILA), Berlin.

The algorithm works by first using back-ground substraction to extract the con-tour of the hand. Then I find theconvex hull and convex defects of the

4

Page 5: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

hand contour. The number of convex defects are to determine whether the handis a palm or a fist. Meanshift method is used to track the hand motion.

In the end, the hand position in image plane is then sent out by UDP to control themovement of a robot arm. The robot arm can also grab items when the detectedhand turn from palm to fist.

5 Virtual drum player using wireless sensor nodes

equipped with accelerometer

Programming Language: nesCHardware: TelosbTechniques: TinyOS, digital signal processing andfiltering, broadcast, UDP communication, IoTDuration: 1 monthRefer to: Demo video

Mini project for the Wireless Sensor Network course. I designed a virtual drumplayer, in which user can play drum by holding a sensor node, and the end PC willplay the sound of the drum according to the movement of the user in real time.There is another node in the system which can record and reply the drum song.

Figure 5: Telosb hardware

To detect specific motion of the user,the accelerometer data is first filteredthrough an average filter. Then thegesture of drumming is detected byanalysing the accelerometer data signalpattern.The detected result along withthe timestamp are broadcasted to the PCas well as the intermediate node. So the

drum pattern could be played in real-time as well as be able to re-play later.

6 Building, stabilization and vision control of a

Quadcopter

In this project we managed to build and assemble a flying quadcopter from scatch. Inthe setup, an STM32F3Discovery board is used for lower-level control and stablizationof the quadcopter; a Raspberry Pi is used for higher level control using camera vision.

5

Page 6: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

Programming Language: C, C++Hardware: STM32F3Discovery, Raspberry Piand RPi cameraTechniques: Quadcopter assemble, ArUco markerdetection, OpenCV face detection, PID tuning,FreeRTOSDuration: 5 months

Figure 6: Self-made quadcopter

To achieve real-time control and richfunctionality, FreeRTOS based Open-Pilot firmware is deployed on the STM32to fuse the sensor data and control themotors. In order to fly stably, thePID parameters of the control loop arecarefully tuned for our quadcopter.

The vision processing part is imple-mented on the more computationallypowerful Raspberry Pi. Two algorithmsare ported on it, namely OpenCV facetracking algorithm and ArUco markerdetection algorithm. The end results aresent to openPilot to direcly control the

motion of the quadcopter.

7 JPEG decoding using three-core multiprocessor

Programming Language: CHardware: three core micro-blazer microprocessorTechniques: JPEG decoding pipeline, multi-coreprogramming, synchronizations between coresDuration: 5 months

Ported a JPEG decoder to a three core micro-blazer microprocessor, use data par-allelism and function parallelism to speedup the program. After the parallelization,the code can runs up-to 3 times faster.

6

Page 7: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

8 GPU optimization for Viola-Jones face detec-

tion

Programming Language: C, CudaHardware: Nvidia Tesla GPUTechniques: GPU programmingDuration: 2 months

Ported a x86 C-code of Viola-Jones face detection to the GPU, use the parallelcomputing capability of the GPU to make it run as fast as possible. After portingto GPU, the optimized running time of the face detection code reducd from 2.2s to130ms.

9 Android Phone Based Robot with Internet Re-

mote Control (Bachelor Thesis)

Programming Language: C, JavaHardware: STM32 micro-controller, SamsungGalaxy II, PCTechniques: Android development, digital audiosignal processing (FFT), DMA, servo and motorcontrol, mechanical building of a robot carDuration: 6 monthsRefer to: Demo video

This is the bachelor thesis I have done in Nanjing University. I realized an innovativeway of controlling a robot car using the combination of an Android phone and a PC.

Figure 7: The self-made robot with An-droid phone equipped

Through the camera of the phone, themanipulator can have the first personview of the robot i.e.monitor the objectsin front of the robot car through the PCscreen and wirelessly control the car tomove using the PC keyboard.

The communication between the An-droid phone and the micro-controller isusing DTMF audio signal. After decod-ing using FFT, the control signals willbe used to control the speed of the robotmotors and the angle of the servo which

7

Page 8: Project Portfolio - GitHub Pagesxeonqq.github.io/downloads/project_portfolio.pdf · Mini project for the Wireless Sensor Network course. I designed a virtual drum I designed a virtual

holds the phone. Moreover, the image data of the android phone is sent through thenetwork to the PC terminal and the PC terminal can send commands to the phoneas well.

8