Design and Prototype of a Self- Navigating Intelligent Programmable Mower William Farner Michael...

Preview:

Citation preview

Design and Prototype of a Self-Navigating Intelligent

Programmable Mower

William FarnerMichael KurdzielJohn Martellaro

Przemyslaw ZalewskiRIT Multidisciplinary Senior

DesignFall 2006 - Team P06113

Design Objectives•Prototype autonomous lawnmower•Programmable operation•Object avoidance•GPS positioning•Wireless operation•Free of perimeter wire•Safe

Objectives Achieved

• Motor interface

• Programmable path following

• Wireless operation through remote and wi-fi

• Grass detection through camera vision

• DGPS enhanced positioning

• Self-contained

• Basic safety features

Platform•Evatech lawnmower platform

•Electric 24V, 30Amp motors

•Differential drive

•Near-zero turn radius

•Drivable without combustion engine

•Approximately 200 lb.

Odometry System•Bourns EM-14 wheel encoder

attached directly to each drive axle

•Each encoder is monitored by an Avago Technologies HCTL-2021 quadrature decoder

Tracking Wheel Motion

• Bourns EM-14 Optical Encoder• Two output signals, 90° out of phase• 64 counts per revolution• Four unique ‘states’• Avago Technologies HCTL-2021• Tracks displacement as well as direction• 4x decoder (one count per state transition)• 16-bit up/down counter• High noise immunity (digital noise filter)

99.5 cm. wheel circumference 0.389 cm. motion resolution

64 encoder counts per rev. x 4 states per count = 256 decoder counts per rev.

•Hitachi HM55B Module

•Dual-axis magnetic field sensor

• 64 direction resolution (6-bit, 5.625º)

• Sends x and y components of magnetic field

Compass Module

q = atan(-y/x )

• MC9S12DT256 MCU communicates with computer over a serial interface to relay sensor information

• Currently sends quadrature decoder data as well as compass heading

• Computer can query MCU at any time (one byte per query message)

• MCU responds with query message (useful as a sanity check) followed by sensor data

Sensor Interface

• One MC9S12DT256 MCU dedicated to receiving speed/steering commands from the computer (serial interface) and controlling the drive motors

• Motors controlled by emulating PWM signals from Futaba RC module

• Multiplexer (controlled by MCU) allows MCU to choose whether motors are controlled by remote control or computer

• MCU reverts control to the remote control when a command has not been received in the last second

Motor Controller

• Used to understand how the system would react to speed/steering combinations

• Automated test, sent a command continuously and obtained wheel displacements to find turn radius

• Results used in pure-pursuit algorithm

Turn Radius Testing

R =LA

dodi- 1

R = turn radius, LA = axle length, dx= outer/inner wheel displacement

Ideally, each plot would reach a vertical asymptote at a steering

command of 5

Turn Radius TestingTurn Radius vs. Steering Command

-600

-400

-200

0

200

400

600

800

1000

0 2 4 6 8 10 12

Steering Command

Turn Radius (cm)

ImagingLogitech QuickCam-3000 Pro Key

Features• 640 x 480 Streaming Video• YUV Color Palette• Worked with Video for Linux 2 API• Exceptional Linux Driver Support and

Documentation

• Currently V4L 2 API only supports •frame grabbing of video

• Camera used the Philips Web Camera Drivers (Allowed for fine tuning of contrast, brightness, color, hue, and

whiteness)

Grass has two components that can be exploited for detection:

• Texture• Color Channel Ratios

Algorithm:• Bias the image with varying degrees of blue for

portions that don’t exhibit grass texture• Pixelate the image into 40 x 40 blocks (down

sample)• Analyze block colors for grass color

characteristics (RGB Values)Works best with high resolution images, but still successful with web camera quality. Highly dependent on fine grained image detail.

Grass Detection

Problem:• RGB palette based web cameras didn’t work• When used outdoors, the sun is too intense for RGB palette cameras• Images become saturated, and color is lost

Solution:• YUV palette based camera was used• Separating out the luminance channel corrects this problem• Essential to the success of outdoor imaging

Acquired Overhead:• An additional 921,600 calculations had to be performed for

conversion of YUV to RGB• Large 50 megabyte lookup table was created to help performance

Important Finding

Original Capture Biased Image

Pixelated Processed Image

Imaging Phases

Image was broken up into 1600 blocks. The same 40 x 40 blocks, during the pixelation phase.

A script was written to generate paths depending on overlaid colors.

Each path is scored and gauged for acceptable traversal.

Path Planning

Results

Do green objects appear as grass?

The texture phase corrects this problem, and does a relatively good job biasing out objects.

Green Image Test

Positioning•RF Trilateration•Non-uniform radiation pattern

•Custom relative GPS solution•Low accuracy (< 2-5 meters)

•RTKNav software with DGPS receivers•High accuracy, very expensive

•Surveyor grade DPGS unit•Good accuracy, relatively expensive

•Everest multi-path rejection

•Incorporates DGPS Corrections

•< 50 cm real-time accuracy

•NMEA-0183 output format

Trimble Pathfinder Pro XR

• Connect to serial port

• Extract NMEA sentence from stream

• Parse GGA sentence for latitude/longitude and fix information

• Capture initial latitude/longitude

• Compute distance and angle between initial and subsequent latitude/longitude pairs

• Decompose distance into x-y coordinates

Relative Positioning Implementation

Haversine Formula & Angle Calculation

•Traveled a “square” path at a 5° angle

•Incorporates variation in both x and y directions

•Collected GPS data points at 1Hz

•GPS Data was converted into x and y coordinates on the fly

GPS Data Acquisition

•Determine equation of the line with 5° angle and hypotenuse 15.25 meters (50 feet)

•For example: Y = (cos(5)/sin(5))*X

•Compute the absolute difference between the specific equation of the line and the actual x-y coordinate

Accuracy Determination

•Horizontal accuracy is better than vertical

• Spec: Horizontal accuracy (< 50cm)

• Spec: Vertical accuracy (< 1m)

•Max error is < 50 cm for all directions

Accuracy Results

Artificial Intelligence

• Coordinates all mower functions

• Written entirely in C

• Multi-threaded

• Utilizes thread priorities

• Built for expansion

Path Planner• Obtains position information from wheel

encoders and GPS

• Records and replays path

• Uses either pure pursuit or carrot method to follow path

CarrotPure Pursuit

User Input•Control movement through key

commands

•Can signal path planner to record or replay

•Provides access for debugging and monitoring

•Has access to all other thread controls

Command Arbiter•Receives speed and angle votes from

User Input, Imaging, and the Path Planner

•Combines all votes and makes decision

•Sends winning speed and angle to motor controller

•Easily expandable to accept any number of voting modules

Command Fusion• Check each module for

new votes

• Normalize

• Linear Decay

• Weighted Sum

• Judge Votes

• Send winning votes to motor control

GUI

•Written in Java

•Connects to mower through TCP/IP

•Displays location, sensor, imaging, and vote information

•Currently used for debugging

GUI

Difficulties Encountered

• Countless motor problems

• Pure pursuit oscillation

• Multiple types of wireless interference

• Compass interference

• Uneven travel

• Limited battery life

Future Work

•Addition of SONAR obstacle avoidance

•Interactive GUI

•Sensor fusion using Kalman Filter

•Safety sensors

Questions?

Recommended