1
Turning Autonomous Navigation and Mapping Using Monocular Low-Resolution Grayscale Vision VIDYA MURALI AND STAN BIRCHFIELD CLEMSON UNIVERSITY ABSTRACT ALGORITHM OVERVIEW AUTONOMOUS DRIVING DOWN THE CORRIDOR EXPERIMENTAL RESULTS CONCLUSION An algorithm is proposed to answer the challenges of autonomous corridor navigation and mapping by a mobile robot equipped with a single forward-facing camera. Using a combination of corridor ceiling lights, visual homing, and entropy, the robot is able to perform straight line navigation down the center of an unknown corridor. Turning at the end of a corridor is accomplished using Jeffrey divergence and time-to-collision, while deflection from dead ends and blank walls uses a scalar entropy measure of the entire image. When combined, these metrics allow the robot to navigate in both textured and untextured environments. The robot can autonomously explore an unknown indoor environment, recovering from difficult situations like corners, blank walls, and initial heading toward a wall. While exploring, the algorithm constructs a Voronoi-based topo- geometric map with nodes representing distinctive places like doors, water fountains, and other corridors. Because the algorithm is based entirely upon low-resolution (32 x24) grayscale images, processing occurs at over 1000 frames per second. END OF THE CORRIDOR Navigate the corridor: Ceiling lights, Entropy, Homing Home image End of the corridor Jeffrey Divergence (J) Time To Crash (TTC) The mobile robot’s navigational behaviour is modelled by a set of paradigms that work in conjunction to correct its path in an indoor environment based on different metrics. Special emphasis is placed on using low resolution images for computational efficiency and metrics that capture information content and variety that cannot be represented using traditional sparse features and methods. The resulting algorithm enables end-to-end navigation in unstructured indoor environments with self-directed decision making at corridor ends, without the use of any prior information or a map. Results are shown for autonomous navigation in three complete corridors in Riggs Hall, Clemson University. The robot displays tropism, turning at corridor ends, and continues by searching for lights controlled by entropy. Entropy: To avoid blank walls Entropy: To detect open corridors in a T-junction AUTONOMOUS MAPPING Joint Probability distribution of distinct landmark measures gives a topological set of landmarks (based on the regional maxima of Pxy(X,Y)), which are superimposed on the navigation path to give a Voronoi-based map of the environment, where links represent the collision-free path and the nodes represent the left/right landmarks. Time To Crash (TTC) : The time taken for the viewed surface to reach the camera COP. G and E t are spatial and temporal image brightness derivatives respectively. Relative Entropy: The symmetric Jeffrey divergence J(p,q) is calculated between two image graylevel histograms (p,q) to arrive at the relative entropy. This metric measures how different the current scene is from a given image. X and Y are image entropy (of the image) and Jeffrey divergence (between consecutive images) along the route respectively. Therefore Pxy represents distinctiveness. N Y N Y AUTONOMOUS DRIVING ACTION AT CORRIDOR END DETECTING THE END OF THE CORRIDOR Y N Y N Lights visible && Entropy >H low Jeffrey Divergence > J th && Time-to-collision < T min || Entropy < H low Lights visible Entropy > H high Centering on ceiling lights Homing

Turning

Embed Size (px)

DESCRIPTION

Autonomous Navigation and Mapping Using Monocular Low-Resolution Grayscale Vision VIDYA MURALI AND STAN BIRCHFIELD CLEMSON UNIVERSITY. AUTONOMOUS DRIVING DOWN THE CORRIDOR. ABSTRACT. EXPERIMENTAL RESULTS. ALGORITHM OVERVIEW. END OF THE CORRIDOR. AUTONOMOUS MAPPING. CONCLUSION. - PowerPoint PPT Presentation

Citation preview

Page 1: Turning

Turning

Autonomous Navigation and Mapping Using Monocular Low-Resolution Grayscale VisionVIDYA MURALI AND STAN BIRCHFIELD

CLEMSON UNIVERSITY

ABSTRACT

ALGORITHM OVERVIEW

AUTONOMOUS DRIVING DOWN THE CORRIDOR EXPERIMENTAL RESULTS

CONCLUSION

An algorithm is proposed to answer the challenges of autonomous corridor navigation and mapping by a mobile robot equipped with a single forward-facing camera. Using a combination of corridor ceiling lights, visual homing, and entropy, the robot is able to perform straight line navigation down the center of an unknown corridor. Turning at the end of a corridor is accomplished using Jeffrey divergence and time-to-collision, while deflection from dead ends and blank walls uses a scalar entropy measure of the entire image. When combined, these metrics allow the robot to navigate in both textured and untextured environments. The robot can autonomously explore an unknown indoor environment, recovering from difficult situations like corners, blank walls, and initial heading toward a wall. While exploring, the algorithm constructs a Voronoi-based topo-geometric map with nodes representing distinctive places like doors, water fountains, and other corridors. Because the algorithm is based entirely upon low-resolution (32 x24) grayscale images, processing occurs at over 1000 frames per second.

END OF THE CORRIDOR

Navigate the corridor: Ceiling lights, Entropy, Homing

Home image End of the corridor Jeffrey Divergence (J) Time To Crash (TTC)

The mobile robot’s navigational behaviour is modelled by a set of paradigms that work in conjunction to correct its path in an indoor environment based on different metrics.

Special emphasis is placed on using low resolution images for computational efficiency and metrics that capture information content and variety that cannot be represented using traditional sparse features and methods.

The resulting algorithm enables end-to-end navigation in unstructured indoor environments with self-directed decision making at corridor ends, without the use of any prior information or a map.

The system forms the basis of an autonomous mapping system using low resolution metrics.

Results are shown for autonomous navigation in three complete corridors in Riggs Hall, Clemson University. The robot displays tropism, turning at corridor ends, and continues by searching for lights controlled by entropy.

Entropy: To avoid blank walls

Entropy: To detect open corridors in a T-junction

AUTONOMOUS MAPPING

Joint Probability distribution of distinct landmark measures gives a topological set of landmarks (based on the regional maxima of Pxy(X,Y)), which are superimposed on the navigation path to give a Voronoi-based map of the environment, where links represent the collision-free path and the nodes represent the left/right landmarks.

Time To Crash (TTC) : The time taken for the viewed surface to reach the camera COP. G and Et are spatial and temporal image brightness derivatives respectively.

Relative Entropy: The symmetric Jeffrey divergence J(p,q) is calculated between two image graylevel histograms (p,q) to arrive at the relative entropy. This metric measures how different the current scene is from a given image.

X and Y are image entropy (of the image) and Jeffrey divergence (between consecutive images) along the route respectively. Therefore Pxy represents distinctiveness.

N

YN

Y

AUTONOMOUS DRIVING

ACTION AT CORRIDOR END

DETECTING THE END OF

THE CORRIDOR

Y

N

Y

N

Lights visible&&

Entropy >Hlow

Jeffrey Divergence > Jth&&

Time-to-collision < Tmin||

Entropy < Hlow

Lights visible

Entropy > Hhigh

Centering on ceiling lights

Homing