1
Robotics at HMC: Summer 2008 Robotics at HMC: Summer 2008 Devin Smith '09, Hannah Hoersting '09, Lesia Bilitchenko '10 (Cal Poly Pomona), Sabreen Lakhani '11, Becky Green '11, Pam Strom '11, Kate Burgers '11, Zeke Koziol '10, Elaine Shaver '09, Peter Mawhorter '08, and Zachary Dodds Students at Harvey Mudd have the opportunity to engage in robotics projects and research throughout their time at HMC. In the summer of 2008, for example, three different projects involving first-year, sophomore, junior, and senior students resulted in Three novel, low-cost robot platforms leveraging off- the-shelf hardware for both indoor (Qwerkbot, Create) and outdoor (PowerWheels) autonomy Mapping implementations (FastSLAM) accessible enough to run on any platform with landmark-recognition capabilities, e.g., Scribbler and Create An investigation of the capabilities of image profiles as a basis for vision-only robot autonomy, including odometry, control, and mapping Successful entry into 2008's AAAI robot exhibition (Chicago; July, 2008) and (possibly successful) to Tapia's robotics contest (Portland; April, 2009). This work was made possible by funds from NSF DUE #0536173, the Institute for Personal Robots in Education (IPRE), the Baker foundation, and funds and resources from Harvey Mudd College. The Tapia 2009 Robotics Competition AAAI '08 Robot Exhibition: Mapping for All Our platforms The Association for the Advancement of Artificial Intelligence (AAAI) sponsors one of the longest-running robot venues in the world. Elaine, Peter, and Zeke exhibited at AAAI this summer; their mapping platforms earned the blue ribbon for education for map-building atop three low-cost platforms; the robots smoothly scale across pedagogical levels: they engaged middle-school girls who visited the event (below) and will be used in HMC's Fall '08 seminar for incoming students entitled Robotics: hardware and software. The AAAI venue Acknowledgments The power of image profiles Closing the loop: mapping via image profiles The vision: Vision-only vehicles Indoor mapping platforms at HMC and an example run of FastSLAM's map- updating: Do image profiles suffice to build consistent maps of the environment? Hannah Hoersting '09 and Lesia Bilitchenko '10 designed, implemented, and tested algorithms for matching locations through their image profiles. These matches, in turn, enable large- scale correction of the inevitable drift from incremental odometric data. Multiple-hypothesis tracking allows future observations to correct ambiguous past data. The result is a topologically consistent map for which shortest-path graph algorithms suffice to navigate. A team of four first-year students designed, built, and tested an iRobot Create-based entry for the upcoming Tapia robotics competition, held in Portland, OR in April 2009. The task is a search for "survivors" in a partially unknown environment: their solution merges hardware and software through several new Robotics at HMC systems: an OpenCV- based landmark recognition system, a Python GUI to facilitate testing, and a Java mapping interface, all integrated into a state machine that guides and governs landmark-finding and returns the robot to the start entirely autonomously. How will it do? We'll see… . Visual input offers substantial promise as a robotic sensor -- indeed, there is too much information captured by the 2d images of a scene: disentangling the contributions of lighting, object characteristics, and 3d geometry is challenging, to say HMC earned the AAAI '08 blue ribbon for education among the 17 robot exhibits. Middle-school visitors to the conference tried out the robots. offers HMC students a chance to deeply consider AI. HMC alums now study AI, vision, and robotics at CMU, UCSD, UW, UCLA, Utah, Oregon State, and Duke. An overhead map built from OpenCV images taken by a Create robot Cameras are cheap and available sensors, and natural agents (us!) offer a tantalizing vision: autonomous navigation and spatial reasoning using visual data alone. Each of the platforms investigated at HMC use pixels as their primary source of information about their surroundings. 2008's projects extend earlier HMC work on the foundations of vision-only vehicles. 2007's projects included revamping the OpenCV library's support for importing and exporting images on Mac OS X; three- dimensional reconstruction from image sequences; and deploying robots atop these capabilities. We look forward to layering further capabilities atop this year's student work during the summer of 2009 and beyond. Raw visual odometry (top), en route (left) to a corrected map (right), through profile matching Indoor Students have built, modified, and programmed several platforms for indoor exploration. At left are images of landmark-based mapping on an iRobot Create through a laptop onboard. Using the OLPC platform (left) yields a remarkably accessible One Robot Per Child. The Scribbler/Fluke combination (below, left) is even more cost-effective, while the Qwerk-based robot offers an array of sonars and camera-panning capabilities. Map-building With vision onboard, these systems build maps by integrating landmark sightings FastSLAM uses both Kalman and particle filters in order to cull Outdoor HMC has investigated accessible outdoor robotics via several PowerWheels vehicles. Equipped with cameras for path following, GPS for position tracking, and sonars for avoiding obstacles, these low-cost platforms highlight both the engineering and computational capabilities of HMC students. With these we run a local version the least. Devin Smith '09 set out to investigate the limits of image profiles, feature vectors of pixel sums. Matching these sums yields video-only odometry and enables a camera-based virtual compass. Intra-image differences suggest interesting destinations and can also control velocity and avoid walls, resulting in a vision- only system with full autonomy. Larger-scale mapping requires matching locations. That effort progressed in parallel in our REU: Snapshots of real-time compass and map updating, along with the profile-based interest operator choosing the robot's next heading to be the hallway's end. HMC's entry to the 2009 Tapia robotics competition, center. Other images show the robot's landmark recognition, the team's mapping interface. Also, the project in action at HMC and inaction at Panera. QuickTime™ TIFF (Uncompressed) are needed to see and a ) decompressor e this picture. possibilities from a population of hypotheses. Such survival-of-the- fittest approaches can adapt to the computational resources currently available. On the left the red circle represents the robot's (badly incorrect) odometry and the green show estimated true positions within their maps' landmarks, as the One-robot-per- child platform navigates among the landmarks to the right. after matching one segment after matching all segments the raw odometry from a four-loop run centering and approaching a new marker parallel-park adjustment marker ID'ed; reverting to wall- following Elaine Shaver '09 guiding middle-school visitors to AAAI as they problem-solve with HMC's accessible platforms. The plots below compare our visual odometry (right) with RatSLAM's original, at left. Our approach better estimates sharp corners and overall topology, as the robot did, in fact, intersect its path where shown: The yellow curve is an image profile, the intensity sum of the image's pixels; the red curve measures the absolute changes in that yellow profile - it offers a heuristic for identifying "interesting" headings for the robot to pursue. 15 video frames and about 15° separate these images, creating a visual compass from profiles. … matching another image from leg 10 An image from leg 6… robot markers map state sonars

Robotics at HMC: Summer 2008 Devin Smith '09, Hannah Hoersting '09, Lesia Bilitchenko '10 (Cal Poly Pomona), Sabreen Lakhani '11, Becky Green '11, Pam

Embed Size (px)

Citation preview

Page 1: Robotics at HMC: Summer 2008 Devin Smith '09, Hannah Hoersting '09, Lesia Bilitchenko '10 (Cal Poly Pomona), Sabreen Lakhani '11, Becky Green '11, Pam

Robotics at HMC: Summer 2008Robotics at HMC: Summer 2008Devin Smith '09, Hannah Hoersting '09, Lesia Bilitchenko '10 (Cal Poly Pomona), Sabreen Lakhani '11,

Becky Green '11, Pam Strom '11, Kate Burgers '11, Zeke Koziol '10, Elaine Shaver '09, Peter Mawhorter '08, and Zachary Dodds

Students at Harvey Mudd have the opportunity to engage in robotics projects and research throughout their time at HMC. In the summer of 2008, for example, three different projects involving first-year, sophomore, junior, and senior students resulted in

Three novel, low-cost robot platforms leveraging off-the-shelf hardware for both indoor (Qwerkbot, Create) and outdoor (PowerWheels) autonomy

Mapping implementations (FastSLAM) accessible enough to run on any platform with landmark-recognition capabilities, e.g., Scribbler and Create

An investigation of the capabilities of image profiles as a basis for vision-only robot autonomy, including odometry, control, and mapping

Successful entry into 2008's AAAI robot exhibition (Chicago; July, 2008) and (possibly successful) to Tapia's robotics contest (Portland; April, 2009).

This work was made possible by funds from NSF DUE #0536173, the Institute for Personal Robots in Education (IPRE), the Baker

foundation, and funds and resources from Harvey Mudd College.

The Tapia 2009 Robotics Competition

AAAI '08 Robot Exhibition: Mapping for All

Our platforms

The Association for the Advancement of Artificial Intelligence (AAAI) sponsors one of the longest-running robot venues in the world. Elaine, Peter, and Zeke exhibited at AAAI this summer; their mapping platforms earned the blue ribbon for education for map-building atop three low-cost platforms; the robots smoothly scale across pedagogical levels: they engaged middle-school girls who visited the event (below) and will be used in HMC's Fall '08 seminar for incoming students entitled Robotics: hardware and software. The AAAI venue

Acknowledgments

The power of image profiles

Closing the loop: mapping via image profiles

The vision: Vision-only vehicles

Indoor mapping platforms at HMC and an example run of FastSLAM's map-updating:

Do image profiles suffice to build consistent maps of the environment? Hannah Hoersting '09 and Lesia Bilitchenko '10 designed, implemented, and tested algorithms for matching locations through their image profiles. These matches, in turn, enable large-scale correction of the inevitable drift from incremental odometric data. Multiple-hypothesis tracking allows future observations to correct ambiguous past data. The result is a topologically consistent map for which shortest-path graph algorithms suffice to navigate.

A team of four first-year students designed, built, and tested an iRobot Create-based entry for the upcoming Tapia robotics competition, held in Portland, OR in April 2009. The task is a search for "survivors" in a partially unknown environment: their solution merges hardware and software through several new

Robotics at HMC

systems: an OpenCV-based landmark recognition system, a Python GUI to facilitate testing, and a Java mapping interface, all integrated into a state machine that guides and governs landmark-finding and returns the robot to the start entirely autonomously. How will it do? We'll see… .

Visual input offers substantial promise as a robotic sensor -- indeed, there is too much information captured by the 2d images of a scene: disentangling the contributions of lighting, object characteristics, and 3d geometry is challenging, to say

HMC earned the AAAI '08 blue ribbon for education among the 17 robot exhibits. Middle-school visitors to the conference tried out the robots.

offers HMC students a chance to deeply consider AI. HMC alums now study AI, vision, and robotics at CMU, UCSD, UW, UCLA, Utah, Oregon State, and Duke.

An overhead map built from OpenCV images

taken by a Create robot

Cameras are cheap and available sensors, and natural agents (us!) offer a tantalizing vision: autonomous navigation and spatial reasoning using visual data alone. Each of the platforms investigated at HMC use pixels as their primary source of information about their surroundings.

2008's projects extend earlier HMC work on the foundations of vision-only vehicles. 2007's projects included revamping the OpenCV library's support for importing and exporting images on Mac OS X; three-dimensional reconstruction from image sequences; and deploying robots atop these capabilities.

We look forward to layering further capabilities atop this year's student work during the summer of 2009 and beyond.

Raw visual odometry (top), en route (left) to a corrected map (right), through profile matching

Indoor Students have built, modified, and programmed several platforms for indoor exploration. At left are images of landmark-based mapping on an iRobot Create through a laptop onboard. Using the OLPC platform (left) yields a remarkably accessible One Robot Per Child. The Scribbler/Fluke combination (below, left) is even more cost-effective, while the Qwerk-based robot

offers an array of sonars and camera-panning capabilities.

Map-building

With vision onboard, these systems build maps by integrating landmark sightings FastSLAM uses both Kalman and particle filters in order to cull

Outdoor HMC has investigated accessible outdoor robotics via several PowerWheels vehicles. Equipped with cameras for path following, GPS for position tracking, and sonars for avoiding obstacles, these low-cost platforms highlight both the engineering and computational capabilities of HMC students. With these we run a local version of Penn State Abington's Mini Grand Challenge.

the least. Devin Smith '09 set out to investigate the limits of image profiles, feature vectors of pixel sums. Matching these sums yields video-only odometry and enables a camera-based virtual compass. Intra-image differences suggest interesting destinations and can also control velocity and avoid walls, resulting in a vision-only system with full autonomy. Larger-scale mapping requires matching locations. That effort progressed in parallel in our REU:

Snapshots of real-time compass and map updating, along with the profile-based interest operator choosing

the robot's next heading to be the hallway's end.

HMC's entry to the 2009 Tapia robotics competition, center. Other images show the robot's landmark recognition, the team's mapping interface. Also, the project in action at HMC and inaction at Panera.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

possibilities from a population of hypotheses. Such survival-of-the-fittest approaches can adapt to the computational resources currently available.

On the left the red circle represents the robot's (badly incorrect) odometry and the green show estimated true positions within their maps' landmarks, as the One-robot-per-child platform navigates among the landmarks to the right.

after matching one segment after matching all segments

the raw odometry from a four-loop run

centering and approaching a new marker

parallel-park adjustment

marker ID'ed; reverting to

wall-following

Elaine Shaver '09 guiding middle-school visitors to AAAI as they

problem-solve with HMC's accessible platforms.

The plots below compare our visual odometry (right) with RatSLAM's original, at left. Our approach better estimates sharp corners and overall topology, as the

robot did, in fact, intersect its path where shown:

The yellow curve is an image profile, the intensity sum of the image's pixels; the red curve measures the absolute changes in that yellow profile - it offers a heuristic for identifying "interesting" headings for the robot to pursue.

15 video frames and about 15° separate these images, creating a visual compass from profiles.

… matching another image from leg 10

An image from leg 6…

robot

markers

map

state

sonars