9
IEEE Robotics & Automation Magazine 20 1070-9932/09/$26.00ª2009 IEEE DECEMBER 2009 Achievements in Space Robotics Expanding the Horizons of Service and Exploration BY KAZUYA YOSHIDA O uter space is an ultimate field for the application of robotics technology. As outer space is a harsh envi- ronment with extreme temperatures, vacuum, ra- diation, gravity, and great distances, human access is very difficult and hazardous and is therefore limited. To assist human activities in space for constructing and maintaining space modules and structures, robotic manipulators have been playing essential roles in orbital operations. Moreover, expanding the horizons of exploration beyond the areas of human access, robots that land and travel on planetary surfaces have been greatly contributing to our knowledge of the solar sys- tem. New challenges are expected in the future. This article consists of three parts. In the first part, the achievements of orbital robotics technology in the last decade are reviewed, highlighting the Engineering Test Satellite (ETS-VII) and Orbital Express flight demonstrations. In the second part, some of the selected topics of planetary robotics from the field robotics research point of view are described. Recent achieve- ments in the author’s laboratory are added as an illustrative exam- ple. Finally, technological challenges to asteroid robotics are discussed. When designing a robot to explore the surface of an asteroid, microgravity raises an interesting problem of how to stick and move on the surface. Some ideas to address these ques- tions are introduced. Orbital Robotics Ideas to assemble space structures by a flying robot in orbit or to conduct servicing missions to existing satellites have been dis- cussed since the 1980s. Figure 1 is an illustration describing the robotic assembling of a space structure, which was published in a National Aeronautics and Space Administration (NASA) report in 1983 [1]. One attractive scenario was to build a space station using teleoperated or autonomous free-flying robots. In reality, the current International Space Station (ISS) has not been con- structed by such free-flying robots but by many hours of human extravehicular activities (EVA) with the assistance of Shuttle Remote Manipulator System (SRMS, Canadarm) and Space Station Remote Manipulator System (SSRMS, Canadarm2). Another important scenario for an orbital free-flying space robot is to retrieve and dock with an existing satellite in orbit and then conduct servicing tasks. Such tasks include replacing components, resupplying expendables, refueling propellants and repairing, rescuing, and reorbiting the satellite. These servicing scenarios have been receiving the researcher’s atten- tion because of their potential for flexible operation of satel- lites, including the extension of mission life and the orbital transfer at the end of life. Space Debris Space debris has become a growing concern in recent years. Collisions at orbital velocities can be highly damaging to func- tional satellites and can also produce even more space debris. Space debris is now a serious hazard to astronauts in the ISS, and therefore, the ISS is armored to mitigate damage from collisions of small-size (less than 1 cm) debris, and it occasion- ally makes collision-avoidance maneuvers for larger-size (more than 10 cm) debris. Recently, there was a major collision Digital Object Identifier 10.1109/MRA.2009.934818 © BRAND X PICTURES Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

Achievements in Space Robotics - Tohoku University …yoshida/paperlist...model-based assistance system to predict the motion of the arm and display the predicted motion in real time

Embed Size (px)

Citation preview

IEEE Robotics & Automation Magazine20 1070-9932/09/$26.00ª2009 IEEE DECEMBER 2009

Achievementsin Space Robotics

Expanding the Horizons

of Service and Exploration

BY KAZUYA YOSHIDA

Outer space is an ultimate field for the application ofrobotics technology. As outer space is a harsh envi-ronment with extreme temperatures, vacuum, ra-diation, gravity, and great distances, human accessis very difficult and hazardous and is therefore

limited. To assist human activities in space for constructing andmaintaining space modules and structures, robotic manipulatorshave been playing essential roles in orbital operations. Moreover,expanding the horizons of exploration beyond the areas ofhuman access, robots that land and travel on planetary surfaceshave been greatly contributing to our knowledge of the solar sys-tem. New challenges are expected in the future.

This article consists of three parts. In the first part, theachievements of orbital robotics technology in the last decade arereviewed, highlighting the Engineering Test Satellite (ETS-VII)and Orbital Express flight demonstrations. In the second part,some of the selected topics of planetary robotics from the fieldrobotics research point of view are described. Recent achieve-ments in the author’s laboratory are added as an illustrative exam-ple. Finally, technological challenges to asteroid robotics arediscussed. When designing a robot to explore the surface of anasteroid, microgravity raises an interesting problem of how tostick and move on the surface. Some ideas to address these ques-tions are introduced.

Orbital RoboticsIdeas to assemble space structures by a flying robot in orbit or toconduct servicing missions to existing satellites have been dis-cussed since the 1980s. Figure 1 is an illustration describing therobotic assembling of a space structure, which was published in aNational Aeronautics and Space Administration (NASA) reportin 1983 [1]. One attractive scenario was to build a space stationusing teleoperated or autonomous free-flying robots. In reality,the current International Space Station (ISS) has not been con-structed by such free-flying robots but by many hours of humanextravehicular activities (EVA) with the assistance of ShuttleRemote Manipulator System (SRMS, Canadarm) and SpaceStation Remote Manipulator System (SSRMS, Canadarm2).

Another important scenario for an orbital free-flying spacerobot is to retrieve and dock with an existing satellite in orbitand then conduct servicing tasks. Such tasks include replacingcomponents, resupplying expendables, refueling propellantsand repairing, rescuing, and reorbiting the satellite. Theseservicing scenarios have been receiving the researcher’s atten-tion because of their potential for flexible operation of satel-lites, including the extension of mission life and the orbitaltransfer at the end of life.

Space DebrisSpace debris has become a growing concern in recent years.Collisions at orbital velocities can be highly damaging to func-tional satellites and can also produce even more space debris.Space debris is now a serious hazard to astronauts in the ISS,and therefore, the ISS is armored to mitigate damage fromcollisions of small-size (less than 1 cm) debris, and it occasion-ally makes collision-avoidance maneuvers for larger-size (morethan 10 cm) debris. Recently, there was a major collisionDigital Object Identifier 10.1109/MRA.2009.934818

© BRAND X PICTURES

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

between a deactivated Kosmos 2251 and an operationalIridium 33 on 10 February 2009. The relative speed of impactwas more than 10 km/s, and both satellites were destroyed,creating and scattering a considerable amount of new debris.After that event, the Space Shuttle and ISS have had increasednumbers of warnings and avoidance maneuvers.

To mitigate the generation of additional debris, variousmeasures have been proposed. One straightforward idea is toremove the satellites out of orbit at the end of their operationallife. Ideally, each satellite could conduct controlled deorbit atthe end of its life, which is sometimes impractical. Rolling outan electrodynamic tether from a spacecraft is one idea to pulldown the orbit. But another more promising idea is roboticreorbit (and not deorbit). The robotic reorbit can work toslightly pull down the orbit for the satellites in low Earth orbit(LEO) and also to push the satellite up to a so-called graveyardorbit for the satellites in geosynchronous Earth orbit (GEO). Itcan also work to change the orbital plane if there is enoughpropulsion capability in the robot. Furthermore, a satellite thathappens to be in an unexpected orbit due to an anomaly of alaunch vehicle or an orbital transfer system could be rescued bytransferring it into a proper orbit.

Although the importance of such retrieval, reorbit, or res-cue missions has been well understood for more than a coupleof decade, the opportunities for in-flight technology demon-stration are limited. The ETS-VII mission (Orihime andHikoboshi) conducted by National Space DevelopmentAgency of Japan (NASDA) in 1997–1999 [2]–[18] and OrbitalExpress mission conducted by Defense Advanced ResearchProjects Agency (DARPA) and Boeing in 2007 [19], [20] aretwo outstanding examples.

Engineering Test SatelliteETS-VII (Figure 2) is an unmanned spacecraft that is equippedwith a 2-m long, six-degree of freedom (6-DoF) manipulatorarm. It was developed and launched by NASDA (currentlyJAXA) in November 1997. The mission objective of ETS-VIIwas to test free-flying robotics technology and demonstrate itsutility in unmanned orbital operation and servicing tasks. Themission consisted of two subtasks: autonomous rendezvous/docking (RVD) and robot (RBT) experiments.

For the RVD experiments, the ETS-VII was separated intotwo pieces of satellites in orbit. The major piece was namedHikoboshi (a prince in a Japanese classical tale) and performed asa chaser. The smaller piece was named Orihime (a princess inthe tale) and acted as a target. Both R-bar and V-bar approach-ing and docking scenarios were conducted successfully usingthe global positioning system (GPS) [3], rendezvous laser radar(RVR) [4], vision-based proximity sensor (PXS), and onboardautonomy [5]. The know-hows for the sensor-based rendez-vous control (particularly for the R-bar approach) are applied tothe guidance-control system of H-II Transfer Vehicle (HTV)for unmanned cargo-supplying mission to the ISS [6].

The RBTexperiments using the onboard 6-DoF manipu-lator arm were conducted during a two-year period by manyorganizations including NASDA [2], [7]–[9], National Aero-space Laboratory (NAL) [10], Electrotechnical Laboratory

(ETL) [11], Communication Research Laboratory (CRL) [12],European Space Agency (ESA) [13], Deutsches Zentrum f€urLuft- und Raumfahrt (DLR) (a German aerospace center) [14],Tohoku University [15]–[17], Tokyo Institute of Technology[18], and Kyoto University [19].

The experiments include the following items.1) Remote surface observation of the satellite was conducted

using video cameras attached at the shoulder and hand ofthe manipulator arm.

2) Exchange of an orbital replacement unit (ORU) wasconducted by using the manipulator arm under teleoper-ation from the ground (Tsukuba Space Center, Japan).The teleoperation was conducted via Tracking and DataRelay Satellite (TDRS) in GEO. ETS-VII orbits in LEOat 550 km altitude. A direct communication timewindow from a single ground station is limited, and eachcommunication time period is about 10 min at most.But, a TDRS link ensures communication in every orbitand extends the window length up to 40 min. However,

Figure 1. A free-flying space robot (telerobotic servicer)discussed in the Automation, Robotics, and MachineIntelligence Systems (ARAMIS) report [1].

(a) (b)

Figure 2. Japanese Engineering Test Satellite (ETS-VII):(a) Orihime and (b) Hikoboshi. (Courtesy of Space RoboticsLab, Tohoku University.)

IEEE Robotics & Automation MagazineDECEMBER 2009 21

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

it requires a much longer path for the signal transmission,and so the time delay in the round-trip communicationwas 5–7 s.

3) The ORU has connectors for electric connection andfuel transfer. Signal transmission and fuel (water) trans-fer between the main satellite and ORU were success-fully demonstrated.

4) The main satellite is equipped with a task board to testthe dexterity of the manipulator arm. Using the taskboard, the compliance and force controls of the armwere successfully demonstrated to follow a curved sur-face and insert a peg in a hole by onboard feedbackcontrol, based on a force/torque sensor at the wrist ofthe arm. Capture of a floating ball was also successfulusing onboard vision.

5) To assist the ground-based teleoperation with 5–7 s oftime delay, two major approaches were tested. One is amodel-based assistance system to predict the motion ofthe arm and display the predicted motion in real time.In this approach, not only the kinematic display but alsothe virtual force reflection to the operator through ahaptic device was verified to be effective in surface-fol-lowing and peg-in-hole tasks [15]. The other is bilateralforce feedback control. With proper selection of theforce feedback gains, stable force feedback control wassuccessfully achieved against such a large time delay andconfirmed to be more effective than conventionalmove-and-wait type of teleoperation (or unilateral forcecontrol) in surface-following and peg-in-hole tasks [19].

6) Practical tasks such as handling a flexible wire and solarcell sheet, mating and demating of an electric connector,fastening a bolt, extracting a deployable truss mecha-nism, connecting truss joints, assembling predefinedcomponents, and tightening the latches were also suc-cessfully demonstrated via teleoperation and/or autono-mously [10]–[12].

7) For a free-flying space robot, dynamic coupling betweenmanipulator reaction and base satellite attitude is also animportant issue. To this problem, two approaches were

tested. One is the coordinated control in which manipu-lator reaction is feedforwarded to the attitude controlsystem [8]. The other is the reactionless manipulation inwhich the manipulator arm moves on a specific path ofno reaction onto the base. Both methods were success-fully demonstrated [16], [17].

8) Demate and mate (berthing) operation of the targetsatellite Orihime was conducted by the manipulatorarm. For this and the following experiments, a hand-rail type of fixture was attached on Orihime togetherwith an optical cue marker.

9) Autonomous chasing and capturing operation of Ori-hime was conducted. For safety reasons, Orihime wasfreely floating inside the open space made by partiallyreleased docking mechanisms (mechanical jaws).Onboard visual feedback tracking was conducted usingthe optical marker on Orihime, and the autonomouscapture of the floating target was successful [9].

Orbital ExpressThe Orbital Express is a DARPA program developed to validatethe technical feasibility of a safe and cost-effective approach toautonomous satellite servicing in orbit [20]. The system consistsof an Autonomous Space Transport Robotic Operations(ASTRO) vehicle, developed by Boeing Integrated DefenseSystems, and a prototype modular next-generation serviceablesatellite, NextSat, developed by Ball Aerospace & TechnologiesCorp. The ASTRO vehicle is equipped with a 3-m long roboticarm to perform satellite capture and ORU exchange operations(Figure 3). The arm was developed by MacDonald, Dettwilerand Associates Ltd., Canada.

After its launch in March 2007, various mission scenarioshad been conducted until early July. The demonstrated technol-ogies and scenarios are similar to those of ETS-VII such as 1),2), 3), 8), and 9) listed earlier, but their autonomy is more high-lighted [21]. As a remarkable advancement, Orbital Expressconducted a sequence of practical subtask scenarios completelyautonomously. First, ASTRO was separated from NextSat to adistance of 7 km, and then ASTRO returned and performed aforced motion for fly-around inspection. After that, ASTROapproached within 1 m of NextSat, reached out with therobotic arm, grappled, and then berthed with it. After successfulmating, propellant transfer and battery and computer ORUtransfers were performed.

In both missions, autonomous docking and manipulatorcapturing/berthing were conducted with a stabilized andcooperative target. Here, cooperative means that the target isequipped with a dedicated fixture to be grasped securely andoptical marks to be detected by the chaser. However, in practi-cal cases, unstabilized or noncooperative targets may need tobe handled, which requires a higher level of technology.

For the issue of the cooperative target fixture, one feasibleidea that has been studied for many years is to hold the nozzlecone of an apogee kick motor engine, which is a commonstructure in GEO satellites [22], [23]. On the other hand, if thetarget is in a tumbling motion, the problem becomes more diffi-cult. First of all, the chaser arm must carefully follow the moving

(a) (b)

Figure 3. Orbital Express: (a) ASTRO and (b) NextSat. (Courtesyof Boeing/DARPA.)

IEEE Robotics & Automation Magazine22 DECEMBER 2009

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

target. If there is residual relative velocity between the grapplingpoint and the gripper at the time of contact, impact forces willoccur, which can dramatically change the motion of thetumbling target. Also, sophisticated control is necessary toaccommodate the angular momentum of the target even after asuccessful capture. This requires comprehensive momentummanagement throughout the entire process from approach topostcontact stabilization [24], [25].

Planetary RoboticsFor the exploration of the moon and other planets, robots havebeen contributing to expand the frontier of scientific knowledgeand human access. The first robot that traveled on the surface ofextraterrestrial body was Lunokhod (1970), developed by for-mer Soviet Union. It was remotely operated from Earth andtraversed more than 10.5 km on the moon. The followingLunokhod-2 (1973) was also successful in 37 km of teleoperatedtraversal. On the other hand, the Lunar Roving Vehicle (LRV)or moon buggy was used in the NASA’s Apollo program (Apollo15, 16, and 17, during 1971–1972). The moon buggy was anelectrically driven four-wheel cart that can carry two astronautsand can be manually driven like a golf cart. It was useful toexpand the area of human expedition from the landing sites.

As for the exploration of Mars, the first successful landersare NASA’s Viking 1 and 2 (landed 1976). Although they werestatic landers, they have a robotic arm to collect soil samplesand conduct in situ analysis. A recent mission, NASA’s Phoe-nix lander, was also successful in landing at the Martian arcticregion. It is equipped with a 2.4-m long, 4-DoF manipulatorarm that has the capability of carrying out dexterous tasks tointeract with the terrain, such as digging, scraping, and sampleacquisition [26]. In situ analysis of the soils confirmed the exis-tence of water ice at present, and a possibly warmer and water-rich climate in the past was strongly suggested.

As for mobile robots (rovers) on Mars, the Sojourner roverin the Mars Pathfinder mission (1997) and Sprit and Opportu-nity in Mars Exploration Rover (MER) mission (2004–2009,see Figure 4) have had remarkable success. The benefits ofmobility in remote exploration mission have been stronglyhighlighted in these missions with rich scientific returns. TheESA’s ExoMars mission should be added as a planned rover mis-sion. From a robotics technology point of view, interestingissues are the design of mobility mechanisms and the algorithmsfor navigation control in natural rough terrain. In particular, thewheel slip and traction issue in a loose soil environment werehighlighted by Opportunity during exploration of MeridianiPlanum. In late April 2005, Opportunity got stuck in a soft sanddune (named Purgatory Dune), and due to significant wheelslip, it took many weeks until it finally got back onto firmground in early June 2005 [27]. Wheel slippage also degradesthe accuracy of odometric measurement of the vehicle, andimproved methods for robot odometry have been developed.

Phoenix and ExoMars missions will be elaborated in thisissue. This article provides a short review of wheeled robots forsurface locomotion, with highlights on the technologies forenvironment mapping, odometric measurement, and slip andtraction control.

MappingTopographic mapping of the environment is the first step before arover starts the traverse of an unknown field. MERs mount twostereo imaging systems on a camera bar of the rover mast. One is apanoramic camera (Pancam) dedicated to the mapping ofmedium-to-far objects in panoramic images. The other is a navi-gation camera (Navcam) with a best focus at 1 m with a field ofview (FoV) of 45�. These stereo cameras were successfully used toobtain detailed three-dimensional (3-D) maps around the rover,and a number of techniques have been developed and applied tointegrate multiple patches of maps and construct augmented mapsof the environment traversed by the rover [28]–[30].

On the other hand, in the research community of mobilerobots, direct measurement of ranging data using laser rangingfinder (LRF) or laser imaging detection and ranging (LIDAR) isgetting to be a common method along with the improvement ofthese hardware devices in terms of ranging distance and reliabilityof measurements. Mechanically moving parts, such as scanningmechanisms of the laser spot, have been considered less robustagainst launch vibrations and landing shocks. In addition, massand power requirements and limitations of computational speedare the issues in current space-qualified LIDARs. Recently, com-pact and durable commercial LIDAR products have been devel-oped and are used in various ground applications. For example,SICK laser sensing systems were used as a primary sensor in suc-cessful vehicles that completed the DARPA Grand Challenge2005 mission of high-speed rough-terrain traversal [31]. As acompact model example, the Hokuyo URG (UTM-30LX) sen-sor weighs 370 g and consumes 8.5 W, yet has 30 m ranging capa-bility in outdoor use. As flight processors improve, LIDARs willbecome a desirable option for space rover applications.

Simultaneous localization and mapping (SLAM) is a techniquefor a mobile robot to integrate the onboard sensor data to build amap of the environment around the robot. It is defined as the prob-lem to build a model that leads to a map and repetitively improve itwhile keeping track of the current location (localization) of therobot in that map. The issues of mapping and localization arecoupled to each other, and onboard sensor data contain inherenterrors. Therefore, SLAM uses an iterative mathematical solution or

Figure 4. MER. (Courtesy of NASA/JPL.)

IEEE Robotics & Automation MagazineDECEMBER 2009 23

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

statistical strategies to obtain maximum likelihood estimates of themap and localization simultaneously. Today, there are a large num-ber of good tutorials (e.g., [32] and [33]), papers, and technicalreports; therefore, a detailed reviewof these are not dealt here.

An integrated demonstration for autonomous navigation ofrover test beds with sensing, mapping, and localization wasconducted at the International Conference on Robotics andAutomation (ICRA) 2009 Robot Competition on PlanetaryExploration Challenge in Kobe, Japan.

Figure 5(a) depicts a rover test bed, El Dorado-II, developedby Tohoku University in the competition field. The robot isequipped with a scanning laser range sensor (Hokuyo UTM-30LX), inertia sensors, and wheel encoders. Robots are to startfrom the top of the lander, go down the ramp onto the ground,travel to the goal position called science station, and then comeback to the lander, and finally, climb up on the ramp. The navi-gation field is a 10 3 10 m2 square area, filled with gravel andscattered obstacles. The relative position of the goal withrespect to the starting point is known in advance, but no a pri-ori information about the obstacles is given. The robots areexpected to navigate the field by detecting the obstacles and

building a map completely autonomously. The El Dorado-IIrover conducted all the required tasks successfully. Figure 5(b)depicts the digital elevation map of the competition field builtby El Dorado-II at the end of the challenge mission togetherwith the trace of the rover position.

The scan matching or iterative closest point (ICP) algorithm[34] is one of the most useful methods to superimpose a multiplegeometry data set, improve the accuracy, and expand the area ofthe map of the environment. The ICP algorithm was originallydeveloped as a versatile method for registration of 3-D shapes inthe applications of computer graphics, but the algorithm has beenwidely accepted by the mobile robotics community to constructgeometrical maps of the environment. Today, a large amount ofsuccessful applications are reported for mobile robots that exploretwo-dimensional (2-D) environments, where a flat floor is inher-ently assumed. Then, the x and y positions and yaw angle of therover should be estimated. However, in 3-D rough terrain, the x,y, z positions and the roll, pitch, and yaw angles (6-DoF coordi-nates) of the rover should be simultaneously estimated. Therefore,the method requires heavy mathematical computation thatincreases exponentially along with the degrees of freedom, takesimpractical time to converge, and to make matters worse, theresults likely go into local minimum (false matching). So far, alimited number of examples have been reported successful in prac-tical 3-D applications. One approach to improve the performanceof the data association is to utilize discriminatory features likeplane segments in the environment model, but this idea is effectiveonly for structured environments [35]. Another idea is a loop-closing method, with which accumulated errors of sequential scanmatching can be corrected (or compromised) by checking globalconsistency of the map, but this idea is effective only when a rovermakes a loop or round-trip type of navigation [36].

In the ICRA Robot Competition, El Dorado-II used the ICPalgorithm to build the entire map of the field along with its naviga-tion paths. The environment was intended to be a 3-D terrain, butthe average elevation of the gravel field is almost flat. Therefore,this was a relatively easy example to apply 3-D SLAM technology.No loop-closing method was used. The author believes that aneffective fusion technique of inertial sensors, odometric measure-ment (to be elaborated in the following subsection), and 3-D scanmatching will be the key for further challenges with much moredifficult, completely unstructured terrains.

OdometryWheel slippage is a critical issue for mobile robots drivingacross loose soil, such as dry sand dunes and the like. It greatlyaffects the traction performance and energy consumption andleads to gradual deviation of the vehicle from the intendedpath, possibly resulting in large drift and poor results of thelocalization and control systems. For example, the use of con-ventional dead-reckoning technique is largely compromised,since it is based on the assumption that wheel revolutions canbe translated into correspondent linear displacements. Thus, ifone wheel slips, then the associated encoder will register revo-lutions even though these revolutions do not correspond to alinear displacement of the wheel. Conversely, if one wheelskids, fewer encoder pulses will be counted.

(a)

(b)

Figure 5. (a) El Dorado-II, developed by Tohoku University, inthe demo field of ICRA 2009 Planetary Exploration Challenge.(b) A map of the whole environment of the challenge field(10 3 10 m2) and the round-trip navigation path executed bythe rover. (Courtesy of Space Robotics Lab, Tohoku University.)

IEEE Robotics & Automation Magazine24 DECEMBER 2009

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

To cope with such problems of conventional dead reckon-ing, methods based on visual motion estimation have been pro-posed and studied [37]. This vision-based approach can bedivided into two broad categories: landmark-based and opticalflow-based methods. The former methods recognize either nat-ural or artificial landmarks in the environment and then inferthe position of the robot, usually, by triangulation. The latterapproaches estimate the differential of successive images extract-ing optical flow vectors, which allow changes in the robot’s poseto be evaluated. Notable examples of this category include visualodometry methods, which were developed and demonstratedfor rough-terrain robots and planetary rovers. The purpose ofvisual odometry is that of incrementally estimating the motionof a mobile robot by detecting and tracking the interesting pointfeatures or image patterns over subsequent images.

An earlier successful example of visual odometry is reportedto use an omnidirectional camera [38]. Based on the optical flowextraction, the x and y positions and yaw angle of the rover weresuccessfully estimated. As an advanced success, MER Sprit andOpportunity apply the visual odometry technique for theirautonomous navigation. In these rovers, stereo navigation cam-eras (Navcam) were used. First, image features are extracted,and the features’ 3-D positions are computed by stereo match-ing. During rover locomotion for a short distance, the 3-D posi-tions of the corresponding features are tracked and then put intokinematic equations to obtain the estimates of the incrementalchange of 6-DoF coordinates of the rover’s pose. The methodwas proven effective for navigation of the rovers on Mars. How-ever, because of limited onboard computational power, theimplemented procedure required an average of nearly 3 min ofcomputation time to update the rover’s coordinates. Also, theonboard inertial measurement unit (IMU) exhibited a verysmall drift rate and therefore maintained attitude knowledgevery well. Hence, the visual odometry was only typically usedto update rover position [30].

In a recent study, the author’s group in Tohoku Universityused a facedown camera just to look at the ground surface.The optical flow vectors are extracted based on the texture ofthe ground (gravels or even sand grains) and averaged to esti-mate the translational velocity only. The 3-D orientation ofthe rover is measured by inertia sensors (IMU). The rover’slocomotion trace is then estimated by the integration of thetranslational velocity along with the orientation obtained byIMU. The results are validated to provide a good accuracy in6-DoF localization by experiments in various test fields [39].

Slip Compensation and Traction ControlLunar/planetary rovers that travel sand dune-like terrain raise aninteresting problem in wheel slippage and traction control.Mathematical models are constructed for a combination of rigidwheel and deformable soil. This makes a remarkable comparisonwith the models for conventional onroad motor vehicles that arebased on a combination of rigid, paved surface and a deformable,rubber tire. A well-known model was developed by Bekker [40]and Wong [41], which was applied to various control schemes[42]. This article will not go into more details on this issue, butthe author would like to point out a simple, yet important idea

for wheeled robots to avoid critical slip and getting stuck in loosesoil. The idea is to control the rotational velocity of each wheelso that the slip ratio of all the wheels should be equal [43].

In a recent article from the author’s group, both longitudinaland lateral forces of a wheel for different slip ratios and slip angleswere studied and successfully applied to traction and steeringcontrol to follow a given trajectory in soft sand slopes [39]. Thevisual odometry system using a facedown camera, introducedearlier, provides slip ratios and angles of all wheels in real time;hence, the online feedback control to compensate for the wheelslippage was implemented and validated in practice.

Asteroid Probes and RoboticsAsteroids and comets, which are referred to as minor planetarybodies, are also attractive destinations of robotic probes for scien-tific discovery. Examples of previous missions are as follows:1) ESA’s Giotto mission conducted a flyby observation of comet1P/Halley in 1986 and unveiled the shape of the comet’s nucleus;2) NASA’s NEAR-Shoemaker mission conducted a detailedorbital observation of asteroid 433 Eros in 1999–2000; 3) NASA’sStardust was launched in 1999, conducted flyby with comet 81P/Wild and captured particles from the comet’s tail in 2004, andthen brought them back to Earth in 2006; and 4) for the study ofthe interior composition of comet 9P/Tempel, NASA’s DeepImpact probe successfully impacted the comet’s nucleus, and thena large amount of dust was ejected from the surface of its nucleus.

Ongoing missions include Hayabusa and Rosetta. Haya-busa is a Japanese probe to visit asteroid 25143 Itokawa, collectmaterial samples from its surface, and then bring them back toEarth. As of July 2009, the probe is on the way back to Earth.Rosetta is an ESA’s spacecraft mission launched in 2004,intended to study the comet 67P/Churyumov-Gerasimenko.It consists of two main elements: the Rosetta space probe andthe Philae lander. Philae is scheduled to land on the surface ofthe nucleus in 2014. Two harpoons will be fired into the cometto prevent the lander from bouncing off, and additional drillsare used to further secure the lander to the comet. Philae willthen conduct an in situ analysis of the nucleus.

In this article, Japanese Hayabusa is spotlighted, with thediscussion of a tiny surface rover as a challenging design option.

Hayabusa and MinervaHayabusa was successfully launched in 2003 and encounteredItokawa in 2005. Detailed scientific observation was conducted,including imaging (see Figure 6) and mapping of the surfaceand identification of the gravitational field [44]. On 19 and 25November (UT), Hayabusa conducted sample acquisition fromthe asteroid surface. Because of the extremely tiny size (justabout 550 m in the longest axis) and small mass of the asteroid,the gravitational acceleration of the Itokawa’s surface is in theorder of 10�5G. Such a small gravitational field makes theprobe’s stable stay on the surface very difficult. It can easily jumpoff from the surface. For this reason, Hayabusa adapted a touch-and-go sampling sequence. The contact duration was estimatedto be about 1–2 s, and in such a short period, a high-speedprojectile is projected to crash the surface so that ejecta shall becollected into a sample container [45]–[47].

IEEE Robotics & Automation MagazineDECEMBER 2009 25

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

Hayabusa carried a tiny minispacecraft (weighing only 590g and approximately 10 cm tall 3 12 cm in diameter) namedMinerva (see Figure 7). It was designed to take advantage ofItokawa’s very low gravity by using an internal flywheel assem-bly to hop across the surface of the asteroid, obtaining imagesfrom its cameras and sending them back to Hayabusa wheneverthe two spacecraft were in sight of one another [48].

Minerva was deployed on 12 November 2005. The landerrelease command was sent from the Earth, but before the com-mand could arrive, Hayabusa’s altimeter measured its distance

from Itokawa to be 44 m and thus started an automatic alti-tude-keeping sequence. As a result, Minerva was releasedwhile the probe was ascending and at a higher altitude thanintended so that it escaped Itokawa’s gravitational pull. It didnot reach the surface of Itokawa. If it had been successful,Minerva could have traveled on the microgravity surface byhopping and tumbling using its reaction wheel in the body.

Surface Mobility in Different Gravitational FieldThe design of Minerva raised an interesting problem of thelocomotion mechanism in the small gravitational field. Figure8 summarizes the typical designs of surface locomotion system.As evidenced by terrestrial motor vehicles and Moon/Marsrovers, a wheel traction system is the best option in the terres-trial gravitational field or down to one tenth of it. But note thatthe smaller the gravitational reaction forces of the wheels, thesmaller the traction forces.

In the smaller gravitational filed, a Nanorover type of designcan be advantageous. A Nanorover is a 1 kg scale of small robotthat is equipped with four wheels at the end of swingable struts.It was developed by NASA/Jet Propulsion Laboratory (JPL)and was intensely discussed as a candidate of optional payloadsonto Hayabusa. But the development of the flight model wascanceled because of budgeting reasons. In the design principle,a Nanorover swings the struts down on the surface while rotat-ing the wheels. Such downswing actions will be useful toproduce the forces to push the wheel onto the surface and thenyield the forces tangent to the surface. However, the down-swing forces inevitably push the robot body up into space.Hence, the entire robot must hop up, and if the hoping velocityis smaller than the escape velocity from the planetary body, therobot will return to the surface after a while and then will keepbouncing and tumbling until the kinetic energy is dissipated.Minerva does not have struts or wheels in its exterior, but it hasa single axis wheel inside. Because of the reaction of the wheel,the entire body starts tumbling and hops up if edges or spikesaround its cylindrical body kick the planetary surface. BothNanorover and Minerva designs rely on opportunistic tumblingmotion. Their traveling direction may be roughly indicated,but detailed point-to-point navigation must be difficult.

As an alternative design, articulated robots have a potentialadvantage in microgravity locomotion [46]. The idea is to holdthe surface by articulated limbs or grippers and then walk on thesurface. This idea was inspired by rock climbing. In principle,rock climbing must become much easier in the microgravityfield. Complexity of the articulated mechanism design will be anissue to realize such a robot. However, even without a dexterousgripper, with six of three DoF legs like an insect, such a robotcan stably hold the surface using three legs, while another threelegs swing forward. By repeating a three-holding and three-swinging gate, the robot will be able to keep a static walk [49].

A key element of the holding and walking strategy is fric-tion forces at the end tip of each limb. The smaller the frictionis, the more difficult it is to maintain static holding. Effectiveuse of electrostatic [50] or van der Waals forces [51] to generateadditional sticking forces or control the magnitude of frictionis an interesting option for future research.

(a)

(b)

Figure 6. (a) Pictures of asteroid Itokawa taken during theHayabusa’s descending operation. The shadow of thespacecraft itself is clearly seen. (Courtesy of JAXA.)(b) Computer graphic image of Hayabusa in the simulation ofits touch-and-go maneuver (Tohoku University). (Courtesy ofSpace Robotics Lab, Tohoku University.)

Figure 7. Minerva lander (courtesy of JAXA).

IEEE Robotics & Automation Magazine26 DECEMBER 2009

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

ConclusionsIn this article, three topics were elabo-rated. In the first part, the achievementsof orbital robotics technology in thelast decade were reviewed, highlightingETS-VII and Orbital Express flight dem-onstrations. In the second part, some ofthe selected topics of planetary roboticsthat are motivated by the field roboticsresearch point of view were described.Recent achievements in the author’slaboratory were presented as illustrativeexamples. Finally, technological chal-lenges to asteroid robotics were discussed.When designing a robot to explore thesurface of an asteroid, microgravity raisesan interesting problem of how to stickand how to move on the surface. Someideas to this question were introduced.

For further reading, the following textbooks would provide basic theories formodeling and control of space robots andtheir application examples: Space Robotics:Dynamics and Control (1992) [52], An Introduction to Space Robotics(2000) [53], Intelligence for Space Robotics (2006) [54], and Hand-book of Robotics (2008) [55].

AcknowledgmentsThe author thanks Dr. Rick Wagner at Northrop GrummanSpace Technology and Dr. Richard Volpe at JPL for their edi-torial assistance, and Dr. Kaiji Nagatani and Mr. Keisuke Satofor providing the navigation data of El Dorado-II rover inICRA 2009 Planetary Exploration Challenge.

KeywordsSpace robotics, free-flying robots, localization, mapping,asteroid robots.

References[1] D. Akin, M. Minsky, E. Thiel, and C. Kurtzman, ‘‘Space applications of

automation, robotics and machine intelligence systems (ARAMIS),phase II,’’ Rep. NASA CR-3735, 1983.

[2] M. Oda, ‘‘Experiences and lessons learned from the ETS-VII robot sat-ellite,’’ in Proc. IEEE Int. Conf. Robotics and Automation (ICRA 00),2000, pp. 914–919.

[3] I. Kawano, M. Mokuno, T. Kasai, and T. Suzuki, ‘‘First autonomousrendezvous using relative GPS navigation by ETS-VII,’’ Navigation,vol. 48, no. 1, pp. 49–56, 2001.

[4] M. Mokuno, I. Kawano, and T. Suzuki, ‘‘In-orbit demonstration ofrendezvous laser radar for unmanned autonomous rendezvous docking,’’IEEE Trans. Aerosp. Electron. Syst., vol. 40, no. 2, pp. 617–626, 2004.

[5] I. Kawano, M. Mokuno, T. Kasai, and T. Suzuki, ‘‘Result of autono-mous rendezvous docking experiment of engineering test satellite-VII,’’J. Spacecraft Rockets, vol. 38, no. 1, pp. 105–111, 2001.

[6] C. Bergin. (2009, Aug. 20). NASA ready for Japan’s HTV via flightreadiness review [Online]. Available: http://www.nasaspaceflight.com/2009/08/nasa-ready-for-japans-htv-via-flight-readiness-review/

[7] M. Oda, T. Doi, and K. Wakata, ‘‘Tele-manipulation of a satellitemounted robot by an on-ground astronaut,’’ in Proc. IEEE Int. Conf.Robotics and Automation (ICRA 01), 2001, pp. 1891–1896.

[8] M. Oda and Y. Ohkami, ‘‘Coordinated control of spacecraft attitude andspace manipulators,’’ Control Eng. Pract, vol. 5, no. 1, pp. 11–21, 1997.

[9] N. Inaba and M. Oda, ‘‘Autonomous satellite capture by a space robot: Worldfirst on-orbit experiment on a Japanese robot satellite ETS-VII,’’ in Proc. IEEEInt. Conf. Robotics and Automation (ICRA 00), 2000, pp. 1169–1174.

[10] K. Matsumoto, S. Wakabayashi, L. F. Penin, M. Nohmi, H. Ueno, T.Yoshida, and Y. Fukase, ‘‘Teleoperation control of ETS-7 robot arm foron-orbit truss construction,’’ in Proc. Int. Symp. Artificial Intelligence,Robotics and Automation in Space (i-SAIRAS99), 1999, pp. 313–318.

[11] K. Machida, T. Mikami, S. Komada, and K. Akita, ‘‘Precise EV robot:Flight model and telerobotic operation for ETS-VII,’’ in Proc. IEEE Int.Conf. Intelligent Robots and Systems (IROS 96), Osaka, 1996, pp. 1550–1557.

[12] S. Kimura, S. Tsuchiya, and H. Morikawa, ‘‘Antenna assembly experi-ments using ETS-VII,’’ in Proc. Int. Symp. Artificial Intelligence, Roboticsand Automation in Space (i-SAIRAS99), 1999, pp. 307–312.

[13] G. Visentin and F. Didot, ‘‘Testing space robotics on the Japanese ETS-VII satellite,’’ ESA Bull., 99, pp. 61–65, Sept. 1999.

[14] K. Landzettel, B. Brunner, K. Deutrich, G. Hirzinger, G. Schreiber,and B. M. Steinmetz, ‘‘DLR’s experiments on the ETS VII space robotmission,’’ in Proc. 9th Int. Conf. Advanced Robotics, 1999, pp. 347–353.

[15] W.-K. Yoon, T. Goshozono, H. Kawabe, M. Kinami, Y. Tsumaki, M.Uchiyama, M. Oda, and T. Doi, ‘‘Model-based space robot teleopera-tion of ETS-VII manipulator,’’ IEEE Trans. Robot. Automat., vol. 20,no. 3, pp. 602–612, 2004.

[16] K. Yoshida, K. Hashizume, and S. Abiko, ‘‘Zero reaction maneuver:Flight validation with ETS-VII space robot and extension to kinemati-cally redundant arm,’’ in Proc. IEEE Int. Conf. Robotics and Automation(ICRA 01), 2001, pp. 441–446.

[17] K. Yoshida, ‘‘Engineering Test Satellite VII flight experiments for spacerobot dynamics and control: Theories on laboratory test beds ten yearsago, now in orbit,’’ Int. J. Robot. Res., vol. 22, no. 5, pp. 321–335, 2003.

[18] T. Kanzawa and S. Matsunaga, ‘‘On-orbit identification experimentsfor ETS-VII robotic arm vibration,’’ in Proc. JSASS/JSME StructuresConf., 2000, vol. 42, pp. 157–160 (in Japanese).

[19] T. Imaida, Y. Yokokohji, T. Doi, M. Oda, and T. Yoshikawa,‘‘Ground-space bilateral teleoperation experiment using ETS-VII robotarm with direct kinesthetic coupling,’’ in Proc. IEEE Int. Conf. Roboticsand Automation (ICRA 01), 2001, pp. 1031–1038.

[20] (2009, June). DARPA home [Online]. Available: http://www.darpa.mil/orbitalexpress/

Simple Wheel Strut and Wheel

Grasp and WalkInternal ReactionWheel

MER (NASA/JPL)

Minerva (JAXA/ISAS)

Nanorover(NASA/JPL)

MultilimbedRover

(Tohoku University)

Figure 8. Design choice of surface locomotion system in different gravitational fields.

IEEE Robotics & Automation MagazineDECEMBER 2009 27

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.

[21] (2009, June). Boeing home-integrated defense systems, Orbital Expressmission updates [Online]. Available: http://www.boeing.com/ids/phantom_works/orbital/updates.html

[22] K. Landzettel, B. Brunner, and G. Hirzinger, ‘‘The telerobotic conceptsfor ESS,’’ presented at IARP Workshop on Space Robotics, Montreal,PQ, 1994.

[23] K. Yoshida, H. Nakanishi, H. Ueno, N. Inaba, T. Nishimaki, and M.Oda, ‘‘Dynamics, control and impedance matching for robotic capture of anon-cooperative satellite,’’ Adv. Robot., vol. 18, no. 2, pp. 175–198, 2004.

[24] D. N. Dimitrov and K. Yoshida, ‘‘Momentum distribution in a spacemanipulator for facilitating the post-impact control,’’ in Proc. IEEE Int.Conf. Intelligent Robots and Systems (IROS 2004), 2004, pp. 3345–3350.

[25] T. Oki, H. Nakanishi, and K. Yoshida, ‘‘Time-optimal manipulatorcontrol for management of angular momentum distribution during thecapture of a tumbling target,’’ Adv. Robot., vol. 24, no. 3, 2010.

[26] R. Bonitz, L. Shiraishi, M. Robinson, J. Carsten, R. Volpe, A. Trebi-Ollennu, R. E. Arvidson, P. C. Chu, J. J. Wilson, and K. R. Davis,‘‘The Phoenix Mars Lander robotic arm,’’ in Proc. 2009 IEEE AerospaceConf., Mar. 2009, pp. 1–12.

[27] (2009, July). NASA/JPL home-Opportunity updates 2005 [Online].Available: http://marsrovers.jpl.nasa.gov/mission/status_opportunityAll_2005.html

[28] R. Li, S. W. Squyres, R. E. Arvidson, B. A. Archinal, J. Bell, Y.Cheng, L. Crumpler, D. J. Des Marais, K. Di, T. A. Ely, M. Golombek,E. Graat, J. Grant, J. Guinn, A. Johnson, R. Greeley, R. L. Kirk, M.Maimone, L. H. Matthies, M. Malin, T. Parker, M. Sims, L. A. Soder-blom, S. Thompson, J. Wang, P. Whelley, and F. Xu, ‘‘Initial results ofrover localization and topographic mapping for the 2003 Mars explora-tion rover mission,’’ Photogramm. Eng. Remote Sens. (Special Issue on Map-ping Mars), vol. 71, no. 10, pp. 1129–1142, 2005.

[29] Y. Cheng, M. Maimone, and L. Matthies, ‘‘Visual odometry on theMars exploration rovers,’’ IEEE Robot. Automat. Mag., vol. 13, no. 2,pp. 54–62, June 2006.

[30] M. Maimone, Y. Cheng, and L. Matthies, ‘‘Two years of visual odome-try on the Mars exploration rovers,’’ J. Field Robot., vol. 24, no. 3,pp. 169–186, 2007.

[31] Special Issues on the DARPA Grand Challenge, J. Field Robot., vol. 23,no. 8–9, 2006.

[32] H. Durrant-Whyte and T. Bailey, ‘‘Simultaneous localization and map-ping (SLAM): Part I,’’ IEEE Robot. Automat. Mag., vol. 13, no. 2,pp. 99–110, 2006.

[33] T. Bailey and H. Durrant-Whyte, ‘‘Simultaneous localization and map-ping (SLAM): Part II,’’ IEEE Robot. Automat. Mag., vol. 13, no. 3,pp. 108–117, 2006.

[34] P. J. Besl and H. D. McKay, ‘‘A method for registration of 3-D shapes,’’IEEE Trans. Pattern Anal. Machine Intell, vol. 14, no. 2, pp. 239–256, 1992.

[35] J. Weingarten and R. Siegwart, ‘‘EKF-based 3D SLAM for structuredenvironment reconstruction,’’ in Proc. IEEE Int. Conf. Intelligent Robotsand Systems (IROS 2005), 2005, pp. 3834–3839.

[36] D. Borrmann, J. Elseberg, K. Lingemann, A. N€uchter, and J. Hertz-berg, ‘‘Globally consistent 3D mapping with scan matching,’’ Robot.Autonom. Syst., vol. 56, no. 2, pp. 130–142, 2008.

[37] G. N. DeSouza and A. C. Kak, ‘‘Vision for mobile robot navigation:A survey,’’ IEEE Trans. Pattern Anal. Machine Intell, vol. 24, no. 2,pp. 237–267, 2002.

[38] P. Corke, D. Strelow, and S. Singh, ‘‘Omnidirectional visual odometryfor a planetary rover,’’ in Proc. IEEE Int. Conf. Intelligent Robots and Sys-tems (IROS 2004), 2004, pp. 4007–4012.

[39] G. Ishigami, K. Nagatani, and K. Yoshida, ‘‘Slope traversal controls forplanetary exploration rover on sandy terrain,’’ J. Field Robot., vol. 26,no. 3, pp. 264–286, 2009.

[40] M. G. Bekker, Introduction to Terrain-Vehicle Systems. Ann Arbor, MI:Univ. of Michigan Press, 1969.

[41] J. Y. Wong, Theory of Ground Vehicles, 3rd ed. New York: Wiley, 2001.[42] K. Iagnemma and S. Dubowsky, Mobile Robots in Rough Terrain: Esti-

mation, Motion Planning, and Control with Application to PlanetaryRovers (Springer Tracts in Advanced Robotics, vol. 12). Berlin:Springer, 2004.

[43] K. Yoshida, H. Hamano, and T. Watanabe, ‘‘Slip-based traction controlof a planetary rover,’’ in Experimental Robotics VIII (Springer Tracts inAdvanced Robotics vol. 5) B. Siciliano and P. Dario, Eds. Berlin:Springer-Verlag, 2003, pp. 644–653.

[44] A. Fujiwara, J. Kawaguchi, D. K. Yeomans, M. Abe, T. Mukai, T.Okada, J. Saito, H. Yano, M. Yoshikawa, D. J. Scheeres, O. Barnouin-Jha, A. F. Cheng, H. Demura, R. W. Gaskell, N. Hirata, H. Ikeda, T.Kominato, H. Miyamoto, A. M. Nakamura, R. Nakamura, S. Sasaki,and K. Uesugi, ‘‘The rubble-pile asteroid Itokawa as observed by Haya-busa,’’ Science, vol. 312, no. 5778, pp. 1330–1334, 2006.

[45] K. Yoshida, T. Kubota, S. Sawai, A. Fujiwara, and M. Uo, ‘‘MUSES-Ctouch-down simulation on the ground,’’ in Proc. 11th Annu. AAS/AIAASpace Flight Mechanics Meeting, Santa Barbara, CA, 2001, pp. 481–490.

[46] K. Yoshida, Y. Nishimaki, T. Maruki, T. Kubota, and H. Yano, ‘‘Samplingand surface exploration strategies in MUSES-C and future asteroid mis-sions,’’ in Proc. 7th Int. Symp. Artificial Intelligence and Robotics and Automationin Space, 2003, pp. 1–8.

[47] H. Yano, T. Kubota, H. Miyamoto, T. Okada, D. Scheeres, Y. Takagi,K. Yoshida, M. Abe, S. Abe, O. Barnouin-Jha, A. Fujiwara, S. Hasegawa,T. Hashimoto, M. Ishiguro, M. Kato, J. Kawaguchi, T. Mukai, J. Saito,S. Sasaki, and M. Yoshikawa, ‘‘Touchdown of the Hayabusa spacecraft at theMuses Sea on Itokawa,’’ Science, vol. 312, no. 5778, pp. 1350–1353, 2006.

[48] T. Yoshimitsu, T. Kubota, I. Nakatani, and J. Kawaguchi, ‘‘Robotic landerMINERVA, its mobility and surface exploration,’’ in Proc. 11th Annu. AAS/AIAA Space Flight Mechanics Meeting, Santa Barbara, CA, 2001, pp. 491–501.

[49] M. Chacin, A. Mora, and K. Yoshida, ‘‘Motion control of multi-limbed robots for asteroid exploration missions,’’ in Proc. IEEE Int. Conf.Robotics and Automation (ICRA 09), 2009, pp. 3037–3042.

[50] R. Wagner, ‘‘Grippers for space locomotion,’’ presented at the OrbitalRobotics Workshop in IEEE ICRA08 Conf., Pasadena, CA.

[51] M. Sitti and R. S. Fearing, ‘‘Synthetic Gecko foot-hair micro/nano-structures for future wall-climbing robots,’’ in Proc. IEEE Int. Conf.Robotics and Automation (ICRA 03), 2003, pp. 1164–1170.

[52] Y. Xu and T. Kanade, Eds., Space Robotics: Dynamics and Control. Nor-well, MA: Kluwer, 1992.

[53] A. Ellery, An Introduction to Space Robotics. Springer-Praxis, 2000.[54] A. Howard and E. W. Tunstel, Eds., Intelligence for Space Robotics. San

Antonio, TX: TSI Press, 2006.[55] B. Siciliano and O. Khatib, Eds., ‘‘Space robots and systems,’’ in Hand-

book of Robotics. New York: Springer, 2008, ch. 45.

Kazuya Yoshida received his B.E. and M.S. degrees inmechanical engineering science from the Tokyo Institute ofTechnology, Japan, in 1984 and 1986, respectively. He receivedhis Dr.Eng. degree from the Tokyo Institute of Technology in1990. He served as a research associate of Tokyo Institute ofTechnology from 1986 to 1994 and a visiting scientist of Massa-chusetts Institute of Technology in 1994. From 1985 to 2003,he was appointed as an associate professor of Tohoku University,Japan, and since 2003 has been a professor in the Departmentof Aerospace Engineering, Tohoku University. He has also beenserving as a visiting lecturer of International Space Universitysince 1998 and adjunct faculty of International Space Universitysince 2007. His research activities cover dynamics and controlof space robotic systems, ranging from free-flying robots toplanetary exploration rovers. His activities are also extendedto the development of university-based microsatellites andalso terrestrial applications of space technology, such as remoteexploration for search and rescue missions.

Address for Correspondence: Kazuya Yoshida, Tohoku Univer-sity, Aoba 6-6-01, 980-8579 Sendai, Japan. E-mail: [email protected].

IEEE Robotics & Automation Magazine28 DECEMBER 2009

Authorized licensed use limited to: TOHOKU UNIVERSITY. Downloaded on December 22, 2009 at 16:26 from IEEE Xplore. Restrictions apply.