6
Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA, Yoshio MATSUMOTO, Tsukasa OGASAWARA Nara Institute of Science and Technology Graduate School of Information Science, Robotics Lab. 8916-5 Takayama, Ikoma, Nara, 630-0192 Japan. Email: {masana-k, yoshio, ogasawar}@is.naist.jp Abstract In this paper, we introduce an annotation- based rescue assistance system for a teleop- erated unmanned helicopter with an wearable augmented reality(AR) environment. In this sys- tem, an operator controls the helicopter remotely while watching an annotated view from the he- licopter through a head mounted display(HMD) with a laptop PC in a backpack. Virtual Build- ings and textual annotations assist the rescue op- eration indicating the position to search rapidly and intensively. The position and the attitude of the helicopter is measured by a GPS and a gyroscope, and sent to the operator’s PC via a Fig. 1. Teleoperated Unmmand Helicopter wireless LAN. Using this system, we conducted experiments to find persons and verified the feasibility. I. . Introduction Unmanned helicopters are currently used for various purposes, such as crop-dusting and re- Fig. 2. Operator Wearing Wearable Aug- mented Reality Environment

Annotation-Based Rescue Assistance System for Teleoperated ... · Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment

  • Upload
    buianh

  • View
    218

  • Download
    0

Embed Size (px)

Citation preview

Annotation-Based Rescue Assistance Systemfor Teleoperated Unmanned Helicopter

with Wearable Augmented Reality Environment

Masanao KOEDA, Yoshio MATSUMOTO, Tsukasa OGASAWARANara Institute of Science and Technology

Graduate School of Information Science, Robotics Lab.8916-5 Takayama, Ikoma, Nara, 630-0192 Japan.Email: {masana-k, yoshio, ogasawar}@is.naist.jp

Abstract

In this paper, we introduce an annotation-based rescue assistance system for a teleop-erated unmanned helicopter with an wearableaugmented reality(AR) environment. In this sys-tem, an operator controls the helicopter remotelywhile watching an annotated view from the he-licopter through a head mounted display(HMD)with a laptop PC in a backpack. Virtual Build-ings and textual annotations assist the rescue op-eration indicating the position to search rapidlyand intensively. The position and the attitudeof the helicopter is measured by a GPS and agyroscope, and sent to the operator’s PC via a

Fig. 1. Teleoperated Unmmand Helicopter

wireless LAN. Using this system, we conductedexperiments to find persons and verified thefeasibility.

I.. Introduction

Unmanned helicopters are currently used forvarious purposes, such as crop-dusting and re-

Fig. 2. Operator Wearing Wearable Aug-mented Reality Environment

IEEE1394

USB

USBPCMCIA

RS-232C

USB

Helicopter

Operator

NTSC

NTSC

Data Relay Station

OmnidirectionalCamera

External Antenna

Wireless LAN Access Point

Wireless LAN

GPS USB-Serial Converter

NotePC DV Converter

Gyroscope

HMD

Head Tracker

NotePCWireless LAN

Fig. 3. System Diagram

GPS

OmnidirectionalCamera

Gyroscope, NotePC, Wireless LAN

Fig. 4. System Overview

mote sensing. However it is difficult for an oper-ator to control an unmanned helicopter remotely.One reason is that an operator cannot be awareof its attitude when he/she is far away from thehelicopter. Another reason is that the coordinatesystem between the helicopter and the operatorchanges drastically depending on the attitude ofthe helicopter. To solve these problems, severalstudies have been made on autonomous heli-copters [1]-[5]. Autonomous helicopters needpre-determined landmarks or flight paths in or-der to fly, thus they are not suitable for flighttasks where the situation changes every minutesuch as in disaster relief. Additionally, many

Fig. 5. Data Relay Station

on-board sensors and computers for control areneeded. Since the payload of a helicopter issharply small, autonomous helicopters tend tobe large, heavy, and expensive.

We proposed an immersive teleoperating sys-tem for unmanned helicopters using an omnidi-rectional camera(Figure 1)[6]. In this system,the operator controls the helicopter remotelyby viewing the surroundings of the helicopterthrough a HMD(Figure 2). The advantage ofthis system is that it is only necessary to in-stall a camera and a transmitter on the heli-copter. Therefore it is possible to use a compact

Head Mounted DisplayGyroscope

Laptop PC with WirelessLAN

BackpackController

Fig. 6. Wearable Augmented Reality Environ-ment

helicopter with a small payload, and make itlightweight and cheap. Additionally, it becomeseasy to control an unmanned helicopter becausea coordinate system between a helicopter and anoperator doesn’t change even when the attitudeof the helicopter changes. Furthermore, an oper-ator can retain control even when a helicopter isout of the operator’s sight as long as the videoimage can reach the operator and the helicoptercan receive the control signal.

However, in this system, it is impossible tocontrol when the image transmission fails or thevisual range is poor. In addition, the resolutionof a perspective image which is generated fromomnidirectional image becomes low and theoperator has trouble seeing distant objects.

To solve these problems, we are developingan annotation-based assistance system for anunmanned helicopter.

II.. Annotation-Based Assistance Systemfor Unmanned Helicopter

We developed an annotation-based assistancesystem for an unmanned helicopter.Figure 3

and Table I show the configuration and thespecification of our system. On the helicopter,an omnidirectional camera and a gyroscope aremounted at the front, and a PC with a wire-less LAN and GPS receiver are hung at thebottom(Figure 4). Position/attitude data and anomnidirectional image are sent to the operatorduring the flight via a wireless LAN through adata relay station(Figure 5). On the ground, aperspective image is generated from a receivedimage, and displayed on the HMD which theoperator wears. The displayed image changesdepending on the head direction which is mea-sured by the gyroscope attached to the HMD.The operator has an annotation database whichconsists of the names and the position infor-mation of neighboring real objects. Using thedatabase and the current position and attitudeof the helicopter, annotations are overlaid onthe perspective image which the operator isobserving. The direction of the nose of the he-licopter, ground speed, map, and the operator’shead attitude are also displayed on the image.The operator holds a controller and controls thehelicopter wearing a backpack which contains alaptop PC(Figure 6).

III.. Experiment

Using this system, we carried out anexperiment to assist search of persons fromthe image captured by the camera mounted onthe helicopter. The experiment was conductedat HEIJYO Palace Site in Nara prefecture.Figure 7 is an overview of the experimentalenvironment. The helicopter took off at positionA in Figure 8), and flew around it in a fewminutes.Figure 8 also shows the position oftextual annotations. Textual annotations areconsisted “FireDepartment”, “MedicalCenter”“City Office”, “ElementarySchool”, “Po-lice Box”, “Library”, and “EvacuationArea”.Virtual buildings are overlaid on the capturedimage to restore to original state of the citydestroyed by the disaster. Additionally, theneighboring map is displayed at the lower left

TABLE I. System SpecificationHelicopter

Airframe JR VoyagerGSRPayload: 4.0[kg]

PC SONY PCG-U1CPU: Crusoe 867[MHz]Memory: 256[MB]Wireless LAN: WLI-USB-KS11G

Camera ACCOWLEOmnidirectional Vision Sensor(with Hyperboloidal Mirror)resolution: 512x492[pixel]

Capture Canopus ADVC-55GPS eTrex Vista

Gyroscope InterSense InertiaCube2Data Relay Station

Access Point BUFFALO WHR2-A54F54Antenna BUFFALO WLE-HG-NDC

OperatorPC TOSHIBA DynabookG8

CPU: Pentium4 2.0[GHz]Memory: 768[MB]Wireless Lan: Embedded

HMD i-O DisplaySystemsi-glasses!LCresolution: 450x266[pixel]

Gyroscope InterSense InterTrax2

of the image to inform the position of thehelicopter.

Figure 9 indicates acquired position and at-titude data from GPS and gyroscope mountedon the helicopter, and these data are transmit-ted to the operator through the Wireless LANduring the flight.Figure 10 shows a sequenceof snapshots which the operator views. Around“EvacuationArea” of the textual annotation, wecould find a person at the points of a blue circlein Figure 10-(d),(e).

IV.. Conclusion

In this paper, an annotation-based rescue as-sistance system for a teleoperated unmannedhelicopter with a wearable augmented realityenvironment was proposed. We conducted an ex-periment to assist a search operation of personsfrom the image captured by the camera mountedon the helicopter. To support the operation,textual annotations and virtual buildings were

overlaid on the captured image, and we couldfind moving person in the search area.

Acknowledgement

This research is partly supported by Core Re-search for Evolutional Science and Technology(CREST) Program ”Advanced Media Technol-ogy for Everyday Living” of Japan Science andTechnology Agency (JST).

Fig. 7. HEIJYO Palace Site

Fire_DepartmentMedical_Center

City_Office

Elementary_SchoolPolice_Box

Library

Evacation_Area A

Fig. 8. Location of Textual Annotation

3441.440

3441.445

3441.450

3441.455

3441.460

3441.465

3441.470

3441.475

3441.480

3441.485

13547.815 13547.820 13547.825 13547.830 13547.835 13547.840 13547.845 13547.850 13547.855

Latit

ude

Longitude

(a)Position

-200

-150

-100

-50

0

50

100

150

200

160 180 200 220 240 260 280 300 320

Ang

le[d

eg]

Time[s]

yawroll

pitch

(b)Attitude

Fig. 9. Position and Attitude of UnmannedHelicopter during Experimental Flight

References

[1] Ryan Miller, Omead Amidi, and Mark Delouis, “ArcticTest Flights of the CMU Autonomous Helicopter”, Pro-ceeding of the Association for Unmanned Vehicle SystemsInternational 1999.

[2] Kale Harbick, James F. Montgomery, and Gaurav S.Sukhatme, “Planar Spline Trajectory Following for anAutonomous Helicopter”, Journal of Advanced Computa-tional Intelligence and Intelligent Informatics, Vol.8, No.3,pp. 237-242, 2004.

[3] Daigo Fujiwara, Jinok Shin, Kensaku Hazawa, KazuhiroIgarashi, Dilshan Fernando, and Kenzo Nonami, “Au-tonomous Flight Control of Small Hobby-Class UnmannedHelicopter, Report 1: Hardware Development and Verifica-tion Experiments of Autonomous Flight Control System”,Japan Society of Mechanical Engineers, Robotics andMechatronics Division, Journal of Robotics and Mecha-tronics, Vol.15 No.5, pp. 537-545 (2003)

[4] Kensaku Hazawa, Jinok Shin, Daigo Fujiwara, KazuhiroIgarashi, Dilshan Fernando, and Kenzo Nonami: ”Au-

tonomous Flight Control of Small Hobby-Class UnmannedHelicopter, Report 2: Modeling Based on Experimen-tal Identification and Autonomous Flight Control Ex-periments”, Japan Society of Mechanical Engineers,Robotics and Mechatronics Division, Journal of Roboticsand Mechatronics, Vol.15 No.5, pp.546-554 (2003)

[5] Hiroaki Nakanishi, and Koichi Inoue, “Development ofAutonomous Flight Control System for Unmanned Heli-copter by Use of Neural Network”, Proceedings of WorldCongress of Computational Intelligence 2002, pp. 2400-2405.

[6] Masanao Koeda, Yoshio Matsumoto, and Tsukasa Oga-sawara, “Development of an Immersive TeleoperatingSystem for Unmanned Helicopter”, Proceedings of IAPRWorkshop on Machine Vision Applications 2002, pp. 220-223.

(a) (b)

(c) (d)

(e) (f)

Fig. 10. Snapshots of Generated Image Overlaid Annotations and Found Person around theannotation: “Evacuation Area”