7
FEATURE ARTICLE Virtual Patient Simulator for Distributed Collaborative Medical Education THOMAS P. CAUDELL,* KENNETH L. SUMMERS, JIM HOLTEN IV, TAKESHI HAKAMATA, MOAD MOWAFI, JOSHUA JACOBS, BETH K. LOZANOFF, SCOTT LOZANOFF, DAVID WILKS, MARCUS F. KEEP, STANLEY SAIKI, AND DALE ALVERSON Project TOUCH (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) investigates the feasibility of using advanced technologies to enhance education in an innovative problem-based learning format currently being used in medical school curricula, applying specific clinical case models, and deploying to remote sites/workstations. The University of New Mexico’s School of Medicine and the John A. Burns School of Medicine at the University of Hawai’i face similar health care challenges in providing and delivering services and training to remote and rural areas. Recognizing that health care needs are local and require local solutions, both states are committed to improving health care delivery to their unique populations by sharing information and experiences through emerging telehealth technologies by using high-performance computing and communications resources. The purpose of this study is to describe the deployment of a problem-based learning case distributed over the National Computational Science Alliance’s Access Grid. Emphasis is placed on the underlying technical components of the TOUCH project, including the virtual reality development tool Flatland, the artificial intelligence– based simulation engine, the Access Grid, high-performance computing platforms, and the software that connects them all. In addition, educational and technical challenges for Project TOUCH are identified. Anat Rec (Part B: New Anat) 270B:23–29, 2003. © 2003 Wiley-Liss, Inc. KEY WORDS: virtual patient; artificial intelligence; medical education; anatomy; patient simulation; problem-based learning; PBL; access grid; traumatic head injury; TOUCH INTRODUCTION Project TOUCH is a multi-year pro- gram initiated in August of 2000 as a collaborative effort between the University of New Mexico and Univer- sity of Hawai’i and their associated high-performance computing centers (Alverson et al., 2001; Jacobs et al., 2003). The purpose of the project is to demonstrate the feasibility of using advanced computing methods, such as virtual reality, to enhance educa- tion in a problem-based learning (PBL) format currently being used in the curriculum in the two schools (Kaufman et al., 1989; Anderson, 1991; Bereiter and Scardamalia, 2000). The demonstration case con- sists of a traumatic head injury, de- ploying it to remote sites and associ- ated workstations over the Next Generation Internet Access Grid (AG), (http://www-fp.mcs.anl.gov/fl/). Rec- ognizing that health care needs are local and require local solutions, both states are focused on improving health care delivery to their unique populations and have begun to benefit from sharing information and experi- ences. Emerging telehealth technolo- gies can be applied by using existing high-performance computing and communications resources present in both states. The primary objective of this project is to determine whether an in- tegrated, collaborative, immersive virtual environment can be developed facilitating enhanced human compre- hension and whether this system can be applied to PBL across distance. The first phase has been exploratory and has involved initial development of advanced computing tools by using immersive virtual reality and a newly developed virtual patient simulator while making use of a completely novel virtual environment develop- ment tool called Flatland (http://www. ahpcc.unm.edu/homunculus/) distrib- uted over the AG to distant learning sites (Jacobs et al., 2003). The purpose of this study is to describe a real-time artificial intelligence (AI) simulation engine, a real-time three-dimensional (3D) virtual reality environment, a system for human-simulation interac- Drs. Caudell and Mowafi are in the De- partment of Electrical and Computer En- gineering, University of New Mexico. Drs. Summers, Holten, and Hakamata are in the High Performance Computing, Edu- cation & Research Center, University of New Mexico. Dr. Jacobs is in the Depart- ment of Internal Medicine, University of Hawai’i School of Medicine. Drs. S. Loz- anoff and Keep, and Ms. B. Lozanoff, are in the Department of Anatomy and Repro- ductive Biology, University of Hawai’i School of Medicine. Dr. Wilks is in the Department of Radiology, University of New Mexico School of Medicine. Dr. Saiki is in the Department of Internal Medicine, University of Hawai’i School of Medicine and the Tripler Army Medical Center, Ho- nolulu, Hawai’i. Dr. Alverson is in the De- partment of Pediatrics, University of New Mexico. *Correspondence to: Thomas P. Caudell, Ph.D., Department of Electrical and Com- puter Engineering, University of New Mexico, Albuquerque, NM 87131. E-mail: [email protected] DOI 10.1002/ar.b.10007 Published online in Wiley InterScience (www.interscience.wiley.com). THE ANATOMICAL RECORD (PART B: NEW ANAT.) 270B:23–29, 2003 © 2003 Wiley-Liss, Inc.

Virtual patient simulator for distributed collaborative medical education

Embed Size (px)

Citation preview

FEATURE ARTICLE

Virtual Patient Simulator for DistributedCollaborative Medical EducationTHOMAS P. CAUDELL,* KENNETH L. SUMMERS, JIM HOLTEN IV, TAKESHI HAKAMATA, MOAD MOWAFI,JOSHUA JACOBS, BETH K. LOZANOFF, SCOTT LOZANOFF, DAVID WILKS, MARCUS F. KEEP, STANLEY SAIKI,AND DALE ALVERSON

Project TOUCH (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) investigates thefeasibility of using advanced technologies to enhance education in an innovative problem-based learning formatcurrently being used in medical school curricula, applying specific clinical case models, and deploying to remotesites/workstations. The University of New Mexico’s School of Medicine and the John A. Burns School of Medicine atthe University of Hawai’i face similar health care challenges in providing and delivering services and training to remoteand rural areas. Recognizing that health care needs are local and require local solutions, both states are committedto improving health care delivery to their unique populations by sharing information and experiences throughemerging telehealth technologies by using high-performance computing and communications resources. Thepurpose of this study is to describe the deployment of a problem-based learning case distributed over the NationalComputational Science Alliance’s Access Grid. Emphasis is placed on the underlying technical components of theTOUCH project, including the virtual reality development tool Flatland, the artificial intelligence–based simulationengine, the Access Grid, high-performance computing platforms, and the software that connects them all. In addition,educational and technical challenges for Project TOUCH are identified. Anat Rec (Part B: New Anat) 270B:23–29, 2003.© 2003 Wiley-Liss, Inc.

KEY WORDS: virtual patient; artificial intelligence; medical education; anatomy; patient simulation; problem-based learning;PBL; access grid; traumatic head injury; TOUCH

INTRODUCTION

Project TOUCH is a multi-year pro-gram initiated in August of 2000

as a collaborative effort between theUniversity of New Mexico and Univer-sity of Hawai’i and their associatedhigh-performance computing centers(Alverson et al., 2001; Jacobs et al.,2003). The purpose of the project is todemonstrate the feasibility of usingadvanced computing methods, suchas virtual reality, to enhance educa-tion in a problem-based learning(PBL) format currently being used inthe curriculum in the two schools(Kaufman et al., 1989; Anderson,1991; Bereiter and Scardamalia,2000). The demonstration case con-sists of a traumatic head injury, de-ploying it to remote sites and associ-ated workstations over the NextGeneration Internet Access Grid (AG),(http://www-fp.mcs.anl.gov/fl/). Rec-ognizing that health care needs arelocal and require local solutions, bothstates are focused on improvinghealth care delivery to their uniquepopulations and have begun to benefitfrom sharing information and experi-ences. Emerging telehealth technolo-

gies can be applied by using existinghigh-performance computing andcommunications resources present inboth states.

The primary objective of thisproject is to determine whether an in-tegrated, collaborative, immersivevirtual environment can be developedfacilitating enhanced human compre-hension and whether this system canbe applied to PBL across distance.The first phase has been exploratoryand has involved initial developmentof advanced computing tools by usingimmersive virtual reality and a newlydeveloped virtual patient simulatorwhile making use of a completelynovel virtual environment develop-ment tool called Flatland (http://www.ahpcc.unm.edu/homunculus/) distrib-uted over the AG to distant learningsites (Jacobs et al., 2003). The purposeof this study is to describe a real-timeartificial intelligence (AI) simulationengine, a real-time three-dimensional(3D) virtual reality environment, asystem for human-simulation interac-

Drs. Caudell and Mowafi are in the De-partment of Electrical and Computer En-gineering, University of New Mexico. Drs.Summers, Holten, and Hakamata are inthe High Performance Computing, Edu-cation & Research Center, University ofNew Mexico. Dr. Jacobs is in the Depart-ment of Internal Medicine, University ofHawai’i School of Medicine. Drs. S. Loz-anoff and Keep, and Ms. B. Lozanoff, arein the Department of Anatomy and Repro-ductive Biology, University of Hawai’iSchool of Medicine. Dr. Wilks is in theDepartment of Radiology, University ofNew Mexico School of Medicine. Dr. Saikiis in the Department of Internal Medicine,University of Hawai’i School of Medicineand the Tripler Army Medical Center, Ho-nolulu, Hawai’i. Dr. Alverson is in the De-partment of Pediatrics, University of NewMexico.*Correspondence to: Thomas P. Caudell,Ph.D., Department of Electrical and Com-puter Engineering, University of NewMexico, Albuquerque, NM 87131. E-mail:[email protected]

DOI 10.1002/ar.b.10007Published online in Wiley InterScience(www.interscience.wiley.com).

THE ANATOMICAL RECORD (PART B: NEW ANAT.) 270B:23–29, 2003

© 2003 Wiley-Liss, Inc.

tion, and finally, an Internet “telecon-ferencing” system to distribute thelearning experience out to remotesights.

DESCRIPTION OF THE SYSTEM

One primary objective of the TOUCHproject is to develop a computing envi-ronment that facilitates student-di-rected learning within a group setting.The group consists of individuals lo-cated at remote sites while the student-directed learning exercise generateslearning issues resulting from the treat-ment of a virtual patient (Jacobs et al.,2003). Therefore, technical componentsof the TOUCH system are developedand integrated to achieve this objective.The system diagram in Figure 1 showsthe relative location and interconnec-tion of all system components from anetwork point of view.

The AG

The National Computational ScienceAlliance AG (NCSA AG) is an Internet-based conferencing system supportingreal time, multipoint, group-to-groupcommunication and collaboration. AGnodes, or studios, are the meeting ven-ues and typically combine large-screenmultimedia displays by using conven-tional projectors with high-end audiosupport (Figure 2). The AG substrate isthe Internet, using Internet protocol(IP) multicast and middleware to feedthe nodes live video and sound. AG us-ers can share presentations, visualiza-tion environments, browsers, white-

boards, and graphics tablets. The AGnodes provide a research environmentfor the development of distributed dataand visualization conduits as well asstudying issues relating to collaborativework in distributed environments. TheAG uses the Video Conferencing Tool(VIC; McCanne and Jacobson, 1995) fortransmitting and receiving video. VIC isa multimedia tool built by LawrenceBerkeley National Laboratory for real-time video conferencing over the Inter-net. It is intended to link multiple siteswith multiple simultaneous videostreams over a multicast infrastructure.

VIC can perform two basic functions:(1) obtain information from video cap-ture cards to which cameras or othervideo devices are attached, and send itover the network; and (2) receive datafrom the network and display them on

a video monitor or on some other at-tached video device such as a video pro-jector. VIC is based on the Real-timeTransport Protocol (RTP; Schulzrinneet al., 1996) that is widely used with theInternet for the real-time transmissionof audio and video due to its uniqueability to encode and decode videostreams (International Telecommuni-cation Union, 1993). Although VIC canbe run point-to-point by using standardunicast IP addresses, it is primarily in-tended as a multiparty conferencing ap-plication. To use VIC’s conferencing ca-pabilities, systems must support IPmulticast, and ideally, the networkshould be connected to the IP MulticastBackbone (Mbone; Macedonia andBrutzman, 1994). Mbone is the multi-cast capable backbone of the Internet.It currently consists of a network of tun-nels linking the islands of multicast ca-pable subnetworks around the world.

The TOUCH project is using the In-ternet for its underlying telecommuni-cations infrastructure. The AG pro-vides a collaborative environment forremote visualization and interactiveapplications. A Flatland applicationwas developed that allows real-timegraphics to be multicast out to the AGfor viewing at remote sites. This strat-egy involves a coordinated process ofcopying the graphics out of Flatland,encoding them into video formats,

Figure 2. A typical Access Grid (AG) studio consists of a meeting room with a multiprojectorwall screen, multiview cameras, microphones, and speakers. On the screen are live imagesof remote collaborators, a Spycam view into Flatland, and a Power Point presentation ofthe TOUCH traumatic head injury storyboard. [Color figure can be viewed in the onlineissue, which is available at www.interscience.wiley.com.]

One primary objectiveof the TOUCH project is

to develop a computingenvironment thatfacilitates student

directed learning withina group setting.

Figure 1. The components of the TOUCHsystem discussed in this study. A single stu-dent user is immersed in the Flatland envi-ronment. The artificial intelligence–basedsimulator interacts with the user and the en-vironment, and controls the virtual patient.The Access Grid (AG) nodes are connectedto Flatland through graphical image trans-mission and control transmission.

24 THE ANATOMICAL RECORD (PART B: NEW ANAT.) FEATURE ARTICLE

and finally transmitting the imagesusing the Flatland Transmitter.

Flatland: Virtual Environments ToolFlatland is a visualization/virtual real-ity application development environ-

ment, created at the University of NewMexico (http://www.ahpcc.unm.edu/homunculus). It allows software au-thors to construct and users to inter-act with arbitrarily complex graphicaland aural representations of data and

systems. The system is described inmore detail in Box 1. The end result isa virtual-reality immersive environ-ment with sight and sound, in whichstudents using joywands and virtualcontrols can interact with computer-

Box 1. Flatland: Technical Details

Flatland is written in C/C�� anduses the standard OpenGL graphicslanguage to produce all graphics. Inaddition, Flatland uses the standardGLUT library for window, mouse, joy-wand, and keyboard management.Flatland is object oriented, mul-tithreaded, and uses dynamicallyloaded libraries to build user applica-tions in the virtual environment, andruns under Linux and Irix operating sys-tems. At the core of Flatland is an open,custom, transformation graph datastructure that maintains and potentiallyanimates the geometric relationshipsbetween the objects contained in thegraph. Graph objects contain all of theinformation necessary to draw, sound,and control the entity represented bythe object. The transformation graph isone part of a higher-level structure re-ferred to in Flatland as a universe. Theuniverse contains the transformationgraph, a flat database of objects in thegraph, and a reference to the graphvertex that is currently acting as theroot of a hierarchically organized tree.This root is usually the graphical cam-era viewpoint.

Flatland is intrinsically multithreaded,allowing the system to make use ofcomputer systems with multiproces-sors and shared memory. The mainthread spawns an OpenGL graphicsthread, a Flatland sound thread, and areal-time tracker thread. The optionaltracker facilitates the use of 3D interac-tion metaphors with their applicationsand to use head tracking and 3D joy-wands or wands. An application in thecontext of Flatland is a relatively self-contained collection of objects, func-tions, and data that can be dynamicallyloaded (and unloaded) into the graph ofan environment instantaneously. Anapplication is responsible for creatingand attaching its objects to the graph,and for supplying all object functional-ity. An application is added to Flatland

through the use of a configuration file.This structured file is read and parsedwhen Flatland starts, and contains thename and location of the libraries thathave been created for the application,as well as a formal list of parametersand an arbitrary set of arguments forthe application.

In Flatland, graphics and sound aretreated symmetrically. Each object inthe graph contains, among otherthings, a draw function and a soundfunction. The draw function contains orcalls all of the code to draw and ani-mate the graphics that represents theobject. From an author’s perspective,all object graphics are based on anddrawn in a local coordinate system.Other structures in the graph handlethe placement and orientation of theobject’s model coordinate relative toother objects in the graph and subse-quently the camera. The sound func-tion within an object contains all of thecalls or code to make sounds that rep-resents that object. Flatland maintainsa library of sound function calls that aredesigned to resemble OpenGL. Wavesound files are treated like OpenGLdisplay lists and are called sound lists.In addition to opening sound lists, func-tions exist that allow the author to con-trol the starting, looping, stopping, vol-ume, and 3D location of the sound. Allsound is emitted in Flatland from pointsources in the 3D space. The authorspecifies the location of the sounds inthe same model coordinate systemused for the graphics.

Although position-tracking technol-ogy is not generally available on com-puters today, Flatland is designed tomake use of these. A tracker is a mul-tiple degree of freedom measurementdevice that can, in real-time, monitorthe position and/or orientation of mul-tiple receiver devices in space, rela-tive to a transmitter device of somesort. As such, Flatland launches a

tracker thread to sample the availabletracker information and make it avail-able for use by applications. In thestandard Flatland configuration,trackers are used to locate hand-heldwands and to track the position of theuser’s head. Head position and orien-tation is needed in cases that involvethe use of head mounted displays orstereo shutter glasses.

User interaction is a central com-ponent of Flatland, and as such, eachobject is controllable in arbitrary waysdefined by the designer. Currently,there are four possible methods forthe control of objects: (1) GLUTpop-up menus in the main viewerwindow, (2) the console keyboard, (3)Flatland 2D control panels either inthe environment or separate win-dows, and (4) external systems orsimulations. In the future, there willalso be available 3D menus and con-trols in the virtual environment andvoice recognition.

An array of controls may be definedwhen an object is coded by the de-signer. These controls are managedby Flatland and can be exercisedthrough a designer-defined functionthat is evoked when either the key-stroke is made or a menu item is se-lected. This function may be arbi-trarily complex and may affectobjects other than the owner of thiscontrol. The control functions associ-ated with objects are the preferredmethod to change any internal statesor data of the object. The mouse andkeyboard interactions are providedthrough the GLUT libraries and a cus-tom 2D widget library. The latter isavailable to the designer for the cre-ation of standard 2D control panelwindows. Finally, external systemsmay control an object, for example,through a threaded simulation, serialcommunication (trackers), or Unixsockets to another process.

FEATURE ARTICLE THE ANATOMICAL RECORD (PART B: NEW ANAT.) 25

generated learning scenarios that re-spond logically to user interaction.Virtual patients can be simulated inany of several circumstances, withany imaginable disease or injury. Theactivities of a participant can be mon-itored by faculty and other students

for educational and instructional pur-poses.

THE TOUCH APPLICATION

Application systems, such as theTOUCH demonstration case, are dy-

namically loaded into the basic Flat-land system and associated objectsare attached to the Flatland graph.The TOUCH demonstration case iscomposed of three Flatland applica-tion modules: (1) The Virtual PatientEnvironment; (2) The Artificial Intel-ligence simulator, and (3) The Spy-cam (Figure 3A). The immersed stu-dent interacts with the virtual patientthrough the virtual reality (VR) effec-tor represented as a floating hand(Figure 3B). The Virtual Patient Envi-ronment consists, for the current headtrauma case, of a car accident sceneand an emergency room (Jacobs et al.,2003). Following the development ofAI rules governing the patient’s condi-tion after head trauma resulting froman automobile accident, a storyboardwas developed as a visual timeline forthe simulation (Jacobs et al., 2003).Graphical models were created forthis scenario in the commercial mod-eling tool Maya and imported intoFlatland. A virtual patient body modelwas created in another commercialtool, Poser, and imported into the casescenario (Figure 4). Medical tool kitmodels (e.g., otoscope, neck brace,pen light) were also produced in Mayaand loaded onto patient-side trays(Figure 4). The system is driven by aVR operator tracking of the users ac-tivities while viewing a computermonitor and assisting with virtualbody position and movements whennecessary (Figure 5).

The immersed student interactswith the virtual patient through a joy-wand, equipped with tracking system(six degrees of freedom), buttons, anda trigger (Figure 3). The user may pickup and place objects by moving thevirtual hand and pulling the wand’strigger. The AI is a custom forwardchaining IF-THEN rulebase systemthat contains knowledge of medicalexperts for this particular case andknowledge of how objects interact(Luger, 2002). The rules are coded, atthis time, in a C computer languageformat as logical antecedents and con-sequences and currently have limitedhuman readability. The AI loops overthe rule base, applying each rule’s an-tecedents to the state of the system,including time, by using a double-buffering method to maintain consis-tency, and testing for logical matches.Matching rules are “fired,” modifying

Figure 3. A: Diagram showing the relationships between the components of the TOUCHSystem in Flatland. The artificial intelligence contains all of the necessary knowledge ex-tracted from medical experts to monitor and control the entire system. The user is virtuallypresent in the scene and controls their viewpoint through a head tracking system. The userinteracts with the scene through hand-held joywands represented as a floating hand withthe corresponding orientation (B, inset). The outside world views the action in the scenethrough the SpyCam that can be moved arbitrarily in the environment by the virtual realitysystem operator or connected directly to the head of the user to share the view. [Colorfigure can be viewed in the online issue, which is available at www.interscience.wiley.com.]

26 THE ANATOMICAL RECORD (PART B: NEW ANAT.) FEATURE ARTICLE

the next state of the system, and con-trolling the status of dynamicallylaunched real-time control functions.These functions operate at the rate ofthe graphics engine to smoothly con-trol all time varying states of the pa-tient, including physiology and inter-action.

Time is a special state of the systemthat is not directly modified by the AI,but whose rate is controlled by an ad-justable clock. Because the rate of in-ference within the AI is controlled bythis clock, the operator is able tospeed up, slow down, or stop the ac-tion controlled by the AI. The AI iscurrently represented in the TOUCHsystem as a crystal rotating synchro-nously with the passage of virtualtime, to provide a monitor of the AI’sstatus for the developers of the sys-tem. In the future, it is planned for theimmersed student to interact directlywith representations of the AI; there-fore, it may be required to take onother forms, such as human avatars.For the current TOUCH project exper-iments, the AI representation was notvisible to the student user during theirinteraction with the virtual patient.

The camera-probe application, Spy-cam, captures images from the Flat-

land environment and transmits themover the AG for viewing at remotesites (Figure 3). This camera is used tocapture the third-person independentview of the applications within Flat-land. The Spycam can move aroundwithin Flatland and stop at any posi-tion. The image captured by the cam-era is copied into an auxiliary bufferand prepared for transmitting. Multi-ple Spycams may be launched simul-taneously and separately flown formultiview transmission into the AG.

The Flatland transmitter is a VIC-based multimedia tool that translatesthe output from the virtual environ-ment into a video stream for multi-cast. The transmitter receives theimages to transmit from Flatland,encodes them, and then sends thevideo stream over the AG. The trans-mitter depends on the Multicast Back-bone (MBone) to broadcast the videostreams. To accommodate sites with-out multicast capabilities, we use amulticast/unicast bridge that providesMbone gateway services so that userscan run their Mbone tools in unicastmode, and join a multicast session(Lehmen, 1999).

DISCUSSION

The TOUCH project is determiningthe feasibility of using emerging tech-nologies to overcome geographic bar-riers to delivery of medical educationin the communities of need and toenhance the learning process with im-mersive virtual reality, patient simula-tion, and Internet-based distributionof knowledge. The project builds uponprevious data supporting the PBL sys-tem of medical education as well as anAI tool initially conceived as a patientsimulator (Stansfield et al., 2000).However, TOUCH has provided manyunique applications and technologicaladvances. Within this context, consid-erable advancement in distance learn-ing is being achieved.

PBL was pioneered during the

1960s at Case Western Reserve Uni-versity and McMaster University(Boud and Feletti, 1991), and it hasbeen applied in numerous forms overthe decades. Its initial intention wasto provide a learning approach thatfacilitated knowledge integrationacross academic disciplines and topromote problem-solving skills (Bar-rows and Tamblyn, 1980). Althoughstill debated possibly due to differingdefinitions of the processes as well asits conceptual underpinnings (Maud-sley, 1999), PBL has been shown toyield successful educational outcomeswith measurable benefits (Blake et al.,2000). Small group interaction pro-vides an opportunity for students towork toward the understanding andresolution of specific patient problem.

Figure 4. A: A view of the TOUCH system in a standard Flatland environment, showing thevirtual patient with a blood pressure cuff, neck brace, and head bandage. On the medicaltrays are located an airway, an otoscope, stethoscope, and a penlight. B: Vital signs anddata are presented to the immersed user upon their “head up” display, in this case, thestethoscope and blood pressure numbers as well as time since the accident. C: The artificialintelligence timeline results in the virtual reality patient becoming cyanotic at which pointan airway must be inserted or death ensues. D: After the airway is inserted, the patientregains hue.

The TOUCH project isdetermining the

feasibility of usingemerging technologies

to overcomegeographic barriers to

delivery of medicaleducation and to

enhance the learningprocess with immersive

virtual reality.

FEATURE ARTICLE THE ANATOMICAL RECORD (PART B: NEW ANAT.) 27

This problem serves as the focus forthe establishment of hypothesis test-ing and the generation of learning is-sues ultimately stimulating problem-solving and reasoning skills. ProjectTOUCH capitalizes on this philoso-phy, because a specific problem is pre-sented to the student group. However,the presentation is novel, because avirtual patient is used, thus providinga sense of realism and urgency, par-ticularly because the simulation re-sponds to a timeline. In addition, stu-dents can dynamically determine thedirection of the scenario each poten-tially resulting in a unique outcome.

The TOUCH project places studentsin a position of decision-making, re-quiring intergroup analysis and rea-soning. Yet, numerous effects remainuntested. For example, the role of casedistribution with the uncertainties ofAG transmission must be examined. Itis unknown whether the personal in-teraction within a PBL group is re-tained over the AG. Although the real-ism of a virtual patient shouldfacilitate associative relationshipsproviding a more effective learningexperience for the student, this hy-pothesis remains to be tested. The ef-fectiveness of the AI system and Flat-land in providing effective reificationmust be validated. Another source ofuncertainty is the transmission capa-

bility of the AG, which can introduceissues of latency and jitter. In moreextreme cases, network congestionmay cause transmission interruptionsand “downtime” with a potentially ad-verse effect on dynamic PBL tutorialgroup interaction.

Currently, experiments are beingundertaken to compare the presenta-tion of the virtual patient and distrib-uted learning structure with a stan-dard paper case tutorial to assess theeffects of the virtual environment aswell as remote distribution of the casewith various iterations (Jacobs et al.,2003; Lozanoff et al., 2003). In partic-ular, the AG has, until now, not beenused to support these types of appli-cations. If evaluation of AG distribu-tion of PBL cases is successful, a moreuniform access to these enhance-ments would be made possible regard-less of location as Internet access be-comes more ubiquitous. Thus, amajor goal of the project is to under-stand the impact of the TOUCH tech-nology on learning dynamics andknowledge processing.

The system described here providesunique opportunities to navigate acase at numerous levels of anatomicalcomplexity. A zoom capability is be-ing developed which allows the partic-ipants in the virtual environment tomaneuver across levels of the system

interacting within that environment.For example, the student interacts atthe patient level, performing a physi-cal examination and assessing thephysical condition. The student couldthen changes levels and investigatelearning issues at the cellular or mo-lecular level by exploring for example,the effect a drug given to a patient andits effect at the current level or a levelabove or below. In addition, partici-pants could zoom out to witness andinteract with the consequences of pa-tient’s condition on phenomena at acommunity, population, or even aglobal level. Thus, Project TOUCHprovides a framework for initial eval-uation of the potential benefit of thesemethods to enhance further medicaleducation, as well as a means of de-fining the strengths, weaknesses, andbarriers to their use in medical educa-tion. In addition, this project sets thestage for future development and po-tential integration into a medicalschool curriculum and provides a“touchstone” for other applicationsusing these methods.

ACKNOWLEDGMENTSThe project described was partiallysupported by grant 2 DIB TM00003-02from the Office for the Advancementof Telehealth, Health Resources andServices Administration, Departmentof Health and Human Services. Thecontents of this study are solely theresponsibility of the authors and donot necessarily represent the officialviews of the Health Resources andServices Administration. The authorsthank the Maui High PerformanceComputing Center, UNM High Per-formance Computing and ResearchCenter, the UNM Health Sciences Li-brary, and the UNM Center for Tele-health for their support, as well as Dr.Sharon Stansfield of Ithaca College,Ithaca, NY, and her former team atSandia National Laboratories in Albu-querque, NM, for their helpful discus-sions. Dr. Robert Trelease, UCLA, isthanked for providing helpful com-ments during the preparation of thismanuscript.

LITERATURE CITED

Alverson DM, Saiki S, Buchanan H. 2001.Telehealth for Unified Community

Figure 5. A view of the Access Grid studio, screen, and a student user with the TOUCHsystem during one of the experimental sessions. The screen shows a mixture of live videoimages of the students participating in the learning session with the Spycam view of thevirtual environment as seen by the immersed student (upper, right corner of screen). Thedark-colored box over the user’s head is part of the tracking system. The student holds ajoywand in his right hand. The virtual reality operator is also seen below the screen. [Colorfigure can be viewed in the online issue, which is available at www.interscience.wiley.com.]

28 THE ANATOMICAL RECORD (PART B: NEW ANAT.) FEATURE ARTICLE

Health (TOUCH). 5th Annual Distrib-uted Medical Intelligence Conference,Breckenridge, CO.

Anderson A. 1991. Conversion to problem-based learning in 15 months. In: Boud D,Feletti G, editors. The challenge of prob-lem based learning. New York: St. Mar-tin’s Press. p 72–79.

Barrows HS, Tamblyn RM. 1980. Problem-based learning: An approach to medicaleducation. Medical education series,Vol. 1. New York: Springer Verlag.

Bereiter C, Scardamalia M. 2000. Com-mentary on part I: Process and productin problem-based learning (PBL) re-search. In: Evensen DH, Hmelo CE, edi-tors. Problem-based learning: A researchperspective on learning interactions.New Jersey: Lawrence Erlbaum Assoc.Publishers. p 185–195.

Blake RL, Hosokawa MC, Riley SL. 2000.Student performances on step 1 and step2 of the United States Medical LicensingExamination following implementation

of a problem-based learning curriculum.Acad Med 75:66–70.

Boud D, Feletti GI. 1991. Introduction. In:The challenge of problem-based learn-ing. London: Kogan Page. p 13–20.

International Telecommunication Union.1993. Video codec for audiovisual ser-vices at p*64kb/s, ITU-T Recommenda-tion H.261.

Jacobs J, Caudell TP, Wilks D, et al. 2003.Integration of advanced technologies toenhance experiential problem-basedlearning over distance: Project TOUCH.Anat Rec (New Anat) 270B:16–22.

Kaufman A, Mennin S, Waterman R, et al.1989. The New Mexico experiment: Ed-ucational innovation and institutionalchange. Acad Med 64:285–294.

Lehman T. 1999. Mbone Multicast/UnicastGateway and Reflector (C Program). Pas-adena, CA: University of Southern Cali-fornia.

Lozanoff S, Lozanoff B, Sora M-C, et al.2003. Anatomy and the Access Grid: ex-ploiting plastinated brain sections for

use in distributed medical education.Anat Rec (New Anat) 270B:30–37.

Luger G. 2002. Artificial intelligence. NewYork: Addison-Wesley.

Macedonia MR, Brutzman DP. 1994. MBoneprovides audio and video across the Inter-net. IEEE Comp 27:30–36.

Maudsley G. 1999. Do we all mean thesame thing by “problem-based learn-ing”? A review of the concepts and aformulation of ground rules. Acad Med74:178–185.

McCanne S, Jacobson V. 1995. Vic: A flex-ible framework for packet video. ACMMultimedia, p 511–522.

Schulzrinne H, Casner S, Frederick R, Jacob-son V. 1996. RTP: A transport protocolfor real-time applications. IETF Audio-Video Transport Working Group RFC1889.

Stansfield S, Shawver D, Sobel A, Prasad M,Tapia L. 2000. Design and implementationof a virtual reality system and its applica-tion to training medical first responderspresence: Teleoperators and virtual envi-ronments. MIT Press J 9:524–556.

FEATURE ARTICLE THE ANATOMICAL RECORD (PART B: NEW ANAT.) 29