8
Enabling discoverybased learning in construction using telepresent augmented reality Amir H. Behzadan a, , Vineet R. Kamat b a Department of Civil, Environmental, and Construction Engineering, University of Central Florida, Orlando, FL 32816-2450, United States b Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor 48109-2125, United States abstract article info Article history: Accepted 9 September 2012 Available online 3 October 2012 Keywords: Augmented reality Visualization Construction engineering Education Learning Construction engineering students often complain about the lack of engagement and interaction with the learning environment. Notwithstanding, many instructors still rely on traditional teaching methods which include the use of chalkboard, handouts, and computer presentations that are often lled with many words and few visual elements. Research shows that these teaching techniques are considered almost obsolete by a many students specially those who are visual learners or team workers. Also, the inuence of visual and social media has changed student perceptions and how they expect the instructional materials to be presented in a classroom setting. This paper presents an innovative pedagogical tool that uses remote videotaping, augmented reality (AR), and ultra-wide band (UWB) locationing to bring live videos of remote construction jobsites to the classroom, create an intuitive interface for students to interact with the objects in the video scenes, and visually deliver location-aware instructional materials to them. © 2012 Elsevier B.V. All rights reserved. 1. Introduction In a recent report published by the Association for American Colleges and Universities (AAC&U), the National Leadership Council for Liberal Education and America's Promise identied connecting knowledge with choices and actionas one of the seven principles of excellence [1]. Within engineering colleges and departments, this emphasis on creating a contextual link between knowledge and prac- tice is even more prevalent. Although a large body of research has shown that specic teaching practices can improve student learning, engagement, and interest in engineering [2,3], and despite the fact that creativity and practicality are highly encouraged in academia, many engineering faculty have not been motivated to change their classroom practice and still rely on traditional methods to convey the theoretical knowledge to their students [4]. The curricula of these programs are heavily shaped around the concept of exposing students to basic science and engineering courses, and are often inadequate in preparing them for real life problem solving and critical thinking [5]. While engineering students need to pick up the social and technical skills (e.g. critical thinking, decision-making, collaboration, and leadership) they also need to be competent in the digital age [6]. Mills and Treagust [7] discussed that although most students are graduating with good knowledge of abstract engineering science and computer literacy, they do not know how to apply this knowledge in practice. In the area of construction education, students have historically lacked a comprehensive knowledge of onsite construction tasks and the dynamics and complexities involved in a typical construction pro- ject [8]. This can be directly attributed to the fact that, to the most extent, existing instructional methods fall short in including guidance from and interaction with construction experts, and thus provide stu- dents with very limited access to hands-on experience. The classroom experience is often passive and deductive in nature as teachers com- municate the fundamentals and students have to deduct derivations, examples, and applications in assignments [9]. Even site visits that ideally form an important component of teaching and learning in many aspects may not be always possible due to issues such as sched- ule conicts, access difculties, weather situations, and the overriding need for safety and liability [10]. Recent gures show that today's digital native students (who are highly engaged with the technology around them) are more likely to choose scientic and engineering elds that are more exible and have already embraced the use of latest technologies [11,12]. Tobias [13] discussed that introductory science courses are often responsible for driving off many students who have an initial intention and the ability to earn science degrees but instead switch to nonscientic elds. These and similar challenges highlight the role of new and innova- tive teaching techniques that use advanced computing and informa- tion technologies, simulation, and virtual learning environments to Automation in Construction 33 (2013) 310 Corresponding author. E-mail addresses: [email protected] (A.H. Behzadan), [email protected] (V.R. Kamat). 0926-5805/$ see front matter © 2012 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.autcon.2012.09.003 Contents lists available at SciVerse ScienceDirect Automation in Construction journal homepage: www.elsevier.com/locate/autcon

Enabling discovery‐based learning in construction using telepresent augmented reality

Embed Size (px)

Citation preview

Page 1: Enabling discovery‐based learning in construction using telepresent augmented reality

Automation in Construction 33 (2013) 3–10

Contents lists available at SciVerse ScienceDirect

Automation in Construction

j ourna l homepage: www.e lsev ie r .com/ locate /autcon

Enabling discovery‐based learning in construction using telepresentaugmented reality

Amir H. Behzadan a,⁎, Vineet R. Kamat b

a Department of Civil, Environmental, and Construction Engineering, University of Central Florida, Orlando, FL 32816-2450, United Statesb Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor 48109-2125, United States

⁎ Corresponding author.E-mail addresses: [email protected] (A.H. Beh

(V.R. Kamat).

0926-5805/$ – see front matter © 2012 Elsevier B.V. Allhttp://dx.doi.org/10.1016/j.autcon.2012.09.003

a b s t r a c t

a r t i c l e i n f o

Article history:Accepted 9 September 2012Available online 3 October 2012

Keywords:Augmented realityVisualizationConstruction engineeringEducationLearning

Construction engineering students often complain about the lack of engagement and interaction with thelearning environment. Notwithstanding, many instructors still rely on traditional teaching methods whichinclude the use of chalkboard, handouts, and computer presentations that are often filled with manywords and few visual elements. Research shows that these teaching techniques are considered almostobsolete by a many students specially those who are visual learners or team workers. Also, the influenceof visual and social media has changed student perceptions and how they expect the instructionalmaterials to be presented in a classroom setting. This paper presents an innovative pedagogical toolthat uses remote videotaping, augmented reality (AR), and ultra-wide band (UWB) locationing tobring live videos of remote construction jobsites to the classroom, create an intuitive interface forstudents to interact with the objects in the video scenes, and visually deliver location-aware instructionalmaterials to them.

© 2012 Elsevier B.V. All rights reserved.

1. Introduction

In a recent report published by the Association for AmericanColleges and Universities (AAC&U), the National Leadership Councilfor Liberal Education and America's Promise identified “connectingknowledge with choices and action” as one of the seven principlesof excellence [1]. Within engineering colleges and departments, thisemphasis on creating a contextual link between knowledge and prac-tice is even more prevalent. Although a large body of research hasshown that specific teaching practices can improve student learning,engagement, and interest in engineering [2,3], and despite the factthat creativity and practicality are highly encouraged in academia,many engineering faculty have not been motivated to change theirclassroom practice and still rely on traditional methods to conveythe theoretical knowledge to their students [4].

The curricula of these programs are heavily shaped around theconcept of exposing students to basic science and engineering courses,and are often inadequate in preparing them for real life problem solvingand critical thinking [5]. While engineering students need to pick upthe social and technical skills (e.g. critical thinking, decision-making,collaboration, and leadership) they also need to be competent in thedigital age [6]. Mills and Treagust [7] discussed that although moststudents are graduating with good knowledge of abstract engineering

zadan), [email protected]

rights reserved.

science and computer literacy, they do not know how to apply thisknowledge in practice.

In the area of construction education, students have historicallylacked a comprehensive knowledge of onsite construction tasks andthe dynamics and complexities involved in a typical construction pro-ject [8]. This can be directly attributed to the fact that, to the mostextent, existing instructional methods fall short in including guidancefrom and interaction with construction experts, and thus provide stu-dents with very limited access to hands-on experience. The classroomexperience is often passive and deductive in nature as teachers com-municate the fundamentals and students have to deduct derivations,examples, and applications in assignments [9]. Even site visits thatideally form an important component of teaching and learning inmany aspects may not be always possible due to issues such as sched-ule conflicts, access difficulties, weather situations, and the overridingneed for safety and liability [10].

Recent figures show that today's digital native students (who arehighly engaged with the technology around them) are more likelyto choose scientific and engineering fields that are more flexible andhave already embraced the use of latest technologies [11,12]. Tobias[13] discussed that introductory science courses are often responsiblefor driving off many students who have an initial intention and theability to earn science degrees but instead switch to nonscientificfields.

These and similar challenges highlight the role of new and innova-tive teaching techniques that use advanced computing and informa-tion technologies, simulation, and virtual learning environments to

Page 2: Enabling discovery‐based learning in construction using telepresent augmented reality

4 A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

complement engineering education [14]. Due to recent advances inthe development of pedagogical concepts, applications and technology,and a simultaneous decline in hardware costs, the use of small-scaleand mobile systems in education has received even more attention.Several researchers reviewed the effectiveness of technology in theclassroom. Their findings indicated that when properly implemented,computer technology has a significant effect on student achievement,stimulated increased instructor–student interaction, encouraged coop-erative learning, collaboration, problem-solving, and student inquiryskills [15,16].

Studies on particular types of technology use are still beingconducted. For example, a recent study on the impact of electronicfield trips conducted by Maryland Public Television and the JohnsHopkins University Center for Technology in Education, found thatparticipating students exhibited significantly higher levels of knowl-edge on three social studies units than students who had not partici-pated. Participating students also demonstrated greater improvementin reading comprehension skills [17].

More recently, the introduction of computer technologies such ascomputer-aided design (CAD) and building information modeling(BIM) has aimed to improve the quality of learning in constructioneducation. In a recent study, more than 60% of the students agreedthat they had a better understanding on building structure afterlearning BIM [18]. Messner et al. [19] presented the results of a pro-ject aimed to improve construction education through the use of vir-tual reality (VR) and 4D CAD modeling. In particular, they integrateda 4D CAD visualization application into their undergraduate architec-tural engineering program, and experienced the use of a VR tool thatallowed construction engineering students to interactively generate aconstruction sequence for a project in an immersive environment.The results of these experiments suggested that students can (1) un-derstand construction projects and plans much better when advancedvisualization tools are used, and (2) very quickly gain experience bydeveloping and critiquing construction schedules in a full-scale virtu-al environment. These and similar studies indicate that the integra-tion of advanced interactive 3D visualization into the curriculumcan significantly assist students to relate their abstract (and mostlytheoretical) knowledge to real practical problems in the field. At thesame time, it is imperative that accumulating adequate skills andtraining to operate equipment and conduct engineering tasks throughtraditional training methods takes significant time and has proven tobe costly and inefficient [20]. Thus, the overarching goal of this pro-ject is to provide a timely and effective education to the studentsthrough integrating technology into core curricula and implementingit in a classroom setting, rather than only providing devices and soft-ware [21].

At the same time, professional development and collaboration be-tween students and instructors need to be encouraged and new formsof pedagogy and assessment must be accordingly created. What isessential is to make technology a ubiquitous resource in the learningprocess, personalize it based on students' individual needs and learn-ing styles, and then ask instructors to mentor and facilitate theuse of technology while students learn, direct, and collaborate in atechnology-rich environment. Integrating technology into the curric-ulum in today's schools should not mean finding ways that computerscan help instructors teach the same old topics in the same old ways.Instead, instructors must have the opportunity to combine technolo-gy with emerging models of teaching and learning to transformeducation. In an attempt to implement this philosophy, this paperpresents the latest findings of an ongoing work which aims to explorethe extent to which collaborative augmented reality (AR) can be ef-fectively used as a transformative learning tool to improve the qualityof education in engineering and science. The authors use constructioneducation as a test bed by enabling students to learn the basics ofequipment, processes, and operational safety in a learning environ-ment that supports real time interaction with a remote jobsite.

2. Importance of research to construction education

Construction operations consist of human-machine interactionsand high levels of exposure to equipment and tools in harsh environ-ments. Although many construction firms have implemented strictjobsite safety measures and training, compared to other industries,construction still has the highest accident and fatality rate in the na-tion [22,23]. Research shows that inexperience and lack of knowledgeamong young and unskilled project personnel account for the highestnumber of work injuries and fatalities [24,25]. Although workersafety and health issues have been previously discussed at length byresearchers [26], the inevitable risks associated with such projects,as well as the high costs of accident recovery especially comparedto relatively lower pay scale in construction [27] have been tradition-ally an impediment to the involvement and recruitment of studentsand youngsters in construction projects.

On the other hand, the construction industry has been constantlyfacing increased national and international competition, and strin-gent governmental and environmental regulations while encounter-ing issues such as organized labor, challenges of new technologiesand new materials, and construction of complex projects [28]. Theseforces emphasize the importance of a steady supply of competitiveand strong workforce to the industry and as a result, attracting talent-ed students and imparting the best possible education are critical tothe future of the U.S. construction industry [12]. In reality, many con-struction programs fail to provide students with an environmentwhere they can acquire the skills and experience necessary to be suc-cessful at professional practice and onsite performance. In a studyconducted by the U.K. Institution of Structural Engineers, it was re-vealed that civil engineering education fails to include practicalityand feel for construction engineering [29,30]. Most engineers needto spend many years in the field in order to assimilate an adequateknowledge about actual construction performance.

Hence, given the high demand for skilled workforce in the construc-tion industry, the high level of risk associated with equipment opera-tions on site, and the fact that appropriate operational and safety skillstake long time to accumulate, this research explores an innovativeapproach that integrates advanced technology into the teaching andlearning experience. The authors have investigated visualization andsensing techniques to study, design, implement, and evaluate a poten-tially transformative pedagogical paradigm for engineering processeducation to impart the required training while providing flexibility,mobility, and ease of use. At the same time, these enhancements pro-vide a more exciting and vivid experience for students and instructorswhile increasing the quality of learning through hands-on interactionwith construction equipment, tools, and processes.

3. Methodology

In this section, first a comprehensive review of the enabling tech-nology is conducted, and then a more detailed description of thedeveloped methodology will be presented.

3.1. Enabling technology: augmented reality (AR)

The presented research aims to enhance the learning process byimplementing AR visualization in an interactive learning environ-ment. AR is the superimposition of computer-generated informationover a user's view of the real world. By presenting contextual infor-mation in textual or graphical format, the user's view of the realworld is enhanced or augmented beyond the normal experience[31]. The addition of such contextual computer-generated informa-tion spatially located relative to the user can assist in the performanceof several scientific and engineering tasks. For this reason, AR en-abling technologies have been researched in an increasing numberof studies during the recent years.

Page 3: Enabling discovery‐based learning in construction using telepresent augmented reality

5A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

AR is different from VR, a visualization technology that has beenaround for several decades. Unlike VR, AR does not completelyreplace the real world, rather the real world is supplemented withrelevant synthetic information, and thus real and virtual objects coex-ist in an augmented space [32]. As shown in Fig. 1, the real advantageof AR is that the view of the real world is used as a readymade back-drop for displaying superimposed information (e.g. graphics, sound,diagrams) of interest.

This feature enables AR users to create and overlay only the infor-mation that needs to be augmented onto the real world view and as aresult, recreating the surrounding environment is not a concern. Atthe same time, by bringing the real world into the visualization, theuser (as a real object in the real world) will become part of the AR ex-perience and hence, can interact with both real and virtual objects inan intuitive manner. Collaborative AR takes this even further as it al-lows multiple users to access a shared space populated by virtual ob-jects [33].

AR has been used by researchers in several fields of science andengineering. For example, visualization of medical informationprojected onto a patient's body is one of the established applicationsof AR technology [34]. AR has also been used in military applications.For instance, the Battlefield Augmented Reality System (BARS) is anAR system that can network multiple dismounted war fighters to-gether with a command center and supports information gatheringand human navigation for situation awareness in urban settings[35]. Previous studies have also explored AR for a number of architec-ture and construction applications. For example, previous researchhave used AR to overlay locations of subsurface utility lines ontoreal world views in order to demonstrate the potential of AR inhelping maintenance workers avoid buried infrastructure and struc-tural elements [36]. Also, AR has been used to study the extent ofhorizontal displacements sustained by structural elements due toextreme loading conditions [37], and to assist viewers for computer-aided drawing [38].

More recently, the authors designed and implemented ARVISCOPE,a general purpose 3D visualization environment capable of animatingsimulation models of dynamic engineering operations in outdoorAR [39]. ARVISCOPE supports real time communications with GlobalPositioning System (GPS) and motion tracking devices in order tocreate and constantly update live AR animations of an engineeringoperation displayed to a mobile observer. Another major outcome ofthis research was ROVER, a mobile computing apparatus designed toaddress the problem of geo-referenced registration which historicallyhas been a major research challenge in outdoor AR [40]. A proper reg-istration results in objects in the real world and superimposed virtualobjects precisely aligned in the global coordinate system (i.e., longi-tude, latitude, and altitude) with respect to each other. When usedtogether, ARVISCOPE and ROVER can create interactive AR animationsof any length and complexity. The designed framework is compatiblewith commonly accepted data transfer protocols [41].

Fig. 1. Snapshots of experiments conducted by authors in which real world views were usreality.

The 2010 Horizon Report, a joint report prepared by the NewMedia Consortium and EDUCAUSE [42], predicts that the use of ARin education will be widespread in two to three years. AR can enhancethe traditional learning experience since:

• The ability to learn concepts and ideas through interacting with ascene and building one's own knowledge (constructivism learning)facilitates the generation of knowledge and skills that otherwisewould take too long to accumulate.

• Traditional methods of learning spatially-related content by view-ing 2D diagrams or images create a cognitive filter. This filter existseven when working with 3D objects on a computer screen becausethe manipulation of the objects in space is done through mouseclicks. By using 3D immersive AR, a more direct cognitive pathtoward understanding the content can be made possible.

• Makingmistakes during the learning processwill have literally no realconsequence for the educator whereas in traditional learning, failureto follow certain rules or precautions while operating machinery orhandling a certain hazardous material would lead to serious safetyand health related problems.

• AR supports discovery-based learning which refers to a learningtechnique in which students take control of their own learning pro-cess, acquire information, and use that information in order to expe-rience scenarios which may not be feasible to construct in realitygiven the time and space constraints of a typical engineering project.

• One of the most important objectives of all academic curricula is topromote social interactions among educators and to teach them tolisten, respect, influence, and act. By providing multiple studentsaccess to a shared augmented space populated with real and virtualobjects, they are encouraged to become involved in teamwork andbrainstorming activities in order to solve a problem which at thesame time, helps them improve their communication skills.

There are two major categories of AR visualization technologies:marker-less, and marker-based. A marker is a 2D computer recogniz-able graphical pattern or symbol printed on a sheet of paper to whicha piece of information (e.g. video, audio, text, diagram, or graphics) isassigned. In marker-less AR, markers are not used and the conditionunder which a virtual object is displayed is defined by the user. Forexample, the user can specify that a virtual object is always displayedat a certain 2D coordinate on an AR display. In marker-based AR, avirtual object is displayed only if its corresponding marker patternis visible. A well-known example of marker-based AR is the LegoDigital Box technology used in Disney World. As shown in Fig. 2,kids can hold a Lego box in front of a camera and the animated 3Dmodel of the completed Lego set is superimposed and displayed ontop of the box when the box is visible to the camera.

In a more recent effort, the Public Broadcasting Station (PBS) usedpart of a $72 million grant from the U.S. Department of Education totest whether AR games and mobile devices can help young children

ed as a readymade backdrop for displaying superimposed information in augmented

Page 4: Enabling discovery‐based learning in construction using telepresent augmented reality

Fig. 2. Using marker-based AR visualization enables kids to see a completed Lego setoverlaid on a Lego Digital box technology in Disney World.

6 A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

with skills such as sorting and measuring [43]. In this research,marker-based AR was used. In particular, marker patternswere printedon sheets of paper bound together to form an “AR Book”. Fig. 3 showssample pages of GEN-1, a first generation AR Book designed by theauthors for prototype development and validation experiments. Moredetails about the development of GEN-1 are discussed in Section 4.

3.2. Visual information delivery using augmented reality

Fig. 4 shows the conceptual framework developed in the presentedresearch to deliver visual information from a remote jobsite to stu-dents in a classroom. As shown in this figure, each student is equippedwith an AR head-mounted display (HMD) which enables viewing ofaugmented information and graphics overlaid on the markers insidethe AR Book. When a marker is visible through the HMD, the corre-sponding information is shown to the student. The following subsec-tions describe the necessary steps that were taken to deliver visualinformation to a student.

Fig. 3. GEN-1 “AR Book” prototype developed by authors contains AR markers (right page)mation and images (left page).

3.2.1. Capturing and transmission of remote scene informationReal time video streams of a remote construction jobsite captured

by an IP-addressable camera are transmitted via the internet to theclassroom, and displayed on a large projection screen. The globalposition of the camera (i.e. longitude, latitude, and altitude) is alsoobtained using a GPS device mounted on top of the camera. In orderto identify an object in the video (e.g. crane, excavator, hauler), it isessential to geo-reference that object. This is done by capturing theobject's global position using a GPS device. Most modern constructionequipment such as graders or dozers takes advantage of built-in GPStransmitters which can be used to obtain their position in the field.However, if necessary, site personnel can be asked to mount a GPSdevice on any object of interest the position of which needs to begeo-referenced in the video. The positional information is constantlysent to a computer.

Knowing the global positions of the camera (viewpoint) and anobject of interest, the local position of the object inside the coordinateframe of the projection screen with the camera located at the centerpoint of the screen is calculated using existing geo-referencingmethods such as the algorithm introduced in [44] and used byauthors in [45]. For example, if the camera located at 81° 20′ 59″ Wand 28° 27′ 57″ N (elevation 28 m above mean sea level), is capturingviews of an object (e.g. construction equipment) located at 81° 21′00″ W and 28° 28′ 00″ N (elevation 26 m above mean sea level),using the formulation presented in [44], the planar distance and azi-muth between the camera and the object will be 96 m and 343.84°,respectively. Hence, assuming that X values indicate points to theright (+) or to the left (−) of the camera's lens, Y values show eleva-tion difference (positive if the object is located above the camera'slens elevation, and negative otherwise), and Z axis runs from thecamera's lens into the depth of the field, the local position of the ob-ject in the camera's coordinate frame can be calculated as follows,

Z ¼ 96� cos 360∘−343:84∘ð Þ ¼ 92:21mX ¼ 96� sin 360∘−343:84∘ð Þ ¼ 26:72mY ¼ 28:00−26:00 ¼ 2:00m

used to display superimposed information as well as supplementary equipment infor-

Page 5: Enabling discovery‐based learning in construction using telepresent augmented reality

Fig. 4. Major steps in bringing the collaborative augmented reality learning experience to the classroom.

7A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

These calculated coordinate values will be then converted to or-thogonal coordinates and used to determine the position of the objectinside the live streaming video projected on the 2D screen. Thiscontextual knowledge enables further interaction with the objectsas described in the following subsection.

Fig. 5. As soon as the distance between a tagged finger and a video object is less than a min

3.2.2. Visual delivery of informationStudents walk up to the projection screen while carrying their AR

Books and watch the video stream. As shown in Fig. 5, each studentwears a touch sensor (i.e. tag) on his or her index finger to have theability to interact with the video scene and retrieve information

imum threshold, the augmented information is overlaid and displayed to the student.

Page 6: Enabling discovery‐based learning in construction using telepresent augmented reality

8 A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

about the objects of interest. As the student moves his or her taggedfinger on the screen, the local position of the student's tagged fingeris captured by a network of data receivers installed in predeterminedpositions. At least three data receivers are needed in order to preciselycalculate the tagged finger position through triangulation in the localcoordinate of the projection screen. An additional (fourth) data re-ceiver is also used to increase accuracy and eliminate any potential er-rors in locating the position of the tag on the projection screen.

When the student's tagged finger moves close to an object inthe video (i.e. linear distance between the tagged finger and the posi-tion of a video object is less than a certain threshold), relevant infor-mation (e.g. 2D or 3D models, manufacturer's data, loading charts,work schedule) are displayed to the student through the HMD onAR Book markers. Students can also move their AR Books aroundthe room to form groups, virtually manage a project, discuss a certainscenario, and explore alternative solutions in a collaborative setting,while learning basic concepts such as equipment operations, jobsitesafety, resource utilization, work sequencing, and site layout.

4. Implementation and results

The authors have successfully created a first generation AR Book inorder to test if contextual graphical information can be effectivelypresented to students in real time.

4.1. GEN-1: first generation AR Book prototype

The first generation AR Book developed in this research, GEN-1, isa prototype of an AR-enhanced book that was implemented in VisualStudio .NET, using the ARToolkit library. ARToolkit is one of the earli-est object-oriented programming libraries that provide functionali-ties and methods to track fiducial markers. As shown in Fig. 6, afiducial marker is a logo bounded by a thick black frame. The four cor-ners of the frame are used to compute the camera pose, and the cen-ter logo is used to interpret the identity of the marker so it can bemapped to certain 2D/3D graphics. Because of its simplicity and fasttracking speed, ARToolkit has been popular for over a decade innumerous AR visualization applications.

Using the functionalities provided by ARToolkit, GEN-1 provides afiducial marker-based AR interface for students and helps them gaina better understanding of construction equipment by overlaying 3Dmodels of construction machinery on AR markers [46]. As shown inFig. 3, GEN-1 consists of left hand pages each coupled with a corre-sponding right hand page. Each left hand page contains informativedetails and illustrations about a certain piece of construction equip-ment, which can include a wide range of information such as the de-tails about various parts of the equipment, history of the equipment,major components, functions, and also its current manufacturers.

Fig. 6. 3D model of a snowman superimposed on a fiducial AR marker.

The corresponding right hand page contains one or more markerpatterns. Fig. 7 shows snapshots of two validation experiments. Asshown in this figure, GEN-1 uses a normal textbook as the main inter-face. Students can turn the pages of the book, look at the pictures, andread the text without any additional technology. However, whenlooking at the same pages through an AR display, students will see3D virtual models of the equipment discussed on the left hand pageon top of the marker depicted on the right hand page. The markerpatterns are detectable by the designed AR visualization platform.Once a marker is detected, a virtual graphic model of constructionequipment (previously assigned to that marker inside the AR applica-tion) is displayed on the marker. The output can be seen on a hand-held display, through a HMD, or on the computer screen. Themodels appear to be attached to the real page so students can seethe AR scene from any perspective by moving themselves or the book.

The virtual content can be static or animated. This interface designsupports collaborative learning as several students can look at thesame book through their AR displays and see the virtual modelssuperimposed over the book pages from their own viewpoints.Since they can see each other and the real world at the same timeas the virtual models they can easily communicate using normalface-to-face communication cues. All of the students using the ARBook interface have their own independent view of the content soany number of people can view and interact with a virtual model aseasily as they could with a real object [47].

4.2. Collaborative learning in a multi-user AR environment

Recent research by the authors has also led to the development ofARVITA, an AR-based 3D visualization environment in which multipleusers can view and interact with modeled engineering processes fromdifferent perspectives. This framework is an expansion of a VR-basedvisualization platform called VITASCOPE, an open, loosely-coupled,

Fig. 7. Authors have successfully conducted preliminary validation experiments ofGEN-1 AR Book.

Page 7: Enabling discovery‐based learning in construction using telepresent augmented reality

Fig. 8. Snapshot of a collaborative learning experiment conducted using marker-basedAR visualization in which multiple users simultaneously viewed and interacted withthe scene.

Table 1Major components used to develop a large-scale test environment.

Component Item Purpose

IP-accessiblecamera

StarDot NetCam XL(wireless)

Capture and transmit real timevideos of a remote location

Indoor locationingsystem

Ubisense ultra-wideband (UWB) platform(receivers and tags)

Locate the position of studenttagged finger on the projectionscreen

Head mounteddisplay (HMD)

eMagin Z800 3DVisor Display graphical informationto the student through a pairof AR goggles

HMD-mountedvideo camera

Microsoft LifeCam VX-5000 Capture views of studentsurroundings to be used as thebackdrop of the AR visualization

9A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

user-extensible 3D animation description language designed specifi-cally for visualizing simulated construction processes and resultingproducts in 3D, and developing higher-level construction visualiza-tion tools [48]. By integrating computer detectable AR marker pat-terns and viewpoint tracking technology into VITASCOPE, the newlydeveloped collaborative AR learning tool enables students to look atand immerse in dynamic animations of simulation-based construc-tion scenarios displayed to them on top of the markers. Fig. 8 showsa snapshot of an experiment where two students are observing a sim-ulated earthmoving operation from two different perspectives. In thisexperiment, a camera mounted in front of each user's HMD sent astreaming video of the real world to the computer. The predefinedAR marker pattern was then detected inside each video frame andviewpoint transformation parameters were accordingly calculated inreal time. The 3D modeled earthmoving operation was then overlaidon top of the marker and displayed on a separate computer screen toeach user.

5. Future work

This paper focused on describing the research motivation and thetechnical aspects of an AR-based pedagogical framework developedby the authors. The next phases of this research will comprisefull-scale usability experiments in classroom settings aimed at evalu-ating student learning in the context of the developed methodology.Currently, a full scale prototype is being developed to test the func-tionality of the presented methodology in a classroom setting. Themajor peripheral components of this test bed are listed in Table 1.

Given the rapid rate of technology advancement, it is conceivablethat the overall success and adoptability of this work in the long term,to a large extent will not be affected by the cost of technology. Now-adays, students have easy access to digital media and tools for free ora very low price. Many such existing devices and tools can be easilymodified for use in this project. To this end, the authors will investi-gate and test the possibility of using existing Wi-Fi access points inthe classrooms or Bluetooth-enabled mobile cellphones as a replace-ment for relatively expensive equipment necessary to build a net-work of data receivers around the projection screen.

6. Summary and conclusions

The main motivation behind this research was that unlike severalother scientific and engineering fields, many construction and civilengineering programs still heavily rely on traditional instructionalmethods and fall behind in terms of integrating state-of-the-art infor-mation delivery technologies into the classroom. In this paper, thelatest results of an ongoing project which aims to investigate therequirements and develop a real time interactive visual informationdelivery framework were presented.

In this framework, real time video streams of a remote construc-tion jobsite are captured and transmitted via the internet to the class-room, and displayed on a large projection screen. Each student canwalk up to the screen while carrying an AR Book and watch thevideo stream. Students have the ability to interact with the sceneand retrieve information about objects by wearing smart tags ontheir fingers. As the student moves his or her finger on the screen, anetwork of motion sensors captures the position of the student'sfinger on the projection screen and maps that position to the loca-tions of objects in the video. When the student's finger moves closeto an object in the video, relevant information (e.g. 2D or 3D models,manufacturer's data, loading charts) are augmented on the AR Bookand displayed to the student. In order to see this overlaid information,each student is equipped with an AR HMD which enables viewingof augmented information and overlaid graphics. Students can alsomove their AR Books around the room to form groups, virtually man-age a project, discuss specific planning scenarios, and explore alterna-tive solutions in a collaborative setting.

Acknowledgments

The presented work is partially supported by a grant from theOffice of Research and Commercialization (ORC) at the University ofCentral Florida (UCF). Activities that resulted in the development ofGEN-1 and ARVITA were supported by the National Science Foundation(NSF) throughgrant CMS-0448762. The authors gratefully acknowledgethe support from the UCF and the NSF. The authors would also like tothank Mr. Suyang Dong (Ph.D. Student at the University of Michigan)for his participation in the development of ARVITA, and Mr. Asif Iqbal(former Visiting Student Researcher at the University of Michigan) forhis participation in the development of the first generation AR Book.Any opinions, findings, conclusions, and recommendations expressedin this paper are those of the authors and do not necessarily reflectthe views of the UCF, NSF, or the individuals named above.

References

[1] Association for American Colleges, Universities (AAC&U), Principles of excel-lence. available at: http://www.aacu.org/leap/principles_of_excellence.cfm 2007(accessed May 26, 2011).

[2] E. Seymour, N. Hewitt, Talking about Leaving, Westview Press, Boulder, CO, 1997.

Page 8: Enabling discovery‐based learning in construction using telepresent augmented reality

10 A.H. Behzadan, V.R. Kamat / Automation in Construction 33 (2013) 3–10

[3] M. Prince, Does active learning work? A review of the research, in: Journal of En-gineering Education, 93(3), American Society of Engineering Education (ASEE),Washington, DC, 2004, pp. 223–246.

[4] K. Friedrich, S. Sellers, J. Burstyn, Thawing the chilly climate: inclusive teaching re-sources for science, technology, engineering, and math, in: D. Robertson, L. Nilson(Eds.), To Improve the Academy: Resources for Faculty, Instructional, and Organiza-tional Development, Volume 25, Jossey-Bass, Indianapolis, IN, 2009, pp. 133–141.

[5] R. Fruchter, The A/E/C virtual atelier: experience and future directions, in: Proceed-ings of the 4th Congress on Computing in Civil Engineering, Philadelphia, PA, 1997,pp. 395–402.

[6] J. Bowie, Enhancing classroom instruction with collaborative technologies.eSchool news—available at: http://www.eschoolnews.com/2010/12/20/enhancing-classroom-instruction-with-collaborative-technologies/ 2010 (accessed May 26,2011).

[7] J.E. Mills, D.F. Treagust, Engineering education — is problem-based or project-basedlearning the answer, Australasian Journal of Engineering Education (2011) availableat: http://www.aaee.com.au/journal/2003/mills_treagust03.pdf, 2003 (accessed De-cember 21, 2011).

[8] D. Arditi, G. Polat, Graduate education in construction management, in: Journal ofProfessional Issues in Engineering Education and Practice, 136(3), American Societyof Civil Engineers (ASCE), Reston, VA, 2010, pp. 175–179.

[9] C.B. Tatum, Construction engineering education, need, content, learning approaches,in: Proceedings of the Construction Research Congress (CRC), Banff, Canada, 2010,pp. 183–193.

[10] D. Echeverry, Multimedia-based instruction of building construction, in: Proceed-ings of the 3rd Congress on Computing in Civil Engineering, Anaheim, CA, 1996,pp. 972–977.

[11] W. Johnson, R. Jone, Declining interest in engineering studies at a time of increasedbusiness need. available at: http://www.worldexpertise.com/DecliningInterestinEngineeringStudiesataTimeofIncreasedBusinessNeeds.htm 2006 (accessed May 26,2011).

[12] M. Nehdi, Crisis of civil engineering education in information technology age:analysis and prospects, in: Journal of Professional Issues in Engineering Educationand Practice, 128(3), American Society of Civil Engineers (ASCE), Reston, VA,2001, pp. 131–137.

[13] S. Tobias, They're Not Dumb, They're Different: Stalking the Second Tier, ResearchCorporation, Tucson, AZ, 1990.

[14] F. Liarokapis, N. Mourkoussis, M. White, J. Darcy, M. Sifniotis, P. Petridis, A. Basu,P.F. Lister, Web3D and augmented reality to support engineering education, in:World Transactions on Engineering and Technology Education, 3(1), UNESCOInternational Center for Engineering Education, Melbourne, Australia, 2004,pp. 11–14.

[15] D.S. Stratham, C.R. Torell, Computers in the Classroom: The Impact of Technologyon Student Learning, Army Research Institute, Boise State University, 1996. avail-able at: www.temple.edu/LSS/spot206.htm (accessed May 26, 2011).

[16] H. Wenglinsky, Does it Compute?: The Relationship between Educational Tech-nology and Student Achievement in Math, Educational Testing Services (ETS),Princeton, NJ, 1998.

[17] H. Jennings, L. Lucca, Summary of Experimental Study of the Ready to TeachDevelopmental Electronic Field Trip Reader Project, ORC Macro Report, Calverton,MD, 2005.

[18] K.A. Wong, K.F. Wong, Building information modeling for tertiary constructioneducation in Hong Kong, Journal of Information Technology in Construction(ITCon) 16 (2011) 467–476.

[19] J.I. Messner, S.C.M. Yerrapathruni, A.J. Baratta, V.E. Whisker, Using virtual realityto improve construction engineering education, in: Proceedings of the AmericanSociety for Engineering Education Annual Conference and Exposition, 2003,Nashville, TN.

[20] S.M. AbouRizk, A. Sawhney, Simulation and gaming in construction engineeringeducation, in: Proceedings of the American Society of Engineering EducationConference, 1994, Edmonton, Canada.

[21] K. Ash, Effective use of digital tools seen lacking inmost tech.-rich schools. Educationweek—available at: http://www.edweek.org/ew/articles/2011/02/23/21computing.h30.html 2011 (accessed May 26, 2011).

[22] M. Carliner, Life and death in construction industry, Housing Economics 108 (2001)available at: http://www.michaelcarliner.com/HE0108-Life-Death-in-Construction.pdf(accessed May 26, 2011).

[23] E.S. Pollack, R.T. Chowdhury, Trends inwork-related death and injury rates amongU.S. construction workers, The Center to Protect Workers' Rights (CPWR): ReportD1-00, Silver Spring, MD, 2001.

[24] T.S. Abdelhamid, J.G. Everett, Identifying root causes of construction accidents, in:Journal of Construction Engineering and Management, 126(1), American Societyof Civil Engineers (ASCE), Reston, VA, 2000, pp. 52–60.

[25] T.M. Toole, Construction site safety roles, in: Journal of Construction Engineeringand Management, 128(3), American Society of Civil Engineers (ASCE), Reston,VA, 2002, pp. 203–210.

[26] M. Behm, Linking construction fatalities to the design for construction safetyconcept, in: Journal of Safety Science, 43(8), Elsevier Science, New York, NY,2005, pp. 589–611.

[27] A.S. Aldosary, S.A. Assaf, Analysis of factors influencing the selection of collegemajors by newly admitted students, in: Journal of Higher Education Policy, 9(3),Taylor and Francis Group, 1996, pp. 215–220.

[28] A. Sawhney, A. Mund, J. Koczenasz, Internet-based interactive constructionmanagement learning system, in: International Journal of Construction Educationand Research, 6(3), Associated Schools of Construction (ASC), Provo, UT, 2001,pp. 124–138.

[29] R. Shaw, Educational Requirements for Avoidance of Structural Adequacy, Institutionof Structural Engineers, Structural Failures in Buildings, London, UK, 1981.

[30] A.C.Walker, Study andAnalysis of the First 120 Failure Cases, Institution of StructuralEngineers, Structural Failures in Buildings, London, UK, 1981.

[31] A.H. Behzadan, V.R. Kamat, Visualization of construction graphics in outdooraugmented reality, in: Proceedings of the 37th Conference on Winter Simulation,Orlando, FL, 2005, pp. 1914–1920.

[32] R. Azuma, A survey of augmented reality, in: Teleoperators and Virtual Environ-ments, 6(4), MIT Press, Cambridge, MA, 1997, pp. 355–385.

[33] H. Kaufmann, D. Schmalstieg, Mathematics and geometry education with collab-orative augmented reality, in: Computers and Graphics, 27(3), Elsevier Science,New York, NY, 2003, pp. 339–345.

[34] W. Barfield, T. Caudell, Fundamentals of Wearable Computers and AugmentedReality, Lawrence Erlbaum Associates, Mahwah, NJ, 2001.

[35] M.A. Livingston, L.J. Rosenblum, S.J. Julier, D. Brown, Y. Baillot, J.E. Swan II, J.L.Gabbard, D. Hix, An augmented reality system for military operations in urbanterrain, in: Proceedings of the Interservice/Industry Training, Simulation, andEducation Conference, National Training and Simulation Association, Arlington,Virginia, 2002, pp. 89–96.

[36] G.W. Roberts, A. Evans, A. Dodson, B. Denby, S. Cooper, R. Hollands, The use ofaugmented reality, GPS, and INS for subsurface data visualization, in: Proceedingsof the 22nd International Conference of the FIG, Washington, D.C., 2002.

[37] V.R. Kamat, S. El-Tawil, Evaluation of augmented reality for rapid assessment ofearthquake-induced building damage, in: Journal of Computing in Civil Engineering,21(5), American Society of Civil Engineers (ASCE), Reston, VA, 2005, pp. 303–310.

[38] X. Wang, P.S. Dunston, Potential of augmented reality as an assistant viewer forcomputer-aided drawing, in: Journal Computing in Civil Engineering, 20(6),American Society of Civil Engineers (ASCE), Reston, VA, 2006, pp. 437–441.

[39] A.H. Behzadan, V.R. Kamat, Automated generation of operations level construc-tion animations in outdoor augmented reality, in: Journal of Computing in CivilEngineering, 23(6), American Society of Civil Engineers (ASCE), Reston, VA,2009, pp. 405–417.

[40] A.H. Behzadan, B.W. Timm, V.R. Kamat, General-purpose modular hardware andsoftware framework for mobile outdoor augmented reality applications in engi-neering, Journal of Advanced Engineering Informatics, 22(1), Elsevier Science,New York, NY, 2008, pp. 90–105.

[41] A.H. Behzadan, V.R. Kamat, Reusable modular software interfaces for outdooraugmented reality applications in engineering, Proceedings of the InternationalWorkshop on Computing in Civil Engineering, American Society of Civil Engineers,Pittsburgh, PA, 2007, pp. 825–837.

[42] L. Johnson, A. Levine, R. Smith, S. Stone, The 2010 Horizon Report, The NewMediaConsortium, Austin, TX, 2010.

[43] D. Rule, PBS uses augmented-reality games to help kids learn. available at: http://www.appedia.com/news/1889.html 2011 (accessed May 26, 2011).

[44] T. Vincenty, Direct and inverse solutions of geodesics on the ellipsoid with appli-cation of nested equations, in: Survey Review, 33(176), Maney Publishing, Leeds,UK, 1975, pp. 88–93.

[45] A.H. Behzadan, V.R. Kamat, Georeferenced registration of construction graphics inmobile outdoor augmented reality, in: Journal of Computing in Civil Engineering,21(4), American Society of Civil Engineers (ASCE), Reston, VA, 2007, pp. 247–258.

[46] A.H. Behzadan, A. Iqbal, V.R. Kamat, A collaborative augmented reality basedmodeling environment for construction engineering and management education,in: Proceedings of the 43rd Conference on Winter Simulation, Phoenix, AZ, 2011.

[47] M. Billinghurst, H. Kato, I. Poupyrev, The MagicBook: a transitional AR interface,in: Journal of Computer and Graphics, 25(5), Elsevier Science, New York, NY,2001, pp. 745–753.

[48] V.R. Kamat, VITASCOPE: Extensible and scalable 3D visualization of simulatedconstruction operations, Ph.D. Dissertation, Department of Civil and EnvironmentalEngineering, Virginia Tech, Blacksburg, VA, 2003.