9
Ubiquitous Computing and Communication Journal 1 REMOTE EXPERIMENTATION USING AUGMENTED REALITY Salaheddin Odeh Department of Computer Engineering, Faculty of Engineering, Al-Quds University, Abu Dies, Jerusalem, Palestine [email protected] Shatha Abu Shanab Electronic and Computer Engineering Master Program, Faculty of Engineering, Al-Quds University, Abu Dies, Jerusalem, Palestine [email protected] ABSTRACT In engineering and science education, laboratories represent an essential part of the study curriculum, and studies are incomplete if laboratories are not present. Laboratories cover the practical side of engineering studies, through which students can improve their theoretical knowledge as a stable basis to strengthen their skills required for enabling them to deal with any real problem in the future after graduation. This research tries to realize the idea of combining internet laboratories with augmented reality. This is what we can designate as an Augmented Reality Internet Laboratory (ARI-Lab). Definitively, augmented reality is to combine the interactive real world with a generated world by an interactive computer system in such a way that they appear as one environment. It shows the real world with an overlay of additional information so that the user can not distinguish between the real world and the virtual augmentation [1], [2]. In our consideration, a web-based remote experimentation using augmented reality is achieved through the facts that students can carry out an engineering experiment represented by real and virtual elements, components and equipments, through overlaying real kits with virtual (graphical) objects. AR technologies make possible the interaction between students and the graphically represented remote experiment to be no longer restricted in a face-to-screen fashion; rather it will be strived to dissolve students in a remotely located real environment laboratory. A number of modern laboratory instruments and equipments can only not be controlled and accessed interactively, but rather more through special kind of interfaces as well, enabling us to control and to implement the experiment via the internet. In this contribution, it will be shown how we can design, develop and implement an interactive web-based ARI-Lab. Keywords: Augmented Reality (AR), human-computer interaction, experimenting kit, remote laboratories, distributed systems, usability testing, C# and .NET. 1 INTRODUCTION The main goal of an engineering or a science laboratory is to provide students with the practical skills through experimenting with real devices and instruments [3], offering them a complete learning, which consists of a mixture of theoretical and practical sessions for enabling them to deal with and to solve any real world problem after graduation. In recent time, the usage of laboratory simulator programs has been increased rapidly. Although students can build their circuits and test it by these programs, simulator programs are mostly based on mathematical models characterized by giving inaccurate real calculations, dealing with graphically or virtually represented equipments, and lacking of interaction between students and instructors. Simulator programs serves as a supportive complementary for traditional laboratories and could not be replaced by face-to-face laboratory rooms. Nowadays, the creation of remote laboratories that students can access at any time and from any place is possible through the facts that internet technologies offer a new education and training system and contemporary laboratory instruments already support remote control via the internet. Conventional experimentation has several constraints left unsolved. For example, time restriction constraints organizing for evening part-time students, resource intensiveness increases teaching costs to operate conventional labs requiring equipments, space, manpower, safety etc., time limitation

Remote Experimentation Using Augmented Reality

Embed Size (px)

DESCRIPTION

UBICC, the Ubiquitous Computing and Communication Journal [ISSN 1992-8424], is an international scientific and educational organization dedicated to advancing the arts, sciences, and applications of information technology. With a world-wide membership, UBICC is a leading resource for computing professionals and students working in the various fields of Information Technology, and for interpreting the impact of information technology on society.

Citation preview

Page 1: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 1

REMOTE EXPERIMENTATION USING AUGMENTED REALITY

Salaheddin Odeh Department of Computer Engineering, Faculty of Engineering, Al-Quds University, Abu Dies, Jerusalem,

Palestine [email protected]

Shatha Abu Shanab

Electronic and Computer Engineering Master Program, Faculty of Engineering, Al-Quds University, Abu Dies, Jerusalem, Palestine

[email protected]

ABSTRACT In engineering and science education, laboratories represent an essential part of the study curriculum, and studies are incomplete if laboratories are not present. Laboratories cover the practical side of engineering studies, through which students can improve their theoretical knowledge as a stable basis to strengthen their skills required for enabling them to deal with any real problem in the future after graduation. This research tries to realize the idea of combining internet laboratories with augmented reality. This is what we can designate as an Augmented Reality Internet Laboratory (ARI-Lab). Definitively, augmented reality is to combine the interactive real world with a generated world by an interactive computer system in such a way that they appear as one environment. It shows the real world with an overlay of additional information so that the user can not distinguish between the real world and the virtual augmentation [1], [2]. In our consideration, a web-based remote experimentation using augmented reality is achieved through the facts that students can carry out an engineering experiment represented by real and virtual elements, components and equipments, through overlaying real kits with virtual (graphical) objects. AR technologies make possible the interaction between students and the graphically represented remote experiment to be no longer restricted in a face-to-screen fashion; rather it will be strived to dissolve students in a remotely located real environment laboratory. A number of modern laboratory instruments and equipments can only not be controlled and accessed interactively, but rather more through special kind of interfaces as well, enabling us to control and to implement the experiment via the internet. In this contribution, it will be shown how we can design, develop and implement an interactive web-based ARI-Lab. Keywords: Augmented Reality (AR), human-computer interaction, experimenting kit, remote laboratories, distributed systems, usability testing, C# and .NET.

1 INTRODUCTION The main goal of an engineering or a science laboratory is to provide students with the practical skills through experimenting with real devices and instruments [3], offering them a complete learning, which consists of a mixture of theoretical and practical sessions for enabling them to deal with and to solve any real world problem after graduation. In recent time, the usage of laboratory simulator programs has been increased rapidly. Although students can build their circuits and test it by these programs, simulator programs are mostly based on mathematical models characterized by giving inaccurate real calculations, dealing with graphically or virtually represented equipments, and lacking of

interaction between students and instructors. Simulator programs serves as a supportive complementary for traditional laboratories and could not be replaced by face-to-face laboratory rooms. Nowadays, the creation of remote laboratories that students can access at any time and from any place is possible through the facts that internet technologies offer a new education and training system and contemporary laboratory instruments already support remote control via the internet. Conventional experimentation has several constraints left unsolved. For example, time restriction constraints organizing for evening part-time students, resource intensiveness increases teaching costs to operate conventional labs requiring equipments, space, manpower, safety etc., time limitation

Page 2: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 2

complicates course organization around an experiment, and last, but not least, using shared equipments leads to the so-called “Driver-passenger” syndrome. By contrast, there are several advantages of the internet remote experimentation such as solving most of above problems, accessing the remote experimentation through a standard web browser anytime and anywhere, and manual procedures etc. are also available and accessible at the same time.

Apart from these benefits, there are several complicating factors in the architecture of remote environments such as time-, transmission- and operation delays as well as incomplete feedback. Additionally, there are many devices originally designed for direct control, and thus do not have adequate sensors or status information so that its integration in remote experiment requires an additional efforts and costs. Unfortunately, most of remote laboratories represent its instruments and equipments graphically, leading to decrease the reality environment of remote labs [3], [4]. This research tries to compensate this lack of information visualization by augmented reality (AR). Rather than showing either the actual world or a virtual world, augmented reality shows the real world with an overlay of additional information so that the user can’t distinguish between the real world and the virtual augmentation [1], [2]. Nowadays, augmented reality technology offers new applications in the human life in many forms. It may range from medical and military systems [5] to applications for entertainments [6], manufacturing, maintenance, and repair [7]. Analyzing and understanding how AR has been implemented and applied, can, therefore, be of great significance regarding contributing of newer ideas to applying AR in internet laboratories. This paper shows the techniques and methods necessary to apply AR to internet laboratories for achieving web-based remote experimentation based on augmented reality or augmented reality internet laboratories. That is, students can carry out an engineering experiment represented by real and virtual elements, components and equipments, through overlaying real kits with virtual (graphical) objects. The combined objects consisting of both computer graphics of electronic elements and real electronic elements are familiar to students. An ARI-Lab is a system composed of hardware and software components that enable students to perform their experiments via conventional web browser so that students can independently learn from time and place.

2 AUGMENTED REALITY: DEFINITION, COMPARISON AND APPLICATIONS

Definitively, augmented reality is the combination of the interactive real world with an interactive computer-generated world in such a way that they appear as one environment, such that the user can’t distinguish between the real world and the virtual augmentation [1], [2]. While AR augments the user's view of the real world by composing virtual objects with their real world counterparts, necessitating that the user maintains a sense of presence in that world, Virtual Reality immerses a user inside a virtual world that completely replaces the real world outside [1], [2]. It breaks the physical limitations of space and allows users to act as though they were somewhere else [9]. It is to note that mixtures of the two approaches are possible, depending on the application field to be manipulated. Fig 1 classifies AR on the reality virtually (RV) continuum dimension. In recent years, AR has been successfully applied in various fields. One of the applications of AR is to provide visualization of hidden objects [1], [2], [8], [10]. In the following, we are going to mention some examples, where AR has been successfully applied.

• Process control [11]: Video information is used as sensor data to control process variables that are difficult to measure. Operators use real-life process pictures along with an overlay of signal values visualized by virtual objects to control the technical process. The interfering between video pictures and signal values is what makes the system AR.

• Medical systems [12]: AR systems create 3D volume visualizations from CT (computerized tomography) or MRI (magnetic resonance imager) data for surgery. AR systems can view a volumetric rendered image of the fetus overlaid on the abdomen of the pregnant woman or present an overlay of an acoustic neuroma after resection, showing how it had entered the internal auditory meatus.

• Military systems [5]: AR systems display information to the pilot on the windshield of the cockpit or the visor of their flight helmet. Solders receive information about their environment (e.g. locations of friends and foes)

Figure 1: Reality Virtually (RV) Continuum [6]

Page 3: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 3

Figure 2: System architecture of the augmented reality environment for web-based remote experimentation

and information that helps them to coordinate their locations.

• Entertainment [6]: AR can be applied to enhance games that people play. The image that the players see is not only of the game area and their real opponents but virtual players are also playing along with them.

• Manufacturing, Maintenance and Repair [7]: For manufacturing, maintenance and repair, AR systems insert additional information into the field of view such as outputting labels displayed at parts of a system and along with operating instructions while repairing a system by a mechanic.

3 DISTRIBUTED SYSTEM ARCHITECTURE Distance laboratories enable remotely located students to complete lab experiments unconstrained by time or geographical considerations. They have the opportunity to practice their experimental laboratory by obtaining live data in real time similar to face-to-face classroom lab. In addition to the ability that an ARI lab offers students the possibility to experiment real-time labs remotely from any place and at any time [10], it presents the experimental kit through real videos combined with an overlay of virtual objects. However, in our approach, setting up computer graphics of electronic elements and

equipments overlaid on real-time video of the experimental kit is what makes our distance learning environment augmented-realistic, aiming at providing engineering students a quasi real environment lab. The application server (Fig. 2) is responsible for the application processing and data management, where as a client runs the representation software, the web-based AR user-interface. The best architectural model for describing the order of these distributed subsystems is the thin-client model [13]. By contrast, in the fat-client model, the server undertakes only the data management and the client software implements the application logic and the interaction with the users. The events for controlling the system could be either interactive commands entered by the students or control signals given by the experimental kit on its interface. That is, our system is categorized as an event-processing system. The architecture proposes that our distributed application should be made of these components (Fig. 2): 3.1 Web server The web server is the communication middleware over the internet between the clients (the students) and the remote experiment, whose visualization on the client side bases on the augmented reality technology. A web browser serves as mediator between the implementing student and

Page 4: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 4

the lab server, representing the central unit of the e-learning environment and functioning along with the web server as a coordinator between the various components. 3.2 Lab Server and remote experiment At the server end, laboratory instrumentations and the experimenting kit are connected to the server designated here as lab server. The lab server communicates with the web server to exchange data with the remote distributed clients via the internet. As previously discussed, internet remote experimentation are aimed at using of real instruments and equipments in laboratories instead of simulations. Fortunately, most of the current instrumentations such as oscilloscopes and multimeters are provided with control through PCI GPIB (General Purpose Interface Bus) card and GPIB cable [14], allowing PCs to communicate with over 2000 instruments made by over 200 manufacturers. This technological progress is what makes internet remote experimentation possible because of the facts that these instruments make measured values available for other systems sharing the same GPIB. The main purpose of the general purpose-interface bus (GPIB) is to send information between two or more devices. An internet remote experiment can be any one of engineering or science labs covering topic related to electric circuits or electronics consisting of resistors, capacitors, inductors, electromechanical modules and so on. One drawback the expansion of conventional experiments is time and cost intensive is because remote experiment designers have to reform and expand conventional experiments for allowing them to be easily integrated in contemporary remote e-laboratories. Before sending any data, GPIB devices must be configured to send the data in the proper order and according to the proper protocol. The electrical specifications as well as the cables, connectors, control protocol, and messages required to allow information transfer between devices are defined by the IEEE-488 standard [15]. For instance, by chaining IEEE-488 cables from one device to the next, it is possible to connect up to 14 devices together. IEEE-488 supports data transfer at up to 1 Mbytes/sec. In addition to simple data transfers, the IEEE-488 standard defines a number of specialized commands for interface programming in the form of subroutines available as programming libraries for different programming languages such as C, Pascal, C# etc. Augmented reality interfaces for web-based remote experimentation demand real-time videos combined with virtual user-interface objects. Therefore, a high quality web-cam is necessary to send real-time video of the experiment to give realistic feedback for the students. Next section

describes how the web-based AR user-interface allows students to control the experiment kit and instruments remotely through operating the virtual elements presented on the live-video frame, causing real responses of the remote physical experiment to be transferred back to client; thus, students feel as being in the real class room lab. Another functionality of the lab server is to mange and to schedule students' accesses into the system. On the lab server, every registered student has an account, in which his/her login name and password, results and marks of the experiment quizzes and tests, schedule data of accessible time are stored. When a student tries to access to the experiment, the management and the schedule component examines whether he/she is allowed to do this according to a schedule time-table. Moreover, this component manages the registration procedures of the students for enabling them to execute the experiment. The entered data will be temporarily stored and after its verification by the experiment administrator, it will be stored in the database permanently. The system informs the students about the failure or the success of their registration attempts by email confirmations. Once the login name and password entered by a student are correct, the lab server establishes a connection between the user-client and the remote experiment. Students and instructors have different web-based user interfaces tailored to their special needs. 3.3 User Client: a web-based AR user-interface One important goal of this research project is to access the web-based AR lab through the internet via a conventional web browser such as Microsoft Internet Explorer and Netscape, providing a suitable mean for data exchange between the user-interface and the lab server. When a student activates the URL address of the web-based AR lab, the web browser loads the start web-page of the experiment lab, which encompasses an authentication page for entering the system through a correct user name and password. When designing the web-based AR interface, various human-computer interaction rules for user-interface design have to be taken into account. These are consistency of data display (labeling and graphic conventions), efficient information assimilation by the user, minimal memory load on user, compatibility of data display with data entry, flexibility for user control of data display, presentation of information graphically where appropriate, standardized abbreviations, presentation of digital values only where knowledge of numerical value is necessary and useful [16]. In the section "User-Interface and Experimental Kit Development" it will be shown, how students interactively implement the electronic circuit via the AR lab interface managed by a conventional web browser. The interaction between the students and

Page 5: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 5

the user-interface takes place visually and through an interaction device such as keyboard and mouse. 3.4 E-Instructor Students might connect sensitive circuit elements such as ICs with power supply elements falsely, causing these components to get damaged. As such, there is a need to provide the system with a further software component, which protects the system from damage in that it prevents dangerous circuit configurations like short circuits or unwanted high power connections. Since this software component undertakes some activities of a human instructor, it could be designated as e-instructor. In the future, the e-instructor can be expanded to a simplified rule-based system, embracing conditional rules in the form "if-then" about the correct experiment configuration. Johannsen [17] expanded the rule-based system architecture presented by Raulefs [18] with additional components that are what makes a rule-based system to be embeddable in a real-time system. The e-instructor evaluates the rules cyclically to react in according to students' implementations on the experimental kit. 4 SYSTEM REQUIREMENTS AND DESIGN Sommerville [13] notes that, in most cases, the requirements and design issues of research projects are uncertain; therefore, while developing and implementing the system, we have followed the prototyping approach. The system software and hardware crystallize incrementally within an iterative process, in which each increment includes new system functionality. In the iterative process, the stages: specification, design, development, and testing are not chained, but rather interleaved and concurrent. From Fig. 2, the augmented reality environment for web-based remote experimentation is not only composed of pure technical software and hardware components, but it encompasses ergonomic aspects [19] for allowing effective human-computer interaction [16]. The user-interface plays a central role for obtaining a harmonic interaction with the whole lab, necessitating to create it with an interactive development system like Microsoft Visual Web Developer 2005 Express Edition along with these tools: Microsoft Visual C# 2005 Express Edition [20], Microsoft SQL Server 2005, Web Server: IIS (Internet Information Service), as well as Web Development. All these tools are integrated in a convenient developmental .Net environment with powerful user-interface tools and rich libraries for creating user-interface components to enable data to be displayed in many forms.

5 USER-INTERFACE AND EXPERIMENTAL KIT DEVELOPMENT As previously discussed, on the augmented

reality web-based user-interface, video-captured images are used along with graphical objects, so that users can not differ between the real world and the virtual augmentation [1], [2]. Since we deal with remote e-laboratories for teaching the practical side of engineering or science studies to strengthen the students' skills required after graduation, the techniques and the methods for developing and for designing augmented reality web-based user-interfaces discussed here can be used to present any one of engineering or science labs covering topic related to electric circuits, electronics, mechanics etc., in the form of mixed virtual and real (video-captured) laboratories' elements. These can not only be simple elements such as resistors, capacitors, inductors, electromechanical modules, but also more complicated units such as oscilloscopes, DDMs, function generators as well. Fig. 3 illustrates a prototype of the augmented reality web-based user-interface, presenting an experiment about the series resonant circuit presented in Fig. 4. Additionally, the circuit shows how several instrumentations are implemented to measure the desired voltage and current signals. One reason the circuit example selected is very simple is that our concern is to mediate the features and functionalities of the user-interface and its experimental kit: • The circuit-board image is captured by a video

camera and processed by a video server in real time. On the circuit board, some of the RCL circuit elements (resistor R and capacitor C) are already installed.

• The other circuit element (inductor L), which does not exist on the circuit board, can be graphically selected by means of a graphical component bar and then can be placed on the circuit board. In the bar, more than one resistor, capacitor, inductor, etc., are possible to be selected by the experimenting student, depending on his decisions or calculations required in the experiment. Visually, the students must be able to distinguish between the real and virtual experimental elements. The computer graphics or virtual objects are overlapped on the circuit-board image and are displayed to the user.

• The instrumentations such as oscilloscopes and multimeters visualized on the augmented reality user-interface can be both real and virtual (graphical). Real-time video-captured images are used to visualize the remote real instrumentations, whereas the virtual equipments are shown by computer graphics. In case of remote real instrumentations, the

Page 6: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 6

Figure 3: Prototype of the augmented reality web-based user-interface

measured values are visually mediated to the students through translated video-captured images. By contrast, the virtual instrumentations on the user-interface receive their values from the lab server measured by the equivalent instrumentations made their measured values available for other systems sharing the same GPIB.

• For wiring the electronic elements together and connecting the circuit board with the instrumentations, a wiring and connecting tool realized as a pop-up menu can be used. The pop-up menu offers various types of wires and cables. The color and width properties are changeable.

The connection locations on the circuit board as well as the inputs and outputs of the instrumentations are marked with highlighted circles. There are two approaches to recognize these locations: • HSL filter: In this image processing approach,

an image processing filter, the HSL filter [21] is used. A unique color such as pink marks these locations physically. The HSL filter removes all pixels in the image except the pink-marked color-filled circles, whose x y positions can be easily determined by the software.

• Layered mask: On the graphical user-interface, an additional layer consisting of the virtual highlighted circles positioned exactly on the

connection points will be placed. The layered mask is transparent unless the highlighted color-edged circles.

The latter technique is superior to the first one because, on the one hand, the HSL filter is noise sensitive, and, on the other hand, every connection

Figure 4: A series resonant circuit.

Page 7: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 7

Figure 5: The circuit configuration of the experimental kit located on the application side

location in the experimental kit must be physically marked with a color-filled circle. These difficulties will not occur when using the second approach since the mask layer is a pure software-technical implementation. One drawback of the layered mask approach is the circuit board and the instrumentations visualized as video-captured images have to be fixed and are not allowed to be shifted because this impairs the adjusted positions of the virtual edged circles on the connection points.

If the mouse cursor moves on such circles, the

software recognizes the corresponding connection through its x y positions. If the wiring and connecting mode is selected and the student moves the cursor while pressing the left button of the mouse, a connection between the desired locations is graphically created. After proving the correctness of the created connection by the e-instructor, the server connects the equivalent switches in the remote experimental kit (see Fig. 5). As mentioned, the system architecture of the web-based augmented reality laboratory follows the thin-client model because all of the application

processing and data management is carried out on our application server and the representation software interacts with the users, enabling them to perform on real (physical) experiments remotely whenever they want and anywhere they are. The user-interface represents the remote experimental kit using a mixture of virtual and real (video-captured) objects and the students can connect these objects by means of various virtual wire and cables interactively (see Fig. 3). For example in Fig. 3, the experimenting student is placing a virtual inductor L on the circuit board, where two circuit elements, the capacitor C and the resistor R, are already configured. A graphical component bar will graphically offer the virtual components. Virtual components on the user-interface can be placed on the circuit board to connect them with other elements to achieve a certain circuit configuration. Once such an element is placed on the circuit board, the connectors of the corresponding physical element in the kit configuration will be connected. For instance, after placing the graphically represented inductor L on the experimenting field of the user-interface, the switches S41 and S42 will be

Page 8: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 8

closed, making this element connectable to other circuit elements through the grid. It is noted that the software program can control the IEEE-488 instrumentations sharing the same GPIB. The input of the oscilloscope is connected with the resistor R to show the signal curve of the voltage. The elements R, C and L are connected with each other in such away to achieve the series resonant circuit. Given these facts illuminated, the experimental kit must be in a position to receive these commands and to response accordingly. That is, if the student intends to connect two objects or to place a virtual object on the user-interface, in the experimental kit, the equivalent physical objects must be activated through connecting them or making them available for further connecting. Fig. 5 shows the circuit of the experimental kit represented to the students by the web-based augmented reality user-interface illustrated in Fig. 3. The lab server translates the virtual connections implemented on the user-interface to real connections through closing the equivalent switches in the kit. From Fig. 5, a programmable grid, consisting of horizontal and vertical lines, can connect all components of the experimental kit with each other. Each crossing vertical and horizontal lines can be connected together via controllable switch. For example, to connect resistor R with the capacitor C (connectors c1 and c3, the highlighted two switches (Fig. 5) must be closed. 6 USABILITY TESTING After implementing the web-based ARI lab, an extensive evaluation to expose the strengths and weaknesses of augmented reality user-interfaces for web-based remote experimentation will be carried out. In this research, the methods and techniques for evaluating of e-learning software for school students in primary stages can be adopted [22]. For comparing our system adapting the augmented reality with other systems implemented within the actual or virtual reality, the students must interact with these systems. A session represents one of the three different environments: (actual, virtual and augmented), in which the same scenarios will be offered to the students to cope with [23]. In order to avoid possible learning effects caused by holding the same sequence of the scenarios, every student gets a different order of the scenarios. This phenomenon appears also while holding the same order of the environments for all subjects. As such, the usability engineer must permutate their order. Every subject (student) carries out the experiment in three different sub sessions: • Sub-session 1: web-based augmented reality

interface

• Sub-session 2: web-based virtual reality interface

• Sub-session 3: actual reality experiment (face-to face)

The different realities represent the independent variables, whereas the evaluation criteria such as transparency, navigation etc., serve as dependent variables. It is of enormous importance to find out the features shared between the different environments to use as evaluation criteria. Some aspects to measure are to what extent the following characteristics are realized: abstract versus concrete, user-centered versus techniques-oriented, transparent versus unintelligible, incomplete versus complete feedback etc. The Student's test (t-test) or one-way ANOVA are suitable means for comparing mean values of two sets of numbers to obtain an overview of the statistical significance of means' differences. While carrying out a usability testing experiment, the usability engineer clarifies a subject all operations related to the experiment and the interfaces. The usability engineer can handle the raw data of the experiments statistically using the Student's test (t-test) [24] and then analyze the outcome by SPSS [25]. The statistical result should be analyzed and reviewed by the usability engineers and the system designers, so that the final results helped in revising and optimizing the design of the interactive software system on the one hand; and the system designers could have defined new or corrected existing design guidelines for future systems on the other hand. 7 CONCLUSION It has been shown how augmented reality can be applied to internet laboratories through combing the interactive real world with an overlay of virtually represented information such so that they appear as one environment. Video pictures of real kits will be overlaid with virtual (graphical) objects of electronic elements and instrumentations. Through conventional web browsers, students can perform their experimental laboratories and obtain the live data in real time, resembling face-to-face classroom laboratories. For developing the software for the web-based remote experimentation using augmented reality, contemporary development environments such as .NET are used; thus, the software can be easily adapted to other platforms. Placing of graphical objects such as electronic elements and modules on the experimenting field of the user-interface causes connectors of the corresponding physical elements to become closed. After completing building and implementing the whole system, we expect various results to be achieved such as: putting up students in real environment laboratories, preserving equipments

Page 9: Remote Experimentation Using Augmented Reality

Ubiquitous Computing and Communication Journal 9

from getting damaged, providing safer experimental environments for students, saving money through reducing the numbers of instructors and equipments, and last but not least increasing collaboration between students and instructors because accessing an experiment takes place at anytime and from anywhere. 8 REFERENCES [1] J. R. Vallino: Interactive Augmented Reality,

doctoral diss., University of Rochester, New York (1998).

[2] T. Nilsen, S. Linton, J. Looser: Motivations for augmented reality gaming, Proc. the New Zealand Game Developers Conference, New Zealand, pp. 86-93 (2004).

[3] S. Das, L. N. Sharma, A. K. Gogoi: Remote Communication Engineering Experiments Through Internet, International Journal of Online Engineering, Vol. 2, No. 1 (2006).

[4] A. Bischoff, C. Rohrig: Remote Experimentation in Collaborative Virtual Environment, rep., Department of Electrical Engineering, University of Hagen, Germany (2001).

[5] J. Juhnke, T. Mills, J. Hoppenrath: Designing for augmented cognition - problem solving for complex environments, Foundations of Augmented Cognition, Berlin: Springer-Verlag, pp. 424-433 (2007).

[6] C. E. Hughes, C. B. Stapleton, D. E. Hughes, E. M. Smith: Mixed Reality in Education, Entertainment, and Training, IEEE Computer Graphics & Applications,Vol. 25 Issue 6, pp. 24-30 (2005).

[7] N. Navab: Developing killer apps for industrial augmented reality, IEEE Computer Graphics & Applications,Vol. 24 Issue 3, pp. 16-20 (2004).

[8] P. Milgram, H. Takemura, A. Utsumi, F. Kishino: Augmented Reality: A Class of Displays on the Reality Virtuality Continuum, Proc. Telemanipulator and Telepresence Technologies (1994).

[9] B. Shneiderman, C. Plaisant: Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition). Addison Wesley Longman (2004).

[10] H. C. Jones: E-Learning Designing Tomorrow's Education, Communication from the Commission, Commission of the European Communities, Brussels (2000).

[11] K. Zinser: Integrated multimedia and

visualization techniques for process S&C, Proc. IEEE International Conference on Systems, Man and Cybernetics, Le Touquet, France, 367-372, 1993.

[12] F. P. Vidal, F. Bello, K. W. Brodlie, N. W. John, D. Gould, R. Phillips & N. J. Avis, Principles and applications of computer graphics in medicine, Computer Graphics Forum, Vol. 25, Issue 1, pp. 113-137 (2006).

[13] I. Sommerville, Software Engineering (8th Edition), Addison Wesley (2007).

[14] IEEE 488.1-2003: Standard for Higher Performance Protocol for the Standard Digital Interface for Programmable Instrumentation, IEEE Instrumentation and Measurement Society (2003).

[15] CEC, CEC 488 programming and reference, Part number 370966A-01 (2003).

[16] B. Shneiderman, C. Plaisant, Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition). Addison Wesley Longman (2004).

[17] G. Johannsen: Mensch-Maschine-Systeme, Berlin: Springer (1993).

[18] P. Raulefs: Expertensysteme, Kuenstliche Intelligenz, Informatik-Fachbereiche, W. Bibel, J. H. Siekmann (Ed.), Proc. 59, Berlin: Springer, pp. 61-98 (1982).

[19] N. A. Streitz: Cognitive compatibility as a central issue in human-computer interaction: Theoretical framework and empirical findings, Cognitive engineering in the design of human-computer interaction and expert systems, in G. Salvendy (Ed.), Amsterdam: Elsevier, pp. 75-82 (1987).

[20] H. M. Deitel, P. J. Deitel: C# for programmers (2nd edition). Prentice Hall (2006).

[21] Z. Li, M. S. Drew: Fundamentals of Multimedia, Prentice-Hall (2004).

[22] S. Odeh, O. Qaraeen: Evaluation Methods and Techniques for E-Learning Software for School Students in Primary Stages, International Journal of Emerging Technologies in Learning, vol. 2, Nr. 3 (2007).

[23] M. B. Rosson, J. M. Carroll: Usability Engineering: Scenario-Based Development of Human-Computer Interaction, Morgan Kaufmann Publishers (2002).

[24] R. A. Johnson, G. K. Bhattacharyya: Statistics: Principles and Methods, Wiley (2000).

[25] Pallant, J.: SPSS Survival Manual, Open University Press (2004).