7
On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University of Rostock 18051 Rostock, Germany [email protected] Philip Berger University of Rostock 18051 Rostock, Germany [email protected] Oliver Staadt University of Rostock 18051 Rostock, Germany [email protected] Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright held by the owner/author(s). ISS ’16 Companion, November 06-09, 2016, Niagara Falls, ON, Canada ACM 978-1-4503-4530-9/16/11. http://dx.doi.org/10.1145/3009939.3009947 Abstract Beyond other domains, the field of immersive analytics makes use of Augmented Reality techniques to success- fully support users in analyzing data. When displaying ubiq- uitous data integrated into the everyday life, spatial immer- sion issues like depth perception, data localization and ob- ject relations become relevant. Although there is a variety of techniques to deal with those, they are difficult to apply if the examined data or the reference space are large and ab- stract. In this work, we discuss observed problems in such immersive analytics systems and the applicability of current countermeasures to identify needs for action. Author Keywords Augmented reality; immersive analytics; spatial perception. ACM Classification Keywords H.5.1 [Multimedia Infomation Systems]: Artificial, aug- mented, and virtual realities Introduction Within recent years, the domain of augmented reality has been pushed by the availability of affordable and powerful hardware (e.g., HoloLens). Such systems are now exten- sively applied and play an important role as for example in immersive analytics. Due to this development, the avail- able rendering approaches have to be extended to deal

On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

Embed Size (px)

Citation preview

Page 1: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

On Spatial Perception Issues InAugmented Reality BasedImmersive Analytics

Martin LuboschikUniversity of Rostock18051 Rostock, [email protected]

Philip BergerUniversity of Rostock18051 Rostock, [email protected]

Oliver StaadtUniversity of Rostock18051 Rostock, [email protected]

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).

Copyright held by the owner/author(s).ISS ’16 Companion, November 06-09, 2016, Niagara Falls, ON, CanadaACM 978-1-4503-4530-9/16/11.http://dx.doi.org/10.1145/3009939.3009947

AbstractBeyond other domains, the field of immersive analyticsmakes use of Augmented Reality techniques to success-fully support users in analyzing data. When displaying ubiq-uitous data integrated into the everyday life, spatial immer-sion issues like depth perception, data localization and ob-ject relations become relevant. Although there is a varietyof techniques to deal with those, they are difficult to apply ifthe examined data or the reference space are large and ab-stract. In this work, we discuss observed problems in suchimmersive analytics systems and the applicability of currentcountermeasures to identify needs for action.

Author KeywordsAugmented reality; immersive analytics; spatial perception.

ACM Classification KeywordsH.5.1 [Multimedia Infomation Systems]: Artificial, aug-mented, and virtual realities

IntroductionWithin recent years, the domain of augmented reality hasbeen pushed by the availability of affordable and powerfulhardware (e.g., HoloLens). Such systems are now exten-sively applied and play an important role as for examplein immersive analytics. Due to this development, the avail-able rendering approaches have to be extended to deal

Page 2: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

with broadened and new challenges. One of them is spatialperception. This issue has been extensively examined anddiscussed in the augmented reality community as it is a keyfor comprehensively relating virtual and real world objects(see e.g., [14, 23]).

In immersive analytics, spatial perception is fundamentaldue to the fact that spatial properties (e.g., position, size)are major attributes for encoding data [18]. Accordingly,they have to be perceived correctly in order to draw theright conclusions from the data.

Aware of this, augmented reality techniques have been ap-plied mainly for the visual analysis of medical volume data[25], geo-spatial data [2], or industrial design studies, e.g.,of cars [22]. These scenarios come along with some prac-tical benefits related to immersion and spatial perception.First, the visualized data (e.g., human organs, buildings,cars) are somewhat familiar to the user and thus, the datathemselves eases the understanding of locations, sizes,and distances. Second, the visualized objects are geomet-rically structured in a sense that the users can determinewhere object parts begin and end which partially helps toidentify spatial object relations. Lastly, the data are oftenlocally confined and/or related and thus, can be virtuallyplaced on real world operating tables, buildings, platforms –providing an additional cue for spatial properties. Moreover,several visual cues exist that emphasize spatial propertiesand can be applied in the above scenarios.

In this work, we discuss common visual cues that supportspatial perception in augmented reality and show to whatextend they can be applied in a more generalized immer-sive analytics. In this context, we illustrate issues with re-gard to data that are abstract, evenly distributed, large orhard to relate to the spatial reference.

The remainder of this paper is structured as follows. Firstwe present frequently used visual cues that support spatialperception in augmented reality to . Then we give a briefexcerpt of plausible large scale immersive analytics applica-tions and discuss the applicability of the presented cues.

Spatial cues in augmented realityNatural cues which help humans to perceive distances andlocations correctly have been studied for a long time (seee.g., [6]). They cover different efficiency ranges (personal,action, and vista space, see [23]) and categories (pictorial,kinetic, physiological, or binocular depth cues, see [8]). Ob-viously, all of those should be reproduced in augmented re-ality systems to provide a distortion free spatial perception.Kruijff et. al. give a good overview of different open prob-lems to achieve this goal and the respective research [14].

Among the fundamental aspects to be considered in aug-mented reality are linear perspective, relative sizes, motionparalax, binocular disparity. In addtion, other visual curesexist that if implemented can dramatically improve spatialperception. Following we describe the effects of commonones and how they have been utilized in augmented reality.

ShadowsThe importance of cast and attached shadows on imageperception is reflected by repeated research in perceptionscience (see e.g., [1, 19, 31]). Figures 1a–c show a popularexample of shadow effects on position and shape percep-tion as well as the ambiguity caused by missing shadows.

The purpose of realistically implementing shadows in aug-mented reality is to provide information on how the virtualobjects are related to a real world reference and to facilitatea natual appearance. The virtual shadows allow for a com-parison against the cast and the attached shadows of realworld objects. Thus, the interpretation of the real world lo-

Page 3: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

(a) (b) (c) (d) (e) (f)

Figure 1: Spatial ambiguity (a) resolved by visual cues (b)–(f).

cations, orientations, and sizes allows to indirectly infer thevirtual objects’ properties.

The effects of shadows have been studied in 3D computergraphics as well (e.g., [11]). Starting with different offlineprocessing approaches (e.g., in [7]) shadows have beenconsistently implemented for augmented reality purposes .Nowadays several techniques are available that allow real-time rendering of shadows in augmented reality scenes(e.g., [10, 21]). They all capture and approximate the light-ing conditions of the surrounding real world and apply thisinformation to the virtual objects. The positive effect of vir-tual shadows on spatial perception in augmented realityapplications has been reported repeatedly (e.g., in [4]).

OcclusionOcclusion is a rather general depth cue as it simply conveyswhich object is in front of another. Thus, mutual occlusionof virtual and real world objects is an important aspect tospatially relate both and to communicate depth correctly.The positive effect of incorporating occlusions on correctdepth perception has been reported for instance in [27].

A specific problem related to augmented reality systems ishow to determine the depth of real world objects. With thatinformation available – for instance from attached mark-ers or a given 3D model (e.g., in [5]) – it is relatively easyto identify occluders and occluded parts of virtual and realobjects. Subsequently, these properties have to be con-veyed comprehensively. Different approaches are used tomodify the appearance of occluding objects to uncover thehidden ones. They are basically similar to the illustrativetechniques of ghosted or cut-away views (see e.g., [29] andFig. 1d). Cut-aways can be found for instance in [5, 9] andtransparencies in combination with masks are used in [20].The X-Ray-Vision (e.g., [2, 12]) is a form of ghosted viewshowing phantom geometry of the occluders.

Additional AugmentationsThe following paragraphs deal with different techniques,that introduce some geometry or use object properties thatdo not originate from the real world reference scene. Gen-erally, they have been designed to enhance spatial percep-tion in specific applications. Some of them are inspired byillustration techniques.

Page 4: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

One approach that is helpful not only to perceive depth butalso to quantify distances is the superimposition of grids(Fig. 1f). Aligning the grid with user’s viewing direction (e.g.,in [3, 30]) helps to determine egocentric distances. More-over, embedding grids into the real-world reference, sup-ports the recognition of exocentric distances (between ob-jects, see e.g., [16, 27]). But the spatial relation of virtualobjects and the grid can be emphasized further. One ap-proach are virtual projections of objects onto the grid suchas contours [30]. Another example are supportive lines thatconnect the virtual object with the grid or ground. This way,they convey the distance to the observer by the intersectionpoint and also the altitude by their lengths (e.g., in [32]). Inaddition, they are independent of light sources (see Fig. 1f).

A technique known from video games to convey locationsare top-down-views (Fig. 1f). Such views provide an overviewof all virtual and real world objects to support spatial per-ception in augmented reality (see e.g., [27, 30]).

Another way to communicate spatial properties is to encodedepth information directly into the depicted virtual objects.An obvious approach is to use colors for representing thedepth values [28, 30]. Related studies can be found in thevolume visualization domain [24]. Variations of this methodare the adjustment of opacity or intensity depending ondepth [17] and aerial perspective for larger distances [23].In the same way, other graphical attributes can be used todepict depth: Depth of field [28] (see Fig. 1e) or symbolicframes (tunnel tool [3, 16]) are examples.

Applicability in Immersive AnalyticsSo far, augmented reality applications have generally fo-cussed on a rather small amount of virtual objects (fewbuildings, machine parts, pipes, human organs) in somekind of bounded reference space. One idea of immersiveanalytics is to make the ubiquitous data that surrounds the

user visually graspable. Particularly, there are no restric-tions neither on the amount and structure of data, nor thereference it is defined in. We can think for instance of volu-metric environmental data sets (e.g., air pollution, weatherforecasts) and to analyze or query that data on site. Otherexamples could be room-filling streaming data set of airconditioning or a rendering of sound propagation.

In the following paragraphs we illustrate, how the relaxedconstraints on the visualized data complicate the applica-tion of the aforementioned visual cues. Although there isa wide range of aspects to be considered, we focus on theamount of data and its associated reference space.

Large Data SetsThe figures in [30] provide a good starting point for dis-cussing large data sets. Those figures show five sphereswith different depth enhancing visual cues. Inspecting theshadow planes reveals that the five cast shadows are partlyfused. As a consequence, the shadows are somewhat am-biguous and thus, less helpful (Fig. 2a). This problem ob-viously increases with larger data sets and unfortunately, italso occurs with the other common visual cues presentedabove: Occlusions can only be depicted for a limited num-ber of occluders, projections onto grids or planes won’t beassignable (Fig. 2a), supportive lines or depth encodingsymbols will result in visual clutter (Fig. 2b), top-down viewswill bear severe overplotting (Fig. 2b), and so on. Other vi-sual cues that encode depth for instance into color or depthof field are likely to interfere with the visualization design.They treat the important visual variable color [26] for depthperception (Fig. 2d) or restrict the datas’ visibility (Fig. 2c).

Similar problems arise if the data are space filling or revealno visual structure that eases object distinction. Complexstructures (e.g., turbulent streamlines) make even smalldata sets hard to relate to depth cues.

Page 5: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

(a) (b) (c) (d) (e) (f)

Figure 2: Issues with larger data sets and larger reference spaces.

Large Reference SpacesRegardless of the data amount, the reference space mayalso be of large scale in immersive analytics. Such largereference spaces will likely result in problems that are re-lated to visibility issues. Beyond that, the highly restrictedfield of view in nowadays augmented reality hardware maycut off virtual objects. Figure 2e shows an illustrative ex-ample. Either due to the size of the reference space or dueto a limited field of view, neither the cast shadows nor thebase of the supportive lines are visible. Hence, these cuesare not interpretable at all. An available countermeasureto this issue are artificial superimposed planes (see [30]).Unfortunately, such approaches come along with occlusionand distort the spatial perception themselves as they are infront of nearby geometry.

Additional problems arise if the reference space covers dis-tances above 30m (vista space, see [23]). If the visualizeddata does not scale in size proportionally to the referencespace, it will be difficult to perceive shadows, occlusion,

supportive lines or differences in color . Due to linear per-spective, the visual cues will shrink together with the visualrepresentation of the data, making it difficult to perceivespatial relations (Fig. 2f).

Abstract reference spaces also suffer from hardly visiblespatial cues or hidden references. A recent example aregraph layouts on an immersive hemispheric display [15].Looking at the spatial perception – not challenging or crit-icizing the approach – the question arises how the dis-tances between nodes should be interpreted if this be-comes necessary? The distance of opposite nodes couldbe the geodesic distance on the hemispheres’ surface orsimply the real world Euclidean distance. Given such ab-stract spaces, there is no natural mapping between datadriven distances and physically perceived distances. Hence,it is difficult to interpret the data unambiguously.

Page 6: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

ConclusionIn this work we gave an overview of common techniques inaugmented reality to support spatial perception. We illus-trated problems of those techniques if they are applied tothe complex field of immersive analytics without adaptions.Note that there are even more perception issues to be con-sidered but they are rather specific to certain augmentedreality display devices like monocular handhelds. Althoughour work focuses augmented reality in immersive analytics,the described problems partially apply to virtual reality sys-tems or large high-resolution displays (see [13]) as well. Inconclusion, there is a strong need for research on spatialdepth cues in immersive analytics to make the overall goalof analyzing data by means of immersion achievable.

REFERENCES1. B. P. Allen. 1999. Shadows as Sources of Cues for

Distance of Shadow-Casting Objects. Perceptual andMotor Skills 89, 2 (1999), 571–584.

2. B. Avery, C. Sandor, and B. H. Thomas. 2009.Improving Spatial Perception for Augmented RealityX-Ray Vision. In Proc. of IEEE VR’09. 79–82.

3. R. Bane and T. Höllerer. 2004. Interactive Tools forVirtual X-Ray Vision in Mobile Augmented Reality. InProc. of IEEE ISMAR’04. 231–239.

4. M. Berning, D. Kleinert, T. Riedel, and M. Beigl. 2014.A Study of Depth Perception in Hand-Held AugmentedReality Using Autostereoscopic Displays. In Proc. ofIEEE ISMAR’14. 93–98.

5. C. Bichlmeier, T. Sielhorst, S. M. Heining, and N.Navab. 2007. Improving Depth Perception in MedicalAR. Chapter 9, 217–221.

6. J. E. Cutting and P. M. Vishton. 1995. PerceivingLayout and Knowing Distances: The Integration,

Relative Potency, and Contextual Use of DifferentInformation about Depth. Chapter 3, 69–117.

7. P. Debevec. 1998. Rendering Synthetic Objects intoReal Scenes: Bridging Traditional and Image-basedGraphics with Global Illumination and High DynamicRange Photography. In Proc. of ACM SIGGRAPH’98.189–198.

8. D. Drascic and P. Milgram. 1996. Perceptual Issues inAugmented Reality. In Proc. of SPIE StereoscopicDisplays and Virtual Reality Systems III.

9. C. Furmanski, R. Azuma, and M. Daily. 2002.Augmented-Reality Visualizations Guided by Cognition:Perceptual Heuristics for Combining Visible andObscured Information. In Proc. of IEEE ISMAR’02.215–224.

10. M. Haller, S. Drab, and W. Hartmann. 2003. AReal-time Shadow Approach for an Augmented RealityApplication Using Shadow Volumes. In Proc. of ACMVRST’03. 56–65.

11. G. S. Hubona, P. N. Wheeler, G. W. Shirah, and M.Brandt. 1999. The Relative Contributions of Stereo,Lighting, and Background Scenes in Promoting 3DDepth Visualization. ACM Transactions onComputer-Human Interaction 6, 3 (1999), 214–242.

12. D. Kalkofen, E. Mendez, and D. Schmalstieg. 2009.Comprehensible Visualization for Augmented Reality.IEEE TVCG 15, 2 (2009), 193–204.

13. E. Klein, J. E. Swan, G. S. Schmidt, M. A. Livingston,and O. G. Staadt. 2009. Measurement Protocols forMedium-Field Distance Perception in Large-ScreenImmersive Displays. In Proc. of IEEE VR’09. 107–113.

Page 7: On Spatial Perception Issues In Augmented Reality …malub/publications/2016...On Spatial Perception Issues In Augmented Reality Based Immersive Analytics Martin Luboschik University

14. E. Kruijff, J. E. Swan II, and S. Feiner. 2010. PerceptualIssues in Augmented Reality Revisited. In Proc. ofIEEE ISMAR’10. 3–12.

15. O. Kwon, C. Muelder, K. Lee, and K. Ma. 2016. AStudy of Layout, Rendering, and Interaction Methodsfor Immersive Graph Visualization. IEEE TVCG 22, 7(2016), 1802–1815.

16. M. A. Livingston, Z. Ai, K. Karsch, and G. O. Gibson.2011. User Interface Design for Military ARApplications. Virtual Reality 15, 2-3 (2011), 175–184.

17. M. A. Livingston, J. E. Swan, J. L. Gabbard, T. Höllerer,D. Hix, S. J. Julier, Y. Baillot, and D. Brown. 2003.Resolving Multiple Occluded Layers in AugmentedReality. In Proc. of IEEE ISMAR’03. 56–65.

18. J. Mackinlay. 1986. Automating the Design of GraphicalPresentations of Relational Information. ACMTransactions on Graphics 5, 2 (1986), 110–141.

19. P. Mamassian, D. C. Knill, and D. Kersten. 1998. Theperception of cast shadows. Trends in CognitiveSciences 2, 8 (1998), 288–295.

20. E. Mendez and D. Schmalstieg. 2009. ImportanceMasks for Revealing Occluded Objects in AugmentedReality. In Proc. of ACM VRST’09. 247–248.

21. D. Nowrouzezahrai, S. Geiger, K. Mitchell, R. Sumner,W. Jarosz, and M. Gross. 2011. Light Factorization forMixed-frequency Shadows in Augmented Reality. InProc. of IEEE ISMAR’11. 173–179.

22. H. Regenbrecht, G. Baratoff, and W. Wilke. 2005.Augmented Reality Projects in the Automotive andAerospace Industries. IEEE Computer Graphics andApplications 25, 6 (2005), 48–56.

23. R. S. Renner, B. M. Velichkovsky, and J. R. Helmert.2013. The Perception of Egocentric Distances in VirtualEnvironments - A Review. Comput. Surveys 46, 2(2013), 23:1–23:40.

24. T. Ropinski, F. Steinicke, and K. H. Hinrichs. 2006.Visually Supporting Depth Perception in AngiographyImaging. In Proc. of SG’06. 93–104.

25. T. Sielhorst, C. Bichlmeier, S. M. Heining, and N.Navab. 2006. Depth Perception - A Major Issue inMedical AR: Evaluation Study by Twenty Surgeons. InProc. of MICCAI’06. 364–372.

26. S. Silva, B. S. Santos, and J. Madeira. 2011. Usingcolor in visualization: A survey. Computers & Graphics35, 2 (2011), 320–333.

27. T. Tsuda, H. Yamamoto, Y. Kameda, and Y. Ohta. 2005.Visualization Methods for Outdoor See-through Vision.In Proc. of ICAT’05. 62–69.

28. K. Uratani and H. Takemura. 2005. A Study of DepthVisualization Techniques for Virtual Annotations inAugmented Reality. In Proc. of IEEE VR’05. 295–296.

29. I. Viola and E. Gröller. 2005. Smart Visibility inVisualization. In Proc. of CA’05. 209–216.

30. J. Wither and T. Höllerer. 2005. Pictorial Depth Cues forOutdoor Augmented Reality. In Proc. of IEEE ISWC’05.92–99.

31. A. Yonas, L. T. Goldsmith, and J. L. Hallstrom. 1978.Development of Sensitivity to Information Provided byCast Shadows in Pictures. Perception 7, 3 (1978),333–341.

32. S. Zollmann, C. Hoppe, T. Langlotz, and G. Reitmayr.2014. FlyAR: Augmented Reality Supported MicroAerial Vehicle Navigation. IEEE TVCG 20, 4 (2014),560–568.