48
State-of-Art Seminar Explorations on Body-Gesture based Object Selection on HMD based VR Interfaces for Dense and Occluded Dense Virtual Environments Presented by Name: Shimmila Bhowmick Roll no: 166105005 Supervisor Dr. Keyur Sorathia Embedded Interaction Lab, Department of Design Indian Institute of Technology Guwahati, Assam-781039, India

State-of-Art Seminarembeddedinteractions.com/Files/SOAS_Presentation_Shimmila_Bho… · State-of-Art Seminar Explorations on Body-Gesture based Object Selection on HMD based VR Interfaces

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

State-of-Art Seminar

Explorations on Body-Gesture based Object Selection on HMD based VR Interfaces for Dense and Occluded Dense Virtual Environments

Presented by

Name: Shimmila Bhowmick

Roll no: 166105005

Supervisor

Dr. Keyur Sorathia

Embedded Interaction Lab, Department of DesignIndian Institute of Technology Guwahati, Assam-781039, India

Desktop based VR VR workstation HMD VR using dataglove to transmit

finger movement

Projection displays

CAVE like environmentOculus Rift- in-built screens

Google cardboard

Figure : Experiment environments: Sparse, Dense and Occluded dense

Figure: Local vs at-a-distance selection using ray-casting

Figure: The depth ray selects the intersected target which is closestto the depth marker by moving the hand forward and backward.

Figure: (a) Holding the button down locks the ray and displays the depth marker at its center. (c) The depth marker is controlled with the input device, selecting the closest intersected target. (c) the farthest object is selected.

Figure: Cone casting methods selects objects that falls inside the selection cone

Figure: Flexible pointer adopts a bendable ray to select occluded objects

Figure: aperture based selection

Figure: Aperture with orientation selects “a” as it closely matches the orientation of the tracker

Figure: Dynaspot

Figure: Region examination

Figure: Sticky ray based object selection

Figure: SQUAD

Figure: (a) Intersected targets are highlighted, (b) flowered out in a marking menu and the desired target is selected.

Figure: 2D Bubble cursor resizes to select nearest object

Figure: Double, bound and depth bubble cursor. Objects are selected in accordance with a depth sphere, 2D ring,

and cursor at center.

Figure: (a) Clicking activates the cursor and transforms covered targets into crossing arcs. (b) Crossing an arc selects a target.

Figure: (a) Crossing the red trigger arc activates the cursor and transforms targets into crossing arcs. (b) Crossing an arc segment

selects that target.

Figure: Motion pointing by assigning different types of elliptical motions to each object

Figure : The Starfish selection technique.

Figure : (a) Head-Crusher technique in which the object is between the user's finger and thumb in its image plane. (b) Sticky Finger technique

uses the outstretched finger gesture to select objects

Figure : Scope technique adapts to the environment by altering the activation area of the cursor.

Figure : (a) Zoom technique zooms in the area inside the cursor hence providing the user with a larger area to aim at. (b) Expand Technique uses zoom and SQUAD to select objects.

Figure: Intenselect: Selecting a moving atom ina heavily cluttered Molecular Dynamics environment.

Figure: The visual feedback shows the moving target that, according to cursor behavior, the system detects as being the target-of-interest.

Figure: Object follows the users hand movementto its exact location.

Figure: visualization of the 3D virtual hand avatar

Figure: GO-GO technique

Figure: (a) Menu cone selects the intended target (in yellow) along with other targets. (b) A pull gesture confirms the selection and a menu appears. (c) A directional gesture is performed to select the target.

Figure: PRECIOUS technique: (a) selection cone intersecting various objects, (b) refinement phase, moving the user closer to the objects, (c) single object selection, (d) returning to the original position with the object selected

Figure: familiarization phase; no feedback; object coloring; connecting line; object halo; shadow

Figure: Movement of EZCursor VR using Head-movement and/or external input device.

Figure: IDS method adopts a proximity sphere along the path described by the motion of the hand, and the objects that

are intersected by it are considered for selection.

References:

1. Bolt, R.A., 1980. Put-that-there: voice and gesture at the graphics interface. In: Proceedings of the Seventh Annual Conference on Computer Graphics and Interactive Techniques, Seattle, Washington, United States, pp. 262–270.

2. Bowman, D. A., & Hodges, L. F. (1997, April). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In Proceedings of the 1997 symposium on Interactive 3D graphics (pp. 35-ff). ACM.

3. Poupyrev, I., Billinghurst, M., Weghorst, S., & Ichikawa, T. (1996, November). The go-go interaction technique: non-linear mapping for direct manipulation in VR. In Proceedings of the 9th annual ACM symposium on User interface software and technology (pp. 79-80). ACM.

4. Fabio Buttussi and Luca Chittaro. 2018. Effects of different types of virtual reality display on presence and learning in a safety training scenario. In IEEE Transactions on Visualization and Computer Graphics, 24(2), 1063-1076.

5. Cockburn, A., & Firth, A. (2004). Improving the acquisition of small targets. In People and Computers XVII—Designing for Society (pp. 181-196). Springer, London. 6. Kopper, R., Bacim, F., Bowman, D., 2011. Rapid and accurate 3D selection by progressive refinement. In: Proceedings of the IEEE Symposium on 3D User Interfaces, pp.67–

74.7. Rosa, D. A. W., & Nagel, H. H. (2010, November). Selection techniques for dense and occluded virtual 3d environments, supported by depth feedback: Double, bound and

depth bubble cursors. In Chilean Computer Science Society (SCCC), 2010 XXIX International Conference of the (pp. 218-225). IEEE.8. Chapuis, O., Labrune, J. B., & Pietriga, E. (2009, April). DynaSpot: speed-dependent area cursor. In Proceedings of the SIGCHI Conference on Human Factors in Computing

Systems (pp. 1391-1400). ACM.9. Bowman, D., Wingrave, C., Campbell, J., & Ly, V. (2001). Using pinch gloves (tm) for both natural and abstract interaction techniques in virtual environments.10. Forsberg, A., Herndon, K., & Zeleznik, R. (1996, November). Aperture based selection for immersive virtual environments. In Proceedings of the 9th annual ACM symposium

on User interface software and technology (pp. 95-96). ACM.11. Pierce, J.S., Forsberg, A.S., Conway, M.J., Hong, S., Zeleznik, R.C., Mine, M.R., 1997. Image plane interaction techniques in 3D immersive environments. In: Proceedings of

the 1997 Symposium on Interactive 3D graphics, Providence, Rhode Island, United States, ACM, pp. 39–4312. Daniel Vogel and Ravin Balakrishnan. Distant freehand pointing and clicking on very large, high resolution displays. In UIST ’05: Proceedings of the 18th annual ACM

symposium on User interface software and technology, pages 33–42. ACM Press, 2005.13. Grossman, T., & Balakrishnan, R. (2004, April). Pointing at trivariate targets in 3D environments. In Proceedings of the SIGCHI conference on Human factors in computing

systems(pp. 447-454). ACM.14. Zhang Hui. 2017. “Head-mounted display-based intuitive virtual reality training system for the mining industry". International Journal of Mining Science and Technology15. Bowman, D., Wingrave, C., 2001. Design and evaluation of menu systems for immersive virtual environments. In: Proc. VR 2001, pp. 149–156. 16. Bowman, D., Wingrave, C., Campbell, J., Ly, V., Rhoton, C., 2002. Novel uses of pinch gloves for virtual environment interaction techniques. Virtual Reality 6 (3), 122–129

References:

17. Cashion, J., Wingrave, C., & LaViola Jr, J. J. (2012). Dense and dynamic 3d selection for game-based virtual environments. IEEE transactions on visualization and computer graphics, 18(4), 634-642.

18. Changyang Li, Wei Liang, Chris Quigley, Yibiao Zhao, Lap-Fai Yu. 2017. “Earthquake safety training through virtual drills". IEEE Transactions on Visualization and Computer Graphics, 23(4), 1275-1284

19. Rosa, D. A. W., & Nagel, H. H. (2010, November). Selection techniques for dense and occluded virtual 3d environments, supported by depth feedback: Double, bound and depth bubble cursors. In Chilean Computer Science Society (SCCC), 2010 XXIX International Conference of the (pp. 218-225). IEEE.

20. Bolt, R. A. (1980). “Put-that-there”: Voice and gesture at the graphics interface (Vol. 14, No. 3, pp. 262-270). ACM.21. Tassos A. Mikropoulos and Antonis Natsis. 2011. Educational virtual environments: A ten year review of empirical research (1999-2009). Computers and Education, 56(3),

769-780.22. Bowman, D. A., Johnson, D. B., & Hodges, L. F. (2001). Testbed evaluation of virtual environment interaction techniques. Presence: Teleoperators & Virtual

Environments, 10(1), 75-95.23. Lee, S., Seo, J., Kim, G. J., & Park, C. M. 2003. Evaluation of pointing techniques for ray casting selection in virtual environments. In the International Conference on Virtual

Reality and its Application in Industry (Vol. 4756, No. 1), 38-44. 24. Andrew Y. C. Nee and Soh K. Ong. 2013. Virtual and augmented reality applications in manufacturing. In 7th IFAC Conference on Manufacturing Modelling, Management

and Control, 15-2625. De Haan, G., Koutek, M., & Post, F. H. (2005, October). IntenSelect: Using Dynamic Object Rating for Assisting 3D Object Selection. In IPT/EGVE (pp. 201-209).26. Liang, J. and Green, M. (1994). JDCAD: A highly interactive 3D modeling system. Computers and Graphics. 18(4): p. 499-506.27. Ni T, Bowman DA, North C, McMahan RP (2011) Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures. Int J Hum Comput Stud

69(9):551–56228. Cashion, J., & LaViola, J. J. (2014, March). Poster: Dynamic adaptation of 3D selection techniques for suitability across diverse scenarios. In 3D User Interfaces (3DUI), 2014

IEEE Symposium on (pp. 165-166). IEEE.29. Shailey Minocha, Ana-Despina Tudor, Steve Tilling. 2017. Affordances of mobile virtual reality and their role in learning and teaching. In Proceedings of the 31st British

Computer Society Human Computer Interaction Conference (p. 44). BCS Learning & Development Ltd. 30. David Lockwood. 2004. Evaluation of Virtual Reality In Africa: An educational perspective; Retrieved September 8, 2018 from

http://unesdoc.unesco.org/images/0013/001346/134607e.pdf 31. Fatima Gutierrez, Jennifer Pierce, Victor Vergara, Robert Coulter, Linda Saland, Thomas P. Caudell, Timothy E. Goldsmith, and Dale C. Alverson. 2007. The effect of degree of

immersion upon learning performance in virtual reality simulations for medical education in Medicine Meets Virtual Reality 15: In Vivo, In Vitro, In Silico: Designing the Next in Medicine 125 (2007), 155

32. Tassos A. Mikropoulos. 2006. Presence: a unique characteristic in educational virtual environments. Virtual Reality 10, 3-4 (2006), 197–206.

References:33. Cobb, S. V., Nichols, S., Ramsey, A., & Wilson, J. R. (1999). Virtual reality-induced symptoms and effects (VRISE). Presence: Teleoperators & Virtual Environments, 8(2), 169-186.34. Nick Yee and Jeremy Bailenson. 2007. The Proteus effect: The effect of transformed self-representation on behavior. Human Communication Research 33, (2007), 271–290. 35. Nick Yee, Jeremy N. Bailenson, Nicolas Ducheneaut. 2009. The Proteus effect: Implications of transformed digital self-representation on online and offline behavior.

Communication Research, 36(2), 285-312 36. Adi Robertson. 2017. Google has shipped over 10 million Cardboard VR headsets. Retrieved April 14, 2018 from https://www.theverge.com/2017/2/28/14767902/googl-

cardboard-10-million-shipped-vr-ar-apps 37. Theodore Svoronos, Deusdedit Mjungu, Prabhjot Dhadialla, Rowena Luk, Cory Zue, Jonathan Jackson, Neal Lesh. 2010. CommCare: Automated quality improvement to

strengthen community-based health. Weston: D-Tree International. 38. Danilo D. O. Robertson. 2017. Virtual reality training in Southern Africa. Retrieved April 14, 2018 from https://www.unido.org/stories/virtual-reality- training-southern-

africa#story-start39. Argelaguet Sanz, F. (2011). Pointing facilitation techniques for 3d object selection on virtual environments. Universitat Politècnica de Catalunya.40. A. Worden, N. Walker, K. Bharat, and S. Hudson. Making computers easier for older adults to use: area cursors and sticky icons. In Proc. CHI ’97, 266–271. ACM,1997.41. Hasan, K., Grossman, T., & Irani, P. (2011, May). Comet and target ghost: techniques for selecting moving targets. In Proceedings of the SIGCHI Conference on Human Factors in

Computing Systems (pp. 839-848). ACM.42. Yoo, S., & Parker, C. (2015, August). Controller-less interaction methods for Google cardboard. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (pp. 127-

127). ACM.43. Cockburn, A., & Firth, A. (2004). Improving the acquisition of small targets. In People and Computers XVII—Designing for Society (pp. 181-196). Springer, London.44. Frees, S., & Kessler, G. D. (2005, March). Precise and rapid interaction through scaled manipulation in immersive virtual environments. In Virtual Reality, 2005. Proceedings. VR

2005. IEEE (pp. 99-106). IEEE.45. Findlater, L., Jansen, A., Shinohara, K., Dixon, M., Kamb, P., Rakita, J., & Wobbrock, J. O. (2010, October). Enhanced area cursors: reducing fine pointing demands for people with

motor impairments. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (pp. 153-162). ACM.46. Vanacken, L., Grossman, T., & Coninx, K. (2007, March). Exploring the effects of environment density and target visibility on object selection in 3D virtual environments.

In null(p. null). IEEE.47. Mine, M. (1995). Virtual environment interaction techniques, UNC Chapel Hill Computer Science Technical Report TR95-018.48. Chatterjee, I., Xiao, R., & Harrison, C. (2015, November). Gaze+ gesture: Expressive, precise and targeted free-space interactions. In Proceedings of the 2015 ACM on

International Conference on Multimodal Interaction (pp. 131-138). ACM.49. Ren, G., & O'Neill, E. (2013). 3D selection with freehand gesture. Computers & Graphics, 37(3), 101-120.50. Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S., 2003. Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: Proceedings of the CHI’03: Conference on

Human Factors in Computing Systems, Ft. Lauderdale, Florida, ACM, pp. 473–480.51. Krueger, M., 1991. VideoPlace and the interface of the future. In: B., Laurel (Ed.), The Art of Human Computer Interface. Addison Wesley, pp. 417–422

References:

52. Ren, G., & O'Neill, E. (2013). 3D selection with freehand gesture. Computers & Graphics, 37(3), 101-120.53. Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S., 2003. Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: Proceedings of the CHI’03: Conference

on Human Factors in Computing Systems, Ft. Lauderdale, Florida, ACM, pp. 473–480.54. Krueger, M., 1991. VideoPlace and the interface of the future. In: B., Laurel (Ed.), The Art of Human Computer Interface. Addison Wesley, pp. 417–42255. Lucas, J. F. (2005). Design and evaluation of 3D multiple object selection techniques (Doctoral dissertation, Virginia Tech).56. Teather, R. J. 2018. EZCursorVR: 2D Selection with Virtual Reality Head-Mounted Displays. environments, 4, 31.57. Periverzov, F., & Ilies, H. (2015, March). IDS: the intent driven selection method for natural user interfaces. In 2015 IEEE symposium on 3D user interfaces (3DUI) (pp. 121-

128). IEEE.58. Moore, A. G., Hatch, J. G., Kuehl, S., & McMahan, R. P. (2018). VOTE: A ray-casting study of vote-oriented technique enhancements. International Journal of Human-

Computer Studies, 120, 36-48.59. Mendes, D., Medeiros, D., Sousa, M., Cordeiro, E., Ferreira, A., & Jorge, J. A. (2017). Design and evaluation of a novel out-of-reach selection technique for VR using iterative

refinement. Computers & Graphics, 67, 95-102.60. Vosinakis, S., & Koutsabasis, P. (2018). Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift. Virtual

Reality, 22(1), 47-62.61. Lin, J., & Schulze, J. P. (2016). Towards Naturally Grabbing and Moving Objects in VR. Electronic Imaging, 2016(4), 1-6.62. Grossman, T., & Balakrishnan, R. (2006, October). The design and evaluation of selection techniques for 3D volumetric displays. In Proceedings of the 19th annual ACM

symposium on User interface software and technology (pp. 3-12). ACM.63. Zhai S, Buxton W, Milgram P. 1994. The “Silk Cursor”: investigating transparency for 3D target acquisition. In: CHI '94: proceedings of the SIGCHI conference on human

factors in computing systems; 1994. p. 459–64.64. Matulic, F., & Vogel, D. (2018, April). Multiray: Multi-Finger Raycasting for Large Displays. In Proceedings of the 2018 CHI Conference on Human Factors in Computing

Systems (p. 245). ACM.65. Mayer, S., Schwind, V., Schweigert, R., & Henze, N. (2018, April). The Effect of Offset Correction and Cursor on Mid-Air Pointing in Real and Virtual Environments.

In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 653). ACM.66. Wyss, H. P., Blach, R., & Bues, M. (2006, March). iSith-Intersection-based spatial interaction for two hands. In 3D User Interfaces, 2006. 3DUI 2006. IEEE Symposium on (pp.

59-61). IEEE.67. Ortega, M. (2013, March). Hook: Heuristics for selecting 3D moving objects in dense target environments. In 3D User Interfaces (3DUI), 2013 IEEE Symposium on (pp. 119-

122). IEEE.68. Wonner, J., Grosjean, J., Capobianco, A., & Bechmann, D. (2012, December). Starfish: a selection technique for dense virtual environments. In Proceedings of the 18th ACM

symposium on Virtual reality software and technology (pp. 101-104). ACM.