4
Walk, Look and Smell Through Valentina Cozza 1 [email protected] Gianni Fenu 1 [email protected] Riccardo Scateni 1 [email protected] Alessandro Soro 1,2 [email protected] 1 Department of Mathematics and Computer Science University of Cagliari Via Ospedale, 72 09124 - Cagliari, Italy 2 CRS4 - Center for Advanced Studies, Research and Development in Sardinia POLARIS Science Park - Ed. 1 - 09010 Pula - Italy ABSTRACT Human Computer interaction is typically constrained to the use of sight, hear, and touch. This paper describes an at- tempt to get over these limitations. We introduce the smell in the interaction with the aim of obtaining information from scents, i.e. giving meaning to odours and understand how people would appreciate such extensions. We discuss the de- sign and implementation of our prototype system. The sys- tem is able to represent/manage an immersive environment, where the user interacts by means of visual, hearing and ol- factory informations. We have implemented an odour emit- ter controlled by a presence sensor device. When the system perceives the presence of a user it activates audio/visual con- tents to encourage engaging in interaction. Then a specific scent is diffused in the air to augment the perceive reality of the experience. We discuss technical difficulties and initial empirical observations. Categories and Subject Descriptors H5.2 [Information interfaces and presentation]: User Interfaces. - Graphical user interfaces. General Terms Design, Measurement. Keywords Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory way. They perceive their sur- roundings through all five senses, leading to a deeply detailed perception. The history of HCI is mainly built over the vi- sual communication channel. Additional auditory and hap- Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, re- quires prior specific permission and/or a fee. CHITALY’ 11, September 13-16, 2011, Alghero, Italy Copyright is held by the owner/author(s). tic interaction came later and are still regarded as optional companions of the former. Only in rare cases attempts have been made to model multisensory environments. In these en- vironments the authors tried to achieve systems where the user is encouraged to get involved in 360 . There are several application fields where it could be an important limit to re- ceive information only through the visual, auditory, and tac- tile channels. Examples about these applications are: sim- ulation environments, e.g. for the firefighter training work [8]; the use of olfactory icons to show events (Microsoft’s smicons); applications in wellness, such aromatherapy [5], multisensory rooms designed for the care of newborn ba- bies or old people [9], and applications in cultural environ- ments such as interactive multisensory museums [4]. How- ever, many aspects are still open in terms of technological research, design and evaluation. This work introduces a pro- totype of interactive multisensory environment, built over computer vision techniques for motion detection, multime- dia contents and actuators able of spraying aromas in the air. Our main goal is to experiment with this setup while collecting problems and exploring opportunities about un- rolling a fully immersive walk, look and smell path. We are developing two specific scenarios: The implementation of an interactive wine museum; The representation of the whole year cycle catching the sensations of changing seasons. We claim that planning to layout these interaction scenarios without the olfactory component, would be not enough sat- isfactory for the full sensorial experience. We support our claim saying that when you try to create such a full sensorial experience, it is necessary, somehow, to use a consistent per- centage of stimuli present in real situations [10]. Moreover, the sense of smell has the important function of adding a profusion of information to the perception, and mainly to modify the human’s emotional state [2, 6]. It is, thus, es- sential to support applications widely developed and used, with the sense of smell that is the most primitive among the human perceptions. The rest of the work is organized as follow: in section 2 we describe the state of art in the field of olfactory interaction; in section 3 we specify the system ar-

Walk, Look and Smell Through - people.unica.it€¦ · Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Walk, Look and Smell Through - people.unica.it€¦ · Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory

Walk, Look and Smell Through

Valentina Cozza1

[email protected] Fenu1

[email protected] Scateni1

[email protected] Soro1,2

[email protected]

1 Department of Mathematics and Computer ScienceUniversity of Cagliari

Via Ospedale, 72 09124 - Cagliari, Italy

2 CRS4 - Center for Advanced Studies, Research and Development in SardiniaPOLARIS Science Park - Ed. 1 - 09010 Pula - Italy

ABSTRACTHuman Computer interaction is typically constrained to theuse of sight, hear, and touch. This paper describes an at-tempt to get over these limitations. We introduce the smellin the interaction with the aim of obtaining information fromscents, i.e. giving meaning to odours and understand howpeople would appreciate such extensions. We discuss the de-sign and implementation of our prototype system. The sys-tem is able to represent/manage an immersive environment,where the user interacts by means of visual, hearing and ol-factory informations. We have implemented an odour emit-ter controlled by a presence sensor device. When the systemperceives the presence of a user it activates audio/visual con-tents to encourage engaging in interaction. Then a specificscent is diffused in the air to augment the perceive reality ofthe experience. We discuss technical difficulties and initialempirical observations.

Categories and Subject DescriptorsH5.2 [Information interfaces and presentation]: UserInterfaces. - Graphical user interfaces.

General TermsDesign, Measurement.

KeywordsMulti-sensory interaction.

1. INTRODUCTIONIn real life human beings interact with the world and witheach other in a multisensory way. They perceive their sur-roundings through all five senses, leading to a deeply detailedperception. The history of HCI is mainly built over the vi-sual communication channel. Additional auditory and hap-

Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copiesare not made or distributed for profit or commercial advantage and thatcopies bear this notice and the full citation on the first page. To copyotherwise, or republish, to post on servers or to redistribute to lists, re-quires prior specific permission and/or a fee.CHITALY’ 11, September 13-16, 2011, Alghero, ItalyCopyright is held by the owner/author(s).

tic interaction came later and are still regarded as optionalcompanions of the former. Only in rare cases attempts havebeen made to model multisensory environments. In these en-vironments the authors tried to achieve systems where theuser is encouraged to get involved in 360◦. There are severalapplication fields where it could be an important limit to re-ceive information only through the visual, auditory, and tac-tile channels. Examples about these applications are: sim-ulation environments, e.g. for the firefighter training work[8]; the use of olfactory icons to show events (Microsoft’ssmicons); applications in wellness, such aromatherapy [5],multisensory rooms designed for the care of newborn ba-bies or old people [9], and applications in cultural environ-ments such as interactive multisensory museums [4]. How-ever, many aspects are still open in terms of technologicalresearch, design and evaluation. This work introduces a pro-totype of interactive multisensory environment, built overcomputer vision techniques for motion detection, multime-dia contents and actuators able of spraying aromas in theair. Our main goal is to experiment with this setup whilecollecting problems and exploring opportunities about un-rolling a fully immersive walk, look and smell path. We aredeveloping two specific scenarios:

• The implementation of an interactive wine museum;

• The representation of the whole year cycle catchingthe sensations of changing seasons.

We claim that planning to layout these interaction scenarioswithout the olfactory component, would be not enough sat-isfactory for the full sensorial experience. We support ourclaim saying that when you try to create such a full sensorialexperience, it is necessary, somehow, to use a consistent per-centage of stimuli present in real situations [10]. Moreover,the sense of smell has the important function of adding aprofusion of information to the perception, and mainly tomodify the human’s emotional state [2, 6]. It is, thus, es-sential to support applications widely developed and used,with the sense of smell that is the most primitive amongthe human perceptions. The rest of the work is organized asfollow: in section 2 we describe the state of art in the field ofolfactory interaction; in section 3 we specify the system ar-

Page 2: Walk, Look and Smell Through - people.unica.it€¦ · Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory

Figure 1: Representation of the virtual path.

chitecture in its components; we outline the future possibledevelopments in section 4.

2. STATE OF THE ARTMaking an application that includes the sense of smell withinthe channels of communication with the user requires one toventure into largely unexplored area; that is due, mainly, tothe trouble to understand how the brain is able to elaborateinformation received by the sense of smell [12], and, after-ward, in which application area it’s useful to add these kindof interactions.

The smell allows us, easily, to draw the attention and to fo-cus on what it’s considered important; Kaye suggests a sortof similarity between olfactory displays and“Calm Technolo-gies” [13], suggesting that the perception of a scent changesquickly from the background to the foreground of our atten-tion, and such changes in environmental scents could quicklyattract one’s attention [8]. Typically, there are two macroareas in HCI in which one can find smell included:

1. Using smell as input with the creation of sensors able toidentify the constituents molecules of a complex scentby using of pattern recognition algorithm and that canreturn this representation of the scent in a digital for-mat; this is the main task of electronic noses; theyobtained a large success in the past two decades andfound a large usage in food and wine industries [3];

2. Reproduce scents by means of electronic devices; thisis the main task of olfactory displays and its develop-ments; in this way smell is an output channel and it is

the focus of our interests here; we will see an overviewof these only.

Olfactory displays do the inverse task as electronic nosesdo. These devices are designed with the goal of synthesizingscents from a digital description [7]. Olfactory displays areused in real life applications, such as, for example, “IncenseClock” of Japan and China life [8]. Olfactory displays arewidely used as therapeutic devices in aroma therapy area,“aroma-chology” [5]. These projects are based on the ideathat some scents stimulate good feelings in human beingslike positivity, creativity, perspicacity. It is this quality ofscents that is mostly tried to be reproduced using electronicsystems. It dates back to the 50’s the attempt to use smellin interactive system, with its introduction in cinemas world[8] as an advertising campaign to bring back people to thecinema, after the advent of television. Films were comple-mented with 3D goggles, vibrating seats and scents emitters.The aim was to provide people with a new approach to theplot. Even if the “scented films” were not a success, thesame technologies have been applied in environments of vir-tual reality. A seminal example is Sensorama, “an immersivevirtual reality” [11], using 3D vision, vibrating seats, andscents, to reproduce real life scenes, such as motorbike ride,or a walk in a flower garden. Another example of virtualreality, as already cited, is the firefighter training developedby Carter and described by Zybura and colleagues in [14].He focused his work in managing scents emission in quantityand quality; he says “olfactory output is completely propor-tional from hint of odor to a stench that makes you wantto rip the mask off ...”. The sense of smell was also intro-duced in closed spaces, as museum and exhibitions, with thegoal to create scents that could help visitors to remember

Page 3: Walk, Look and Smell Through - people.unica.it€¦ · Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory

easily all the information acquired during the exhibition [1].This idea was used in famous museum as Natural HistoryMuseum in London, Bow Street Old Whiskey Distillery inDublin, Jorvik Viking Museum New York. Pletts Haque re-alized an artistic installation where she used colors and scentto mark off an exhibition area [8]. We can see examples inwhich smell completes the information received from othersenses. The idea to provide information by the sense of smellwas also realized in other applications such as “smicon” [7],where every scent is conventionally associated to the seman-tics of an information. For example Microsoft used this ideato create an extension of Outlook in which the sender of anemail can be recognized by a smell that is emitted when amessage arrives [7].

3. OUR PROPOSALAs we mentioned before, we describe here a prototype of aninteractive multisensory system, supported by presence sen-sors, multimedia contents and actuators capable of spread-ing specific aromas in the air (see an example in Figure 1).We aim at testing problems and opportunities concerningthe realization of immersive paths. It is possible to split theproject in the investigation of two different scenarios, inter-related, since they are using the same basic components.

An interactive wine museum. This scene describes animmersive multisensory experience inspired by the desire ofincrementing the knowledge about wine making regional cul-ture. The user visits a multimedia exhibition; when shecomes in a predefined “interaction area”, she is involved byseveral sensations including sounds, images and odors. Thestimuli have an evocative and involving nature and enhancethe message transmitted by the installation.

The representation of changing seasons. This scenedescribes an experience that is inspired by the cultural her-itage of Sardinia. The user visits a multimedia exhibition,when arriving at a transition point, for example an airport,she is caught by typical images of wildlife in Sardinia, withsome typical scents of the underwood. The user gets in-volved in the cycle of seasons, and can look how earth andsea, colors and scents change in time and space.

3.1 ImplementationThe whole project consists in the design and implementa-tion of an indoor environment including two different com-ponents: a video-camera allowing capturing the user move-ment; a mechanical activated device, correlated to the first,able to spray scent and, thus, reproduce olfactory experi-ences. The two stages are linked together by developingsoftware that performs several tasks in pipeline:

1. The area in front of the camera is constantly monitoredand the video is recorded at a slow pace of one frameper second;

2. Each frame is compared with the previous N (where Nis a user defined parameter) to determine when a useris really interested in interacting with the experience;

3. A signal is sent to the actuator that releases a puff ofscent which is sprayed towards the user to complementthe experience with the olfactory channel.

Our main initial focus was on the realization of the devicethat can reproduce scents. The project was intended tomodel a kind of virtual path, an enabling technology to lay-out different applicative models for our studies. As we men-tioned before, the two first examples we are planning are:an implementation of a wine museum in a natural confinedenvironment; and the sensorial experience of the changingseasons across the year. The details of the detection device,which allows identifying human presence in the interactionzone, and of the hardware device, capable of controlling ascents emitter, are as follows.

3.1.1 The video deviceThe video device is realized using a webcam. The webcamis positioned above the interaction zone between the userand the virtual representation of the object. The webcamcan detect a change in the interaction zone. To do this,the device, continually collects information from the videoby means of image processing techniques. Such simple videocapturing algorithms, one frame per second of the video, andcomparing frames two by two, can be easily deployed to anembedded architecture at a later time. With the compari-son, we can obtain differential connected regions from pairsof images in black and white. These connected regions allowunderstanding if any user is in the interaction zone. Whenconnected regions are identified, and also the presence of theuser is recognized in the area, the scents emitter is fired.

Figure 2: Prototype realized.

3.1.2 The hardware deviceThe hardware device is realized connecting a DC motor toan external microcontroller with a specific programming toprovide the necessary pressure to a scent spray (see Figure2). We can imagine the motor as a sort of remotely con-trolled electronic finger, which spays scents, by pressing on

Page 4: Walk, Look and Smell Through - people.unica.it€¦ · Multi-sensory interaction. 1. INTRODUCTION In real life human beings interact with the world and with each other in a multisensory

a button. The motor can, possibly, be in three differentstates:

1. A state NOP, where it does not have energy enough,and it is in a no-motion condition;

2. A state RIGHT, where it receive energy along a direc-tion, and moves along right direction;

3. A state LEFT, where the input energy has invertedpolarity, and it moves along the opposite (left) direc-tion.

Figure 3: Representation of the projectarchitecture and the communication between

devices.

3.1.3 Communication between devicesAs said before, the scents emitter works only if it receivessome external signal and this will provide energy to the mo-tor. This signal is managed by the video device. When thevideo device identifies the presence of an object of a pre-defined dimension in the interaction zone, it sends a signalto the actuator (the high-level scheme is in Figure 3). Atthe moment, in the prototype, the signal is sent via a serialinterface. In the future, it will travel over the network, sothat webcam and emitter device will not be obliged to stayon the same computer.

4. CONCLUSION AND FUTURE WORKThe smell is one of the most important senses in the dailylife. It provides necessary information to understanding thesurrounding environment and to complete and complementthe information from other senses. In our implementationthe smell is the fundamental vehicle of interaction of thewhole work. The idea is to realize a localized environmentwhere it is possible to place, side by side, videos, soundsand olfactory sensations. The installation suggests the userto interact with the environment while not forcing her to doso. This way she can decide to get captured by surroundinginformation. As of today, it is only at a proof of conceptstage and, in the future, the prototype will be improvedalong two directions:

1. We want to be able to setup the video acquisition sys-tem to allow to figure out how long a user stays in theinteraction area. This information will be used to de-termine how many times it will be necessary to emitscents and in which quantity;

2. A second main improvement will be to modify thehardware device, replacing the “electronic finger” witha piezoelectric device.

On the prototype we will do some measuring to choose opti-mal positioning of environment emitters. Moreover we willintroduce methods of post ventilation in the environment,to leave unchanged the perception for users not affected bythe current interaction.

5. REFERENCES[1] J. Aggleton and L. Waskett. The ability of odors to

serve as state-dependent cues for real-worldmemories:can viking smell aid the recall of vikingexperiences? British Journal of Psycology, 90(1):1–7,1999.

[2] S. Chu and J. Downes. Odour-evoked autobiographicalmemories: psycological investigations of the Proustianphenomena. Chemical Senses, 25(1):111–116, 2000.

[3] J. Gardner and P. Bartlett. Electronic noses.Principles and Applications. Oxford University Press,1999.

[4] N. Inoue. Application of ultra-realistic communicationresearch to digital museum. In Proceedings of the 9thACM SIGGRAPH Conference on Virtual-RealityContinuum and its Applications in Industry, VRCAI’10, pages 29–32, New York, NY, USA, 2010. ACM.

[5] A. Isen, F. Ashby, and E. Waldron. The sweet smell ofsuccess. The Aroma Chology Review, 4(3):1, 1997.

[6] T. Jacobs. The role of smellhttp://www.cf.ac.uk/biosi/staff/jacob/teaching/sensory/olfact1.html. 2007.

[7] J. Kaye. Symbolic olfactory display. Master’s thesis,Media Lab, Massachusetts Institute of Tech, 2001.

[8] J. Kaye. Making scents: aromatic output for HCI.interactions, 11:48–61, January 2004.

[9] P. Marti, H. Lund, M. Bacigalupo, L. Giusti, andC. Mennecozzi. Blending senses: A multi-sensoryenvironment for the tratment of dementia affectedsubjects. Journal of Gerontechnology, 6(1):33–41,January 2007.

[10] B. Ramic-Brkic and A. Chalmers. Virtual smell:authentic smell diffusion in virtual environments. InProceedings of the 7th International Conference onComputer Graphics, Virtual Reality, Visualisation andInteraction in Africa, AFRIGRAPH ’10, pages 45–52,New York, NY, USA, 2010. ACM.

[11] H. Rheingold. Virtual Reality. Summit Books, 1991.

[12] S. S. Schiffman and T. C. Pearce. Introduction toOlfaction: Perception, Anatomy, Physiology, andMolecular Biology, pages 1–31. Wiley-VCH VerlagGmbH and Co. KGaA, 2004.

[13] M. Weiser and J. S. Brown. The coming age of calmtechnolgy, pages 75–85. Copernicus, New York, NY,USA, 1997.

[14] M. Zybura and G. A. Eskeland. Olfaction for virtualreality. University of Washington, 1999.