8
Hajukone: Developing an Open Source Olfactory Device Abstract In comparison to our other senses, there has been relatively little work on how our sense of smell can be effectively utilised in a Human-Computer Interface. We argue that the lack of easy access to ‘off-the-shelf’ computer controlled scent delivery devices restricts research in this area, and that without understanding what smell can be used for, there is little commercial case to make such devices available. In considering these issues, we have developed Hajukone: a smell delivery device that is both open source and can be built with low technical skills, yet provides high quality olfactory capabilities. We outline the design of Hajukone, showing how it overcomes critical design requirements that have restricted prior research, before outlining our future plans for its development and use. Author Keywords Olfaction; Open-Source; Scent-Delivery; Making ACM Classification Keywords H.5.m. Information interfaces and presentation: Miscellaneous; Introduction In 2004 Kaye [12,13] noted that although smell plays significant and important roles in daily life, in “HCI smell is an almost entirely unexplored medium” [12] Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). CHI'16 Extended Abstracts, May 7–12, 2016, San Jose, CA, USA. ACM 978-1-4503-4082-3/16/05. http://dx.doi.org/10.1145/2851581.2892339 David McGookin Department of Computer Science Aalto University Helsinki, Finland [email protected] www.davidmcgookin.net Dariela Escobar School of Arts and Communication Malmö University Malmö, Sweden [email protected] www.darielaescobar.com

Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

Hajukone: Developing an Open Source Olfactory Device

Abstract In comparison to our other senses, there has been relatively little work on how our sense of smell can be effectively utilised in a Human-Computer Interface. We argue that the lack of easy access to ‘off-the-shelf’ computer controlled scent delivery devices restricts research in this area, and that without understanding what smell can be used for, there is little commercial case to make such devices available. In considering these issues, we have developed Hajukone: a smell delivery device that is both open source and can be built with low technical skills, yet provides high quality olfactory capabilities. We outline the design of Hajukone, showing how it overcomes critical design requirements that have restricted prior research, before outlining our future plans for its development and use.

Author Keywords Olfaction; Open-Source; Scent-Delivery; Making

ACM Classification Keywords H.5.m. Information interfaces and presentation: Miscellaneous;

Introduction In 2004 Kaye [12,13] noted that although smell plays significant and important roles in daily life, in “HCI smell is an almost entirely unexplored medium” [12]

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). CHI'16 Extended Abstracts, May 7–12, 2016, San Jose, CA, USA. ACM 978-1-4503-4082-3/16/05. http://dx.doi.org/10.1145/2851581.2892339

David McGookin Department of Computer Science Aalto University Helsinki, Finland [email protected] www.davidmcgookin.net Dariela Escobar School of Arts and Communication Malmö University Malmö, Sweden [email protected] www.darielaescobar.com

Page 2: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively few papers that consider the use of olfaction in HCI [3,5,7,18,21,24]. In contrast to the varied roles smell plays in our daily lives [18], it continues to have significant but unrealized potential in HCI. Whilst researchers in other emergent areas over the last decade (such as pressure interaction [4], recently incorporated into Apple iPhone [1] and Watch [11]) have been able to easily appropriate commercial devices (e.g. Wacom drawing tablets [19]) to support research, olfactory researchers must start by designing and building an Olfactory delivery device – a device capable of producing a scent on demand. This can require significant engineering skills, placing a barrier in the way of researchers looking to understand how smell can be used in a Human-Computer Interface.

These issues are in spite of significant research effort into the development of a range of olfactory delivery technologies and systems [3,7,10,17,21–23,25]. However, such work largely focuses on technical contributions to scent delivery devices, not in how they are used. Such work is not reported in enough detail to be technically reproduced by third parties, and is often developed within companies. Therefore, it is difficult for researchers without engineering skills to build. This leads to HCI researchers studying the role and use of smell on less than “state of the art” technology. For example, Bodnar, Corbett and Nekrasovski [3] used an Ethernet enabled power socket and a powered room freshener fragrance device to study olfactory notifications (See Figure 1).

Such approaches also lead to using devices with widely varying characteristics. Unlike in audio or haptics,

where the time taken between triggering a stimulus and that stimulus being physically generated and delivered is predictable (and often so short that it is ignored), delivery time of a scent varies widely (from several milliseconds to several seconds) depending on the delivery mechanism. Ultrasonic devices (such as the Scentee Balloon [20] – Figure 3) are significantly faster than thermal approaches [7] - where a solid is heated to emit a scent. This can have a significant effect on results, but such characteristics are often not reported. Warnock, McGee-Lennon and Brewster [24] report time taken for olfactory notifications to be responded to when the user is engaged in a secondary task, but provide no information on how long the delivery time of the scent was. Bodnar, Corbett and Nekrasovski [3] provide no details on the technology used within their commercial room fragrance delivery mechanism. Such characteristics make comparison of results on the use of smell in HCI challenging.

A commercial off-the-shelf computer controlled smell device would be a solution, and Kaye [12] was optimistic in 2004 that these were coming. However, the devices he mentioned never reached market. In the last decade some devices have been commercially released, but have not lasted long. This is in part due to a lack of knowledge on what smell is actually useful for in a computer interface. Currently there are two available commercial devices that researchers can employ. The Dale Air Vortex [8] (see Figure 2) costs approx. $500 USD and uses alcohol-based scents soaked onto cotton pads. A CPU fan blows air across the pad blowing the scent towards the user. However, as the unit is open, scent is always released through evaporation. The CPU fan provides crude and limited control over the direction of scent. As the device can

Figure 1: An example ad-hoc device used to study olfaction. Taken from Bodnar, Corbett and Nekrasovski [3].

Figure 3: The commercial Scentee Balloon provides a single scent controlled via a proprietary API.

Figure 2: The Date Air Vortex [8] uses CPU fans to blow over discs. The discs are soaked in alcohol based scent that evaporates into the air.

Page 3: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

contain 4 scents, good ventilation is also required to avoid smells mixing in the atmosphere and creating an unpleasant mix [5]. The Scentee Balloon [20] (see Figure 3) costs $50 USD and uses ultrasonic transducers to provide faster and more controlled release of a contained liquid scent. However, it supports delivery of only a single scent, in a proprietary capsule and via a proprietary audio connection interface on a mobile device. It cannot support multiple scents, or support combinations of modalities (e.g. smell and sound) concurrently. The proprietary API also limits the amount of scent that can be released. The device is also compatible with only a few smartphone models.

We argue that there is a gulf between those who primarily study smell delivery technology and those that wish to study the role of smell in HCI. The former do not (for whatever reason) make their technologies or designs reproducibly available, and the latter generally (with notable exceptions such as [7,21]), do not have the technical ability to design and build relatively complex devices, so resort to “least effort” options. Whilst there are commercial devices, these often do not last long on market, and until research on the use of smell has developed, there are no compelling use cases to support commercial development.

Hajukone It is clear that a common, relatively advanced platform for researchers to employ would be desirable. To address this we have begun to develop a scent delivery device called Hajukone. Hajukone is designed to be made widely and freely available to researchers, and provide a common platform to more easily compare findings. However, as discussed it is not enough to build a device, we must consider how others will be

able to access it. Based on the previous discussion we identified the following key design challenges:

Ability to Produce Multiple Scents: We cannot automatically mix a scent from a set of primary components in the same way as we might mix a colour [2]. Therefore, smell delivery devices can only emit pre-mixed scents. Current approaches to producing multiple scents require duplication of devices. However, this increases cost, size and complexity. For example, Bodnar, Corbett and Nekrasovski [3] (see Figure 1), in order to study desktop notifications, used two room fragrance devices with two X.10 networked power points to generate two scents, and required IP networking infrastructure to control them. The commercial Scentee Balloon (see Figure 2) requires an iPhone and Balloon device for each scent; and networking between iPhones to control scent release. Hajukone therefore needs to support multiple scents, but without simply duplicating devices for each and expecting researchers to manage the significant technical complexity this can bring.

Open Source: Although there is extensive research in smell delivery [7,10,17,21–23,25], such devices are not directly available or presented in enough detail for them to be reproduced. Ad-hoc solutions in existing research vary significantly in their scent release characteristics. By having an open source design, the characteristics of which can be well understood, it will be easier to compare findings across studies.

Easily Reproducible: Whilst existing researchers in smell interaction may not be able to design a smell delivery device (thus turning to ad-hoc solutions), it is increasingly likely they can construct it. Many

Figure 4: Our initial design ideas came from a multi-colour pen (left). This led to a cartridge based delivery concept (right).

Figure 5: Our initial designs focused on scent cartridges that could be pushed up in the device to emit a scent.

Page 4: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

universities now have Fabrication laboratories (Fab-Labs) [14]. These provide technical skills and devices to support making. However, in doing so it is important that the components used are widely available. Providing open-source solutions is not helpful if the components they use are specialised or not widely obtainable. By using widely available components and having designs available as standard formats, it should be possible to reproduce Hajukone with minimal technical skills in hardware design and engineering.

Mobility and Portability: The majority of proposed commercial devices and ad-hoc solutions require infrastructure that is static (e.g. a USB connection to a Desktop computer or mains power sockets). This restricts researchers to focus on desktop-like scenarios, even if their work considers more mobile and ubiquitous scenarios. E.g. Warnock, McGee-Lennon and Brewster [24] studied the role of smell to provide reminders to support elderly users living at home, but had to use a Dale Air Vortex which required a USB connection to a Desktop PC. Research such as Choi et al.’s olfactory glasses [7] illustrate the potential of mobile olfaction, yet the technologies available to most researchers (e.g. the main’s powered devices used by Bodnar, Corbett and Nekrasovski [3]), preclude it. Hajukone must be able to work in mobile scenarios. This also somewhat excludes more localised scent possibilities, such as ‘firing’ scent vortex rings at users as these are sensitive to occlusion from other users getting in the way of the device [17] .

Initial Designs In considering how to apply these goals it became clear the largest challenge is in the production of multiple scents without the need to have multiple instances of

the device setup. To do this we need to be able to dynamically disconnect and reconnect different scents to whatever device or technology is used to deliver them. In solving this we were inspired by the design of multi-colour pens (see Figure 4 (left)) which have small sliders that push a pen nib with the corresponding colour out of the pen to write with. A similar approach could use capsules of scent pushed up to a delivery device (see Figure 4 (right)). Whilst this works in theory, and would overcome the need to duplicate equipment, it introduces the problem of contamination and un-desired mixing of smells with each other. This is an issue for all prior smell work. Brewster, McGookin and Miller [5], in their study of a tangible olfactory photo browser that used multiple plastic cubes each containing a scent, noted: “the smells were very powerful and created a nasty mixture (which infused our whole department). We put the cubes in zip-loc bags and then in a sealed plastic box to stop the smells escaping”. Smells must therefore be contained, and allowing them to mix with each other in the common parts of the delivery mechanism will lead to undesirable consequences. A potential solution is to use solid scents that only produce scent when they melt in exposure to heat [7], and can therefore be easily switched and avoid contamination. However, these require special skills to create. In line with our consideration of easy reproducibility, we considered that scents should be liquid based. This allows a wide range of existing scents to be used (e.g. essential oils or fragrances).

To overcome this issue, we decided to minimize the amount of common infrastructure between scents that each would be exposed to. For parts of the device that would be exposed to multiple scents, we would create a cleaning capsule containing water that would wash the

Figure 6: A disassembled Scentee Balloon [20]. The Ultrasonic transducer (on the left) is used to emit scent from the scent reservoir disassembled in the top of the picture.

Figure 7: Our final concept focused on a rotating container to deliver scents to the humidifier ultrasonic sensor.

Page 5: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

common parts after each scent delivery event, avoiding smells mixing. This led to an initial prototype of a ‘spritzer’, where cartridges containing scent could be pushed up to a delivery mechanism (See Figure 5).

Delivering the Scent Our initial prototype considered that the liquid based scent would be sprayed out, in a similar way to scent being emitted by a perfume bottle. However, during body storming [15] of the mock-up device (see Figure 4 (right)) we quickly realized that these are designed to impregnate something or someone with scent that is then released. The scent cannot be easily cleared. This means the device would either spray multiple scents onto the user (creating an unpleasant merging of scent [5]), or spraying others nearby whilst not being detectable to the user. Work such as Nakaizumi et al.’s [17] scent projectors overcomes this, projecting a vortex of scent at the user’s nose from an olfactory canon. However, this requires camera tracking, and has not been studied in multi-user or open mobile interaction scenarios. E.g. others walking in-front of the “line of fire” would cause the same problem as the perfume bottle.

In re-evaluating the problem, again rejecting solutions such as heat based solid scents (see previous section) and solutions that require evaporation into the atmosphere (such as the cubes used by Brewster, McGookin and Miller [5] and the Dale Air Vortex [8]), as the scent emission cannot be stopped, we decided to consider ultrasonic transducers. These are, in effect, small speakers that vibrate at particular ultrasonic frequencies. As with all speakers they generate a pressure wave that permeates through the air. Such devices have begun to be investigated in HCI, such as

Carter et al.’s work [6] to support physical levitation of lightweight objects. A variant of these devices are ultrasonic atomizers, which use a semi-permeable transducer that can use a capillary action to emit a fine mist from a fluid on demand. This produces a more ambient infusion of scent into the air than the directional “perfume bottle” model. Such devices have already been used in smell devices (e.g. this is how the Scentee Balloon (see Figure 6) emits a scent [20]).

Although atomizers produce scent instantly, in a controlled quantity, on-demand and can quickly spread it through the air, there are practical issues with this approach as it relates to our design goals. Although widely used in industrial applications, they are not commonly used by the making community (or available in making stores such as http://www.adafruit.com/), so are only available in bulk. In addition, atomisers require a relatively complex signal generator to produce the ultrasonic signal. This must be set to the resonant frequency of the transducer and is non-trivial to construct. As such the use of ultrasonics would significantly compromise our easy reproduction goal.

To overcome this we began to look for devices that would contain the transducers and drivers, and which would be easy to modify. We found that USB humidifiers (commonly and cheaply available on web stores such as www.amazon.com) contain both the transducers and driver circuits. Such devices are cheap ($4-5 USD each) and simple – running whenever power is supplied via a USB port. They can therefore be treated as components. I.e. By using a transistor to act as a switch to turn on and off power, computer control is very easy to obtain. Whilst this approach compromises our open source design goal to some

Figure 9: We developed a cardboard working prototype of the final design.

Figure 8: A sketch of the final design, intended to be small enough to be attachable by clip to a backpack shoulder strap.

Page 6: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

extent, the simplicity and common characteristics of USB humidifiers (i.e. they do only one thing without additional complexity such as on/off buttons) and their wide availability, means that one model can be relatively easily swapped for another without the need for complex re-engineering of the device. Whilst it is arguably technically simpler to use the humidifiers as is, beyond a small number of scents it would be necessary to attach a new device to the user each time a new scent was to be used. This would quickly cover the user in devices, which may not emit scent in an ideal location.

Final Design We developed sketches of what would become the final design (see Figure 7). All the humidifiers had an absorbent fibre that pulled water from a container to the ultrasonic transducer. Rather than pushing cartridges up towards the humidifier, the humidifier can be pulled down, engaging with the absorbent fibre and scented liquid in a container. The container can then be rotated to provide different scents, with a dedicated segment filled with water to clean the ultrasonic transducer. Although the scents will evaporate at the top of the fibres, as these must be left exposed, this is not part of the delivery mechanism (such as with the Dale Air Vortex [8]) so the unit can be sealed (see Figure 10), with the only way for scents to escape being via the ultrasonic device under computer control.

To test our approach, we built a working cardboard prototype (see Figure 9) with a 3D printed container to hold the liquid. We found that the device could be effectively controlled by an Arduino Uno microcontroller (www.arduino.cc) using only a few very basic circuits. A servo unit was used to rotate the scent container,

with a small switch at the base used to index the container for each scent. Another servo motor pushed the ultrasonic transducer down, whilst a transistor was used to switch power to the transducer to activate it. The device can be controlled over a serial interface via the Arduino USB cable by a laptop, or, by using a Bluetooth low energy shield on the Arduino an iPhone or Android device, providing a wireless solution.

Current Device We replaced the cardboard components with Computer Numerical Control (CNC) machined wooden parts. In building the final version the 3D printed container used to hold the scents was found to be unreliable and leaky. 3D printing is still at an early stage and careful design of objects is required to ensure they can be printed reliably. This required us to redesign the container to be bigger, and thus more reliable for the 3D printer. As a result, the device moved from being a small shoulder mounted device as originally intended (see Figure 8) to a chest worn device (see Figure 10), but is still mobile.

Discussion Informal testing of the current device has shown it can reliably emit up to six scents. We have found that any liquid based scent can be used. This opens a significant range of off-the-shelf scents that can be employed. We have released the 3D printer (.stl) and CNC files (.ai), circuit diagrams and source code (.pde) [16] under a GPL v3 licence [9]. Our future work is focused on more formal testing and use of Hajukone, as well as having others reproduce the devices (e.g. in Student projects). This will allow us to validate the technical skill needed to build the device, and make it easier. By doing so we be able to validate if our approach can help end the impasse in Olfactory HCI research.

Figure 10: The final prototype version. Top: The overall assembly. Middle: illustrating a top view of the device without its lid. Bottom: The sealed unit can be chest worn.

Page 7: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

References 1. Apple Inc. http://www.apple.com/iphone/.

2. Benjamin Auffarth. 2013. Understanding smell—The olfactory stimulus problem. Neuroscience & Biobehavioral Reviews 37, 8: 1667–1679. http://doi.org/http://dx.doi.org/10.1016/j.neubiorev.2013.06.009

3. Adam Bodnar, Richard Corbett, and Dmitry Nekrasovski. 2004. AROMA: Ambient Awareness Through Olfaction in a Messaging Application. Proceedings of the 6th International ACM Conference on Multimodal Interfaces (ICMI 2004), ACM, 183–190. http://doi.org/10.1145/1027933.1027965

4. Stephen A Brewster and Michael Hughes. 2009. Pressure-based Text Entry for Mobile Devices. Proceedings of the 11th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2009), ACM, 9:1–9:4. http://doi.org/10.1145/1613858.1613870

5. Stephen Brewster, David McGookin, and Christopher Miller. 2006. Olfoto: designing a smell-based interaction. CHI ’06: Proceedings of the SIGCHI conference on Human Factors in computing systems. Retrieved from http://portal.acm.org/citation.cfm?id=1124772.1124869

6. Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. 2013. UltraHaptics: Multi-point Mid-air Haptic Feedback for Touch Surfaces. Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST 2013), ACM, 505–514. http://doi.org/10.1145/2501988.2502018

7. Yongsoon Choi, Adrian David Cheok, Xavier Roman, The Anh Nguyen, Kenichi Sugimoto, and Veronica Halupka. 2011. Sound Perfume: Designing a Wearable Sound and Fragrance Media for Face-to-face Interpersonal Interaction.

Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology (ACE 2011), ACM, 4:1–4:8. http://doi.org/10.1145/2071423.2071428

8. Dale Air. http://www.daleair.com/dispensers/machines/vortex-activ.

9. GNU General Purpose License v3. 2016. http://www.gnu.org/licenses/gpl-3.0.en.html.

10. Nicolas S Herrera and Ryan P McMahan. 2014. Development of a Simple and Low-Cost Olfactory Display for Immersive Media Experiences. Proceedings of the 2nd ACM International Workshop on Immersive Media Experiences, ACM, 1–6. http://doi.org/10.1145/2660579.2660584

11. Apple Inc. 2016. http://www.apple.com/watch/.

12. Joseph “Jofish” Kaye. 2004. Making Scents: Aromatic Output for HCI. interactions 11, 1: 48–61. http://doi.org/10.1145/962342.964333

13. Joseph Nathaniel Kaye. 2001. Symbolic Olfactory Display.

14. Fab Labs. http://www.fablabs.io.

15. Bella Martin and Bruce Hanington. 2012. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions. Rockport Publishers, Beverly, MA.

16. David McGookin and Dariela Escobar. 2016. https://github.com/davidmcgookin/Haju. Hajukone GitHub Repository.

17. Fumitaka Nakaizumi, Yasuyuki Yanagida, Haruo Noma, and Kenichi Hosaka. 2006. SpotScents: A Novel Method of Natural Scent Delivery Using Multiple Scent Projectors. IEEE Virtual Reality Conference (VR 2006): 207–214. http://doi.org/10.1109/VR.2006.122

Page 8: Hajukone: Developing an Open Source Olfactory …...(p. 50). Whilst the last ten years have seen significant advances in touch, haptic and auditory interaction, there have been relatively

18. Marianna Obrist, Alexandre N Tuch, and Kasper Hornbaek. 2014. Opportunities for Odor: Experiences with Smell and Implications for Technology. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2014), ACM, 2843–2852. http://doi.org/10.1145/2556288.2557008

19. Gonzalo Ramos, Matthew Boulos, and Ravin Balakrishnan. 2004. Pressure Widgets. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2004), ACM, 487–494. http://doi.org/10.1145/985692.985754

20. Scentee. 2015. https://scentee.com.

21. Sue Ann Seah, Diego Martinez Plasencia, Peter D Bennett, et al. 2014. SensaBubble: A Chrono-sensory Mid-air Display of Sight and Smell. Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI 2014), ACM, 2863–2872. http://doi.org/10.1145/2556288.2557087

22. Sayumi Sugimoto, Daisuke Noguchi, Yuichi Bannnai, and Kenichi Okada. 2010. Ink Jet Olfactory Display Enabling Instantaneous Switches of Scents. Proceedings of the ACM International Conference on Multimedia (ICMI 2010), ACM, 301–310. http://doi.org/10.1145/1873951.1873994

23. Risa Suzuki, Shutaro Homma, Eri Matsuura, and Ken-ichi Okada. 2014. System for Presenting and Creating Smell Effects to Video. Proceedings of the 16th ACM International Conference on Multimodal Interaction ICMI 2014), ACM, 208–215. http://doi.org/10.1145/2663204.2663269

24. David Warnock, Marilyn McGee-Lennon, and Stephen Brewster. 2011. The Role of Modality in Notification Performance. Human-Computer Interaction -- INTERACT 2011: 572–588.

25. Tomoya Yamada, Satoshi Yokoyama, Tomohiro Tanikawa, Koichi Hirota, and Michitaka Hirose. 2006. Wearable olfactory display: Using odor in outdoor environment. Proceedings - IEEE Virtual Reality 2006: 28. http://doi.org/10.1109/VR.2006.147