13
Organic User Interfaces Assignment 1. History of interfaces leading to Organic user interfaces Human computer interaction has been developed by many people over the decades. No one person should be credited with its invention or success; this is a much too simple view (Reimer, 2005). Figure 1 is a visual representation of the development of the windows, icons, menus and pointers (WIMP) paradigm. The graphics have improved but the WIMP paradigm has stagnated pretty much since its inception. This in recent years has been changing and now with the commercial success of the iphone, new terminology called organic user interface has been coined and could take the place of WIMP. Organic user interfaces is a term which has recently been developed; however the paradigm has been around for decades. The multi-touch three dimensional touch-sensitive tablet developed in 1985 imbues many of the qualities that are becoming commercially available today (SK. Lee, 1985). Despite the ideas and technologies having been around for so long the standard input to date is still the mouse and keyboard. This is due to the high costs of implementing newer ways of interacting with the interface. So even though the breakthrough of human computer interaction happen only after major advances of technology, people outside the academic community only start to see the benefits once the technology has reached an acceptable price. [1]

stevespyrou.files.wordpress.com  · Web viewThe first is input equals output. This is the interaction of displays by the user meaning ‘the input device is the output device’

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Organic User Interfaces Assignment

1. History of interfaces leading to Organic user interfaces

Human computer interaction has been developed by many people over the decades. No one person should be credited with its invention or success; this is a much too simple view (Reimer, 2005). Figure 1 is a visual representation of the development of the windows, icons, menus and pointers (WIMP) paradigm. The graphics have improved but the WIMP paradigm has stagnated pretty much since its inception. This in recent years has been changing and now with the commercial success of the iphone, new terminology called organic user interface has been coined and could take the place of WIMP.

Organic user interfaces is a term which has recently been developed; however the paradigm has been around for decades. The multi-touch three dimensional touch-sensitive tablet developed in 1985 imbues many of the qualities that are becoming commercially available today (SK. Lee, 1985). Despite the ideas and technologies having been around for so long the standard input to date is still the mouse and keyboard. This is due to the high costs of implementing newer ways of interacting with the interface. So even though the breakthrough of human computer interaction happen only after major advances of technology, people outside the academic community only start to see the benefits once the technology has reached an acceptable price.

Figure 1: History of Interfaces, development of WIMP paradigm (Reimer, 2005)

2. Organic user interfaces defined

Organic user interfaces are computers with displays that can be manipulated and navigated by either touch or hand gesture. The display will then show the information on the same screen or through the display itself moving (David Holman, 2008). Organic user interfaces have three main components to it. The first is input equals output. This is the interaction of displays by the user meaning ‘the input device is the output device’ (Roel Vertegaal, 2008). The second is function equals form meaning the display is able to take any shape and the entire object should be engineered to fit the function that object is to carry out. The third is form follows flow, meaning the display will be able to change shape according to the data it is displaying or can be moved by us to convey a command.

3. Technologies and their applications using the organic user interface paradigm

3.1. Smartskin technology

Smartskin is an interactive surface that can also sense the distance and shape of the hand from the surface, therefore opening up more interaction possibilities. It is an improvement on interactive surfaces because it negates the need for a camera to track hand movements, which means the hardware is simpler. It achieves this through putting a grid of wires under the object that is to act as a sensor, the horizontal wires are receivers and vertical wires transmitters. The hand can be measured at the crossing points of these wires due to fluctuation in signals. The tighter the grid the more accurate Smartskin can be in identifying more intricate hand gestures (Rekimoto J. , SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces, 2002).

3.2. Smartskin based applications

Three applications use Smartskin technology. The interactive table, the gesture-recognition pad and the SmartPad are three prototypes. The first two were built to examine types of hand movement, gestures and positions of the hands and fingers. The gesture-recognition pad also incorporates something called ‘capacitance tags’ which also means that Smartskin is a tangible user interface (physical and digital merging together, the user manipulating the digital through the physical) (Rekimoto J. , SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces, 2002). The SmartPad is the grid woven behind a mobile phone keypad. This enables the user to hover over the button to get a preview of what their action will do if the button was pressed (Oba, Rekimoto, & Ishizawa, 2003). It also allows for gesture recognition (see figure 2) much like that of the iPhone.

Figure 2: Gesture interface with SmartPad. (left) wiping, (right) virtual jog. (Oba, Rekimoto, & Ishizawa, 2003)

Smartskin suffers from a lack of tactile feedback (Rekimoto J. , SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces, 2002) measuring that the users eyes will always have to be on what they are doing. Both hands will have to stay in vision and would provide a usability problem if they are too far apart.

3.3. Organic light-emitting diode technology

Organic light-emitting diode (OLED) display is a technology that can be made paper thin, light weight, flexible and viewed at any angle. It has or will be used in many applications, which have been discussed further below. OLED display is made up of organic polymer molecules printed on a flexible surface such as polyester films and bendable metal foils (Elise Co, 2008). OLED displays work by an electric current being passed through the surface, making the electrons jump between the conductive and emissive layer and when the electrons recombine with each other light is emitted (Freudenrich, 2005).

3.4. Organic light-emitting diode based applications

Digital watches have benefited from OLED displays, due to the economy of space on a watch. The OLED has helped to create a much higher resolution image than the previous LCD screens. The team at IBM, TJ Watson research, concluded that the higher resolution display increased usability and readability (M. T. Raghunath, 2002). This research may be technically dated because MP4 video watches are now available (Gadgets Arcade (division of Arcade International Limited), 2008) however the paper does raise a lot of Organic user interface based usability issues. The research concluded that due to the versatility and elegance (over buttons on the side) the watch should be a touch screen. This meant they moved away from the existing paradigm (buttons and wheels) into the organic user interface paradigm of input equals output. Another is the use of the GesturePad to navigate the watch menu, albeit mentioned very briefly. The idea is that the pad is placed somewhere on the clothes then the user can interact with the watch via the pad (Rekimoto J. , 2001).

Figure 3: Bending states and events (Schwesig, Poupyrev, & Mori, 2004)

Gummi is a bendable computer that would use an OLED display. It is a prototype so at the moment uses a ridge TFT display with the bendable parts underneath. This prototype was made to test out the OUI paradigm of new interaction techniques and interfaces that go beyond WIMP. The interaction was (shown in figure 3) to be a select/deselect method with selecting (target down) concaving the Gummi computer and deselecting (target up) the Gummi computer (Schwesig, Poupyrev, & Mori, 2004). The user reaction to this prototype was good with the users understanding the concept within minutes; however it does suffer from a lack of keyboard. Users were not able to type as fast as they wanted to. Couple this with the fact that, as this was rapid prototype, the users were technical people so their comprehension of such gadgetry would be higher. This also means that if a technical person had trouble inputting text then the ordinary user would find it much harder. Overall Gummi is still in its infancy and upon speculation, in order for this to be a true working PDA style computer, they would have to find a way of making text input as quick as a PDA keyboard. It will amount to no more than a novelty if what it is supposed to replace is better than the original.

PaperWindows is another prototype that could one day incorporate OLED displays. This particular prototype uses actual paper that is tracked by overhead projectors using motion capture that simulates the paper being intractable by deformation. There are several sheets of interactive paper in use at any one time along with the PC; this is to create a desk environment. The PaperWindows work by mapping what we do naturally with paper to tasks we can carry out on the computer. For example in books we browse by flipping the page so we flip the PaperWindow to initiate a scroll (Holman, Vertegaal, Altosaar, Troje, & Johns, 2005). PaperWindows shows us how we can more intuitively manipulate the technology around us to navigate through graphical user interfaces; this is the concept of function equals form. The advantages are transferable knowledge from real world experiences through to the technology. This however could lead to greater anxiety when the system crashes due to user suspension of disbelief; an actual book is never going to go blue screen. While it is important for a new way of interacting to be developed, it cannot be at the cost of a decreased knowledge of how computers work.

3.5. Presense technology

PreSense is an interaction technique based on that of SmartPad, in that it gives the user a preview function. There is also a command assignment function, taking it further than pressing number lock or holding down the shift button on the keyboard by using a technique called touch-shift. The technique has three input states (not-contact, touch and press) as opposed to two input states which is what all keypads are at the moment thus allowing more different types of characters (Rekimoto, Ishizawa, Schwes, & Oba, 2003).

3.6. Presense based applications

Three applications have been created and put through user testing using the PreSense & SmartPad concept. The applications are mobile phone based and the buttons that used sensors were the 2 (up), 5 and 8 (for middle) and 0 (down). The first application is for traversing phonebook and getting information about calls to that contact when the middle button was pressed. The second is almost identical but pictures with the selected person pop-ups, which can then be traversed. The third is a picture gallery application in which the user traverses the pictures using 2 and 0 and then selecting the picture using 5 or 8. The user can then zoom in and out of the selected picture (Holleis, Huhtala, & Häkkilä, 2008). These applications were well received by the users with increased touch speed and general usefulness being mentioned. The disadvantages of such a system is that it can increase cognitive load when using a phone and that the user ‘rests’ their thumb on the keypad without wanting to do anything. However with this system the resting thumb will activate something.

3.7. Presense2 technology

The paper introducing PreSense lacked a concrete technology that enabled all the different interaction techniques to be used at the same time. PreSense has now been superseded by PreSense2 which has a suitable underlying technology that can incorporate all the different interaction techniques. The technology used, as shown in figure 4, are pressure sensors under the touch pad to measure how much force is being applied and a piezo-actuator to give vibration feedback. The amount of feedback is dependent on how much pressure and type of pressure (e.g. Click, drag & drop) applied (Rekimoto & Schwesig, PreSenseII: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback, 2006). This technology allows for feedback and bi-directional control. Bi-directional control is a way of measuring the contact area against threshold in order to differentiate between types of touch, in this case called positive and negative pressure. Positive pressure is the finger cushion touching the touchpad and negative pressure is the finger tip touches the touchpad (Rekimoto & Schwesig, PreSenseII: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback, 2006).

Figure 4: Presense2 sensor configuration (Rekimoto & Schwesig, PreSenseII: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback, 2006)

3.8. Presense2 based applications

There are two prototype applications. The first is map navigating. This works by the user touching and moving their finger to navigate round the map and the positive and negative pressure to zoom in and out of the map. This means there is no releasing the finger from the touchpad to change modes. The other application is scrolling through lists. The positive and negative pressure is used to define speed of the traversing of the list (Rekimoto & Schwesig, PreSenseII: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback, 2006).

3.9. iPhone technology

The underlying technology of the iphone is a capacitive material layer arranged in a grid. The points on the grid have their own signal (much like smartskin) meaning they can be touched in multiple places on the screen simultaneously (Wilson, 2007). Although not as ‘futurist’ as some of the other concepts in this paper, the iPhone is ‘now the number one smartphone in the US’ (Thomson, 2008) and incorporates the OUI paradigm of input equals output. This could be considered by technological historians as the breakthrough system of organic user interfaces. The iPhone has several applications (web browsing, ipod, phone, text, camera plus many more) that all use multi-touch and gesture technology. None of these applications are new on a mobile phone but they were broken and Apple fixed them (Grossman, 2007). Apple could do this because a lot of ‘broken’ things about mobile devices are due to an economy of screen space and insufficient input device. The iPhone solved the screen space by creating a big 3.5inch screen but this meant there were no buttons; so they decided to employ a multi-touch screen that could handle gestures. The gestures are (Block, 2007):

Tap – to start an application

Double-Tap – different for different applications

Pinch/ unpinch – zooms in and out

Drag – scroll up/down

Flick – quick scroll up/down

Swipe – change panes or delete items

What the apple iPhone has managed to do whilst not pushing the boundaries of the organic user interface paradigm has created something successful and because of this has changed the business dynamic between mobile carriers and mobile developers. The dynamic shift is that mobile developers are now concentrating on the development of a perfect mobile phone instead of ‘disposable’ ones (Vogelstein, 2008) and the touch screen is at the forefront of this shift with the creation of: Prada & Viewty by LG, Samsung F480 &F700, Nokia’s Xpress Music and the blackberry’s Storm.

3.10. Blackberry Storm’s technology Clickthrough

The blackberry storm is much like the iPhone in that it is a touch screen mobile phone, with all the same applications (media player, camera, phone, text, email, web browser). The difference between the two is that the storm combines the SmartPad idea with the iPhone using a technology called ‘clickthrough’. Clickthrough is a touch screen with springs underneath which when depressed gives the touch screen a clickable feeling (Beaumont, 2008). An example of the way they have implemented this technology is if a user is surfing the web, placing the users finger on the touch screen and dragging it around moves the web page around, if the user then presses the screen it clicks on the link (AP, 2008). This is the next step of organic user interfaces (especially for mobile phones) now our computer is starting to be able to be moved by us to convey a command which is form following flow.

4. Organic user interface paradigm vs. the WIMP paradigm

There are two schools of thought when it comes to interaction, the first is organic user interface, the theme of this literature review and the more traditional user interfaces (e.g. WIMP) where the user uses physical tools to interact with the screen and computer (mouse, touchpad, keyboard).

A tabletop display computer is a new arena of interaction; research was carried out to see if the mouse or direct touch was more suitable style of interaction. The experiments were unimanual, the use of a mouse pointer against a hand directly manipulating the screen and bimanual, the use of two mouse pointers against two hands manipulating the screen. The conclusions of the experiments were such that if designers were to create multi-input (bi-manual) displays then two hands work better than two mouse pointers. If one hand input (unimanual) is to be adopted (which due to the multi-touch success of the iPhone is highly unlikely) then a mouse would be quicker (Forlines, Wigdor, Shen, & Balakrishnan, 2007). This research will become more relevant with the introduction of the Microsoft surface, a multi touch table top display expected to be released in 2012 (Williams, 2008).

An example of the paradigms conflicting of the paradigms conflicting is two new ways to interact with laptops. One is called Thumbsense, which it senses the users hand on the touchpad and when it changes the button on the keyboard to become a mouse button (see figure 5) (Rekimoto J. , ThumbSense: Automatic Input Mode Sensing for Touchpad-based Interactions, 2003). The other is Thinsight, this is a type of interface technique that allows for the sensing of multi-touch on or near the screen (Hodges, Izadi, Butler, Rrustemi, & Buxton, 2007) for manipulation purposes. The advantage of both these techniques and why they are being compared is that they can be implemented into existing laptop technology. The advantages of Thinsight over Thumbsense is that you can remove the touchpad on the laptop on Thinsight but remove only the touchpad buttons on Thumbsense. This is important for laptop because of the economy of space. Thinsight allows for bimanual interaction (multi-touch) and as discussed earlier direct manipulation is more natural than using in this case a touch pad. The advantages of Thumbsense over Thinsight are that the touch pad is a tool we are familiar with on a laptop so no new way of interacting is to be learnt, just an adaptation of an existing one. The switch between typing and pointer manipulation is quicker than Thinsight. Whilst using Thinsight your hands may inadvertently get in the way of what you are trying to look at and Thinsight may use up too much battery making the life of the laptop shorter before recharging. As shown in this comparison organic user interfaces is not always the best method of interacting in this particular context of laptops, it seems there are more advantages of using Thumbsense over Thinsight.

Figure 5: Comparison of hand positions (A, B: Without Thumbsense. The user’s hands have to switch positions. C, D: With Thumbsense. The user can transparently switch between the text (C) and mouse (D) modes.) (Rekimoto J. , ThumbSense: Automatic Input Mode Sensing for Touchpad-based Interactions, 2003)

5. Future direction of organic user interfaces

At present the iPhone and other mobiles phones lead the way in the application of organic user interfaces. In the future as prototypes are refined and technologies become cheaper, applications such as Microsoft Surface will be released, estimated 2012. These new types of ‘base PCs’ will, after more refinement be supplemented by Paperwindows and Gummi (bendable computers). As these are more mobile in design there are more technologies barriers to overcome and will take longer than the static technologies (surface) to complete.

6. Conclusions

This article has given an overview of technologies using organic user interfaces at this moment in time. The technologies are: Smartskin, Organic light-emitting diode, Prsense2, iPhone and Clickthrough. It discusses how these technologies would be applied to our everyday lives by giving example applications and how they will be used. A comparison of organic user interfaces and WIMP paradigms have been discussed concluding that perhaps it depends on the context of the technology as to which style of interaction it adopts. Lastly a glimpse into how organic user interface paradigm is going to shape computing and human computer interaction for the decade to come.

Bibliography

AP. (2008, October 8). Blackberry's iPhone rival hits the market. Retrieved October 25, 2008, from Independent: http://www.independent.co.uk/life-style/gadgets-and-tech/news/blackberrys-iphone-rival-hits-the-market-954880.html

Beaumont, C. (2008, October 8). BlackBerry touch screen launched to take on the Apple iPhone. Retrieved October 25, 2008, from Telegraph: http://www.telegraph.co.uk/news/3152214/BlackBerry-touch-screen-launched-to-take-on-the-Apple-iPhone.html

Block, R. (2007, July 3). iPhone review, part 1: Hardware, interface, keyboard. Retrieved October 18, 2008, from engadget: http://www.engadget.com/2007/07/03/iphone-review-part-1-hardware-interface-keyboard/

David Holman, R. V. (2008). Organic User Interfaces: Designing Computers in and Way, Shape or Form. Communications of the ACM, volume 51, number 6 , 48-55.

Elise Co, N. P. (2008). Emerging display technologies for organic user interface. communications of the ACM, volume 51, number 6 , 45-47.

Forlines, C., Wigdor, D., Shen, C., & Balakrishnan, R. (2007). Direct-Touch vs. Mouse Input for Tabletop Displays. CHI 2007 Proceedings - Mobile Interaction Techniques I , 647-656.

Freudenrich, P. C. (2005, March 24). How OLEDs Work . Retrieved October 12, 2008, from How Stuff Works: http://electronics.howstuffworks.com/oled2.htm

Gadgets Arcade (division of Arcade International Limited). (2008). MP4 Video Watch - Chrome (2GB). Retrieved October 12, 2008, from gadgets arcade: https://www.gadgetsarcade.com/video-watch-chrome-p-460.html

Grossman, L. (2007, January 9). Apple's New Calling: The iPhone. Retrieved October 18, 2008, from Time: http://www.time.com/time/business/article/0,8599,1575410-1,00.html

Hodges, S., Izadi, S., Butler, A., Rrustemi, A., & Buxton, B. (2007). ThinSight: Versatile Multi-touch Sensing for Thin Form-factor Displays. UIST’07 , 259-268.

Holleis, P., Huhtala, J., & Häkkilä, J. (2008). Studying Applications for Touch-Enabled Mobile Phone Keypads. Second International Conference on Tangible and Embedded Interaction , 15-18.

Holman, D., Vertegaal, R., Altosaar, M., Troje, N., & Johns, D. (2005). PaperWindows: Interaction Techniques for Digital Paper. CHI 2005, PAPERS: Physical Interaction , 591-599.

M. T. Raghunath, C. N. (2002). User Interfaces for Applications on a Wrist Watch. Personal and Ubiquitous Computing , 17-30.

Oba, H., Rekimoto, J., & Ishizawa, T. (2003). SmartPad: A Finger-Sensing Keypad for Mobile Interaction. CHI 2003: NEW HORIZONS , 850-851.

Reimer, J. (2005, may 5). A History of the GUI. Retrieved october 8, 2008, from ars technica: http://arstechnica.com/articles/paedia/gui.ars/8

Rekimoto, J. (2001). GestureWrist and GesturePad: unobtrusive wearable interaction devices. Wearable Computers , 21-27.

Rekimoto, J. (2002). SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces. CHI 2002: changing the world, changing ourselves , 113-120.

Rekimoto, J. (2003). ThumbSense: Automatic Input Mode Sensing for Touchpad-based Interactions. CHI 2003: New Horizons , 852-853.

Rekimoto, J., & Schwesig, C. (2006). PreSenseII: Bi-directional Touch and Pressure Sensing Interactions with Tactile Feedback. CHI 2006. Work-in-Progress , 1253-1258.

Rekimoto, J., Ishizawa, T., Schwes, C., & Oba, H. (2003). PreSense: Interaction Techniques for Finger Sensing Input Devices. CHI Letters, Volume 5, Issue 2 , 203-212.

Roel Vertegaal, I. P. (2008). Organic Uesr Interfaces. Communications of the ACM, journal 51, number 6 , 26-30.

Schwesig, C., Poupyrev, I., & Mori, E. (2004). Gummi: A Bendable Computer. CHI 2004, Volume 6, Number 1 , 263-270.

SK. Lee, W. B. (1985). A Multi-Touch Three Dimensional Touch-Sensitive Tablet. CHI '85 Proceedings , 21-25.

Thomson, I. (2008, October 7). Apple iPhone gains market share. Retrieved October 18, 2008, from vnunet: http://www.vnunet.com/vnunet/news/2227601/apple-iphone-gains-market-share

Vogelstein, F. (2008, October 1). The Untold Story: How the iPhone Blew Up the Wireless Industry. Retrieved October 25, 2008, from wired: http://www.wired.com/gadgets/wireless/magazine/16-02/ff_iphone?currentPage=1

Williams, W. (2008, October 25). Future Technology. Web User , pp. 22-30.

Wilson, T. V. (2007, June 20). How the iPhone Works. Retrieved October 18, 2008, from HowStuffWorks: http://electronics.howstuffworks.com/iphone.htm

[1]