Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Tangible User Interfaces II
Early examples
Tangible bits, blocks and tiles
Ambient interfaces
Gesture interfaces
Interactive designs
Ina Wagner, Institut für Gestaltungs- und Wirkungsforschung, TU Wien
Early examples - overview of concepts
Early examples• Forgotten predecessors, 1975 - 1985, first designs 1993
• Marble Answering Machine, 1992, Durell Bishop (augmented product design)
• Graspable Bricks, 1995, Fitzmaurice, Ishii and Buxton
• Tangible Bits, Ishii and Ullmer 1997
Basic ideas:• Graspable objects for interaction with digital representations and data
• 3D haptic-tactile interfaces
• Graspable objects for representing and controlling digital information
• Direct coupling of material objects with digital representations
• Explicit interaction
Games: CardBoard Box Garden
Sound installation piece made up of fifteen boxes, varying in shape and size.
The “Recording group” contains three boxes. When opened, each of the boxes records a separate audio sample. By analysing the frequency and amplitude, the input sound is converted into one of three musical instruments, a piano, xylophone or percussion.
The “Play group” is again made up of three boxes, and each one is paired to a “Recording” box.The three pairs of boxes are each assigned a particular musical instrument. Opening a “Play box” will cause the sound, which was initially generated by the child, to be played back. Once thedesired musical sounds are playing, the children can also alterthe volume and tempo by stacking and pushing other specificcardboard boxes (Ferris & Bannon 2002)
Tangible Viewpoints (Mazalek 2002): Interactive story-telling Elements of stories ‚stick‘ to the aura of the figurines)
Designer‘s outpostWeb design tool based on image recognition (identifiesPost-its)
Blocks and post-its
WebstickersHolmquist 1999
•Use objects as bookmarks
•Transfer the post-it to the virtual world
•Augment objects with barcode
•Use barcodes as keys in networked database and map them on URLs (user predefines the mapping)
•Example: associate a printed document with its webversion
Tangible mediaBlocks• Media Bocks are electonically tagged wooden blocks serving as
physical icons (phycons)
• Capture, containment and transport of media across diverse physical world devices
• Mediasequenzer supports constructions of multimedia sequences and presentation
• ‘Slot’ e.g. capturing drawing session on digital whiteboard - inserting block in slot triggers musical chime indicating start of recording
Tangible bits (Ishii/Ullmer, CHI’97)
• To grasp and manipulate bits (digital information) by coupling them with everyday objects
• To be aware of background bits at the periphery of perception using ambient media
• Bridge the gaps between digital and physical world
• Examples:
– Interactive surfaces - e.g. metaDESK
– Coupling of bits with objects - e.g. transBOARD
– Ambient media - e.g. ambientROOM
Tangible bits - TransBOARD
• One-way interactive surface absorbing information from the physical world (strokes) and transforming them into bits
• Strokes are stored digitally on hyperCARDS
• hyperCARDS are the physical representations of the stored data
Tangible bits - ambientROOM
• Surround user with augmented environment
• Ambient media (light, shadow, sound, airflow, waterflow) as carrier of information at the periphery of human perception
• Activity of a distant loved one as sound of rain or as water ribbles projected on the ceiling
• Active wallpaper - patters of illuminated projected patches as indicators of low or high activity
• Ambient sound - use of whiteboard - eraser rubbing against the board
Interactive surfaces Wellner’s desk
• Desk with top-projected video and camera tracking from above
• Digitized tablet
• Interact with real paper
– digitize with camera
– recognize action and gestures
– overlay with video
Erstes größeres
Tangible Bits Projekt 1997
2 Campus-Gebäude
Auf Karte verankern, Karte bewegen und verzerren
Nutzt VR-Tracking-Techniken - Lupen, die Dinge sichtbar machen
(Instrumente, leider verkabelt)
Erkenntnis:
Noch zu sehr an GUI-Metaphern
orientiert, schlecht übertragbar
Tangible bits - Metadesk
Data tilesRekimoto et al. 2001)
RFID tagged transparent tiles working like a window for digital information and trigging action
Interaction ideas:
• Tagged transparent objects as interaction modules
• Mixed physical and graphical interaction modules
- fuse printed images with dynamically displayed graphics
- digitizer pen (mouse and widget control)
- engraved grooves
• Combination of multiple tiles creates complex interaction possibilities
AlgoBlocks (Suzuki & Kato 1995, NEC)
Programmiersprache für Kinderum ein U-Boot auf dem Monitorzu steuern (Logo-ähnlich)LEDs auf den Blöcken leuchten,die gerade aktiv sind
Programmieren als kooperativeTätigkeit, für alle sichtbarEnge Feinkoordination, großeKonzentration
Triangles(Gorbet & Orth 1997, MIT)
Neue Metaphern und Formen der Interaktion
Stecksystem aus Dreiecken, die ihre Konfiguration an eine Anwendung auf dem Computer weitermelden
Vielfältige Steckverbindungen (leitendes Klettband aus Metallwolle und Magnete)
Nicht-lineares Geschichtenerzählen - interaktives Ausstellungsstück (The Digital Veil, Ars Electronica 1997)
Eines der ersten Systeme für „abstrakte“ Inhalte (ohne 1:1 Abbildung)
Anwendungsbereich nicht räumlicher Natur
Umgang mit abstrakter Information
Ambient media
• Make invisible processes in the virtual world visible in the real world
• Visible means noticeable in an ambient way
– No exact statement about virtual processes but impression about what/how much is going on
– Ambient information is taken up in subtle ways, e.g. with a wink
– Observer’s attention can change between foreground activity and information in the background/at the periphery
• Ambient - through (changes of) light, sound, smell, movement ...
Ambient displays
Information moved to the environment manifesting itself as subtle changes in form, movement, sound colour, smell, temperature, light
To keep us aware (weather, traffic, etc.)
“We are turning each state of physical matter - not only solid matter, but also liquids and gases - within everyday architectural spaces into ‘interfaces’ (Ishii/Ullmer 1997)
Ambient media -examples
Dangling string, Natalie Jereijenko (Xerox PARC)
-Motor with a 2.4m string
-Electrically connected to Ethernet cable
- Each bit causes a tiny switch
Let observer discriminate between heavy and low network traffic
BodyScannerStudent project, RCA London, 2002
• Uses people’s perception of their own body for giving abstract medical images an intuitively understandable frame of reference
• Images from ‘Visible Human’ project
• Visitors ignore evident inconsistencies between own body or posture and images
Ambient media -examples
Traffic lights (Schmidt et al. TecO)
-Light bulbs in front of posters indicate web access on topics shown on posters
- Light intensity changes according to hits/time period
Passing observer can discriminate between web accesses on various topics
Ambient media -examples
Informative Art (Holmquist et al. PLAY)
Aggregate different information such as weather data from several locations in a changing representation of an art work
Gesture interfaces
• Input by gestures, e.g. hand signs
•Convenient
•Natural
•Unobtrusive
•Personal
•Usable in difficult situations, e.g. mobile scenarios
Gesture interfaces in interactive art work IE-Spark, Plancton. The creatures, modelled as a real existing specie of plankton, appear as three-dimensional shapes pulsating and fluctuating with movements similar to the real plankton, animated by social behaviour visually similar to swarm formations. create an environment where apparently living artificial beings develops behaviour and absorbs language from the people visiting the installation and try to dialog with them. In the installation, three-dimensional "digital creatures" move inside a "real-virtual hybrid world" projected on a wall screen.
In front of the screen, the visitors can interact through gestures and voice. The creatures are equipped with "sensors" that make them able to see and listen to the visitors, and with an "artificial brain" allowing them capable to process visual and sound information, to control the movements, to learn words by the visitors and to produce vocalizations. The "eye" is attached to a video-camera that captures the shadow of the visitors projected over the screen. The "ear" is attached to a microphone where the visitor can speaks. The brain of the creature is built using associative memory and neural networks, the base of learning capabilities.
Gesture interfaces in interactive art work II
Pegaso3, Gruppo Làbun
Based on the sculpture Pegaso by Paolo Minoli, the installation makes use of sound and colored lights in order to define an artificial environment that can be modified by visitors using gestures.
The installation makes use of sound and colored lights in order to define an artificial environment that can be modified by visitors. People can navigate the boundaries between space and light and sound simply by interacting with three theremins (i. e. antennas used as musical instruments, invented in the beginning of the 20th century).
The environment defined by users' interaction surrounds both the audience and the sculpture, with coherent reference to Minoli's idea about the interaction of sculptural works with the landscape, expanding the potentials of this relationship. in collaboration with Conservatorio di Musica di Como and CASAPERLARTE - fondazione Paolo Minoli
Definition of gestures I
Gesture dichotomies
•Act/symbol- Action gesture - a person chopping wood
- Symbolic gesture - e.g. OK sign
•Opacity/transparency- Refers to the ease with which others can interpret gestures
- Gesture meaning are culturally dependent
•Centrifugal/centripetal- Centrifugal gestures are directed towards an object, centripetal not
•Autonomous-semiotic/multi-semiotic- Autonomous semiotic systems are those used in a gesture languaga, e.g. ASL
- Others are created as part of a multisemiotic activity - gestures which accompany other languages
Definition of gestures II
Gesture typologies
•Arbitrary- Gestures whose interpretation must be learned due to their opacity
- Useful because they can be specfically created for use in device control
•Mimetic- Form an object’s main shape or representative feature
- Are intended to be transparent
•Deictic- Used to point at objects - each gesture transparent in its given context
- These gestures can be specific (pointing to one object), general (referring to a class of objects) or functional (representing intensions, e.g. pointingto a chair to ask for permission to sit)
Handwriting as gesture
• Input by gestures, e.g. hand signs
•Online recognition machines identify handwriting as a user writes - capture the dynamic information of writing (number of strokes, ordering of strokes, direction and velocity profile of each stroke)
- are interactive, allowing users to correct recognition errors. etc.
- on-line tablets capture writing as sequence of coordinate points
- pre-process characters (different chracters can loook quite similar) and perform some sort of shape recognition
Presentation, recognition, exploitation
Existing experimental systems
• Use their own definition of gesture - the technology used to recognize gestures and the response derived further complicate the issue
• Many use symbolic command gestures for e.g. stop, start, turn ...
• Gestures may also be interpreted as letters of an alphabet or words
Research question. Is it possible to consistently capture information expressed in a gesture in a usable form?
• What is the gesture lexicon?
• How are gestures generated?
• How does the system recognize gestures?
And for using gestures as device control -
• What devices are the gestures intended to control?
• What system control commands are represented by gestures?
• How are the gestures transformed into commands)
Issues
•How consistent are people in their use of gestures?
•What are the miost common gestures used in a given domain and how easily can they be recalled?
•Do gestures contain identifiable spatial components which correspond to the functional components of command (action to be performed), scope (object to which the command is applied) and target (the location where the object is moved, inserted, copied)?
Empirical studiesWolf 1987, Hauptmann 1989
•People consistently use the same gestures for the same commands
•People are good at learning new arbitrary gestures
• People use very similar gestures for the same operations
• In all studies humans (and not machines) interpreted the gestures
Gesture recognitioninsert Fig here
•Which features to use - hand configuration, palm orientation, movement direction
•Gesturing - preparatory, actual, retractual phase
•Hman cannot perform e.g. perfect cycles - they hesitate, start with different initial positions
• Slow and fast gestures
Gesture interfaces - examples
Gesture Wrist (J. Rekimoto, Sony CSL)
- To be worn like wrist watch
- Measure changes in wrist shape capacivity
- Movements are measured by accelerometers
- Used as command and input device
- User gets tactile feedback when a gesture was recognized
Gesture interfaces - examples
Gesture Pad (J. Rekimoto, Sony CSL)- Capacity sensor pad embedded in clothes
- Clothes as interactive surface
- Detects and reads finger motions applied onto clothes
Combine with gesture wrist- Select and adjust
- Transmit data from wrist to pad via body
Interactive design - examples
Pillow mate 3000 (Dahlberg et al.)- Interactive pillow as replacement for a cat for lonesome people or people under stress
- Pillow heats up
- Touching the corner results in sound and vibration like a purring cat
Interactive design - examples
VooDoo (Anderson et al.)
VooDoo doll as input device- Needles with symbols have particular meanings
- Symbols are identified via RFID transponder
- Needles work as penetration sensors
Application: trigger certain messages as SMS, Email or post them on public notice boards
Interactive design - examples
Grynet (Antonsson et al.)- Silent wake up
- Dawn simulation by building a lamp connected to an alarm clock
- After the awakening and if the user has not left the bed the lamp will begin to blink