AWE 2014 - The Glass Class: Designing Wearable Interfaces

Preview:

DESCRIPTION

Tutorial taught at the AWE 2014 conference, by Mark Billinghurst and Rob Lindeman on May 27th 2014. It provides an overview of how to design interfaces for wearable computers, such as Google Glass.

Citation preview

The Glass Class Designing Wearable Interfaces

May 27th, AWE 2014

Mark Billinghurst HIT Lab NZ

University of Canterbury mark.billinghurst@canterbury.ac.nz

Rob Lindeman VIVE Lab

Worcester Polytechnic Institute gogo@wpi.edu

1: Introduction

Mark Billinghurst ▪  Director of HIT Lab NZ, University

of Canterbury

▪  PhD Univ. Washington

▪  Research on AR, mobile HCI, Collaborative Interfaces

▪  More than 250 papers in AR, VR, interface design

▪  Sabbatical in Glass team at Google [x] in 2013

Rob Lindeman ▪  Director of HIVE Lab, Worcester

Polytechnic Instiute

▪  PhD The George Washington Univ.

▪  Research on 3DUI, VR, Gaming, HRI since 1993

▪  Been wearing Glass non-stop (mostly, anayway) since Sept. 2013

▪  Sabbatical at HIT Lab NZ in 2011-12

▪  Program Co-Chair, ISMAR 2014

▪  Love geocaching, soccer, skiing

How do you Design for this?

Course Goals In this course you will learn

▪  Introduction to head mounted wearable computers

▪  Understanding of current wearable technology

▪  Key design principles/interface metaphors

▪  Relevant human cognition/perception principles

▪  Rapid prototyping tools

▪  Overview of native coding/application development

▪  Areas for future research

▪  Hands on experience with the technology

What You Won’t Learn ▪  Who are the companies/universities in this space

▪  See the AWE exhibit floor

▪  Designing for non-HMD based interfaces

▪  Watches, fitness bands, etc

▪  How to develop wearable hardware

▪  optics, sensor assembly, etc

▪  Evaluation methods

▪  Experimental design, statistics, etc

Schedule 1:30 1. Introduction (Mark + Rob)

1:35 2. History and Technology (Mark)

1:55 3. User Experience and Design Principles (Mark)

2:15 4. Prototyping Tools (Mark)

2:50 Break/Demo

3:15 5. Native Programming (Rob)

4:00 6. Application Case studies (Det)

4:30 7. Technical Q & A (Everyone)

5:00 8. Research Directions (Mark + Rob)

5:30 Finish

Display Demos You Can Try Google Glass Display

Glass UI, AR demos, Games, multimedia capture Vuzix M-100 Display

Monocular display Epson BT-100, Epson BT-200

See through displays AR Rift

Occulus Rift for AR Recon Snow

Micro-display integrated into ski goggles

More at the AWE Exhibits

2: History and Technology

A Brief History of Time

▪  Trend ▪  smaller, cheaper, more functions, more intimate ▪  Time pieces moved from public space onto the body

18th Century

20th Century

13th Century

A Brief History of Computing

Trend ▪  Smaller, cheaper, faster, more intimate ▪  Moving from fixed to handheld and onto body

1950’s

1980’s

1990’s

Room Desk Lap Hand Head

What is a Wearable Computer ? ▪  A computer that is: ▪  Portable while operational ▪  Enables hands-free/hands-limited use ▪  Able to get the user’s attention ▪  Is always on, acting on behalf of the user ▪  Able to sense the user’s current context

Rhodes, B. J. (1997). The wearable remembrance agent: A system for augmented memory. Personal Technologies, 1(4), 218-224.

In Other Words .. ▪  A computer that is .. ▪  Eudaemonic: User considers it part of him/herself ▪  Existential: User has complete control of the system ▪  Ephemeral: System always operating at some level

Mann, S. (1997). Wearable computing: A first step toward personal imaging. Computer, 30(2), 25-32.

Wearable Computing ▪  Computer on the body that is: ▪  Always on ▪  Always accessible ▪  Always connected

▪  Other attributes ▪  Augmenting user actions ▪  Aware of user and surroundings

The Ideal Wearable ▪  Persists and Provides Constant Access: Designed

for everyday and continuous user over a lifetime. ▪  Senses and Models Context: Observes and models

the users environment, mental state, it’s own state. ▪  Augments and Mediates: Information support for

the user in both the physical and virtual realities. ▪  Interacts Seamlessly: Adapts its input and output

modalities to those most appropriate at the time.

Starner, T. E. (1999). Wearable computing and contextual awareness (Doctoral dissertation, Massachusetts Institute of Technology).

Wearable Attributes

▪  fafds

Augmented Interaction

Rekimoto, J., & Nagao, K. (1995, December). The world through the computer: Computer augmented interaction with real world environments. In Proceedings of the 8th annual ACM symposium on User interface and software technology (pp. 29-36).

● Mixed Reality Continuum

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.

● Mediated Reality

▪ adsfas

Mann, S., & Nnlf, S. M. (1994). Mediated reality.

● Augmediated Reality (Mann)

▪ sdfa

History of Wearables ▪  1960-90: Early Exploration ▪  Custom build devices

▪  1990 - 2000: Academic, Military Research ▪  MIT, CMU, Georgia Tech, EPFL, etc ▪  1997: ISWC conference starts

▪  1995 – 2005+: First Commercial Uses ▪  Niche industry applications, Military

▪  2010 - : Second Wave of Wearables ▪  Consumer applications, Head Worn

Thorp and Shannon (1961)

▪  Wearable timing device for roulette prediction ▪  Audio feedback, four button input

Ed Thorp

Thorp, E. O. (1998, October). The invention of the first wearable computer. In Wearable Computers, 1998. Second International Symposium on (pp. 4-8). IEEE.

Keith Taft (1972)

▪  Wearable computer for blackjack card counting ▪  Toe input, LED in Glasses for feedback

Belt computer Shoe Input Glasses Display

Steve Mann (1980s - )

http://wearcomp.org/

MIT Wearable Computing (1993-)

http://www.media.mit.edu/wearables/

Enabling Technologies (1989+) ▪  Private Eye Display (Reflection Technologies) ▪  720 x 280 dipslay ▪  Red LED ▪  Vibrating mirror

▪  Twiddler (Handykey) ▪  Chording keypad ▪  Mouse emulation

MIT Tin Lizzy (1993) ▪  General Purpose Wearable ▪  Doug Platt, Thad Starner ▪  150 MHz Pentium CPU ▪  32-64 Mb RAM ▪  6 Gb hard disk ▪  VGA display ▪  2 PCMCIA slots ▪  Cellular modem

http://www.media.mit.edu/wearables/lizzy/lizzy/index.html

Thad Starner 1998

Early Wearable Computing

Early Technology ▪  Computing ▪  Belt or Backpack

▪  Displays ▪  Head Mounted, LCD Panel, Audio

▪  Input Devices ▪  Chording Keyboard, Speech, Camera

▪  Networking ▪  Wireless LAN, Infra-Red, Cellular

US Military Wearables (1989- ) ▪  Early experimentation ▪  386 computer, VGA display ▪  GPS, mapping software

▪  Land Warrior (1991-) ▪  Integrated wearable system ▪  Camera, colour display, radio ▪  Navigation, reports, photos

Zieniewicz, M. J., Johnson, D. C., Wong, C., & Flatt, J. D. (2002). The evolution of army wearable computers. IEEE Pervasive Computing, 1(4), 30-40.

CMU Wearables (1991–2000) ▪  Industry focused wearables ▪  Maintenance, repair

▪  Custom designed interface ▪  Dial/button input

▪  Rapid prototyping approach ▪  Industrial designed, ergonomic

http://www.cs.cmu.edu/afs/cs/project/vuman/www/frontpage.html

Early Commercial Systems ▪  Xybernaut (1996 - 2007) ▪  Belt worn, HMD, 200 MHz

▪  ViA (1996 – 2001) ▪  Belt worn, Audio Interface ▪  700 MHz Crusoe

■  Symbol (1998 – 2006) ■  Wrist worn computer ■  Finger scanner

Prototype Applications ▪  Remembrance Agent ▪  Rhodes (97)

▪  Augmented Reality ▪  Feiner (97), Thomas (98)

▪  Remote Collaboration ▪  Garner (97), Kraut (96)

■  Maintenance ■  Feiner (93), Caudell (92)

▪  Factory Work ▪  Thompson (97)

Mobile AR: Touring Machine (1997) ▪  University of Columbia ▪  Feiner, MacIntyre, Höllerer, Webster

▪  Combines ▪  See through head mounted display ▪  GPS tracking ▪  Orientation sensor ▪  Backpack PC (custom) ▪  Tablet input

Feiner, S., MacIntyre, B., Höllerer, T., & Webster, A. (1997). A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal Technologies, 1(4), 208-217.

MARS View ▪  Virtual tags overlaid on the real world ▪  “Information in place”

Backpack/Wearable Systems 1997 Backpack Wearables

▪  Feiner’s Touring Machine ▪  AR Quake (Thomas) ▪  Tinmith (Piekarski) ▪  MCAR (Reitmayr) ▪  Bulky, HMD based

Piekarski, W., & Thomas, B. (2002). ARQuake: the outdoor augmented reality gaming system. Communications of the ACM, 45(1), 36-38.

PCI 3D Graphics Board

Hard Drive

Serial Ports

CPU

PC104 Sound Card

PC104 PCMCIA

GPS Antenna

RTK correction Antenna

HMD Controller

Tracker Controller

DC to DC Converter

Battery

Wearable Computer

GPS RTK correction

Radio

Example self-built working solution with PCI-based 3D graphics

Columbia Touring Machine

Mobile AR - Hardware

HIT Lab NZ Wearable AR (2004)

▪  Highly accurate outdoor AR tracking system ▪  GPS, Inertial, RTK system ▪  HMD

▪  First prototype ▪  Laptop based ▪  Video see-through HMD ▪  2-3 cm tracking accuracy

2008: Location Aware Phones

Nokia Navigator Motorola Droid

2009 - Layar (www.layar.com) •  Location based data

– GPS + compass location – Map + camera view

•  AR Layers on real world – Customized data – Audio, 3D, 2D content

•  Easy authoring •  Android, iPhone

Wearable Evolution

Backpack+HMD: …10+ kg

Handheld + HMD … Separate sensors .... UMPC 1.1GHz …1.5kg …still >$5K

Scale it down more: Smartphone…$500 …Integrated …0.1kg …billions of units

1997 2003 2007

Google Glass (2011 - )

▪  Hardware ▪  CPU TI OMAP 4430 – 1 Ghz ▪  16 GB SanDisk Flash,1 GB Ram ▪  570mAh Battery

▪  Input ▪  5 mp camera, 720p recording, microphone ▪  GPS, InvenSense MPU-9150 inertial sensor

▪  Output ▪  Bone conducting speaker ▪  640x360 micro-projector display

Google Glass Specs

Other Wearables ▪  Vuzix M-100 ▪  $999, professional

▪  Recon Jet ▪  $600, more sensors, sports

▪  Opinvent ▪  500 Euro, multi-view mode

▪  Motorola Golden-i ▪  Rugged, remote assistance

Ex: Recon Instruments Snow

Ski display/computer ▪  Location, speed, altitude, phone headset

http://www.reconinstruments.com/

Projected Market

● Wearables Market Size

dsfh

● Samsung Galaxy Gear

▪ Watch based wearable

● Nike Fuelband

▪ Activity/sleep tracking

Summary Wearables are a new class of computing

Intimate, persistent, aware, accessible, connected Evolution over 50 year history

Backpack to head worn Custom developed to consumer ready device

Enables new applications Collaboration, memory, AR, industry, etc

Many head worn wearables are coming Android based, sensor package, micro-display

Wearable Technologies

Wearable System

Display Technologies

Key Properties of HMD ▪  Field of View ▪  Human eye 95 deg. H, 60/70 deg. V

▪  Resolution ▪  > 320x240 pixel

▪  Refresh Rate ▪  Focus ▪  Fixed/manual

▪  Size, Weight ▪  < 350g for long term

▪  Power

Types of Head Mounted Displays

Occluded See-thru

Multiplexed

Optical see-through HMD

Virtual images from monitors

Real World

Optical Combiners

Optical See-Through HMD

Epson Moverio BT-200

▪  Stereo see-through display ($700) ▪  960 x 540 pixels, 23 degree FOV, 60Hz, 88g ▪  Android Powered, separate controller ▪  VGA camera, GPS, gyro, accelerometer

Strengths of optical see-through ▪  Simpler (cheaper) ▪  Direct view of real world ▪  Full resolution, no time delay (for real world) ▪  Safety ▪  Lower distortion

▪  No eye displacement ▪  see directly through display

Video see-through HMD Video cameras

Monitors

Graphics

Combiner

Video

Video See-Through HMD

Vuzix Wrap 1200DXAR

▪  Stereo video see-through display ($1500) ■ Twin 852 x 480 LCD displays, 35 deg. FOV ■ Stereo VGA cameras ■ 3 DOF head tracking

Strengths of Video See-Through ▪  True occlusion ▪  Block image of real world

▪  Digitized image of real world ▪  Flexibility in composition ▪  Matchable time delays ▪  More registration, calibration strategies

▪  Wide FOV is easier to support ▪  wide FOV camera

Multiplexed Displays ▪  Above or below line of sight ▪  Strengths ▪  User has unobstructed view of real world ▪  Simple optics/cheap

▪  Weaknesses ▪  Direct information overlay difficult

•  Display/camera offset from eyeline ▪  Wide FOV difficult

Vuzix M-100

▪  Monocular multiplexed display ($1000) ■ 852 x 480 LCD display, 15 deg. FOV ■ 5 MP camera, HD video ■ GPS, gyro, accelerometer

Display Types

▪  Curved Mirror ▪  off-axis projection ▪  curved mirrors in front of eye ▪  high distortion, small eye-box

▪  Waveguide ▪  use internal reflection ▪  unobstructed view of world ▪  large eye-box

See-through thin displays

▪  Waveguide techniques for thin see-through displays ▪  Wider FOV, enable AR applications ▪  Social acceptability

Opinvent Ora

Waveguide Methods

See: http://optinvent.com/HUD-HMD-benchmark#benchmarkTable

Holographic Hologram diffracts light Limited FOV Colour bleeding

Diffractive Slanted gratings Total internal reflection Costly, small FOV

Waveguide Methods

See: http://optinvent.com/HUD-HMD-benchmark#benchmarkTable

Clear-Vu Reflective Several reflective elements Thinner light guide Large FOV, eye-box

Reflective Simple reflective elements Lower cost Size is function of FOV

Input Technologies

Input Options ▪  Physical Devices ▪  Keyboard ▪  Pointer ▪  Stylus

▪  Natural Input ▪  Speech ▪  Gesture

▪  Other ▪  Physiological

Twiddler Input

▪  Chording or multi-tap input ▪  Possible to achieve 40 - 60 wpm after 30+ hours ▪  Chording input about 50% faster than multi-tap ▪  cf 20 wpm on T9, or 60+ wpm for QWERTY

Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., & Looney, E. W. (2004, April). Twiddler typing: One-handed chording text entry for mobile phones. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 671-678). ACM.

Virtual Keyboards

▪  In air text input ▪  Virtual QWERTY keyboard up to 20 wpm

-  On real keyboard around 45-60+ wpm ▪  Word Gesture up to 28 wpm

-  On tablet/phone Word Gesture up to 47 wpm ▪  Handwriting around 20-30 wpm

A. Markussen, et. al. Vulture: A Mid-Air Word-Gesture Keyboard (CHI 2014)

Unobtrusive Input Devices

▪  GestureWrist ▪  Capacitive sensing ▪  Change signal depending on hand shape

Rekimoto, J. (2001). Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 21-27). IEEE.

Unobtrusive Input Devices

▪  GesturePad ▪  Capacitive multilayered touchpads ▪  Supports interactive clothing

Skinput

Using EMG to detect muscle activity

Tan, D., Morris, D., & Saponas, T. S. (2010). Interfaces on the go. XRDS: Crossroads, The ACM Magazine for Students, 16(4), 30-34.

Issues to Consider ▪  Fatigue ▪  “Gorrilla” Arm from free-hand input

▪  Comfort ▪  People want to do small gestures by waist

▪  Interaction on the go ▪  Can input be done while moving?

Interaction on the Go

▪  Fitt’s law still applies while interacting on the go ▪  Eg: Tapping while walking reduces speed by > 35% ▪  Increased errors while walking

Lin, M., Goldman, R., Price, K. J., Sears, A., & Jacko, J. (2007). How do people tap when walking? An empirical investigation of nomadic data entry.International Journal of Human-Computer Studies, 65(9), 759-769.

3: User Experience and Design Principles

●  Early prototyping

● View Through Google Glass

Always available peripheral information display Combining computing, communications and content capture

Google Glass User Interface

• dfasdf

Google Glass Demo

Timeline Metaphor

User Experience

• Truly Wearable Computing – Less than 46 ounces

• Hands-free Information Access – Voice interaction, Ego-vision camera

• Intuitive User Interface – Touch, Gesture, Speech, Head Motion

• Access to all Google Services – Map, Search, Location, Messaging, Email, etc

Living Heads Up vs. Heads Down

Sample Applications

• https://glass.google.com/glassware

Glassware Applications

Virtual Exercise Companion

• GlassFitGames – http://www.glassfitgames.com

Vipaar Telemedicine

• Vipaar + UAB - http://www.vipaar.com • Endoscopic view streamed remotely • Remote expert adds hands – viewed in Glass

● CityViewAR

▪ Using AR to visualize Christchurch city buildings ▪ 3D models of buildings, 2D images, text, panoramas ▪ AR View, Map view, List view ▪ Available on Android/iOS market

● CityViewAR on Glass

▪ AR overlay of virtual buildings in Christchurch

Design Principles

Last year Last week Now Forever

The Now machine Focus on location, contextual and timely information, and communication.

It's  like  a  rear  view  mirror    

Don't  overload  the  user.  S7ck  to  the  absolutely  essen7al,  avoid  long  

interac7ons.  Be  explicit.      

Micro-Interactions

▪  On mobiles people split attention between display and real world

● Time Looking at Screen

Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.

● Dividing Attention to World

▪ Number of times looking away from mobile screen

Design for MicroInteractions ▪  Design interaction less than a few seconds ▪  Tiny bursts of interaction ▪  One task per interaction ▪  One input per interaction

▪  Benefits ▪  Use limited input ▪  Minimize interruptions ▪  Reduce attention fragmentation

● Design for Cognitive Load

Cognitive continuums (a) Input, (b) Output Increase cognitive load from left to right

Design for Interruptions

▪  Gradually increase engagement and attention load ▪  Respond to user engagement

Receiving SMS on Glass

“Bing”

Tap Swipe

Glass

Show Message Start Reply User

Look Up

Say Reply

Nomadic Radio (2000)

▪  Spatial audio wearable interface Sawhney, N., & Schmandt, C. (2000). Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. ACM transactions on Computer-Human interaction (TOCHI), 7(3), 353-383.

Spatial Audio Metaphor

Messages/Events arranged depending on time of day

Notification Interruptions

▪  Dynamic scaling of incoming message based on interruptibility of the user ▪  Busy = silence ▪  Availble = preview

Layered Audio Notifications

Background ambient audio Notification scale depending on priority

● 1. Design For the Device

▪ Simple, relevant information ▪ Complement existing devices

● 2. Don’t Get in the Way

▪ Enhance, not replace, real world interaction

● 3. Keep it Relevant

▪ Information at the right time and place

● 4. Avoid the Unexpected

▪ Don’t send unexpected content at wrong times ▪ Make it clear to users what your application does

● 5. Build for People

▪ Use imagery, voice interaction, natural gestures ▪ Focus on fire and forget interaction model

Other Guidelines ▪  Don’t design a mobile app ▪  Design for emotion ▪  Make it glanceable ▪  Do one thing at a time ▪  Reduce number of information chunks ▪  Design for indoor and outdoor use

●  As technology becomes more personal and immediate, it can start to disappear.

Distant Intimate

4: Prototyping Tools

How can we quickly prototype Wearable

experiences with little or no coding?

Why Prototype? ▪  Quick visual design ▪  Capture key interactions ▪  Focus on user experience ▪  Communicate design ideas ▪  “Learn by doing/experiencing”

Prototyping Tools ▪  Static/Low fidelity ▪  Sketching ▪  User interface templates ▪  Storyboards/Application flows ▪  Screen sharing

▪  Interactive/High fidelity ▪  Wireframing tools ▪  Mobile prototyping ▪  Native Coding

Important Note ▪  Most current wearables run Android OS ▪  eg Glass, Vuzix, Atheer, Epson, etc

▪  So many tools for prototyping on Android mobile devices will work for wearables

▪  If you want to learn to code, learn ▪  Java, Android, Javascript/PHP

Typical Development Steps ▪  Sketching ▪  Storyboards ▪  UI Mockups ▪  Interaction Flows ▪  Video Prototypes ▪  Interactive Prototypes ▪  Final Native Application

Increased Fidelity & Interactivity

Sketched Interfaces

▪  Sketch + Powerpoint/Photoshop/Illustrator

GlassSim – http://glasssim.com/

▪  Simulate the view through Google Glass ▪  Multiple card templates

GlassSim Card Builder ▪  Use HTML for card details ▪  Multiple templates ▪  Change background ▪  Own image ▪  Camera view

GlassSim Samples

Screen Sharing

▪  Android Design Preview ▪  Tool for sharing screen content onto Glass ▪  https://github.com/romannurik/

AndroidDesignPreview/releases

Mac Screen Glass

Glass UI Templates

▪  Google Glass Photoshop Templates ▪  http://glass-ui.com/ ▪  http://dsky9.com/glassfaq/the-google-glass-psd-template/

Sample Slides From Templates

Application Storyboard

▪  http://dsky9.com/glassfaq/google-glass-storyboard-template-download/

ToolKit for Designers

▪  Vectoform Google Glass Toolkit for Designers ▪  http://blog.vectorform.com/2013/09/16/google-glass-

toolkit-for-designers-2/

▪  Sample cards, app flows, icons, etc

Application Flow

Limitations ▪  Positives ▪  Good for documenting screens ▪  Can show application flow

▪  Negatives ▪  No interactivity/transitions ▪  Can’t be used for testing ▪  Can’t deploy on wearable ▪  Can be time consuming to create

Transitions

▪ Series of still photos in a movie format. ▪ Demonstrates the experience of the product ▪ Discover where concept needs fleshing out. ▪ Communicate experience and interface ▪ You can use whatever tools, from Flash to iMovie.

Video Sketching

See https://vine.co/v/bgIaLHIpFTB

Example: Video Sketch of Vine UI

UI Concept Movies

Interactive Wireframes

Interactive Wireframing ▪  Developing interactive interfaces/wireframes ▪  Transitions, user feedback, interface design

▪  Web based tools ▪  UXpin - http://www.uxpin.com/ ▪  proto.io - http://www.proto.io/

▪  Native tools ▪  Justinmind - http://www.justinmind.com/ ▪  Axure - http://www.axure.com/

UXpin - www.uxpin.com

▪  Web based wireframing tool ▪  Mobile/Desktop applications ▪  Glass templates, run in browser

https://www.youtube.com/watch?v=0XtS5YP8HcM

Proto.io - http://www.proto.io/ ▪  Web based mobile prototyping tool ▪  Features ▪  Prototype for multiple devices ▪  Gesture input, touch events, animations ▪  Share with collaborators ▪  Test on device

Proto.io - Interface

Demo: Building a Simple Flow

Gesture Flow

Scr1

Scr2 Scr3

Scr4 Scr5 Scr6

Tap Swipe

Start Transitions

Demo

Justinmind - http://www.justinmind.com/

▪  Native wireframing tool ▪  Build mobile apps without programming ▪  drag and drop, interface templates ▪  web based simulation ▪  test on mobile devices ▪  collaborative project sharing

▪  Templates for Glass, custom templates

User Interface - Glass Templates

Web Simulation Tool

Wireframe Limitations ▪  Can’t deploy on Glass ▪  No access to sensor data ▪  Camera, orientation sensor

▪  No multimedia playback ▪  Audio, video

▪  Simple transitions ▪  No conditional logic

▪  No networking

Processing for Wearables

Processing ▪  Programming tool for Artists/Designers ▪  http://processing.org ▪  Easy to code, Free, Open source, Java based ▪  2D, 3D, audio/video support

▪  Processing For Android ▪  http://wiki.processing.org/w/Android ▪  Strong Android support ▪  Generates Android .apk file

Processing - Motivation ▪  Language of Interaction ▪  Sketching with code ▪  Support for rich interaction

▪  Large developer community ▪  Active help forums ▪  Dozens of plug-in libraries

▪  Strong Android support ▪  Easy to run on wearables

http://processing.org/

http://openprocessing.org/

Development Enviroment

Basic Parts of a Processing Sketch

/* Notes comment */ //set up global variables float moveX = 50; //Initialize the Sketch void setup (){ } //draw every frame void draw(){ }

Importing Libraries ▪  Can add functionality by Importing

Libraries ▪  java archives - .jar files

▪  Include import code import processing.opengl.*;

▪  Popular Libraries ▪  Minim - audio library ▪  OCD - 3D camera views ▪  Physics - physics engine ▪  bluetoothDesktop - bluetooth networking

http://toxiclibs.org/

Processing and Glass ▪  One of the easiest ways to build rich

interactive wearable applications ▪  focus on interactivity, not coding

▪  Collects all sensor input ▪  camera, accelerometer, touch

▪  Can build native Android .apk files ▪  Side load onto Glass

Example: Hello World //called initially at the start of the Processing sketch void setup() { size(640, 360); background(0); } //called every frame to draw output void draw() { background(0); //draw a white text string showing Hello World fill(255); text("Hello World", 50, 50); }

Demo

Hello World Image PImage img; // Create an image variable void setup() { size(640, 360); //load the ok glass home screen image img = loadImage("okGlass.jpg"); // Load the image into the program } void draw() { // Displays the image at its actual size at point (0,0) image(img, 0, 0); }

Demo

Touch Pad Input ▪  Tap recognized as DPAD input

void keyPressed() { if (key == CODED){ if (keyCode == DPAD) {

// Do something ..

▪  Java code to capture rich motion events ▪  import android.view.MotionEvent;

Motion Event //Glass Touch Events - reads from touch pad public boolean dispatchGenericMotionEvent(MotionEvent event) { float x = event.getX(); // get x/y coords float y = event.getY(); int action = event.getActionMasked(); // get code for action switch (action) { // let us know which action code shows up case MotionEvent.ACTION_DOWN: touchEvent = "DOWN"; fingerTouch = 1; break; case MotionEvent.ACTION_MOVE: touchEvent = "MOVE"; xpos = myScreenWidth-x*touchPadScaleX; ypos = y*touchPadScaleY; break;

Demo

Sensors ▪  Ketai Library for Processing ▪  https://code.google.com/p/ketai/

▪  Support all phone sensors ▪  GPS, Compass, Light, Camera, etc

▪  Include Ketai Library ▪  import ketai.sensors.*; ▪  KetaiSensor sensor;

Using Sensors ▪  Setup in Setup( ) function

▪  sensor = new KetaiSensor(this); ▪  sensor.start(); ▪ sensor.list();

▪  Event based sensor reading void onAccelerometerEvent(…) { accelerometer.set(x, y, z); }

Sensor Demo

Using the Camera ▪  Import camera library

▪  import ketai.camera.*; ▪  KetaiCamera cam;

▪  Setup in Setup( ) function ▪ cam = new KetaiCamera(this, 640, 480, 15);

▪  Draw camera image void draw() { //draw the camera image image(cam, width/2, height/2); }

Camera Demo

Timeline Demo ▪  Create Card Class ▪  load image, card number, children/parent cards

▪  Timeline Demo ▪  Load cards in order ▪  Translate cards with finger motion ▪  Swipe cards in both directions ▪  Snap cards into position

Timeline Demo

Hardware Prototyping

Fake Display

3D print Thingiverse model see http://www.thingiverse.com/thing:65706 Have the social impact of Google Glass without the cost

Build Your Own Wearable

▪  MyVu display + phone + sensors

Beady-i

▪  http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/

Rasberry Pi Glasses

▪  Modify video glasses, connect to Rasberry Pi

▪  $200 - $300 in parts, simple assembly ▪  https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-

glasses

Physical Input Devices ▪  Can we develop unobtrusive input

devices ? ▪  Reduce need for speech, touch pad input ▪  Socially more acceptable

▪  Examples ▪  Ring, ▪  pendant, ▪  bracelet, ▪  gloves, etc

Prototyping Platform

Arduino Kit Bluetooth Shield Google Glass

Example: Glove Input

▪ Buttons on fingertips ▪ Map touches to commands

Example: Ring Input

▪ Touch strip, button, accelerometer ▪ Tap, swipe, flick actions

How it works

Bracelet

Armband

Gloves

1,2,3,4

Values/output

Summary ▪  Prototyping for wearables is similar to mobiles

▪  Tools for UI design, storyboarding, wireframing

▪  Android tools to create interactive prototypes ▪  App Inventor, Processing, etc

▪  Arduino can be used for hardware prototypes ▪  Once prototyped Native Apps can be built

▪  Android + SDK for each platform

Other Tools ▪  Wireframing ▪  pidoco ▪  FluidUI

▪  Rapid Development ▪  Phone Gap ▪  AppMachine

▪  Interactive ▪  App Inventor ▪  Unity3D ▪  WearScript

App Inventor - http://appinventor.mit.edu/

▪  Visual Programming for Android Apps ▪  Features ▪  Access to Android Sensors ▪  Multimedia output

▪  Drag and drop web based interface ▪  Designer view – app layout ▪  Blocks view – program logic/control

App Inventor Designer View

App Inventor Blocks View

Orientation Demo

▪  Use wearable orientation sensor

● Unity for Glass Dev

 Unity has built-in support for sensors on Android devices on a low level. Third-party

plugins like GyroDroid provides high-level access to every single sensor.

 rotation vector

 gyroscope

 accelerometer

 linear acceleration

 gravity

 light

 proximity

 orientation

 pressure

 magnetic field

 processor temperature

 ambient temperature

 relative humidity

● Unity for Glass Dev

 Screenshots

● Unity for Glass Dev

 Unity + GDK for Glass Touchpad

 Use the AndroidInput.touchCountSecondary method to

get touch numbers on the Glass touchpad.

 Use the AndroidInput.GetSecondaryTouch() static

method to get a specific touch on the Glass touchpad.

 Use the AndroidInput.GetSecondaryTouch().phase to

detect the touch gesture on the Glass touchpad

● Unity for Glass Dev

 Example  if(AndroidInput.touchCountSecondary == 2)

…… // if there are two touches

 if(AndroidInput.GetSecondaryTouch(0).phase ==

TouchPhase.Moved)

…… // if the first touch is moving

 float pos1X = AndroidInput.GetSecondaryTouch(1).position.x;

// get the second touch postion x value

● Android API in Unity for Glass

 Support Touchpad Input for Google Glass

 API: //Indicating whether the system provides secondary touch input.

AndroidInput.secondaryTouchEnabled //Indicating the height of the secondary touchpad. AndroidInput.secondaryTouchHeight //Indicating the width of the secondary touchpad.

AndroidInput.secondaryTouchWidth //Number of secondary touches.. AndroidInput.touchCountSecondary //Returns object representing status of a specific touch on a secondary touchpad .

AndroidInput.GetSecondaryTouch

● Android API in Unity for Glass

 Example:

/* Detect touch number on the Glass touchpad*/

Debug.Log("Touchpad", "Touch count: " + AndroidInput.touchCountSecondary);

if(AndroidInput.touchCountSecondary >= 2) { ......

}

/* Detect touch gesture on the Glass touchpad*/

if(AndroidInput.GetSecondaryTouch(0).phase == TouchPhase.Moved

} http://docs.unity3d.com/Documentation/ScriptReference/TouchPhase.html http://stackoverflow.com/questions/20441090/how-can-create-touch-screen-android-scroll-in-unity3d

● Android API in Unity for Glass

 Detect Google Glass in Unity C# Script

 API: SystemInfo.deviceModel

 Functionality: Provides the model of the device.

● Android API in Unity for Glass

 Example:

/* Show different GUIs for different devices */ Debug.Log("Android model", SystemInfo.deviceModel);

if(SystemInfo.deviceModel.contains("Glass")) {

Debug.Log("Android", "Google Glass detected");

// Active GUI for Glass

......

} else {

Debug.Log("Android", "Phone or Tablet detected");

// Active GUI for Phone or Tablet

......

}

WearScript

▪  JavaScript development for Glass ▪  http://www.wearscript.com/en/

▪  Script directory ▪  http://weariverse.com/

● WearScript Features ▪ Community of Developers ▪ Easy development of Glass Applications ▪ GDK card format ▪ Support for all sensor input

▪ Support for advanced features ▪ Augmented Reality ▪ Eye tracking ▪ Arduino input

● WearScript Playground

▪ Test code and run on Glass ▪ https://api.wearscript.com/

5: Native Coding

Overview ▪  For best performance need native coding ▪  Low level algorithms etc

▪  Most current wearables based on Android OS

▪  Need Java/Android skills

▪  Many devices have custom API/SDK ▪  Vusix M-100: Vusix SDK ▪  Glass: Mirror API, Glass Developer Kit (GDK)

Mirror API + Glass GDK

Glassware and Timeline

Glassware and Timeline ▪  Static Cards

▪  Static content with text, HTML, images, and video. -  e.g. notification messages, news clip

▪  Live Cards ▪  Dynamic content updated frequently.

-  e.g. compass, timer ▪  Immersions

▪  Takes over the whole control, out from timeline. -  e.g. interactive game

Glassware Development ▪  Mirror API ▪  Server programming, online/web application ▪  Static cards / timeline management

▪  GDK ▪  Android programming, Java (+ C/C++) ▪  Live cards & Immersions

▪  See: https://developers.google.com/glass/

▪  REST API ▪  Java servlet, PHP, Go,

Python, Ruby, .NET ▪  Timeline based apps ▪  Static cards

-  Text, HTML, media attachment (image & video) -  Standard and custom menu items

▪  Manage timeline -  Subscribe to timeline notifications -  Sharing with contacts -  Location based services

Mirror API

GDK ▪  Glass Development Kit ▪  Android 4.0.3 ICS + Glass specific APIs ▪  Use standard Android Development Tools

▪  GDK add-on features ▪  Timeline and cards ▪  Menu and UI ▪  Touch pad and gesture ▪  Media (sound, camera and voice input)

GDK

Glass Summary ▪  Use Mirror API if you need ... ▪  Use GDK if you need ... ▪  Or use both

● An Introduction to Glassware Development

- GDK -

Rob Lindeman gogo@wpi.edu

Human Interaction in Virtual Environments (HIVE) Lab Worcester Polytechnic Institute

Worcester, MA, USA http://www.cs.wpi.edu/~gogo/hive/

* Images in the slides are from variety of sources, including http://developer.android.com and http://developers.google.com/glass

● Thanks to Gun Lee! ▪ Most of this material was developed by Gun Lee at the HIT Lab NZ. ▪ He’s a rock star!

Rob Lindeman ▪  Director of HIVE Lab, Worcester

Polytechnic Instiute

▪  PhD The George Washington Univ.

▪  Research on 3DUI, VR, Gaming, HRI since 1993

▪  Been wearing Glass non-stop (mostly, anayway) since Sept. 2013

▪  Sabbatical at HIT Lab NZ in 2011-12

▪  Program Co-Chair, ISMAR 2014

▪  Love geocaching, soccer, skiing

● Glassware Development ▪ Mirror API ▪ Server programming, online/web application ▪ Static cards / timeline management

▪ GDK ▪ Android programming, Java (+ C/C++) ▪ Live cards & Immersions

https://developers.google.com/glass/

● Mirror API ▪ REST API ▪ Java servlet, PHP, Go, Python, Ruby, .NET

▪ Timeline based apps ▪ Static cards

- Text, HTML, media attachment (image & video) - Standard and custom menu items

▪ Manage timeline - Subscribe to timeline notifications - Sharing with contacts - Location based services

● GDK ▪ Glass Development Kit ▪ Android 4.4.2 + Glass specific APIs ▪ Use standard Android Development Tools

● GDK ▪ GDK add-on features ▪ Timeline and cards ▪ Menu and UI ▪ Touch pad and gesture ▪ Media (sound, camera and voice input)

● Development Environment Setup

▪ JDK (1.6 or above, using 1.8 for the tutorial) ▪ http://www.oracle.com/technetwork/java/javase/downloads/index.html

▪ ADT Bundle (Eclipse + Android SDK) ▪ http://developer.android.com/sdk/index.html ▪ With Android SDK Manager (select Window>Android SDK Manager from Eclipse menu) install:

- Tools > Android Build-tools (latest version) - Android 4.4.2 (API 19) SDK Platform, ARM System Image, Google APIs, Glass Development Kit Preview

- Extras > Google USB Driver (only for Windows Platform)

● Create an Android App Project

▪ In Eclipse ▪ File > New > (Other > Android>) Android Application Project

▪ Fill in the Application name, Project name, and Java package namespace to use ▪ Choose SDK API 19: Android 4.4.2 for all SDK settings ▪ Use default values for the rest

● Virtual Device Definition for Glass ▪ Window > Android Virtual Device Manager > Device Definitions > New Device ▪ 640x360px ▪ 3 in. (hdpi) ▪ Landscape

● Live Cards vs. Immersions ▪ Live cards display frequently updated information to the left of the Glass clock. ▪ Integrate rich content into the timeline ▪ Simple text/images to full-blown 3D graphics

▪ Immersions let you build a user experience outside of the timeline. ▪ Build interactive experiences ▪ Extra control, fewer user input constraints

● Live Cards

● Immersions

● Live Cards vs. Immersions

● Develop with GDK ▪ Android 4.4.2 (API 19) SDK and GDK Preview from the Android SDK Manager. ▪ Project settings: ▪ Minimum and Target SDK Versions: 19 ▪ Compile with: GDK Preview ▪ Theme: None (allows the Glass theme to be applied.)

▪ GDK samples ▪ File > New Project > Android Sample Project

▪ On Glass, turn on USB debugging ▪ Settings > Device Info > Turn on debug

● Hello World - Immersion ▪ App/Activity without theme ▪ Allows the Glass theme to be applied.

▪ Add voice trigger for launching ▪ Touch input and Menu ▪ Voice recognition for text input

● Voice Trigger for Launching ▪ Add intent filter to your main Acivity in AndroidManifest.xml

▪ Add xml/voice_trigger.xml to res folder

▪ Can use additional follow up voice recognition prompts if needed

<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT" /> … <intent-filter> <action android:name="com.google.android.glass.action.VOICE_TRIGGER" /> </intent-filter> <meta-data android:name="com.google.android.glass.VoiceTrigger“

android:resource="@xml/voice_trigger" />

<?xml version="1.0" encoding="utf-8"?> <trigger keyword="hello world" />

https://developers.google.com/glass/develop/gdk/input/voice

● Official Voice Triggers on MyGlass ▪ listen to ▪ take a note ▪ post an update ▪ show a compass ▪ start a run ▪ start a bike ride ▪ find a recipe ▪ record a recipe ▪ check me in

•  start a stopwatch •  start a timer •  start a round of golf •  translate this •  learn a song •  tune an instrument •  play a game •  start a workout

https://developers.google.com/glass/develop/gdk/input/voice

● Touch Input as Key Input ▪ Touch input translated as DPAD key input ▪ Tap => KEYCODE_DPAD_CENTER ▪ Swipe down => KEYCODE_BACK ▪ Camera button => KEYCODE_CAMERA

@Override public boolean onKeyDown( int keycode, KeyEvent event ) {

if( keycode == KeyEvent.KEYCODE_DPAD_CENTER ) { // user tapped touchpad, do something return true; } … return false;

}

https://developers.google.com/glass/develop/gdk/input/touch

● Touch Input ▪ onGenericMotionEvent( MotionEvent event )

@Override public boolean onGenericMotionEvent( MotionEvent event ) {

switch( event.getAction( ) ) { case MotionEvent.ACTION_DOWN: break; case MotionEvent.ACTION_MOVE: break; case MotionEvent.ACTION_UP: break; }

return super.onGenericMotionEvent( event );

}

https://developers.google.com/glass/develop/gdk/input/touch

● Touch gestures ▪ GDK provides GestureDetector for Glass ▪ com.google.android.glass.touchpad.GestureDetector

- NOT android.view.GestureDetector ▪ BaseListener, FingerListener, ScrollListener, TwoFingerScrollListener

▪ Pass MotionEvent from onGenericMotionEvent( ) ▪ gestureDetector.onMotionEvent( event );

https://developers.google.com/glass/develop/gdk/input/touch https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/touchpad/GestureDetector

● Live Demo •  Handling Key input •  Touch input and Detecting gestures

● Menu ▪ Open options menu on tap ▪ openOptionsMenu( )

▪ Add 50x50 pixel icons in the menu resource XML ▪ android:icon="@drawable/icon"

- https://developers.google.com/glass/tools-downloads/menu_icons.zip

▪ Show/hide/update menu items if needed ▪ onPrepareOptionsMenu( )

https://developers.google.com/glass/develop/gdk/ui/immersion-menus

● Live Demo •  Menu

● Voice Input ▪ Start activity for result with system action

▪ Customize prompt with intent extra

▪ Recognized strings returned in intent data of onActivityResult( )

intent = new Intent( RecognizerIntent.ACTION_RECOGNIZE_SPEECH ); startActivityForResult( intent, 0 );

intent.putExtra( RecognizerIntent.EXTRA_PROMPT, "What is your name?” );

intentData.getStringArrayListExtra( RecognizerIntent.EXTRA_RESULTS );

https://developers.google.com/glass/develop/gdk/input/voice http://developer.android.com/reference/android/speech/RecognizerIntent.html

● Live Demo •  Voice Input

● Hello World - Immersion ++ ▪ Play Sounds & Text-to-speech ▪ Take a picture with camera ▪ Card based info page

● Playing Sounds & TTS ▪ Glass system sounds

▪ Text-to-speech ▪ Create/destroy TTS in onCreate/onDestroy( )

https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/media/Sounds

AudioManager am = ( AudioManager )getSystemService( Context.AUDIO_SERVICE ); am.playSoundEffect( Sounds.ERROR ) // DISALLOWED, DISMISSED, ERROR, SELECTED, SUCCESS, TAP

TextToSpeech tts = new TextToSpeech( context, ttsOnInitListener ); … tts.speak( “Hello world!”, TextToSpeech.QUEUE_FLUSH, null ); tts.shutdown( );

http://developer.android.com/reference/android/speech/tts/TextToSpeech.html

● Playing Custom Sounds ▪ Put sound files in res/raw ▪ Load sounds to SoundPool object to play

soundPool = new SoundPool( MAX_STREAM, AudioManager.STREAM_MUSIC, 0 );

int soundOneID = soundPool.load( context, R.raw.sound1, 1 ); int soundTwoID = soundPool.load( context, R.raw.sound2, 1 ); … soundPool.play( int soundID, float leftVolume, float rightVolume,

int priority, int loop, float rate )

http://developer.android.com/reference/android/media/SoundPool.html

● Live Demo •  Playing Sounds and Text-to-Speech

● Camera Input ▪ Calling the Glass built-in camera activity with startActivityForResult( ) and Action Intent, returned with file path to image/video through Intent extra data.

▪ Low level access to camera with the Android Camera API. ▪ http://developer.android.com/reference/android/hardware/Camera.html

https://developers.google.com/glass/develop/gdk/media-camera/camera

● Camera with Action Intent private void takePicture( ) { Intent intent = new Intent( MediaStore.ACTION_IMAGE_CAPTURE ); startActivityForResult( intent, TAKE_PICTURE ); } @Override protected void onActivityResult( int requestCode, int resultCode, Intent data ) { if( requestCode == TAKE_PICTURE && resultCode == RESULT_OK ) { String picturePath = data.getStringExtra( CameraManager.EXTRA_PICTURE_FILE_PATH ); // smaller picture available with EXTRA_THUMBNAIL_FILE_PATH processPictureWhenReady( picturePath ); // file might not be ready for a while } super.onActivityResult( requestCode, resultCode, data ); }

● Live Demo •  Taking a Picture

● Scrolling Cards in Activity ▪ Set a CardScrollView as the content view ▪ Use a custom class extending the CardScrollAdapter class to populate the CardScrollView

https://developers.google.com/glass/develop/gdk/ui/theme-widgets

https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/widget/package-summary

CardScrollView cardScrollView = new CardScrollView( this ); cardScrollView.setAdapter( new InfoCardScrollAdapter( ) ); cardScrollView.activate( ); setContentView( cardScrollView );

● Scrolling Cards in Activity ▪ In your custom CardScrollAdapter class ▪ Create a list of cards ▪ Implement abstract methods in your custom CardScrollAdapter class

- int getCount( ) => return the number of cards (items) - Object getItem( int position ) => return the card at position - View getView( int position, View convertView, ViewGroup parentView ) => return the view of the card at position

- int getPosition( Object item ) => find and return the position of the given item (card) in the list. (return -1 for error)

https://developers.google.com/glass/develop/gdk/ui/theme-widgets

https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/widget/package-summary

● Live Demo •  Scrolling Cards •  Live Cards •  Viewforia •  NyARToolkit

● More Information ▪ Website ▪ https://developers.google.com/glass ▪ http://developer.android.com ▪ http://arforglass.org ▪ http://www.hitlabnz.org

6: Case Studies

BRICKSIMPLE AND GLASS Introduction

BUILDING GLASS EXPERIENCES

Development

WEARABLE IS DIFFERENT Design

“You mustn't be afraid to dream a little bigger, darling”

THANK YOU

Det Ansinn President & Founder BrickSimple LLC det@bricksimple.com Twitter: @detansinn G+: +detansinn Cell: 215.771.8680

7: Technical Q & A

8: Research Directions

Challenges for the Future (2001) ▪  Privacy ▪  Power use ▪  Networking ▪  Collaboration ▪  Heat dissipation ▪  Interface design ▪  Intellectual tools ▪  Augmented Reality systems

Starner, T. (2001). The challenges of wearable computing: Part 1. IEEE Micro,21(4), 44-52. Starner, T. (2001). The challenges of wearable computing: Part 2. IEEE Micro,21(4), 54-67.

Interface Design

Gesture Interaction

Capturing Behaviours

▪  3 Gear Systems ▪  Kinect/Primesense Sensor ▪  Two hand tracking ▪  http://www.threegear.com

Gesture Interaction With Glass ▪  3 Gear Systems ▪  Hand tracking

▪  Hand data sent to glass ▪  Wifi networking ▪  Hand joint position ▪  AR application rendering ▪  Vuforia tracking

Performance

▪  Full 3d hand model input ▪  10 - 15 fps tracking, 1 cm fingertip resolution

User Study

▪  Gesture vs. Touch pad vs. Combined input ▪  Gesture 3x faster, no difference in accuracy

● Meta Gesture Interaction

▪ Depth sensor + Stereo see-through ▪ https://www.spaceglasses.com/

Collaboration

Social Panoramas

Ego-Vision Collaboration

▪  Wearable computer ▪  camera + processing + display + connectivity

Current Collaboration

▪  First person remote conferencing/hangouts ▪  Limitations

-  Single POV, no spatial cues, no annotations, etc

Sharing Space: Social Panoramas

▪  Capture and share social spaces in real time ▪  Enable remote people to feel like they’re with you

Key Technology

▪  Google Glass ▪  Capture live panorama (compass + camera) ▪  Capture spatial audio, live video

▪  Remote device (desktop, tablet) ▪  Immersive viewing, live annotation

Awareness Cues

▪  Where is my partner looking? ▪  Enhanced radar display, Context compass

Interaction

▪  Glass Touchpad Input/Tablet Input ▪  Shared pointers, Shared drawing

User Evaluation

▪  Key Results Visual cues significantly increase awareness Pointing cues preferred for collaboration Drawing on Glass difficult, ranked low in usability

Cognitive Models

Resource Competition Framework ▪  Mobility tasks compete for cognitive resources

with other tasks ▪  the most important given higher priority

▪  RCF is a method for analyzing this, based on: ▪  task analysis ▪  modelling cognitive resources ▪  a resource approach to attention

Oulasvirta, A., Tamminen, S., Roto, V., & Kuorelahti, J. (2005, April). Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 919-928). ACM.

RCF Key Assumptions Four Key Assumptions 1. Functional Modularity: cognitive system divided into functionally separate systems with diff. representations

2. Parallel Module Operation: cognitive modules operate in parallel, independent of each other

3. Limited Capacity: cognitive modules are limited in capacity with respect to time or content

4. Serial Central Operation: central coordination of modules (eg monitoring) is serial

Cognitive Interference ▪  Structural interference ▪  Two or more tasks compete for limited

resources of a peripheral system -  eg two cognitive processes needing vision

▪  Capacity interference ▪  Total available central processing

overwhelmed by multiple concurrent tasks -  eg trying to add and count at same time

Cognitive Resources & Limitations

asdfasdf

Using RCF

1. Map cognitive faculty to task

2. Look for conflicts/overloads

3. Analyse for competition for attention 4. Look for opportunities for technology to

reduce conflicts/competition

Example: Going to work ..

Which is the most cognitively demanding?

Application of RCF

Busy street > Escalator > Café > Laboratory. But if you made Wayfinding, Path Planning, Estimating Time to Target, Collision Avoidance easier?

Social Perception

How is the User Perceived?

TAT Augmented ID

● The Future of Wearables

9: Resources

Online Wearables Exhibit

Online at http://wcc.gatech.edu/exhibition

Glass Resources ▪  Main Developer Website ▪  https://developers.google.com/glass/

▪  Glass Apps Developer Site ▪  http://glass-apps.org/glass-developer

▪  Google Design Guidelines Site ▪  https://developers.google.com/glass/design/

index?utm_source=tuicool ▪  Google Glass Emulator ▪  http://glass-apps.org/google-glass-emulator

Other Resources ▪  AR for Glass Website ▪  http://www.arforglass.org/

▪  Vandrico Database of wearable devices ▪  http://vandrico.com/database

● Glass UI Design Guidelines

▪ More guidelines ▪ https://developers.google.com/glass/design/index

Books ▪  Programming Google Glass ▪  Eric Redmond

▪  Rapid Android Development: Build Rich, Sensor-Based Applications with Processing ▪  Daniel Sauter

▪ Microinteractions: Designing with Details ▪ Dan Saffer ▪ http://microinteractions.com/

● Conclusions

● Conclusions ▪ Wearable computing represents a fourth generation of computing devices ▪ Google Glass is the first consumer wearable ▪ Lightweight, usable, etc

▪ A range of wearables will appear in 2014 ▪ Ecosystem of devices

▪ Significant research opportunities exist ▪ User interaction, displays, social impact

Contact Details Mark Billinghurst ▪  email: mark.billinghurst@hitlabnz.org ▪  twitter: @marknb00

Rob Lindeman ▪  email: gogo@wpi.edu

Feedback + followup form ▪  goo.gl/6SdgzA

Recommended