19
Break-out Group # D Research Issues in Multimodal Interaction

Break-out Group # D Research Issues in Multimodal Interaction

Embed Size (px)

Citation preview

Page 1: Break-out Group # D Research Issues in Multimodal Interaction

Break-out Group # D

Research Issues in Multimodal Interaction

Page 2: Break-out Group # D Research Issues in Multimodal Interaction

What are the different types

• Speech

• Haptics

• Gesture

• Deictic

• Head and eye movement

• EEG Electrocephalograms

• physiological measurements

Page 3: Break-out Group # D Research Issues in Multimodal Interaction

What has been done so far?

• Semantic fusion of information – Speech and Gesture

• Preliminary efforts as to what types of modalities to intergrate. This is application dependent.

• Need standardization at the level of devices and types of information to be fused

Page 4: Break-out Group # D Research Issues in Multimodal Interaction

Open Research Problems:

• Should we stay with current paradigms or invent new methodologies?

• There is no unifying framework for interaction in terms of devices/semantic integration. This is due to the lack of general purpose application.

• We see specific applications eg, simulation, medical training

Page 5: Break-out Group # D Research Issues in Multimodal Interaction

Open Research Questions

• How to deal with specific tasks in terms of fusing channels. How should channels be fused.

• How to do transitions between tasks, e.g., manipulation vs loccomotion

• Need more experimentation and a theory as to where VR is needed?

Page 6: Break-out Group # D Research Issues in Multimodal Interaction

Open Research Questions

• Formal study of tasks within applications(e.g., manipulation, selection, navigation, changing of attributes, numerical input)

• Need more research on output. So far mostly visual and oral.

Page 7: Break-out Group # D Research Issues in Multimodal Interaction

First breakout group

• Taxonomy

• Semantics

• Cross-modal Representations (actions/perceptions)

Page 8: Break-out Group # D Research Issues in Multimodal Interaction

Applications/Output Group Second Meeting

New Issues we Discussed in the afternoon

Page 9: Break-out Group # D Research Issues in Multimodal Interaction

DM: Third Breakout Group: Applications/Output

• Human Perception of the environment

• Integration with Input

• Relationships to basic principles

Page 10: Break-out Group # D Research Issues in Multimodal Interaction

Human Perceptive abilities

• Vision Technology: Limitation in terms or lighting or real time rendering

• Limitations for other channels: Haptics, audio, olfaction, taste

• The type/mix of output depends on the application

• This is related to the internal representation

Page 11: Break-out Group # D Research Issues in Multimodal Interaction

Continued

• Issue of using many modalities to offset the limitations of each modality.– Right now we do not have enough research data

to support that.

• Do we or not need to represent exactly the environment?– Application dependent

Page 12: Break-out Group # D Research Issues in Multimodal Interaction

Continued

• Abstraction vs exact representation– Application dependent

• Exact physical simulation vs fake physics. Ok or not to fool the user?– Probably application and technology

dependent.

Page 13: Break-out Group # D Research Issues in Multimodal Interaction

Other Human Perceptive Modalities

• Olfaction and taste: very little research

• Some modalities are better understood than others (e.g., visual vs haptic or olfaction)

Page 14: Break-out Group # D Research Issues in Multimodal Interaction

Continued-Summary

• Big issues:– Sensory substitution– Level of detail (variable resolution)– Sampled vs synthetic generation – Online or offline computation – Preservation (or not) of individuality e.g two

people with different sense of taste or heat etc– Higher-level emotional augmentation

Page 15: Break-out Group # D Research Issues in Multimodal Interaction

Integration with Input

• Haptics is the most widely used output sense that is also used for input– Head orientation, whole-body position, eye

gaze also

• Some output must be tightly coupled to input (it’s at the physical level)– Head motion to view changes, 3D audio

Page 16: Break-out Group # D Research Issues in Multimodal Interaction

Integration with Input (cont.)

• Eye gaze-based control requires some interpretation

• Intentional vs unintentional movement– When is a gesture a gesture?

Page 17: Break-out Group # D Research Issues in Multimodal Interaction

Relationship to Basic Principles

• Mapping semantics to output– One or multiple representations for all modalities

eg. Language and visual output where we have a common representation but gets translated differently for output

– Spatio-temporal synchronization– Cross-modal representation (actions/perceptions)

• Account for individual differences

Page 18: Break-out Group # D Research Issues in Multimodal Interaction

Future Paper Topics

• All the previously mentioned open problems

• Short Term– Update of the NRC report on modalities

• Medium-Term– Modeling, Coupling and Output of modalities– In particular model smell and taste

Page 19: Break-out Group # D Research Issues in Multimodal Interaction

Future Paper Topics

• Long Term– Further modeling and coupling– Advanced display technology– Personalization of output