25
Machine learning applied to multi- modal interaction, adaptive interfaces and ubiquitous assistive technologies December 10, 2009 Jaisiel Madrid Sánchez R&D Consultant INREDIS project

Inredis And Machine Learning Nips

Embed Size (px)

DESCRIPTION

This presentation address some of the research lines on machine learning in order to foster accessibility in the ICT design.

Citation preview

Page 1: Inredis And Machine Learning Nips

Machine learning applied to multi-modal interaction, adaptive interfaces and ubiquitous assistive technologies

December 10, 2009

Jaisiel Madrid SánchezR&D Consultant INREDIS project

Page 2: Inredis And Machine Learning Nips

• Technology company belonging to the ONCE’s Foundation

• Over 70% of Technosite’s staff are people with disabilities .

• It is precisely in that aspect that we have been able to boost our competitive edge:

• Our technological development follows accessibility criteria

• Business area focusing on social studies:

• users’ needs

• preferences

• expectations

• Social Spaces for Research and Innovation (SSRIs): exchange information and network among users, designers and stakeholders for the ICT development.

Technosite (who are we?…)

Page 3: Inredis And Machine Learning Nips

Transforming the Assistive Technology Ecosystem

• INREDIS project is developing basic technologies for communication and interaction channels between people with disabilities and their ICT environment (INterfaces for RElationships between people with DISabilities and their ICT environment).

• Accessibility: technologies must be designed for diversity (design for all).

• Interoperability.

• Adaptability.

• Multimodality.

• Ubiquity

Page 4: Inredis And Machine Learning Nips
Page 5: Inredis And Machine Learning Nips

• Interoperability and ubiquity (cloud computing): structured data sharing.

• Adaptability machine learning

• Adaptive user interfaces (personalization): accessibility becomes a special case of adaptation.

• Multimodality machine learning

• Multimodal interaction (detection): accessibility becomes a natural interaction according to user capabilities.

• Little to say about particular learning methods, but specific setups to apply them.

Accessibility and Machine Learning

INREDIS

Page 6: Inredis And Machine Learning Nips

• Multimodal interaction is achieved by multimodal assistive technologies (executed in local/remote services):

• vary the interaction channel or perform a code translation:

• considered as “interaction resources” of the user interface (to be adapted).

Adaptive user interfaces and multimodal assistive technologies

• Text to Speech.

• Speech to Text.

• ECA (Embodied

Conversational Agents)

• Text to Augmentative Communication

• Text to Sign Language.

• Sign Language to text.

• etc.

Page 7: Inredis And Machine Learning Nips

• Levels of adaptation of user interface (accessibility resources on the user interface):

• Lexical level: navigation windows, button sizes, figures with reduced detail, textual description of non-textual resources, etc.

• Interaction level: multimodal assistive technologies

Adaptive user interfaces and multimodal assistive technologies

• Selection of:

• Type of multimodal AT.

• Configuration options: “ready from the first moment”

Page 8: Inredis And Machine Learning Nips

• Data for adaptation:

• Persistent features (off-line adaptation):

• User profile: needs, preferences*, expectations*.

• Technological profile: user device, target service/device.

• Non-persistent features (on-line adaptations):

• User profile: user experience, affective detection (and other activity response systems: brain, eye,…)

• Context profile: wearable sensors, complex event processing (INREDIS platform-level).

Adaptive user interfaces

Page 9: Inredis And Machine Learning Nips

• Knowledge organization for data-adaptation matching:

• INREDIS ontology: organizes concepts, their properties and their relations.

• Populating the ontology is a difficult task: machine learning as a tool to discover instances and enrich the ontology.

• Persistent features:

• User profile: needs, preferences, expectations.

- Implicit interaction systems (vs. explicit user input: e.g., on-line form).

• Non-persistent features:

- User profile: user experience, affective detection.

- Context profile: wearable sensors, complex event processing

• Evolving the ontology: new concepts and relations according to experience by means of machine learning.

Adaptive user interfaces

Page 10: Inredis And Machine Learning Nips

Persistent user features: implicit interaction systems

persistent user profile

multimodal games

social analysis

interaction logs

Page 11: Inredis And Machine Learning Nips

Persistent user features: implicit interaction systems

• Multimodal (natural) interaction games:

“Tell me and I forget, show me and I remember, involve me and I understand”: Chinese proverb

• Goals:

• Capture of persistent user profile: needs and preferred adaptations (provide personal predictions for each user).

• Reflect user’s actual practices, not user’s beliefs (forms, etc.).

• “Static over time”: explicitly reconfigured by user.

• Multimodal: accessible from the first interaction

• The game involves: vision, auditory, motor and cognitive problems.

Page 12: Inredis And Machine Learning Nips

• The game actively interacts with user to generate queries and examples to evaluate user needs and preferences (following a consistent goal).

• The system collects traces of user decisions and apply machine learning to these traces to construct a persistent user profile model (needs, preferences and expectations).

• This profile will be used for future interface adaptations (non-persistent updates).

• Dynamic modeling:

• users provide different feedbacks for similar situations according to needs, preferences and expectations.

• the agent might ask questions to learn more effectively according to given feedbacks and select a subset of observed samples.

Persistent user features: implicit interaction systems

Page 13: Inredis And Machine Learning Nips

• Complexity of the tasks can be extended:

• Additional modalities (incorporated to the model).

• Media contents.

• Real time.

• Choosing the right problems: designers choose different questions depending on user profiles and agent performance, maintaining minimal interactions.

• Measure of efficiency: number of interactions (clicks, etc.) to complete the game.

• Measures of quality: several criterion (different users differ in the relative importance they assign to such criteria: according to expectations).

• ML Literature (connections): advisory systems by information filtering, multi-task learning, etc.

Persistent user features: implicit interaction systems

Page 14: Inredis And Machine Learning Nips

• Social network analysis:

• Finding relevant information from social network monitoring.

• Relevant information: accessibility and usability features.

• Help increasing accuracy on the persistent user profile, so matching more relevant interface resources to user .

• Feedback focus on user interests, feelings, needs, preferences and expectations about accessibility features (instead of functionality features):

• At the level of single experience in 2.0 portals and blogs (targeting of individuals based on expressed preferences).

• At the level of related user groups: improve relevancy and trustworthiness of opinion data for interface resources recommendation.

Persistent user features: implicit interaction systems

Page 15: Inredis And Machine Learning Nips

• Incorporating the experience of those who used particular accessibility resources before. Opinion mining.

• Grouping of 2.0 content based on natural language expressions about user like and dislike about accessibility and usability features: categorization of interests

• Taking into account inconsistencies in the opinion of conflicting authors (by determining reputation of authors).

• Requires a specific semantic technology (represent the original semantic structure of authors information (with different needs and reputations) ). Parse tree + semantic rules which navigates these trees.

• ML connections: text categorization using Support Vector Machines.

Persistent user features: implicit interaction systems

Page 16: Inredis And Machine Learning Nips

• User interaction logs:

• Within the symp. schedule:

“Data Mining based user modeling systems for web personalization applied to people with disabilities”. J. Abascal, O. Arbelaitz, J. Munguerza and I. Perona.

Persistent user features: implicit interaction systems

Page 17: Inredis And Machine Learning Nips

• User experience.

• First adaptation of interface has been already done (by using persistent features): off-line adaptation.

• Learned knowledge should reflect the preferences of individual interface resources: personalized assistive technologies.

• On-line adaptation of user interface according to user experience: each time interaction with the interface occurs (on-line learning, which contrast with work on datamining).

• INREDIS aims to construct an interaction manager makes recommendations to the user or generates actions on the interface resources (both lexical and interaction) that the user can always override: these update persistent user profile.

• Collaborative filtering: find similar user profiles and suggest on-line accessibility resources that they liked but the current user has not yet used.

Non-persistent user features

Page 18: Inredis And Machine Learning Nips

• Affective detection (attentive interfaces):

• Goal: (the ability to simulate empathy: natural interaction…).

• To accept or reject on-line modifications (from explicit interactions) on the interface resources according to an implicit feedback (user’s behaviour), in order to improve user experience.

• To generate new modifications from implicit (emotional) user interaction, in order to better meet dynamic usability goals.

• INREDIS affective intelligent agent:

• Multimodal: speech and facial detection (hypoacusis, cognitive, etc.).

• Combined with eye activity detection and brain response.

• Negative, neutral and positive emotions (Litman y Forbes-Riley.2004).

Non-persistent user features

Page 19: Inredis And Machine Learning Nips

• Video, audio and fusion classifiers (“unambiguity”).

• Support vector machines.

• ML literature: detection until 40 emotions.

• Essential step: training over specific users (multimodal games may give this offline information).

• Affective visual output system:

Non-persistent user features

Page 20: Inredis And Machine Learning Nips

• Wearable sensors:

• Context-awareness: interface adaptation should be able to behave in a context-sensitive way (of person of computing device).

• Remind: INREDIS focus on lexical and interaction adaptations!

• To collect data from a dynamic and unknown environment: the context (of user or device).

• Standard machine learning methods are generally used to integrate and interpret the collected sensor traces from multiple sources of information (see “learning from multiple sources” papers…).

• Context-sensitive adaptations: non-persistent disabilities…

Non-persistent context features

Page 21: Inredis And Machine Learning Nips

• Context-sensitive adaptations: “non-persistent disabilities”:

• Noisy context: hypoacusis visual alternative (text, graphic)

• Reflecting light on screen: low vision magnifier/auditory alternative.

• Cold temperature/gloves or walking/driving: motor impairment voice interaction.

• Surrounding people (ATM): hearing impairment visual alternative.

• etc.

Non-persistent context features

Page 22: Inredis And Machine Learning Nips

Non-persistent features “non-persistent disabilities”

“Every day we can have the same needs as a person with disabilities”

Page 23: Inredis And Machine Learning Nips

• INREDIS: multimodal remote services

• Image/text/audio/haptic processing.

• Fusion and syncronization of multimodal streams.

• High dimensional data: SVM.

• E.g.: Spanish sign language classifier:

Multimodal assistive technologies

Page 24: Inredis And Machine Learning Nips

interoperability adaptability multimodality ubiquity

Page 25: Inredis And Machine Learning Nips

Thank you for your attention

<jaisiel madrid sánchez>[email protected]

www.technosite.es

Machine learning applied to multi-modal interaction, adaptive interfaces and ubiquitous assistive technologies