© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
How to achieve context sensitivity in mobile applications.
Context in mobile applications
Martin Wolpers
© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Agenda
Introduction of context Sensors in mobile devices Conclusions based directly on sensor data Aggregating sensor data to derive conclusions Advanced sensor data processing to create higher order conclusions
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Introduction of context
Any ideas?
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context awareness:the essence of adaptabilityContext awareness Resource awareness
Adapt to available resources (connectivity, nearby devices Situation awareness
Adapt to the situation (mode, location, time, event) Intention awareness (?)
Adapt to what the user wants to do
Context awareness is found in humans We always adapt our behavior and actions according to the context (i.e.
situation) Pervasive computing devices that ubiquitously accompany humans (such as
smartphones) must adapt accordingly Or risk being disruptive and annoying
Taken from lecture slides CSE494/598
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Defining Context
One definition [Schilit et-al. 1994]: Computing context:
connectivity, communication cost, bandwidth, nearby resources (printers, displays, PCs)…
User context: user profile, location, nearby people, social situation, activity, mood …
Physical context: temperature, lighting, noise, traffic conditions …
Temporal context (time of day, week, month, year…)
Context history can also be useful
Another definition [Abowd & Mynatt]:
Social context: user identity and human partner identities
Functional context: what is being done, what needs to be done
Location context: where it is happening
Temporal context: when it is happening
Motivation context: why it is happening (purpose)
• Dictionary definition• “the interrelated conditions in which something exists or occurs”
• Definition for pervasive computing• “any parameters that the application needs to perform a task without
being explicitly given by the user”
Sch
ilit,
B., A
dam
s, N
. A
nd
Wan
t, T
.R. (1
99
4),
Con
text-
aw
are
co
mp
uti
ng
ap
plic
ati
on
s. In
Fir
st In
tern
ati
on
al W
ork
shop
on
Mob
ile c
om
pu
tin
g
Syst
em
s a
nd
Ap
plic
ati
on
s, p
p. 8
5-9
0
GREGORY D. ABOWD and ELIZABETH D. MYNATT (2000). Charting Past, Present, and Future Research in Ubiquitous Computing. ACM Transactions on Computer-Human Interaction, Vol. 7, No. 1, March 2000, Pages 29–58.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
An operational context definition
Based on Zimmermann et.al. 2007, Proceedings of Context 2007
Definition:
Context is any information that can be used to characterise the situation of an entity (Dey, 2001).
Elements used for the description of context information fall into five categories: individuality, activity, location, time, relations
The activity predominantly determines the relevancy of other context information in specific situations.
Location and time primarily drive the establishing of relations to other entities enabling the exchange of context information among entities.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Elements of context
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context Information: Individuality
captures contextual information strongly related to the entity several types of entities possible:
active and passive real and virtual mobile, movable, stationary human, natural, artificial, group entities
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context Information: Time
covers temporal information related to the entity current time
alternative representations overlay models
time intervals recurring events process-oriented view historical context information
access past contextual information analyse past contextual information
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context Information: Location
covers spatial information related to the entity physical or virtual absolute or relative quantitative (geometric) and qualitative (symbolic) representations overlay models one entity possesses
one physical quantitative location several different qualitative locations several different virtual locations
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context Information: Activity
covers information about activities the entity is involved in described by goals, tasks and actions tasks are goal-oriented activities and small, executable units task models structure task into subtask hierarchies goals potentially change very frequently low-level and high-level goals determines the relevancy of other contextual information
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context Information: Relations
covers information about relations the entity has established to
other entities expresses semantic dependencies between two entities spatio-temporal coordinates of two entities are key-driver several relations can be established to the same entity each entity plays a specific role in a relation static and dynamic relations several types of relations:
social relations functional relations compositional relations
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context (cont’d)
Other classifications of context: Low-level vs High-level
context Active vs Passive context
Putting it all together Gather low-level context Process and generate
high-level context Separate active from
passive context Adjust
individual
time
relations
location
activity
Sensor data
Low-levelcontext
Context processing
high-levelcontext
Context-aware application
activecontext
passivecontext
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Context-Aware Application DesignHow to take advantage of this context information?
Schilit’s classification of CA applications:
1. Proximate selection:1. closely related objects & actions are emphasized/made easier to choose
2. Automatic contextual reconfiguration: adding/removing components or changing relationships between components based on context1. Switch to a different operation mode
2. Enable or disable functionality
3. Context-triggered actions: rules to specify how the system should adapt
3. Contextual information and commands: produce different results according to the context in which they are issued1. Narrow-down the output to the user using the context
2. Broaden the output to the user using the context
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Problems with processing sensor data
From Junehwa Song. Mobile and Sensor OS. MobiSys 2008/TMC 2010/PerCom 2010
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
The usual approach
Requires costly operations for Continuous data updates from sensors Continuous context processing
Complex feature extraction and context recognition Continuous change detection
Repeated examination of numerous monitoring requests
From Junehwa Song. Mobile and Sensor OS. MobiSys 2008/TMC 2010/PerCom 2010
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Introducing feedback loops
Early detection of context changes Remove processing cost for continuous context recognition
Utilize the locality of feature data in change detection Reduce processing cost by evaluating queries in an incremental
manner Turn off unnecessary sensors for monitoring results
Reduce energy consumption for wireless data transmissionFrom Junehwa Song. Mobile and Sensor OS. MobiSys 2008/TMC 2010/PerCom
2010
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Sensors in mobile devices
Touch screen Several accelerometers Gyroscope GPS Wifi Microphone Camera Bluetooth Light Telephone (Call, SMS)
Navigation Browser history Social networks Calendar Contacts Address resolver Music player
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Conclusions based directly on sensor data
Sensor data generate first level observation data.
Examples
Accelerometer indication that someone might be moving
Localization + Accelerometer track of movement activity
Localization + Time indication that someone might be moving
Localization + Feedback button someone confirms an activity (e.g. app asks the student to state that he attended a course after attending the course)
Time + Lightsensor indication that someone might be outside
Real world examples
Location + Accelerometer + Time Wake up timer
Location + Time + Calendar Silence mobile phone, e.g. Tasker
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Aggregating sensor data to derive conclusions
Combine sensor data to derive second level observation data.
Examples: Location + Contacts + Bluetooth log Buddies near you; Buddy phone status Location + Calendar + Time + Sound Identify if in a conversation Location + Accelerometers Identify if someone is moving indoors and
outdoors Time + Location + SMS activity Identify if someone is waiting for someone
else
Real world examples: ContextPhone VibN CenceMe Physical Activity measurement Time tracking: How do figure out if a task is completed.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
The ContextPhone framework
(from 2004/2005: runs on Symbian OS 6 and 7 – Really old -- now part of Google Jaiku http://www.jaiku.com/ )
Already then, most of today’s ideas have been addressed,e.g. using bluetooth connections to determine how busy an environment is.
Or
Access to status of friends mobile phone:
http://www.cs.helsinki.fi/group/context/
My PhoneFriends Phone
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
VibN
Using the microphone to collect environment information
Tagging of places with audio and statistics of people present
(To ensure privacy, voices are removed from the recording.)
http://sensorlab.cs.dartmouth.edu/vibn/http://www.youtube.com/watch?v=U37G6uzTu5k
Points of Interest identified by sound recording and time of stay
Uses microphone, localization and
accelerometers
Note that accelerometer shut down on iOS if app is in background (not so on Android)
Good paper showing implementation at http://sensorlab.cs.dartmouth.edu/pubs/sci906e-miluzzo.pdf
Sound with iOSSound with AndroidSound with HTML5 (carefull, some problems)
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
CenceMe – sensing and sharing presence
Sensing presence captures a user’s status in terms of his activity (e.g., sitting, walking, meeting friends), disposition (e.g., happy, sad, doing OK), habits (e.g., at the gym, coffee shop today, at work) and surroundings (e.g., noisy, hot, bright, high ozone).
Use of sensors: Accelerometers identify activity of user (sit, run, walk, etc.). Microphone identifies conversation, quite place, loud location, etc. Localization delivers web-based additional info like weather, etc. Access to contacts and calendar provides indications of with whom
you are in a conversation.
http://metrosense.cs.dartmouth.edu/projects.html#cencemehttp://cenceme.org/ http://www.youtube.com/watch?v=8rDFbTF47PA
iPhone access to calendarAndroid access to calendar
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Example problem:Physical Activity Measurement using the iPhone
Task: identify the physical activity in terms of standing, sitting, walking, jogging,moving upstairs and downstairs
Sensor: Accelerometer in mobile device at different places
Problem: Place where mobile device is on the body is unclear
Solution: Best place is the waist. If not possible, use transiton tables from research, e.g.
Jennifer R. Kwapisz, Gary M. Weiss, Samuel A. Moore. Activity Recognition using Cell Phone Accelerometers. SensorKDD ’10, July 25, 2010, Washington, DC, USA.
Yuichi Fujiki. iPhone as a Physical Activity Measurement Platform. CHI 2010, April 10–15, 2010, Atlanta, Georgia, USA.
Accelerometer on the iPhoneAccelerometer on Android
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Time tracking – How to...
Solution 1: Ask the worker.
Solution 2: (Semi-) Automatic detection (one possible solution) Identify starting and ending events/activities of tasks or
assignments Ask user to press button when starting a task Ask user to define task in terms of sensor input (change of location,
result sent, stop button pressed, participating partners, collaboration events, etc.)
Integration with Calendar to ensure pausing at unrelated events Integrate with Telephone and Mic and Calendar to identify F2F
collaboration is ongoing Integrate with SMS to detect asynchronous collaboration ... Use facebook timeline upload/store data and to visualize activities
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Advanced sensor data processing to create higher order conclusions
Emoticon analysis Learning resource context Basic learning analytics
A
EC F
DE C
C G
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Emoticon Analysis – Goals and Idea
Detecting positive sentiments from computer mediated communication (CMC) between chat partners to qualify the degree of positivity in a relationship
Positive emoticons in CMC do convey positivity and respective emotions
Take emoticons as a substitute for non-verbal communication. Disregard all verbal information -> ease and speed of processing
Question: Does positivity as calculated by emoticon extraction correlate with sympathy?
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Emoticon Analysis: Experimental Setup & IndicatorsExtract chats from Skype for test users. Anonymize contacts and user
information and store emoticon parameters on central DB
Calculated Positivity value:
= Positive Emoticon Quotient = Global Emoticon Quotient = Emoticon Mimicry Quotient
PEQ relates to positive emoticons per chat session to all chat sessions.
GEQ relates to emoticon usages per chat session to all chat sessions.
Mimicry rate grabs the amount of mimiced emoticons between chat partners.
Scalar weight vector (G) open for modification.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Emoticon Analysis: Evaluation & Results
Questionnaires for participants (N=6) Top ten ranking of skype contacts with pseudonyms to guarantee
anonymousity Build pairs of partners to detect differences in relationship
interpretation
Results Calculated top ten ranking of algorithm includes 50% of the most
sympathetic Skype contacts Pairing leads to very interesting results showing emoticon use and
mimicry can differ widely in chat communication. Hinting towards personal tendencies and inequalities in relationships
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Paradigmatic Relations
A
EC F
DE C
C G
EA
C F
D C
C G
UC 1
UC 2
pre-contexts post-contexts
Background (corpus linguistic) Words that occur in similar contexts are commonly semantically related Example: beer and wine
Research question Do (learning) objects with similar usage contexts have similar content?
Approach Each object holds a usage context profile comprising all its usage contexts A usage context (UC) consists of a pre- and a post-contexts
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Paradigmatic Relations
First results using CAM collected in the MACE project: Medium correlation between metadata similarity and
object context similarity (0.32), significant due to large sample size (> 65.000.000 object pairs)
Manual comparison: 92% of the 100 object pairs with the highest object context similarity are strongly related.
The found context similarity was in many cases not entailed in the metadata.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
PPP – Data Collection
Engineering program at Universidad Carlos III de Madrid C programming course from Sep 6 - Dec 16, 2010 (244 students)
and Sep 5 – Oct 19, 2011 (342 students)
virtual machine with all tools needed, configured by teaching staff
learning management system (.LRN then Moodle) for forums, course material, etc.
reminder about data collection at every start of the VM (should be used for course-related work only)
existence of a concrete folder functions as a switch (students can move it easily)
people in charge can be contacted and request for insight and deletion is possible
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
PPP – Analysis and Results
Extracting key actions to identify user patterns and tendencies throughout the whole course
keywords semantically represent the text they are taken from key actions represent the session they are taken from
Year 1: ~120,000 events and 19 event types visualization of key actions showed key action sequences clearly
pointing to corrective actions to be deployed analysis of errors also showed problems to discuss in class
Year 2: ~125,000 events and 34 event types teachers think key actions to be a very useful form of data
distillation use results for course evaluation teachers liked getting better information from the key actions than
from the logs themselves.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
PPP – Example Visualizations Year 1
12/9
/201
0
15/9
/201
0
18/9
/201
0
21/9
/201
0
24/9
/201
0
27/9
/201
0
30/9
/201
0
3/10
/201
0
6/10
/201
0
9/10
/201
0
12/1
0/20
10
15/1
0/20
10
18/1
0/20
10
21/1
0/20
10
24/1
0/20
10
27/1
0/20
10
30/1
0/20
10
2/11
/201
0
5/11
/201
0
8/11
/201
0
11/1
1/20
10
14/1
1/20
10
17/1
1/20
10
20/1
1/20
10
23/1
1/20
10
26/1
1/20
10
29/1
1/20
10
0
50
100
150
200
250
300
350
400
number of times the error occurred number of students getting the error
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
PPP – Example Visualizations Year 2
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
A final word...
About social media apps Used to communicate context Used to consume context Respect privacy and ensure security of data
Don’t be too overly ambitious: Semi-automatic rule-based volume control is an app that sells for 6
US-$. Don’t try to duplicate it – use it (if possible). Joint To-Do lists including calendar access are already existing, e.g.
Family Organizer Follow the principles of architecture design:
Copy and improve rather then re-invent.
© Fraunhofer-Institut für Angewandte Informationstechnik FIT© Fraunhofer-Institut für Angewandte Informationstechnik FIT
Questions??