168
UNIVERSITY OF OSLO Department of Informatics CommonSens A Multimodal Complex Event Processing System for Automated Home Care PhD thesis Jarle Søberg

CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

UNIVERSITY OF OSLODepartment of Informatics

CommonSensA Multimodal ComplexEvent ProcessingSystem for AutomatedHome Care

PhD thesis

Jarle Søberg

Page 2: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

© Jarle Søberg, 2011 Series of dissertations submitted to the Faculty of Mathematics and Natural Sciences, University of Oslo No. 1089 ISSN 1501-7710 All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, without permission. Cover: Inger Sandved Anfinsen. Printed in Norway: AIT Oslo AS. Produced in co-operation with Unipub. The thesis is produced by Unipub merely in connection with the thesis defence. Kindly direct all inquiries regarding the thesis to the copyright holder or the unit which grants the doctorate.

Page 3: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Abstract

Automated home care is an application domain of rapidly growing importance for the society.By applying sensors in the homes of elderlies it is possible to monitor their well being and activ-ities of daily living (ADLs). Automated home care increases life quality by letting the elderly,i.e., the monitored person, live in a familiar environment. In this thesis we present Common-Sens, a multimodal complex event processing (CEP) system for detecting ADLs from sensorsin the home. CommonSens is modelled and designed to simplify the work of the applicationprogrammer, i.e., the person who writes the queries that describe the ADLs.System supported personalisation simplifies the work of the application programmer, and

CommonSens adapts to any environment and sensor configuration. During an instantiationphase CommonSens analyses queries and uses late binding to select the sensors in the environ-ment that are relevant for the query. In order to realise the personalisation, CommonSens isbased on three separate models: (1) an event model to identify states and state transitions thatare of interest in the real-world, (2) a sensor model to describe the capabilities of physical andlogical sensors, their coverage and the signal types they use, and (3) an environment model todescribe the physical dimensions of the environment and the impact they have on various signaltypes. In order to approximate coverage of locations of interest (LoIs) in the home, Common-Sens uses multimodality and can combine readings from different sensor types. In addition totraditional query processing, CommonSens supports the concept of deviation detection, i.e., thequeries are interpreted as statements, or rules, which describe the desired behaviour as events.When CommonSens detects that these rules are not followed, it sends a notification about thedeviation.Through the implementation of CommonSens we evaluate three claims: CommonSens (1)

detects complex events and deviations, (2) processes data tuples in near real-time, and (3) iseasy to use and provides personalisation. We show these claims by using simulations basedon synthetic workload and trace files, as well as real-world experiments using real sensors andreal-time CEP. We show that CommonSens provides personalisation by instantiating the queriesdifferently depending on the current environment.

i

Page 4: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 5: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Acknowledgements

First of all, I would like to thank my excellent and conscientious supervisors Professor Dr. VeraGoebel and Professor Dr. Thomas Plagemann.Second, I would like to thank the whole gang at the Distributed Multimedia Systems re-

search group. I hope to still being allowed to join the salary beers. I would especially like tothank Azadeh Abdolrazaghi and Dr. Sebastien F. Mondet for proof reading and asking insightfulquestions about things I had never though of. I would also like to thank Viet Hoang Hguyen forhelping me with the MICAz experiments. Through the years, I have also had many interestingdiscussions with Morten Lindeberg. Some were also related to research. Dr. Matti Siekkinenand Dr. Katrine Stemland Skjelsvik have helped me with their research experience, and havealso contributed in joint work. I also want to thank Radioresepsjonen for their podcasts. Theyhelped me to fall asleep during periods of high stress and to laugh out loud in situations wheresuch behaviour is considered rather eccentric, e.g. when sitting on the bus during rush hour.Finally, I would like thank my wonderful wife Ingerid Skjei Knudtsen and my family (also

wonderful).

iii

Page 6: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 7: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Contents

Abstract i

Acknowledgements iii

1 Introduction 11.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2 A Brief Look at Additional Issues . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Claims, Methods and Approach . . . . . . . . . . . . . . . . . . . . . . . . . 41.4 Contributing Papers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.5 Structure of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2 Background and Related Work 92.1 Automated Home Care . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.1.1 Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.1.2 Requirement Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2 Sensor Technology and Sensor Models . . . . . . . . . . . . . . . . . . . . . . 132.3 Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3.1 Event Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.3.2 Query Languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.3.3 Complex Event Processing and Personalisation . . . . . . . . . . . . . 202.3.4 Spatial Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.3.5 Deviation Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

2.4 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3 CommonSens Data Model 273.1 Event Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.2 Environment Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313.3 Sensor Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323.4 Query Based Event Language . . . . . . . . . . . . . . . . . . . . . . . . . . 35

3.4.1 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.4.2 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3.5 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4 Instantiation and Event Processing 414.1 Sensor Placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

4.1.1 Coverage Area Calculation . . . . . . . . . . . . . . . . . . . . . . . . 42

v

Page 8: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

4.1.2 Sensor Placement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.2 Query Instantiation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504.3 Event Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

4.3.1 Query Evaluator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 524.3.2 Query Pool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 534.3.3 Data Tuple Selector . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4.4 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5 Implementation 575.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

5.1.1 Environment Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 585.1.2 Sensor Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 595.1.3 Query Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

5.2 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 635.2.1 System Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 635.2.2 Physical Sensor Creation and Placement . . . . . . . . . . . . . . . . . 675.2.3 Event Processing Model Creation . . . . . . . . . . . . . . . . . . . . 705.2.4 Event Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

5.3 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

6 Evaluation 796.1 Detecting Complex Events and Deviations . . . . . . . . . . . . . . . . . . . . 80

6.1.1 Functionality Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . 816.1.2 Real-world Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . 916.1.3 Trace File Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

6.2 Scalability and Near Real-Time Event Processing . . . . . . . . . . . . . . . . 1006.3 Personalisation and User Interface . . . . . . . . . . . . . . . . . . . . . . . . 1046.4 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

7 Conclusion 1117.1 Summary of Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1117.2 Critical Review of Claims . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1127.3 Open Problems and Future Work . . . . . . . . . . . . . . . . . . . . . . . . . 114

7.3.1 Open Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1147.3.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

Bibliography 115

A Appendix 125A.1 calculateError . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125A.2 reduceRay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127A.3 Functionality Tests Configuration . . . . . . . . . . . . . . . . . . . . . . . . . 128

A.3.1 Data Files from the Last Experiment in Section 6.1.2 . . . . . . . . . . 143A.4 Trace Files from Cook and Schmitter-Edgecombe . . . . . . . . . . . . . . . . 152

vi

Page 9: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

List of Figures

3.1 The relation of the core elements in our conceptual model of the real world. . . 283.2 Concurrent and consecutive atomic events. . . . . . . . . . . . . . . . . . . . . 293.3 Overview of the V , N and E sets. . . . . . . . . . . . . . . . . . . . . . . . . 303.4 Example of how a wall reduces the coverage area of a camera. . . . . . . . . . 333.5 Examples of capability hierarchies for detecting falls and taking medication. . . 343.6 Examples of allowed sequences in a P-registered query. . . . . . . . . . . . . . 363.7 Our query language as written in EBNF. . . . . . . . . . . . . . . . . . . . . . 39

4.1 Life cycle phases and concepts of CommonSens. . . . . . . . . . . . . . . . . 414.2 Signals that are sent through the objects in two directions and which create

intervals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434.3 The REDUCE algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454.4 Coverage range divided into several intervals with different permeability values. 464.5 A model of a circle with a set of rays. . . . . . . . . . . . . . . . . . . . . . . 464.6 The rays are affected by an object and the coverage area is reduced. . . . . . . 474.7 Using physical sensors to approximate LoIA. . . . . . . . . . . . . . . . . . . 474.8 Examples of relations between sensors that give equivalent results. . . . . . . . 484.9 The FINDSENSOR algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . 514.10 Overview of the query processor in CommonSens. . . . . . . . . . . . . . . . 54

5.1 Key classes in the environment package. . . . . . . . . . . . . . . . . . . . . 585.2 Key classes in the sensing package. . . . . . . . . . . . . . . . . . . . . . . . 595.3 Key classes in the language package. . . . . . . . . . . . . . . . . . . . . . . 615.4 The parsed version of IqC1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 635.5 Key classes in the modelViewController package. . . . . . . . . . . . . 645.6 Main window in CommonSens. . . . . . . . . . . . . . . . . . . . . . . . . . . 645.7 Environment creator in CommonSens. . . . . . . . . . . . . . . . . . . . . . . 665.8 Classes involved in the calculation of reduced coverage area. . . . . . . . . . . 695.9 Before and after the reduceRay()method has been called. . . . . . . . . . . 705.10 Key classes in the eventProcessor package. . . . . . . . . . . . . . . . . . . . 715.11 Instantiation of a box with atomic queries. . . . . . . . . . . . . . . . . . . . . 715.12 Overview of the event processing phase. . . . . . . . . . . . . . . . . . . . . . 755.13 Mixed matching versus uniform matching. . . . . . . . . . . . . . . . . . . . . 77

6.1 Environment instances used in functionality tests. . . . . . . . . . . . . . . . . 836.2 LoIs used in functionality tests. . . . . . . . . . . . . . . . . . . . . . . . . . . 85

vii

Page 10: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

6.3 Movement pattern classes used in functionality tests. . . . . . . . . . . . . . . 866.4 Nine possible locations in the environments. . . . . . . . . . . . . . . . . . . . 876.5 The environment in CommonSens and in the real world. . . . . . . . . . . . . 916.6 Comparison of received and calculated signal strength. . . . . . . . . . . . . . 936.7 Real world experiments with only one camera covering LoI1 and LoI2. . . . . 946.8 Real world experiments with three cameras covering LoI1 and LoI2. . . . . . . 956.9 Overview of the hallway and location of cameras. . . . . . . . . . . . . . . . . 976.10 Results from the hallway experiment. . . . . . . . . . . . . . . . . . . . . . . 986.11 Processing time with 6, 66, 126, 186, and 246 sensors in the environment. . . . 1016.12 Processing time with an increasing number of concurrent queries. . . . . . . . 1036.13 Average processing time for atomic queries in the functionality tests. . . . . . . 1046.14 Two environments with different setup. . . . . . . . . . . . . . . . . . . . . . 1066.15 Excerpts from the hallway with the new LoI Hallway. . . . . . . . . . . . . . 1076.16 Results from the four LoIs that are turned into Hallway. . . . . . . . . . . . . 108

viii

Page 11: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

List of Tables

6.1 Workload types and sections they are used. . . . . . . . . . . . . . . . . . . . 806.2 Return values from the functionality tests and their meaning. . . . . . . . . . . 826.3 Mapping between movement pattern classes and movement patterns. . . . . . . 846.4 Results from functionality tests 178 to 182. . . . . . . . . . . . . . . . . . . . 906.5 Results from functionality tests 172 and 173. . . . . . . . . . . . . . . . . . . 90

A.1 Regression tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135A.2 Complex queries cq1.qry to cq23.qry, which are used in the regression tests. . . 136A.3 Complex queries cq24.qry to cq34.qry, which are used in the regression tests. . 137A.4 Complex queries cq35.qry to cq74.qry, which are used in the regression tests. . 138A.5 Regression test results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

ix

Page 12: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 13: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 1

Introduction

The increasing ratio of elderlies in the world requires alternatives to the traditional home carethat we have today, since this changing ratio means that there are many more persons to betaken care of and less persons to perform this task. Hence, there is a need for alternatives thatautomate the home care application domain. Recent development of sensor technology haspaved the way for these alternatives. Automated home care, or ambient assisted living, consistsin placing sensors in homes of elderlies and use systems that obtain and process the informationfrom the sensors. If the system detects alarming situations, it can either send notifications tothe person being monitored or to the helping personnel. Automated home care systems canincrease the threshold for hospitalisation and for putting elders in retirement homes. Insteadof being placed in retirement homes, the elderlies can live in their homes and be monitored bysensors. This means they can live in familiar environments, i.e., their homes, while feeling safe.The placement and capabilities of the sensors are customised for the well being of the elderly.Sensor readings are low level data created by converting analogue signals to digital values.

Even though these signals are the foundation of the information about the elderlies, it is im-portant to have a framework that manages to handle all the sensor signals and filter out all thesensor signals that are not interesting. In addition, the framework must have a simple higherlevel interface that abstracts away the low level sensor data. One possible approach is to usethe concepts from complex event processing (CEP) to handle this. CEP is a technology thatallows us to write declarative queries that are used to continuously filter interesting events froma stream of data tuples from sensors.In this thesis we present CommonSens, a multimodal CEP system for automated home care.

CommonSens detects complex events and deviations from complex events. In addition, Com-monSens provides personalisation, i.e., it adapts a query plan to the current home, sensors andmonitored person. Personalisation is an important contribution for automated home care. Thisis because there are many similarities between the instances and one type of event may apply tomany different persons. Fall detection and medication taking, i.e., making sure that the personremembers to take his medication, are examples of such common, yet important, events. Bypersonalising, it is possible to reuse or alter queries that describe the same events. Personali-sation can be done by adapting the event description to the environment, the available sensors,and the needs of the monitored person.Automated home care today is either based on closed and proprietary solutions or there is

limited communication between the sensors, i.e., the water leakage sensor starts beeping when

1

Page 14: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

it recognises water and the motion detector starts beeping if it is night time and someone isin the living room. Our goal is to achieve an open solution that uses simple, comprehensiveand domain related models. As part of the personalisation goal, the models facilitate reuse ofmodel instances, and separation of concern, which are important design principles in computerscience.

1.1 Problem StatementIn automated home care there are several issues that have to be investigated. When using anopen system, one cannot expect a homogeneous set of sensors. If the system was proprietary,we could have defined explicitly how the sensors should be configured. We could also havedecided that only certain types of sensors could have been used for detecting given events. Forinstance, only one type of camera can be used, simplifying the process considerably. Withan open system, we can not make such assumptions. It is expected that the set of sensors isheterogeneous, and given heterogeneity, we need models that address the important propertiesfor the sensors that are used in our domain. In order to handle heterogeneity, it might be easierto address what type of events we want instead of addressing the sensors directly. The systemidentifies the sensors that are most appropriate by performing a late binding between the eventtype and the sensors that are located in the home.The homes and the persons in the homes are different as well. Therefore, it is important to

have a system that manages to adapt to the various instances, i.e., homes, sensors, and persons.It is unrealistic just to have one simple system that is completely reconfigured for each home.Such work consumes too much resources and takes too much time to configure. However, thereare similarities between homes and persons as well. Homes consist of the same types of roomsand areas that might be interesting to monitor. Persons might suffer from similar conditions.These aspects have to be addressed and used as part of the system deployment.The issue of sensor placement has to be considered as well. For instance, the coverage areas

of sensors are affected by the environment. A concrete wall reduces the coverage of a camerasince the camera uses light, whereas a radio based sensor can send signals that pass through thewall. Given the fact that the sensors are heterogeneous, it is important that there exist ways ofmodelling any coverage area and still be able to use the system to investigate how these sensorsshould be placed in the home. This is done in order to obtain as much information as possible.Sometimes we also need to combine the information from several heterogeneous sensors inorder to obtain sufficient information about the monitored persons, for example his location.In automated home care, there are issues regarding networking and data collection. Relying

on only one type of communication simplifies the data collection. On the other hand, it reducesthe possibilities of using other types of communication. For instance, having only wired sensorsin the home is simple. However, if the monitored person wears a sensor, the communicationhas to be wireless as well. Therefore, a system for automated home care has to be general andhas to support all possible types of networking and data collection.Heterogeneous sensors and homes complicate the data processing, and, as noted above,

it is not possible to use an automated home care system that does not take these issues intoconsideration. Sensors only report lower level information, and it is not sufficient to rely on only

2

Page 15: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

one type of sensor. Therefore, it is imperative that the system manages to handle the lower levelsensor readings, and combines and relates these readings into meaningful events. This can beachieved by using a CEP system. However, choosing to use a CEP system to solve these tasks isa challenge, since there exist no CEP system that handles all the issues related to heterogeneity,sensor placement, networking, data collection, and processing. On the other hand, CEP is apromising technology, since it allows us to write queries that describe the observed behaviourof the monitored persons.Using the concepts from CEP also simplifies the work of the application programmer. The

application programmer does not necessarily have a computer science background. Hence,lower level sensor programming is not an acceptable solution. In addition, we assume a widedeployment of automated home care solutions in the years to come. Therefore, it is importantto provide a simple and declarative query language that allows this wide deployment of queries.By utilisingmany of the similarities in the home care application domain it is possible to providethe simple and declarative query language and still provide personalisation, i.e., adapt a queryplan to the current home, sensors and monitored person. This means that the CEP systemhas to handle the issues that we have described above, while still simplifying the work for theapplication programmer. The discussion leads us to the following problem statement:

In automated home care, there is a need for a CEP system that handles the do-main specific challenges related to automated home care. The CEP system shouldsimplify the work for the application programmer, and automatically handle per-sonalisation by adapting to the current sensors and homes.

The application programmer should be part of the planning, the query writing, and the sen-sor placement. This means that the work done by the application programmer is very importantfor the well being of the elderly person. In order to further simplify the work for the applica-tion programmer, we have to investigate alternatives to traditional query processing. Traditionalquery processing consists of writing queries that describe what we want. For instance, if theapplication programmer wants to detect falls, he has to describe the fall through a query. Analternative is to use deviation detection, i.e., the application programmer writes queries thatdescribe expected behaviour. Only if there are deviations from the expected behaviour, Com-monSens sends a notification.

1.2 A Brief Look at Additional IssuesAutomated home care is a large application domain, and in this section we present two ofthe many issues that are too broad for the scope of this thesis. The first important issue isrelated to the correctness of the sensor readings, i.e., how to obtain correct information fromthe sensors. Sensor readings can be erroneous and noisy, which means that one has to applysignal processing techniques to the readings before the results can be presented. However, inour work we assume that all the sensor signals and the information we obtain from the sensorsare correct. For instance, when the temperature sensor tells that it is 21◦C we assume that thisis the actual temperature. On the other hand, this assumption may compromise the immediateutilisation of certain sensors, i.e., we depend on a layer between the sensor and the creation of

3

Page 16: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

data tuples which processes and corrects the signals before the values are reported. Despite thisassumption, we show in Chapter 6 that it is still possible to perform real experiments with realsensors.The second important issue is related to the psychosocial effects that constant monitoring

might have on people, especially privacy issues related to the fact that the monitoring is per-formed inside homes. In addition, monitoring can be used without the consent of the individu-als. For instance, people who suffer from dementia might not be aware that they are monitored.Another example is that sensor readings might be distributed to people who should not haveaccess to the data. Our field of expertise is not on psychosocial issues and privacy considera-tions. However, given the way CommonSens is designed, we implicitly take some precautions.For instance, we want much of the data from the sensors to be processed by a computer thatis located inside the home. Only relevant information should leave the home, i.e., notificationsabout the events or deviations from events that the application programmer has defined. Thismeans that raw sensor data is never visible for humans. For instance, video streams are neversent out the home, only the notifications, if any.

1.3 Claims, Methods and ApproachState of the art in automated home care consists of systems that do not integrate concepts likeevent processing, sensors, and sensor placement. For instance, research on event processingin automated home care [SBR09] does not discuss the practical issues that follow when thesensors should be placed in the home, and how objects like walls and furniture affect the sensorreadings. These issues are discussed in other work [BF07], however, they no not discuss howthey can use this information to place sensors more properly in the environment. In our work,we integrate models for events, sensors, and the environment the sensors are placed in. Thisgives an holistic view of the domain and the issues involved. This also includes providinga query language that extends related work by facilitating personalisation and by letting theapplication programmer query complex events and deviations from complex events.In this section we present our claims, scientific methods and approach. They are deducted

from the problem statement and depend on each other. We argue for the importance of ourclaims, which are related to the requirements (presented in Chapter 2). Our claims are as fol-lows:

1. CommonSens detects complex events and deviations. It is important that the eventprocessing works as intended. This happens when CommonSens manages to correctlyidentify sensor readings that match a set of queries. Although we assume that the sen-sor readings are correct, the processing of the sensor readings has to be correct as well,and this has to be handled by CommonSens. The application programmer might wantto be notified about complex events, which means that only one event is not sufficientto create a notification. Complex events typically consist of sets of events that can bedescribed through logical operators, concurrency classes and consecutive relations. Acomplex event might be that the monitored person opens the kitchen door and turns onthe light before making breakfast. While the monitored person makes breakfast, he alsohas to be located in the kitchen. If one of these events are not discovered and processed

4

Page 17: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

correctly, the processing of the whole complex event might fail since the event process-ing depends on this one single event. Traditional complex event processing consists ofwriting queries that describe the complex events that we want to know that occur [esp].On the other hand, deviation detection turns the whole process upside down. Only ifthere are deviations from the complex events, CommonSens sends a notification. This isa strong addition to traditional complex event processing, and simplifies the work of theapplication programmer, since he does not have to write queries about everything that cango wrong. It is sufficient to describe the expected events. If the event processing does notwork correctly, i.e., that single events might not be processed correctly, a consequence isthat many unnecessary notifications are sent.

2. CommonSens processes data tuples in near real-time. It is important that Common-Sens returns notifications about the events as soon as possible after the events have oc-curred, i.e., that the event processing happens in near real-time. If the event processing istoo slow, it may happen that not all sensor readings can be obtained, or they might be lostdue to buffer overloads. This can result in situations where events are not detected evenif they occurred. This is related to Claim 1; all the events have to be processed correctly.

3. CommonSens simplifies the work for the application programmer and provides per-sonalisation. As stated in the problem statement, CommonSens is designed with person-alisation and simplicity in mind. Personalisation allows the application programmer tomake small changes on already existing queries. These small changes are adapted to anew environment, other types of sensors and other persons. Related work provides per-sonalisation as well [WT08], but they do not include all the issues related to adapting todifferent homes and sensors. The deviation detection mentioned in Claim 1 also simpli-fies the work of the application programmer, and the implementation of CommonSensallows the application programmer to create virtual environments, test sensor placementand emulate the event processing as well.

We achieve our claims through methods like modelling, designing, implementing and exper-imentally evaluating CommonSens. The modelling part consists of defining a set of models forevents, environments and sensors. By using models, we can identify the important propertiesthat apply to our application domain. For instance, it is important to know about the coveragearea of a sensor and how the objects in the environment, e.g. walls, affect the coverage areawhen the sensor is placed there. In order to simplify the work of the application programmer,we define a declarative query language that uses abstractions instead of directly addressing sen-sors. This means that the application programmer only needs to address locations of interest(LoIs), e.g. the kitchen, the type of events he wants from the kitchen, and the temporal proper-ties, i.e., when and for how long the events are supposed to occur. The application programmercan use the language to describe that events should occur concurrently and consecutively. Inaddition, the application programmer can state that the query processor should only return a no-tification if there are deviations from the query. CommonSens automatically instantiates a queryby investigating the environment and sensors in the environment and performs late binding be-tween the query and the sensors. This is done by only addressing capabilities in the queries.CommonSens also detects if sensors that provide these capabilities are available in the current

5

Page 18: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

home.CommonSens is designed as a complex event processing system, i.e., its main purpose is to

efficiently process data tuples that it pulls from the sensors. In order to avoid a large numberof data tuples, CommonSens only pulls those sensors that are relevant for the current queries.When the set of data tuples match a complex query, a notification is sent. If the applicationprogrammer has stated that he is only interested in deviations instead, a notification is sent onlyif the query is not matched.The proof-of-concept implementation of CommonSens supports much of the concepts that

are defined during modelling and design. We evaluate our claims by testing use-cases, queries,environments, sensors and personalisation in our implementation. In addition to emulate userpatterns and sensors, our implementation allows us to perform experiments with real sensors.This shows that the concepts that CommonSens is based on can be utilised in real scenarios andnot only through simulations.

1.4 Contributing PapersIn order to achieve our claims, we have made important contributions to the automated homecare application domain. Our contributions are as follows:

• We define models for events, environments and sensors and show how these models fa-cilitate personalisation through our query language.

• We have introduced a way to detect events by letting CommonSens make use of the sensorplacement and the environment properties.

• We introduce the concept of deviation detection as an alternative to explicitly describingdangerous and critical situations through queries.

These three contributions are presented in the following four papers, which form the basisof this thesis. Below we shortly summarise each of the four papers.

• To Happen or Not to Happen: Towards an Open Distributed Complex Event Pro-cessing System [SGP08]. This paper paves the way for CommonSens by introducing ourdeclarative and multimodal approach. In addition, we stress the importance of supportfor consecutive and concurrent events, as well as deviation detection. Being an early con-tribution in our portfolio of papers, the paper discusses issues like distribution and stateautomata, which we have not included in the final version of CommonSens.

• CommonSens: Personalisation of Complex Event Processing in Automated HomeCare [SGP10a]. In this paper we present our models and show how CommonSens sup-ports personalisation. We evaluate personalisation by using two environments and show-ing how the same query can be instantiated in both, even though the two environmentsuse different types of sensors. We believe that the personalisation simplifies the work ofthe application programmer since he only needs to do small changes to the queries. Inaddition, the queries address LoIs, which is an abstraction that applies to many homes.

6

Page 19: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

• Detection of Spatial Events in CommonSens [SGP10b]. Spatial events occur in thehome and are detected by sensors that cover certain areas. We show how to combinereadings from several sensors in order to approximate LoIs in the home. In addition,we show how CommonSens can use different propagation models to determine how thecoverage areas of the sensors are affected by objects in the environment. The applicationprogrammer can model these environments on a computer and find optimal sensor place-ment before placing the sensors in the home. This simplifies the work of the applicationprogrammer since it allows the application programmer to simulate different situations inthe environment.

• Deviation Detection in Automated Home Care Using CommonSens [SGP11]. Manysystems focus on describing the dangerous situations that can occur in the home, e.g., thatthe monitored person falls or does not get up in the morning. In this paper, we want toshow that it is possible to turn this issue around. Instead of describing everything that cango wrong, we focus on describing the expected events instead. Only when the expectedevents do not occur, something might be wrong and needs special attention. This contri-bution simplifies the work of the application programmer since he only has to explicitlydescribe the expected events. CommonSens automatically detects the deviations.

1.5 Structure of the ThesisIn Chapter 2 we present the background for our work and relate our contributions to otherwork. We argue for automated home care by referring to documentation of the increasing ratioof elderlies. We present the requirements of successful deployment of automated home caresystems. We discuss sensor technology and different concepts related to events, and point outthe shortcomings of related work. In Chapter 3 we present our models for events, environmentsand sensors. In addition, we introduce our query language. Chapter 4 shows how the modelsare used and how they interact. The context of the presentation is related to the life cycleof CommonSens, from utilisation of model instances to placement of sensors, instantiation ofqueries and complex event processing. The concepts of CommonSens are realised through arunning implementation, which is described in Chapter 5. Our three claims are evaluated inChapter 6. Finally, we conclude and address future work in Chapter 7.

7

Page 20: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 21: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 2

Background and Related Work

In this chapter we describe the technologies this thesis builds on. Throughout the chapter,we relate our contributions, i.e., models for events, sensors, and environments, and a querylanguage, detection of spatial events, and deviation detection, to other work. First, we givean overview of automated home care. We define relevant terms and roles for this applicationdomain. We discuss the user and system requirements that we consider relevant for our problemstatement. Second, we discuss sensor technology and give an overview of different types ofsensors. We also relate our sensor model to related work. Finally, we present the conceptof events, which play an important part of our work. The section discusses topics like eventmodels, and query languages. In addition, the section discusses complex event processing,spatial issues, i.e., detecting spatial events, and deviation detection. For each of the topic, wealso refer to related work and point out where our contributions extend the current state of artin the field.

2.1 Automated Home CareIn 2009, the U.S. Department of Health and Human Services issued a report concerning the age-ing of the world population [KH09]. In the whole world the proportion of the older populationincreases and by 2040, it is estimated that 14 percent of the global population will be aged 65and older. Today, the developed world has the largest population of people aged 65 and older.However, it is estimated that the developing countries will follow, and between 2008 and 2040,several developing countries will experience a growth of over 100 percent concerning people inthat age span. These numbers show that there will be challenges related to taking proper careof the elderlies since care giving requires that there are enough people and resources available.Given this uneven distribution of ages, care giving institutions have to change and investigatealternative solutions in order to maintain life quality for the ageing population.A possible solution is automated home care, or ambient assisted living (AAL). The recent

advances in sensor technology have made it possible to envision scenarios that involve sen-sors which are placed in a home and which monitor the persons living there. If carefully andcorrectly used, these sensors can obtain information that can be used to save the person fromharmful conditions, or at least send alarms when such conditions occur or are about to occur.Examples of harmful conditions are falls or situations where the person does not behave as ex-

9

Page 22: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

pected, e.g. not getting up in the morning. The fact that harmful conditions can be identified byubiquitous and ambient sensors increases the quality of living because the person feels safe inthe home [ML08].Using sensors in the home to monitor persons does not only apply to older persons. Such so-

lutions can also be used to heighten the threshold for hospitalisation in general. People sufferingfrom chronic diseases or other illnesses can benefit from sensors that monitor their well beingand health. In addition, preventing people from unnecessary hospitalisation is economicallybeneficial for the society.Due to its relevance for the ageing population in the whole world, automated home care is

a well-accepted research field. For instance, the European AAL Joint Programme1, which isimplemented by funding authorities of several European countries, has a total budget of e700million for the years 2008 to 2013. The focus and resources used in automated home care hasresulted in several systems and system designs that address the various issues related to thisdomain.In the following section, we define the roles in automated home care and analyse the re-

quirements of the application domain.

2.1.1 Roles

In this section we present the roles that we find relevant in our work. We have found three rolesthat are especially important in our application domain:

1. Monitored person. The monitored person is the individual who lives in the home andneeds special care. The monitored person is observed and monitored by the sensors thatare placed in the home. Our main focus in this work is on monitored persons who areelderly. However, automated home care does not solely apply to elderlies. Automatedhome care also applies to persons who suffer from various types of illnesses like chronicdiseases, and who need to be constantly monitored.

2. Application programmer. The application programmer is the person who implementsthe automated home care application. The application is a set of queries that are adaptedto fit the needs of a monitored person. The application programmer does not have an ex-tensive knowledge in low level programming of sensors and needs a high level interfaceto write queries and instantiate the environment, i.e., the home. The instantiation of theenvironment consists of defining the walls, furniture, doors, and other important objectsthat belong to the home. The application programmer places the sensors in the home ofthe monitored person, and by instantiating the environment on a computer, the applica-tion programmer knows where to place the sensors in the real home. This is importantsince the placement of sensors decides what type of information they obtain. In addition,several sensors can cooperate in order to obtain even more information, i.e., the sensorreadings can be aggregated.

3. Helping personnel. The helping personnel are the persons who interact with the moni-tored person in order to obtain knowledge about his/her personal requirements. The help-

1http://www.aal-europe.eu/

10

Page 23: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

ing personnel communicate with the application programmer in order to inform aboutwhich queries to use. If it is not possible to reuse queries, the helping personnel give theapplication programmer enough information to write new queries. If the sensors in thehome detect events that indicate that the monitored person needs any assistance, a notifi-cation should be sent to the helping personnel so that they can interact with the monitoredperson.

Our main goal is provide a complex event processing system that simplifies the work forapplication programmer. However, it is important to be aware of the existence of helping per-sonnel as well, since their work is to interact with both the monitored person and the applicationprogrammer. Although there might be situations where several monitored persons live togetherin one home, we focus on scenarios where only one monitored person lives in a home.

2.1.2 Requirement Analysis

Our requirement analysis is performed by investigating related work in automated home care.Current research and observations in the field of automated home care emphasise on four re-quirements; two user requirements and two system requirements. The requirements coincidewith existing research [TBGNC09, MWKK04, WT08, WSM07, ML08] as well as official con-siderations in the field [LAT05].The simplification of the work for the application programmer should not happen at the

expense of the well being of the monitored persons. The user requirements are as follows:

1. Safety. The motivation for using automated home care is to provide safety for the moni-tored person. We assume that feeling safe in a familiar environment increases the qualityof life for the monitored person. This requirement explicitly facilitates solutions for de-tecting possibly dangerous situations [TBGNC09].

2. Non-intrusive deployment. The deployment of a system that provides automated homecare should not interfere with the daily life of the monitored person [MWKK04]. It isimportant that, by staying in the home, the monitored person should feel independent[LAT05] and in control of his environment.

Safety is maintained by verifying that the system correctly detects possibly dangerous andcritical situations. If a dangerous and critical situation is detected, e.g., that the monitoredperson falls, the helping personnel is alarmed immediately. Other situations are not dangerous,and a reminder to the monitored person might be sufficient. This applies to situations where themonitored person for instance has forgotten to take his medicine. The reminder can be an alarmin the home or a pre-recorded voice telling what the monitored person has forgotten to do. Inother cases it is sufficient that the helping personnel calls the monitored person by using phoneor other communication devices.To provide a non-intrusive deployment, communication between the monitored person and

the helping personnel should, unless an emergency situation occurs, be limited to telephone andweb pad (a pad connected to the Internet using the Web) [MWKK04]. A web pad solution isfor instance implemented in the work presented by Walderhaug et al. [WSM07], where persons

11

Page 24: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

suffering from dementia are reminded about daily activities like taking medicine, as well astelling if it is day or night. The monitored person should not directly provide data about hisstatus. Instead, the monitored person should be reminded about certain tasks if he does notrespond as expected. Such a solution is also suggested by Laberg et al. [LAT05]. Laberg etal. state that only established and well-known interfaces should be used. This might excludeusing technical devices like web pads, since elders today might not be familiar with such aninterface. By having non-intrusive deployment, it is important to note that the deployment needsspecial consent, especially if the monitored person suffers from dementia or similar cognitivedisabilities [LAT05]. Along with this point, it is also important that the control and ownershipof the data collected by the automated home care system is well-defined [ML08], since there areprivacy issues that have to be taken into consideration during monitoring. Finally, non-intrusivedeployment should not require cognitive efforts from the monitored person, e.g. that they haveto instantiate the system themselves [MWKK04].In recent deployment of automated home care solutions, it has been reported that “the work

has been more calm and structured” for the helping personnel [LAT05]. In order to sustain thisimpression among the helping personnel, it is essential that they can obtain knowledge aboutthe current system deployment when a critical situation has happened and they arrive in a home.In addition to what the system reports, this might for instance be information about the locationof the person. When the helping personnel arrive in an apartment or house, they should beable to obtain documentation of the system functionality [LAT05] or have knowledge about thisbeforehand.The system requirements are related to how we expect CommonSens to be. One system

requirement relates to the application programmer and one system requirement relates to theevent detection. The system requirements are as follows:

1. Personalisation. Even though a consequence of successful development of automatedhome care solutions imply a large scale deployment, every person is different, and hasspecial needs and requirements that have to be met [WT08]. These needs may also changeover time. The efforts in developing a solution for monitoring a single monitored personshould be reduced.

2. Near real-time detection of dangerous and critical situations. When something dan-gerous or critical occurs, it is important that the system detects this immediately and thatthe delay is minimal.

In addition to increasing the quality of life for the monitored person, automated home careshould simplify the work for the helping personnel and the application programmer. For in-stance, we can not require that the application programmer has extensive knowledge about lowlevel sensor programming. Such a requirement might limit the number of people who can inter-act and work in the automated home care application domain, since application programmersand helping personnel need to be educated in health care as well.One important argument for personalisation is that custom solutions for every monitored

person require more economical resources, i.e., it costs more money. For instance, personal-isation might be implemented by providing a repository of query templates. These templates

12

Page 25: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

can be adapted and modified depending on the needs of the monitored person [LAT05]. How-ever, in order to save money this setup might need to cover large parts of the population thatneeds to be monitored. In addition, personalisation needs to be supervised by professionalslike helping personnel. This is important since there may be a difference between what themonitored persons need and what they want [ML08]. Hence, the monitored person should notcommunicate directly with the application programmer. A final issue with personalisation isthat the automated home care application domain has to adapt to changing user behaviour overtime [WT08]. New user patterns need to be detected, and this means that new sensors have tobe placed in the home. In addition, if the sensors are battery powered, the batteries have to bereplaced occasionally.

2.2 Sensor Technology and Sensor ModelsIn recent years sensor technology has evolved into a resource which can be used in many ap-plication domains. For instance, given the efforts in efficient utilisation of research fields likesensor networks [ASSC02], sensors have become small and efficient. The size of sensors hasdecreased drastically, and commercial sensor based solutions like RFID have been successfullydeployed in many every-day situations, like registering tickets on the bus or unlocking doors inoffice buildings. Sensors can also be used to measure air temperature and relative humidity inforests [TPS+05] and vineyards [GIM+10]. Modern cars are equipped with several sensors thatmeasure oxygen level for tuning the ratio of air and fuel, as well as vehicle motion [SH09]. Inaddition, meteorological sensors can measure solar radiation and air pressure [BR07], and thereeven exist glucose sensors that can be implanted under the skin and harvest energy from, e.g. awrist watch [HJ08].This section explains sensor technology and places sensors into three categories: (1) RFID

tags/readers, (2) programmable sensors, and (3) sensors that need to substantially process thedata to give meaningful results. Sensors can be defined differently, but one possible interpreta-tion is that sensors are “devices for the measurement of physical quantities” [BR07]. Depend-ing on the sensor type, this means that a sensor measures various types of physical values inthe environment, for instance light signals and temperature. In our work we especially focuson sensor types that can be used within the automated home care application domain and themain resource in our application domain is the smart home. The smart home is equipped withsensors that obtain information states in the home. A state can for instance be associated withthe current temperature in a given room, or the location of the monitored person.The first category is based on RFID (radio-frequency identification). RFID is based on the

communication between a reader and a tag. When the tag is located within the coverage areaof the reader, it responds to a radio signal that the reader sends. The tag can either be active,semi-active, or passive. The active and semi-active tags are attached to a power source, e.g.a battery. The passive tag harvests energy from the reader and uses that energy in order torespond. The response contains information that identifies the tag. This identification can bea unique number, which can be used to identify objects in the kitchen [NBW07] or personswho wear the tag. We have chosen to put RFID technology in its own category, since it alwaysrequires the reader/tag pair in order to work.

13

Page 26: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

The second category is the programmable sensor. A programmable sensor mostly consistsof three mains parts: (1) the sensing device, i.e., an AD (analogue-to-digital) converter thatmeasures the physical quantity and turns it into a digital value. The digital value is a simpledata type like an integer, e.g., the temperature. (2) the computation of the digital value, and (3)the communication to other sensors or a base station. The sensing device can range from simpletemperature sensors to accelerometers and humidity sensors. The information from the sensingdevice is computed by a processor with memory attached. The processor can be a simple butpower-efficient CPU or more complex and power demanding. The communication betweenthe sensors can either be wireless or wired. In addition, the sensors can either be poweredby batteries or plugged to the power line. Since the sensor is programmable, it is possible toeither program the sensor directly or flash the memory with a bootable image. The former canbe done by using the Contiki operating system [con], while the latter can be done during thedeployment of TinyOS [tin]. Examples of sensor types that fall into the second category arethe TelosB, MICA2 and MICAz motes [xbo] and the FLEX mini motes [evi]. Mobile phonesare examples of devices that contain several programmable sensors. Modern mobile phoneshave sensors like accelerometers and temperature, and provide applications that allow the userto read the information from the sensors. Users can program their own applications and in thatway utilise the sensors that the mobile phone provides. In addition, the applications can use thecommunication part to send the data to other mobile phones or computers and receive data.The third category is related to sensors that need substantial processing of the digital values.

While the sensors in the second category produce single data values when they are pulled,the sensors in the third category return more complex data types, like arrays or data objects.Multimedia sensors like cameras and microphones fall into this category. Cameras produceimages, which can be represented as a sequence of two-dimensional arrays. The sound thatcomes from a microphone does not give any meaning before it is represented as an array ofsound samples.Since sensors might have limited processing power and might have limited supply of energy,

utilisation of the communication part is an important issue in sensor technology. Research onwireless sensor networks addresses communication as one of the main challenges in the domain[YG03], since the process of sending one packet takes much energy. This means that it is notappropriate to continue resending packets that are lost. In addition, since the packets are sent inthe wireless channels, there might be collisions if the communication is not synchronised. Sev-eral communication protocols have been developed for sensors, including standards like IEEE802.15.4 [iee], ZigBee [zig]. In the home care application domain it is possible to avoid thesecommunication issues by connecting the sensors to the power line in the house. Commercialindustry standards like X10 use the power line for communication as well. On the other hand,in automated home care, the monitored person might have to wear sensors, and this means wehave to consider that the sensors might use both wireless and wired communication.Sensor models are used to identify relevant properties of sensors. For instance, the OpenGIS

Sensor Model Language (SensorML) is used to describe sensors and sensing processes [BR07].The language describes processes and process chains related to sensors. SensorML aims tocover many of the issues that we have chosen to relate to the environment model. For instance,in SensorML, the location and orientation of the sensor can be defined. In addition, the coverageof a sensor is defined either by being in-situ or remote. The in-situ sensors measure the object

14

Page 27: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

they are attached to, for instance a temperature sensor in a room. The remote sensors measurephysical quantities that are distant to the sensor, i.e., they have a coverage area. SensorML usesthe term capability about the type of output a sensor produces. In our sensor model, a capa-bility is a description of what the sensor can observe. However, the capability in SensorMLalso contains temporal properties like sampling frequency. Since the application domain is notexplicitly defined, the resulting sensor model is complex and comprehensive. SensorML andits components are defined using UML and XML. Our sensor model uses some of the elementsin SensorML, and we do still not know if there exist elements in our models that SensorMLdoes not cover. However, our sensor model is application domain specific and contains mostlyconcepts that are relevant for automated home care. For instance, we define sampling frequencyas a separate property and not part of a capability. In addition, by combining our sensor modelwith our environment model, we can define orientation and placement or sensors. In Common-Sens, we extend the traditional view of sensors by explicitly dividing our sensor model intothree different types. Note that these types should not be confused with the three categoriespresented above. The first sensor type is the physical sensor, which transforms the analoguesignals into digital values. The second sensor type is called a logical sensor, which aggregatesdata from other sensors, and depends on input from the other sensors to work. The third type ofsensor is the external source, which stores static data. This distinction of sensor types has notbeen done in related work, and gives a better overview of the functionality that is expected fromthe different types of sensors.

2.3 EventsCommonSens is a complex event processing system that obtains data tuples from sensors andprocesses them according to a set of queries. In order to understand how CommonSens worksit is important to have a fundamental understanding of events and issues related to this con-cept. Hence, in this section we first introduce events by referring to different interpretations,definitions, and models. Second, we discuss query languages that are used to describe com-plex events. Third, we introduce the concept of complex event processing systems and discusspersonalisation. The fourth topic is related to spatial issues and sensor placement. Finally, wediscuss the concept of deviation detection and show the novelty that CommonSens providesconcerning this issue.

2.3.1 Event Models

In this section, we first discuss general event definitions in order to give a common understand-ing of the concept. Second, we relate our work to event models used in automated home care.The term event is hard to define, as the term is used in many different application domains

from philosophy to computer science. According to the Merriam-Webster dictionary, an eventcan be an occurrence, i.e., “something that happens”, “a noteworthy happening”, “a social oc-casion or activity”, or “an adverse or damaging medical occurrence <a heart attack or othercardiac event>”. The first definition is a common divisor for all the three remaining definitions,since they all refer to something that happens. The only difference between the four definitions

15

Page 28: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

is the strictness related to the event occurrence. If an event is considered as something that hap-pens, everything can be an event. The second definition points to a possible restriction, since itstates that an event is only a noteworthy happening or occurrence. This means that only a subsetof everything that happens can be considered as events.Events are defined differently within the field of computer science as well, and in this section

we focus our discussion on atomic and complex events. Throughout this thesis we use the termevent for both atomic and complex events. Luckham [Luc01] defines an event as follows: ”Anevent is an object that is a record of an activity in a system”. According to Luckham, an eventhas three aspects: form, significance and relativity. The form of any event is an object. Theobject can be everything from a simple string or a tuple of attributes that tell where and whenthe event occurred. The significance of an event relates to the activity. For instance, in anordering system every event related to new orders are significant. The relativity aspect denoteshow the event is related to other events. The relation can be temporal and causal. In addition,two or more events can be aggregated. A version similar to Luckham’s definition is used byCarlson [Car07]. Carlson states that “an event represents a particular type of action or changethat is of interest to the system, occurring either internally within the system or externally in theenvironment with which the system interacts”. Luckham’s definition is restricted to a system,whereas Carlson’s definition also includes the environment, which might be more than a system,e.g. a home where things can happen. This fits very well with the event definition from Etzionand Niblett [EN10], who include important considerations from both Luckham and Carlson:“An event is an occurrence within a particular system or domain [...] The word event is alsoused to mean a programming entity that represents such an occurrence in a computing system”.An atomic event is an event that can not be divided into any other events. For instance,

Atrey [Atr09] states that an atomic event is “exactly one object having one or more attributes[and which] is involved in exactly one activity over a period of time”. Since the atomic eventhas temporal properties it can have two timestamps. The first timestamp denotes the point oftime when the event begins, and the second timestamp when the event ends. The atomic eventscan occur concurrently or consecutively, and a set of atomic events that are consecutively andconcurrently related is called a complex event. For instance, the routine of making breakfast canbe considered as a complex event. When the monitored person opens the door to the kitchen,opens and closes the fridge, prepares food, etc., these occurrences can be interpreted as tempo-rally related atomic events. Complex events are described by many terms, for instance compos-ite events [RGJ09] or compound events [Car07, AKJ06]. However, the common divisor is thatthey contain related atomic events.Events can be structured in hierarchies and abstractions. We use an example from a fabric

line. This example is inspired by the examples that Luckham uses. The lowest level of eventsis related to the assembly of the product equipment, e.g. components in computer chips. Thenext level is related to the completion of one chip; an event is created when a chip is assembled.The following events at this level relate to moving the chips from the assembly line to packingand shipping. The whole process of packing and shipping can also be considered as one singlehigher level event.Logical sensors introduce event hierarchies, since a logical sensor can be an abstraction

and aggregation of lower level sensor readings. A hierarchical approach is an abstraction that isquite often investigated in complex event processing. For instance, Luckham and Frasca [LF98]

16

Page 29: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

introduce event hierarchies as a way of handling different levels of events in their RAPIDEcomplex event processing system [LKA+95]. An approach they investigate is detection ofhigher level events in fabrication process management systems. For the multimedia applicationdomain, Atrey [Atr09] introduces a model for hierarchical ordering of multimedia events, wherea complex event can consist of several atomic events, and finally transient events.In addition to atomic and complex events, there exist many other event types and structures.

For instance, Atrey includes an event type that is at a lower level than atomic events. Theseevents are called transient event and have only one timestamp. According to Atrey’s model, theatomic event can consist of two to N transient events that happen consecutively. Consequen-tially, the atomic event still has two timestamps: The first timestamp is from the first transientevent and the other timestamp is from the last transient event. An interesting addition in Atrey’sevent model is the silent event, which is “the event which describes the absence of a meaningfultransient event”. This type of event is meant to cover the interval between two transient events.Events can have many properties. For instance, they can occur in time and space. In addition

to temporal and spatial properties, Westermann and Jain [WJ07] introduce the informational,experiential, structural, and causal properties, or aspects, of events in their event model E. Theinformational properties of an event are information about the event. For instance, if an event isthat a person walks into a room, the informational properties about this person might be the per-son’s identity or howmany times the person has walked into the room this day. This informationcan be available from combining the data from the event with database records. Other infor-mation from events might be based on experience from earlier readings, so that the intervalsand spatial properties can be adapted. This also includes obtaining information from severalmedia to gain more information. This is denoted experiential properties. Structural propertiesof events help deciding which level of abstraction we want to investigate, which is similar toevent hierarchies. Finally, complex events can have information about causal properties; thatevents depend on other events in order to happen. For instance, if a light switch is located insidea room, it can not be turned on before someone actually enters the room.In their Event-Model-F [SFSS09], Scherp et al. use ontologies to obtain a common un-

derstanding about events that occur in a given domain. Instead of using queries to defineevents beforehand, the events in the Event-Model-F happen related to a context, for instancean emergency-response scenario. This scenario might have participating roles like officers inthe emergency control centre, firemen and police. In the Event-Model-F events are only en-tities that exist in time and not in space. Therefore, it is common to relate events to objects,like a person opening a door. In the Event-Model-F, the identification of causality is important.In the emergency-response scenario, an emergency control centre might get calls from peoplecomplaining that the electricity is gone and other people complaining that there is water in theircellars. When investigating the houses having water in the cellars, the forward liaison officersobserve that these events are caused by flooding. Since the complaints about the electricitywere registered almost at the same time, the officers in the emergency control centre can usetheir event based system to reason about the cause of the electricity failure. One of the possiblecauses is the flood having snapped a power cable. However, causality is hard to determine, andthe electricity failure might have been caused by other events as well.In the discussion above, we show that events are used in many application domains and that

events are defined and interpreted differently. In the following discussion, we present event

17

Page 30: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

models for automated home care. Even though the term event is used in several papers in thisapplication domain, there are few papers that present separate event models. The event modelis usually integrated in communication [RH09] and inherit many of the properties from thepublish/subscribe research domain. CommonSens is a centralised solution where one computerhas direct contact with all the sensors. Hence, we do not relate to publish/subscribe eventmodels, which are usually distributed and related to larger scenarios and application domains.In their event-driven context model, Cao et al. [CTX09] define events as context switches.

The overall goal of their work is to close the semantic gap between environment context, whichcan be sensor readings, and human context, which denotes the activities that the monitoredpersons are part of. In our work we use sensor readings to reason about the activities of dailyliving. Cao et al. define two types of events: Body events and environment events. The bodyevent lie down causes a shift from the state standing to laying. The environment events arethe results from the interaction between the user and the environment, e.g. an object is usedor household appliances are manipulated by the user. The semantic gap is closed by using anontology, and the classes in their ontology include and relate all the aspects of their automatedhome care scenario. For instance, an activity has entities like cooking and watching tv, whereasa sofa is a furniture object with properties like size and function. In addition, the activity hasproperties like services, destination, and description syntax. The events are part of this ontologyand are reported from sensors as events when the transitions occur. By using the ontology, Caoet al. integrate events in a different way than we do in CommonSens. In CommonSens, anevent is a state or state transition in which someone has declared interest, i.e., addressed inone or more queries. However, since only context switches can be events according to Cao etal., it is not possible to identify that the monitored person sits on the sofa as an event. In ourapplication domain it is important to detect such states, since, e.g. they can help locating themonitored person or find out for how long the monitored person has been sitting in the sofa.According to Cao et al., such information can be deducted from the switches that lead to thecontext. If we want to know that the monitored person is sitting on the sofa, it is possibleto identify this context from the event sits down on the sofa. Cao et al. do not investigatethe concept of complex events, although they relate to concurrent and consecutive occurrenceof context switches. This is expressed through a first-order logic language that we discuss inSection 2.3.2.Park et al. [PBV+09] define events to be responses of sensors when the sensor readings

satisfy certain conditions, which is similar to our definition. A basic event does not containinformation about the sensor that produce the event. This allows the users to address eventslike motion in the home without specifying which sensors they expect to obtain the data from.On the other hand, an event includes information about the sensor that produced the event. Anepisode is a series of events which are ordered by the production timestamp. This is similar toour complex events, which are sets of atomic events. However, it seems that the events that Parket al. define only contain one timestamp in contrast to our two. This implies that concurrencymight not be supported unless two sensors produce readings concurrently. Park et al. use astatistical approach to obtain the events that form an episode. The conditions are simple: Anaccelerometer sensor reports an event if the values exceed a pre-determined threshold. Park etal. only refer to three types of basic events, which are caused by different types of accelerometerreadings; getting in and out of bed, moving objects, and moving around in the home. They do

18

Page 31: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

not include any other sensors or types of events, and do not discuss the possibility of extendingthe system with more events. By using only accelerometers, it might be hard to add any othertypes of events as well.Storf et al. [SBR09] also use an event model that is similar to ours. Their events are states

and state transitions, however it is not clear whether all states and state transitions are events ornot. Events can occur in sequences as well, and they use a general rule language to describethe events. Furthermore, their events can be ordered in hierarchies where sensor events formaggregated events. An example of an aggregated event is that the monitored person prepares ameal, and this event is based on sensor readings from sensors that are defined to be related tomeal preparation, e.g. a sensor that reports that the fridge is opened.All three event models for automated home care are defined informally, however, they show

that there are different interpretations of how events can be defined in our application domain.To the best of our knowledge, there exist no other event models for automated home care thatmatches our model, both in formality and generality.

2.3.2 Query Languages

Declarative query language are inspired by SQL and it is well accepted that a declarative ap-proach is simpler than a procedural approach when it comes to describing complex events. Forinstance, consecutiveness can be easily described by using an arrow between two events A andB: (A -> B). This means that A should be followed by B. This approach is for instance usedin the Esper complex event processing library [esp]. In addition, a declarative query languagefacilitates personalisation, since it helps removing technical issues like sensor selection fromthe application programmer. In CommonSens, the queries never address sensors directly; onlythe capabilities that the sensors provide are addressed. At a later stage in the life cycle of thesystem, the real sensors and the queries are combined. This approach makes it possible to writegeneral queries which can be reused in several homes with different types of sensors.Our query language is formally defined, and supports consecutiveness, concurrency, devia-

tions, and abstractions. In our application domain, there are some other languages as well thatprovide some similar properties, however concurrency and deviations are seldom discussed.In order to describe context and context switches, Cao et al. [CTX09] use a first-order logic

based query language which includes the followed-by relation→. However they do not provideany formal definition of the language and its syntax. In addition, it is not clearly stated howthey translate the location in their queries to the sensors, even though they claim to use theirontology. Concurrency and deviations are not discussed, although they allow concurrent contextor events by using the ∧ operator. On the other hand, this might not be sufficient if they planto capture all classes of concurrency, e.g. that an event occurs during the occurrence of anotherevent. Since they do not support concurrency classes and have limited documentation of howthe language works, we conclude that our query language is better to express complex events.In their personalised automated home care system, Wang and Turner [WT08] use a policy

based XML to define conditions. If a sensor reading matches the condition, a predefined actionis performed. The condition and action are part of a policy, which describes a given situation.For instance, the policy prepare bath water applies to one given person in a home at one givenpoint of time, e.g. 20:00hr. When the time is 20:00hr, an action is defined to tell the home

19

Page 32: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

care system to start heating bath water. The conditions are simple parameter-operator-valuetriples, which is similar to the conditions in CommonSens queries. However, it is not clearwhether Wang and Turner manage to express complex events. In their examples, they use timeas condition, i.e., at certain points of time there are different events that should occur. Thisapproach is also similar to our, however, we explicitly state the possibilities for expressing bothconsecutive and concurrent events, something which Wang and Turner do not define.In their query language, Qiau et al. [QZWL07] use concepts from temporal logic to describe

events. Temporal logic can be used to state that an atomic event should occur during a definedinterval. Complex events can also be described using temporal logic, and Qiau et al. use thefollowed by relation→ to describe that two or more events should occur consecutively. On theother hand, as with Cao et al., only the logical operator ∧ is used to describe concurrent events,which is not sufficient. In addition, deviation detection is not included in their language, makingit hard to detect deviations properly.Storf et al. [SBR09] use XML to describe activities, i.e., aggregated events. An activity can

consist of several events, and the relationship between these events can be defined. The rela-tionship can be used to define temporal and consecutive properties of the events. The languagegives the application programmer the possibility to state that some events are not obligatory.For instance, in order to prepare a meal, not all the events have to be fulfilled every time. Eachevent, if not defined as obligatory, is defined with a weight value. Therefore, it is sufficientthat the sum of the weight values exceed a certain threshold in order for the aggregated eventto occur. This functionality is not supported in CommonSens; currently all the events have tooccur in order for the complex event to occur. On the other hand, it is not clear whether or nottheir XML can express concurrency and deviations.The language that is provided by the complex event processing library Esper [esp] contains

much functionality related to detecting complex events and performing aggregations over aset of events. In CommonSens, aggregations are supported as well, and can be performed byusing a custom function in the logical sensors. Even though Esper claims to support queries forcomplex events, it does not support any concurrency operators other than ∧. It does not supportdeviation detection either.

2.3.3 Complex Event Processing and Personalisation

In order to fully benefit from the concept of events, it is required that there exist systems thatmanage and process the events. There exist many types of such systems, ranging from thosethat handle single atomic events, to complex event processing systems. In this section, firstwe focus on the complex event processing (CEP) systems, but we also mention some of therelated systems. Second, we look at related work concerning personalisation. We discuss CEPand personalisation in the same section, and in order to simplify the work of the applicationprogrammer, it is important that the CEP systems used in automated home care support this.However, as we show in the discussion, the combination of CEP, personalisation and automatedhome care is novel and there exist no related work concerning this combination.In computer science, events have traditionally been associated with event-handling by the

graphical user interface in modern operating systems. When the user moves, drags or clicksthe mouse, the operating system generates an event that contains information about the action.

20

Page 33: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Several running applications have installed event handlers, and the event information is sent tothose applications that are available. If the application has defined an event handler for thisevent, the event handler is called and performs the corresponding operations. In the publish/-subscribe system paradigm, the user can subscribe to certain events. For instance, in rescueand emergency scenarios, the medical doctor can subscribe to events, e.g. the heart conditionrelated to certain patients [SSGP07].CEP systems handle consecutive and concurrent data tuples. The data tuples contain infor-

mation about something that has happened, e.g., that a key is pressed on the keyboard, or thatthe monitored person has eaten breakfast while being in the kitchen. The origin of the data tu-ples, i.e., the data sources, can for example be sensors, stock tickers or computer hardware like amouse. The data tuples can either be pushed to the CEP system as a data stream, or the CEP sys-tem can be instructed to pull the data sources that are of interest. In some application domains itis assumed that all the data tuples in the data stream are events. For instance, an event in a stocktrading system is a data tuple that informs about a company, the price of that company’s stock,as well as a timestamp that tells when the trade has been performed. This approach is acceptablefor such domains, and is used in systems like Cayuga [DGH+06, DGP+07, WRGD07], SASE[WDR06], and SASE+ [GADI08]. However, these systems only consider those data tuples thatmatch a set of rules. In other systems the amount of information can be considerable. Thisapplies to sensor based systems, since data streams from all the sensors might lead to a highload. Therefore, it is much more resource efficient to let the CEP system pull the relevant datasources instead.In order to decide if a data source is relevant or not, the CEP system needs to work accord-

ing to a set of queries, which tell what the CEP system should do when certain data tuples areobtained. The data tuples might match a set of conditions, like values, or temporal and spa-tial properties. The queries are written by application programmers who work within a certainapplication domain. This also means that the application programmers have domain knowl-edge and know what sort of conditions are required. The complex event processor and querylanguage provide personalisation, and our CEP system easily adapts to new sensor types and en-vironments. Since it is hard to find related work concerning CEP, personalisation and automatedhome care, we relate the personalisation issue to two systems that use a different approach thanCEP.Challenges related to personalisation of home care systems are addressed by Wang and

Turner [WT08], where they use a policy-based system using event, condition, action (ECA)rules where certain variables can be changed for each instance. They provide the possibility ofaddressing sensor types instead of sensors directly, provided by a configuration manager thatfinds the correct sensors. However, they do not provide separate models for the events, sensorsand the environment to show how one capability can be provided by several different types ofsensors in different environments. This also applies to Mao et al. [MJAC08], who decouple sen-sors and events, but do not relate these issues to the automated home care application domain.As stated in the introduction of the section, there exist no other automated home care systemthat combines the aspects of CEP and personalisation in a structured way as CommonSens does.

21

Page 34: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

2.3.4 Spatial Issues

For spatial issues like detecting spatial events, CommonSens differs from related work sinceit aims for a general and open approach that supports all possible sensors in a standard home.The only important issue is that the sensors provide the capabilities that are addressed in thequeries and that they have a coverage area. By combining the readings from several sensors,CommonSens can approximate the areas in the environment where the events should occur.These areas in the environment are called locations of interest (LoIs) and are addressed in thequeries. CommonSens provides a separate environment model, which is used to solve thesespatial issues.Related work mainly use specific technologies to locate events and objects in the environ-

ment. For instance, Koile et al. [KTD+03] use a multi-camera stereo-based tracking system totrack people in activity zones in indoor environments. The activity zones are similar to our con-cept of LoIs. However, they do not discuss utilisation of other types of sensors and abstractionsthrough capabilities. On the other hand, they investigate how the activity zones can be used totrigger different types of activities in the environment. For instance, if a person is located in anactivity zone that is defined to be a quiet zone, the system can be set to not accept telephonecalls.CommonSens supports multimodality through capabilities, which means that one capability

can be provided by many different types of sensors. Atrey et al. [AKJ06] propose a multimodalframework that identifies whether or not complex events occur based on user defined thresholdsand probability based on earlier occurrence. An important issue that they address is the syn-chronisation of the events and sampling frequency. In our work, we also address this issue, butwe use a pull-based approach and assume that the data tuples we obtain are synchronised whenthey are pulled. However, this assumption might not always work in real world scenarios. Forinstance, we experienced (see Chapter 6) that there were some synchronisation issues related toobtaining data tuples from cameras. On the other hand, Atrey et al. do not discuss the conceptsof late binding between sensors and capabilities since this is not the focus in their multimodalapproach. Sensor selection is investigated by Saini and Kankanhalli [SK09] where metrics likeconfidentiality and cost related to processing time and memory requirements are considered. Inour work, we have not discussed issues related to sensor selection; we simply use sensors basedon the capabilities they provide.Using other types of sensors than cameras and microphones for detecting spatial events has

also been investigated. Chen et al. [CCC+05] assist WiFi based positioning with RFID readingsto cope with environmental factors like doors, humidity and people. In sensor network research,k-coverage problems consist of finding areas in the field that are covered by at least k sensors[YYC06].Coverage of sensors has been investigated in the Art Gallery Problem [O’R87], where a

minimum number of guards should cover every part of an environment. CommonSens providesa general solution for detection of spatial events which is independent of the sensor types andenvironment instance and binds together many of the already existing solutions while providingan application programmer-friendly solution.Even though there exist much work in the field of signal propagation in environments, e.g.

[SR92, BP00, LC07], not many works address the issue of how to make use of the sensors’ sig-

22

Page 35: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

nal propagation for placement. Among some of their contributions, Huang and Tseng [HT03]observe and discuss the issues related to the fact that sensor coverage ranges are not alwayscircular. This includes sensors that do not originally have circular coverage areas, like cameras,or sensors whose coverage areas are reduced due to placement. Boukerche and Fei [BF07] alsofocus on irregularities in coverage areas. These irregularities may be objects, as we have definedthem in our work. Their approach is to generate simple polygons, i.e., polygons whose sides donot cross, from the physical sensor’s coverage area. If an object is within the coverage area, ashape similar to the object’s shape is removed from the polygon. They do not discuss perme-ability, i.e., how an object affects different types of signals, and utilisation of signal propagationmodels.Permeability and signals are investigated in research on propagation modelling. For in-

stance, in the field of user tracking based on signal strength from, e.g. WLAN base stations, anestablished model is the one suggested by Seidel and Rappaport [SR92]. This model is calledthe Wall Attenuation Factor (WAF) model, where the user specifies the number of walls and anattenuation factor. This factor is derived empirically and is not pro-actively used to place sen-sors in the environment. In addition, the model leaves for the user to specify details like shape.Factors like the depth of the wall might be important for how the signal strength is attenuated,and there might be several types of walls with different depth and permeability values that thesignal passes. The model of Seidel and Rappaport is for example the fundamental model in Bahland Padmanabhan’s RADAR system [BP00] and the environment model presented by Xiang etal. [XZH+05]. Another issue with these works is that they assume an already fixed positionof the base stations. We point out that objects in the environment affect the coverage of thesensors. This means that proper environment models are imperative for sensor placement.Using coordinates to describe objects and environments is widely used in applications like

computer-aided design (CAD). We have adapted this approach in our environment model andadded properties like permeability in order to support signal propagation and facilitate sensorplacement. In this section we focus on systems that combine sensors and environment models.In their multimodal surveillance system architecture, Saini et al. [SKJ09] use an environ-

ment model that allows system porting between environments. Their environment model con-sists of geometric information, contextual information and sensor parametric information. Thegeometric information is for instance realised through a 3D model, and the locations of objectsin the environment have semantic labels. For instance, a semantic label can state that the ob-ject is a glass door. The contextual information concerns dynamic qualities in the environment.Saini et al. focus on the office space application domain, and give examples of informationrelated to office/working hours and prohibited regions. The sensor parametric information con-tains information about sensors, i.e., capabilities and limitations, as well as parameters likelocation and coverage area.The data is obtained through a data acquisition module, which organises the available sen-

sors. Each type of sensor provides data level event detectors (DLED), which describe whattypes of events the sensor can detect. For instance, a video DLED might have face detectorand motion detector. This approach is similar to our concept of capabilities. However, it is notclear if Saini et al. use logical sensors to provide the data level events. In our work, video isprovided by a camera, which is a physical sensor. Face detection can only be provided by alogical sensor since it needs both the video stream and the face detection description, e.g. Haar

23

Page 36: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

classifiers [WF06].It is not clear whether the environment model uses properties as permeability values or not

when sensors are added or removed from the environment. However, in their experiments,Saini et al. run the same queries in two different environments. They show how the system canreason about the possibilities of false positives given the number of sensors whose data tuplesare fused. This is similar to our approach to LoI approximation and query instantiation, i.e., thelate binding between the sensors and the queries.

2.3.5 Deviation Detection

The concepts of deviation detection and anomaly detection refer to approaches that identify un-expected patterns. A common approach is to use statistics to detect deviations. This approach isused in fraud detection for credit cards, intrusion detection in networks, fault detection in safetycritical systems, detection of abnormal patient condition and disease outbreaks [CBK09, PP07].CommonSens uses queries which describe complex events that we want to detect deviationsfrom. In contrast to CommonSens, rule-based systems identify deviations bottom-up. Thesesystems mine training data to obtain rules, i.e., patterns that do not match these rules are con-sidered as anomalies or deviations [Agg05]. However, a training phase restricts the applicationprogrammer from stating other rules than those that are mined from the training set. For in-stance, the monitored person may suffer from dementia. CommonSens notifications can beused to guide the monitored person. In these scenarios it can be hard to initiate a successfultraining phase, since the monitored person may be guided by another person. The presence ofanother person will affect the sensor readings. It is hard to extract the sensor readings from thetraining set that only relates to the monitored person.CEP systems use declarative queries or a GUI to define rules that describe complex events

[SSS10]. This approach does not fully support the concept of deviation detection. For instance,the commonly used CEP library Esper [esp] supports detection of complex events. However,Esper and similar systems only return results if there exist matching events. This contradicts toour concept of deviation detection since they do not report anything if the query is not matched.For instance, it is not possible to prefix each atomic query with the logical operator NOT, sincethis might violate the near real-time requirements of automated home care. In Esper, when thedata tuples do not match the condition in the query, i.e., a deviation, Esper will simply continuethe evaluation of the query. Only if all the atomic queries are not matched, Esper sends amessage that the query is matched, i.e., that a deviation has occurred.

2.4 Discussion and ConclusionWith the recent development of sensor technology, application domains like automated homecare can heighten the threshold for hospitalisation and for putting elderlies in retirement homes.Unfortunately, sensors are not easy to program. In automated home care, there will be applica-tion programmers who are responsible for programming the sensors so that they can monitor thepersons correctly. In order to simplify the work of the application programmer, it is importantto find alternatives so that the application programmer does not have to manually program eachsensor. One possible approach is to use the concept of events and complex event processing.

24

Page 37: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Complex event processing allows the application programmer to write high level declarativequeries. The sensors themselves do not have to be programmed; instructing the sensors to senddata is done automatically by the complex event processing system. This simplifies the workfor the application programmer, and this is our overall goal.This chapter introduces important concepts that our work builds on, and relates our work

to the other contributions of the field, especially sensors and concepts related to events. Asstated in Chapter 1, CommonSens has three main contributions: models for events, sensors, andenvironments, and a query language, detection of spatial events, and deviation detection. Forsensors, we show that there exist an extensive model from which we extract some interestingfeatures and show that we extend related work by explicitly dividing into three different typesof sensors. With respect to events, the main focus is to show that there are several differentinterpretations of events. Personalisation is an important focus in our work. However, thereis not much related work in automated home care concerning this issue. Hence, we relate thepersonalisation to the discussion of complex event processing and conclude that there does notyet exist any complex event processing system that provides personalisation in the automatedhome care application domain. When we discuss the spatial issues, we present related worksthat combine readings from several sensors in order to locate objects. Our environment modelis inspired by how coordinates are used to model 3D, and the model is used to perform properplacement of sensors in the homes. Finally, we show how our interpretation of deviation de-tection differs from statistical approaches, since we simply use the complex queries as groundtruth for what we regard as correct.Finding related work for our application domain is not a challenge, since there are several

research projects that focus on these issues. However, we have noted that there are many dif-ferent approaches in this domain, and it is sometimes unclear whether or not a system actuallysupports given functionality. Based on the related work study that we have made, it is cer-tain that there exist no other automated home care systems that support all the aspects that arecovered by CommonSens.

25

Page 38: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 39: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 3

CommonSens Data Model

In this chapter we present the fundamental models in CommonSens. These models are definedin order to identify relevant attributes from the elements in the application domain, i.e., theenvironment and sensors. It will be easier for the application programmer to instantiate andreuse environments, sensors and queries if there exist models that the concrete instances ofenvironments and sensors can be mapped to. In addition, events are not uniquely defined inrelated work, and researchers give many different meanings to that term. Hence, we have toexplicitly define an event model that fits our requirements.First, we define the three models that identify and describe the concepts and semantics of

events, the environment where events happen, and the sensors that are used to obtain infor-mation about the environment. Second, we define the semantics and the syntax of the querylanguage that is used to describe events.

3.1 Event ModelIn our conceptual model of the real world, everything that happens can be modelled throughstates and state transitions. For instance, when a door is opened, it transitions from a state withthe value ‘door closed’ to a state with the value ‘door open’. The current temperature is a singlestate with a value. When the temperature changes, the current state transitions to another statewith another value. Based on this conceptual model of the real world, we define an event as:

Definition 3.1.1 An event e is a state or state transition in which someone has declared interest.A state is a set of typed variables and data values. A state transition is when one or more of thedata values in a state change.

The type of variables to use depends on the instantiation of the system. Not all states andstate transitions in the real world are of interest. Therefore, we view events as subsets of thesestates and state transitions. Figure 3.1 relates the core elements of the approach to our concep-tual model of the real world. The application programmer uses declarative queries to describeevents. The sensors detect states in the real world, and if the state values match the conditionsin the queries, these are identified as events. State transitions become events if all the states thatare involved in the transition match a set of queries.

27

Page 40: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Events

States and state transitions

Sensors

Queries

Describe events

Detect events Detect states

Events

Figure 3.1: The relation of the core elements in our conceptual model of the real world.

Our event definition works well with existing event processing paradigms. For example, inpublish/subscribe systems, events can be seen as publications that someone has subscribed to.To identify when and where an event occurs and can be detected, temporal and spatial

properties are important to specify. For addressing temporal properties we use timestamps,which can be seen as discrete subsets of the continuous time domain and time intervals [PS06].

Definition 3.1.2 A timestamp t is an element in the continuous time domain T : t ∈ T . A timeinterval τ ⊂ T is the time between two timestamps tb (begin) and te (end). τ has durationδ = te − tb.

For instance, we want to ensure that the monitored person’s dinner has lasted for at least 20minutes, i.e., δ ≥ 20 min.To differ events from other states and state transitions, it is important to have knowledge

about the spatial properties of the events, i.e., where in the environment they happen. Thesespatial properties are specified through locations of interest, denoted LoIs.

Definition 3.1.3 A location of interest (LoI) is a set of coordinates describing the boundariesof an interesting location in the environment.

The LoI can be seen as a shape, which is defined in Definition 3.2.1.In CommonSens we differ between atomic events and complex events, and an event that

cannot be further divided into lower level events is called an atomic event.

Definition 3.1.4 An atomic event eA consists of four attributes: eA = (e, loi, tb, te). e is theevent, loi is the LoI where (|loi| = 1 ∨ loi = ∅) and (tb, te ∈ T ) ∨ (tb = te = ∅). If thetimestamps are used, they are ordered so that tb ≤ te.

loi, tb and te do not necessarily need to be instantiated. An atomic event without loi, tb andte is, for example the state where a sensor that monitors a lamp returns a reading telling thatthe lamp is turned on. If we want to address that one certain lamp in the home is turned on at acertain time of the day, the temporal properties and LoI need to be instantiated.In this section we do not define ways of describing the order and relations the atomic events

have. Here, we simply state that the atomic events in eC can happen concurrently or consecu-tively. To classify this we define five classes of concurrency and one class for consecutiveness.

28

Page 41: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Concurrent:

Consecutive:

A A

A A

A

A

B B

B B

B

B

A equals B A starts B

A finishes B A during B

B overlaps A

A before B

Figure 3.2: Concurrent and consecutive atomic events.

These classes are inspired by Carlson’s six interval relations [Car07] and Allen’s temporal in-tervals [All83] and are shown in Figure 3.2. The five concurrency classes are equals, starts,finishes, during, and overlaps. The consecutive class is called before. In the figure, the intervalbetween the atomic event’s two timestamps is shown as an arrow, and we show pairs of atomicevents, which are called A and B. Concurrency is formally defined as:

Definition 3.1.5 Two atomic events eAi and eAj are concurrent iff ∃tu,

(tu ≥ eAi.tb) ∧ (tu ≥ eAj .tb) ∧ (tu ≤ eAi.te) ∧ (tu ≤ eAj .te)

For two atomic events to be concurrent there is a point in time, i.e., a timestamp, wherethere is an overlap between both atomic events. Note that in the previous and the followingdefinitions we use dot notation to denote the attributes, i.e., eAi.tb refers to the tb attribute ofeAi. Two events are consecutive when they do not overlap.

Definition 3.1.6 Two atomic events eAi and eAj are consecutive iff eAi.te < eAj .tb.

A set of atomic events can be part of a complex event.

Definition 3.1.7 A complex event eC is a set of N atomic events: eC = {eA0, . . . , eAN−1}.

Since eC is a set, the atomic events can be unordered. However, the atomic events havetimestamps that state exactly when they occur. For instance, eA1 and eA2 both happen at 16:54hr,i.e., eA1.tb = 16:54hr∧eA1.te = 16:54hr∧eA2.tb = 16:54hr∧eA2.te = 16:54hr. Thecomplex event eC1 contains the two atomic events. Hence, eC1 = {eA1, eA2}. It is unnecessaryto use operators or relations between the atomic events; eC implicitly contains all consecutive

29

Page 42: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

EN

E

All sequences of data tuples related to all queries

The sequences that are related to one particular query

The sequences that match the particular query

V

Figure 3.3: Overview of the V , N and E sets.

and concurrent relations between any two atomic events. However, when the application pro-grammer wants to describe that the two atomic events are concurrent, he is required to use aquery language. The query language that uses the concepts of CommonSens is described inSection 3.4.Finally, an event might not occur when it is supposed to, i.e., when someone has declared

interest in it. In CommonSens, a non-occurrence of an event is called a deviation. Deviationdetection is the process of identifying the states and state transitions which are not matchingthe queries, i.e., the process of identifying non-occurrence of events. For instance, it might beexpected that the monitored person takes his medicine between between 08:00hr and 09:00hrevery day. This complex event is stated through a query, and if the monitored person doesnot take his medicine during the defined interval, this is interpreted as a deviation from thecomplex event. The application programmer explicitly writes in the query that he is interestedin a notification about the deviation. He is given a notification if the complex event does notoccur. In the remaining text we use the term event for both complex events or an atomic events.In the following we explain our interpretation of deviations, and how they differ from events.

We use terms from set theory to separate state values, events and deviations. This approachdefines state values, complex and atomic events and deviations from events as three related setsE ⊆ N ⊆ V . The subset relation of the three sets is illustrated in Figure 3.3.

V is defined by all the sensors that are used in the queries and contains all state values thatmight theoretically be read from the real world. This means that V contains all possible se-quences of data tuples within the range of state variables the sensors can read. Consequentially,V can be very large. However, the temporal properties in the queries limit the time interval thatdefines V .We want to identify the possible deviations from any given event that is described in a

complex query. Therefore, we need to identify a subset of V that is defined by each runninginstantiated complex query. The set N , contains all possible sequences of state values thatmight be read by the sensors that instantiate a given complex query. Since N contains allpossible sequences of state values there might exist sequences inN that do not match the query.Even though N can be significantly smaller than V , N is still a large set. We want to define

a subset of N that contains all the sequences of data tuples that actually match a query. Thesesequences belong to the set E. The deviation of an event depends on the N and E sets and is

30

Page 43: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

defined as:

Definition 3.1.8 A deviationD from an event is a set of data tuple sequences D = N \ E.

D is the difference between the N and E sets. Deviation detection is an important addition tothe traditional event processing because it provides a simpler way to solve the challenges relatedto detecting dangerous situations. In addition, deviation detection helps simplifying the work ofthe application programmer. Instead of explicitly describing everything that can go wrong, theapplication programmer only has to describe normal behaviour. If instructed to, CommonSensautomatically detects the deviations if the events do not occur, and sends a notification.

3.2 Environment ModelThe environment, or home, is the physical space where the monitored person lives. Common-Sens uses an environment model to identify how the sensor signals are affected when the sensorsare placed in the home. The goal is that all LoIs specified in the queries are covered by corre-sponding sensors. The environment is described by a configuration of objects. These objectscould be rooms, walls, furniture, etc. Once an object is defined, it can be reused in any otherinstantiation of any other home. Objects have two core properties; their shape, and how theyimpact different signals types (e.g. radio and light), i.e., their permeability.

Definition 3.2.1 A shape s is a set of coordinates:

s = {(x, y, z)0, . . . , (x, y, z)N−1}

The triplets in s describe the convex hull (boundary) of the shape. All triplet values are relativeto (x, y, z)0, i.e., the base of the coordinate system.

While each object, e.g. a wall, has one shape, it can have several permeability values fordifferent signal types. For instance, a wall stops light signals while a radio signal might only bepartially reduced by the wall. Therefore, it is important to identify how permeable objects areregarding different types of signals.

Definition 3.2.2 Permeability p is a tuple: p = (val, γ). val ∈ [−1, 1] is the value of thepermeability. γ denotes which signal type this permeability value is valid for.

The lower the value for p.val is, the lower the permeability is. If the permeability value is 0,the signal does not pass. While if p.val has a negative value, the signal is reflected. The objectis defined as follows.

Definition 3.2.3 An object ξ is a tuple: ξ = (P, s). P = {p0, . . . , pN−1} is a set of permeabilitytuples. s is the shape.

We use ξ.P to support that an object can have permeability values for many different signaltypes. Finally, the environment is defined as an instance of a set of related objects.

31

Page 44: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Definition 3.2.4 An environment α is a set of objects: α = {ξ0, . . . , ξN−1}. Every ξi ∈ α\{ξ0}

is relative to ξ0.

In the definition of the shape s we state that all triplet values in the shape ξi.s of an objectare relative to ξi.s.(x, y, z)0. In an environment αy where ξi is located, ξi.s.(x, y, z)0 is relativeto ξ0.s.(x, y, z)0, which is set to (0,0,0). Since all the triplet values in the shapes are relativeto a base, only the base values have to be changed when the objects are reused in differentenvironments.In the environment, it is possible for two objects to overlap or share the same space. This is

especially true in the case where objects are placed in a room, since the rooms are also objects.In cases where these two objects have permeability values for the same type of signals, thelowest value of the two will apply to both. For example, a table is located in a room, and thetable stops light signals. Therefore, regarding light signals, in the coordinates that are shared bythe table and the room, only the permeability values from the table apply.Instances of the environment model can be reused when several objects are made of the same

material. A wall made of a given type of concrete can be defined once, i.e., the permeabilityvalues for the various types of signals need to be set only once. When using the concrete wallin an instance of an environment, only the physical dimensions need to be changed. In casethere are several similar apartments that need to be instantiated, the same wall object type canbe reused. If one defines permeability tuples for another signal type, this can be added to thetemplate wall. The instances inherit the new permeability tuples if new sensors are added.

3.3 Sensor ModelTraditionally, sensors read analogue signals from the environment and convert these into datatuples. Hence, a data tuple is the information that a sensor has obtained about a state in the realworld. The data tuple is defined later in this section.Our sensor model should achieve three objectives. First, when we have described the events,

CommonSens has to determine which type of sensors to use. Second, events might happen overtime, so we want the sensors to utilise historical and stored data together with recent data tuples.Third, we want to aggregate data tuples from several sources of information. In order to meetthe first objective, each sensor should provide a set of capabilities.

Definition 3.3.1 A capability c is the type of state variables a sensor can observe. This is givenby a textual description: c = (description).

Capabilities like temperature reading or heart frequency reading return values of type inte-ger. However, capabilities might be much more complex, like face recognition or fall detection.To capture all these possibilities in our model we use a string to describe sensors such that thedescription in a particular implementation can be anything from simple data types, such as in-tegers, to XML and database schemes [GMUW08]. The application programmer should notaddress particular sensors, rather sensor capabilities. To enable CommonSens to bind capabili-ties to correct sensors it is necessary to describe capabilities based on a well-defined vocabulary.The capabilities play an important role for how the queries and the sensors are mapped.

32

Page 45: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 3.4: Example of how a wall reduces the coverage area of a camera.

The data tuples that the sensors produce are used to send information about the reading thesensor has performed.

Definition 3.3.2 A data tuple d =< φ, c, val, tb, te > consists of the sensor φ that produced thedata tuple, the capability c, and the value val of the capability. tb is the timestamp that denotesthe beginning of the event and te denotes the end time, where tb ≤ te.

For instance, a data tuple from a motion detector Mot_A23 that provides the capabil-ity MotionDetect and which reports true at 1:44.34pm could look like <Mot_A23,MotionDetected, true, 1:44.34pm, 1:44.34pm>.In order to meet the two last objectives, we define three distinct types of sensors. The

physical sensor is responsible for converting the analogue signals to data tuples. The externalsource only provides stored data, and the logical sensor processes data tuples from all the threesensor types and provides data tuples that are results from this processing.

Definition 3.3.3 A physical sensor φP is a tuple: φP = (cov, γ, f, C). cov denotes the coverageof the physical sensor. The coverage of a physical sensor can either address specific objects oran area. γ is the type of signal this sensor sends or receives. f is the maximal samplingfrequency, i.e., how many data tuples the physical sensor can produce every second. C =

{c0, . . . , cN−1} is the set of capabilities that the sensor provides.

The physical sensor is limited to only observe single states in the home. When the coverageof a physical sensor covers an area, this is denoted coverage area. Usually, the producer of thephysical sensor defines the coverage area as it is in an environment without obstacles. Whenthe physical sensor is placed in the environment, the coverage area might be reduced due toobjects in the environment that have permeability tuples that match the signal type of the currentphysical sensor. When a physical sensor covers the power line from a light switch, the capabilitydescribes the physical sensor’s ability to tell whether or not the light is turned on. This coverageis not an area, and is not affected by the permeability tuples in the environment.An example of how an object in the environment reduces the coverage area of a sensor is

shown in Figure 3.4. Figure 3.4 a) shows the coverage area of a video camera as it is defined bythe producer. When the video camera is placed in the environment its coverage area is reducedby e.g. a wall (see Figure 3.4 b)).The external source φE includes data that is persistently stored. The main purpose of an

external source is to return data tuples from the storage.

33

Page 46: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

PS: Accelerometer(value)ES: User(personID) LS: FallDetected(personID)

ES: FallDetected(true/false)

ES: FaceRecognition(personID)

PS: Camera(matrix)

LS: FallDetected(personID)

PS: TakingMedication(type)

PS: PhysicalSensor, ES: ExternalSource, LS: LogicalSensor

PS: Camera(matrix)ES: MedicationTaken(type)

LS: TakingMedication(type)

Figure 3.5: Examples of capability hierarchies for detecting falls and taking medication.

Definition 3.3.4 The external source φE is an attribute: φE = (C). C is the set of the capabil-ities it provides.

In contrast to the physical sensor, the external source does not obtain readings directly fromthe environment. Instead, it utilises historical and stored data, which later can be aggregatedwith recent data tuples. This is practical for applications like face recognition, e.g. to identifythe monitored person when he is taking medicine. Thus, we do not include attributes likecoverage. For instance, we allow DBMSs and file systems to act as external sources, as long asthey provide data tuples that can be used by CommonSens.The information that can be obtained from an external source can for instance be Haar

classifiers for object and face recognition [WF06]. The information from the external sourcecan be aggregated with images from a camera in order to detect objects or faces.In order to perform aggregations between for instance historical and recent data tuples or

Haar classifiers and images, we finally define the logical sensor, which performs computationson data tuples from other sensors.

Definition 3.3.5 The logical sensor is a tuple: φL = (Cd, aggL, Cp, f). Cd is the set of all the

capabilities it depends on and aggL is a user defined function that aggregates the data tuplesfromCd. Cp is the set of capabilities it provides. f is defined as for physical sensors but dependson aggL and Cd.

It should be noted that the definition of logical sensors does not specify specific sensorsto provide the input data tuples. Instead it specifies the capabilities required to provide theinput data tuples. These capabilities can in turn be capabilities of external sources, physicalsensors or logical sensors. This leads to a hierarchical dependency structure of capabilities witharbitrary depth, which we call a vocabulary tree. The leaves in a vocabulary tree are capabilitieseither provided by physical sensors or external sources. Each capability is a unique keyworddescribing what a sensor is able to sense. As such the keywords bring semantics into the systemand the vocabulary tree can be seen as a simplified version of an ontology. Vocabulary trees areour main mechanism to model logical sensors that detect complex events, like fall detection orobject recognition, which cannot be modelled by standard data types like integer.Figure 3.5 shows an example of three simple vocabulary trees with capabilities for detecting

falls and that medication is taken. FallDetected is provided by a logical sensor that either

34

Page 47: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

depends on the accelerometer capabilities together with the information about the monitoredperson (personID), or the combination of the Camera capability and capabilities for facerecognition and fall detection. The available sensors define which of the two FallDetectedand TakingMedication configurations that are chosen. For instance, in one environmentthere are already installed sensors that provide the capabilities Accelerometer and User.This makes it more convenient for CommonSens to instantiate the FallDetected that de-pends on these two capabilities. The figure shows a possible instance with sensors that providethe needed capabilities. The two capabilities Accelerometer and User are provided bya physical sensor and an external source, respectively. TakingMedication is provided bya physical sensor, which e.g., uses an RFID tag/reader pair to detect that the monitored per-son holds the medication, or a logical sensor that depends on the capabilities Camera andMedicationTaken.

3.4 Query Based Event LanguageThis section defines the query language that the application developer can use to describe events.The basic idea is that the application programmer formulates queries that describe complexevents in the environments. Our query language supports the reusability concept of Common-Sens by letting the application programmer only address LoIs, capabilities and temporal proper-ties. The application programmer can explicitly denote whether he is interested in the events orthe deviations from the events. First, we present the semantics of the query language. Second,we present the syntax.

3.4.1 Semantics

The query language uses capabilities from the sensor model, and LoIs and timestamps from theevent model to support reuse. In order to detect an event, the application programmer assignsconditions to the capabilities. The query that addresses one atomic event is called an atomicquery.

Definition 3.4.1 An atomic query qA is described by a tuple: qA = (cond, loi, tb, te, preg).cond is a triplet (c, op, val), where c is the capability, op ∈ {=, �=, <,≤, >,≥} is the operator,and val is the expected value of the capability. If set, loi, tb, te and preg specify the spatial andtemporal properties.

Since a capability is a type of state variable, cond is used to describe a state. For instance,in order to detect motion, which is a capability, the condition can be Motion = True. If the ap-plication programmer wants to detect motion in the kitchen, he sets loi = Kitchen. In addition,the application programmer can set the temporal properties in order to specify when motion isexpected to be detected in the kitchen. The temporal properties are discussed below. When itis needed to describe complex events, two or more atomic queries have to be used. These arecalled complex queries.

35

Page 48: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Temperature > 19C

Temperature <= 19C

08 09 10 11 12 13 14 15 16 17 18 19 20 21 22Time

Combination 1

Combination 2

Combination 3

Combination 4

Figure 3.6: Examples of allowed sequences in a P-registered query.

Definition 3.4.2 A complex query qC is a list of atomic queries, and logical operators andrelations ρi between them: qC = {qA0 ρ0 . . . ρN−2 qAN−1}. If the complex query only consistsof one atomic query, ρ0 = ∅.

The logical operators in the query language are ∧, ∨ and ¬. If any other logical operatoris needed, these three operators can be used to define it. Complex queries describe complexevents and state transitions. In the following, the term query, denoted q, covers both atomic andcomplex queries. An example of a complex event is that the monitored person is expected toget out of bed in the morning, take medication and make breakfast. Taking medication consistsof standing close to the medication cupboard and take two different medicines. The breakfastconsists of opening the fridge, take out food, prepare food, eat and clean up within a certainperiod of time.In order to describe consecutive events like the order of the events while making breakfast,

we use one temporal relation, i.e., the followed by query relation (→).

Definition 3.4.3 The followed by relation→ between two queries qz and qx:

{qz → qx ⇔ qz.te < qx.tb }.

With respect to the temporal properties of the events, the application programmer should beable to write two types of queries. The first type should explicitly denote when an event shouldhappen, e.g. between 16:00h and 18:00h. The other type should denote how long an event lastswhen it is detected. These two types of queries are called timed queries and δ-timed queries. Atimed query has defined tb and te values, which means that the two timestamps denote exactlythe duration of the event. δ-timed queries define the duration of the events when they first start.These are denoted by overloading the semantics and setting only tb. The timed or δ-timed querycan be percent-registered (P -registered) or not. This is defined in preg.A P -registered query accepts (1) atomic events that last for minimum or maximum the

specified duration of the query, and (2) sequences of atomic events where the duration from thelast state value to the first state value equals the specified time interval or the duration, and wherethe percentage of the atomic events that satisfy the conditionmatches the maximum or minimumspecified in the P -registration. preg is therefore a tuple m, val, where m = min ∨ max and

36

Page 49: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

val is a percentage value between 0% and 100%. In order to explain P -registration we definethe data tuple sequence seq, which is defined as timely ordered list of data tuples.

Definition 3.4.4seq = {d0 ≺ ... ≺ dn|∀di, di+1, di.tb ≤ di+1.tb}

A sequence may contain any pair of data tuples that share begin time and occur concurrentlyor consecutively.

Concerning Point (2) of what a P -registered query accepts, CommonSens accepts all se-quential combinations of events where the ratio and duration are correct. A δ-timed or timedatomic query qA where the P -registration is not defined is equivalent to a similar query withqA.preg = {min, 100%}.We show the semantics of P -registration with a simple example. We assume that the envi-

ronment consists of only one sensor that provides the capability Temperature. The sensorcovers the LoI kitchen. A query states that the temperature in the kitchen should be above19◦C at least 80% of the time between 08:00hr and 23:00hr.To simplify the example, we assume that the sensor is pulled once every hour. This gives 15

temperature state readings in total. According to the query 12 temperature readings (80% of 15)have to be above 19◦C. Figure 3.6 shows a small set of the legal combinations of state valuesthat match the query. Combination 1 shows a consecutive sequence of accepts that match thecondition 12 times. Combination 2 and 3 shows accepted combinations of events, which stillsatisfy the P-registration. Since the query states that minimum 80% of the events should matchthe condition, Combination 4 is also accepted. In addition, a complex query can be timed orδ-timed, and P -registered.It is not required that an atomic or complex query is P -registered. If a timed or δ-timed

query is not P -registered, it means that all the data tuples that arrive during the evaluation haveto match the condition. If the atomic query is not timed, it can not be P -registered. This meansthat it is sufficient that the condition is matched once for the evaluation to be successful. It isimpossible to get deviations from such atomic queries, because there are no temporal propertiesin such atomic queries. On the other hand, this does not apply to complex queries that are nottimed or P -registered. When this happens, the temporal properties of the atomic queries apply.In addition to using logical operators and the → relation to describe complex events, the

application programmer can use five concurrency operators to describe temporal dependencybetween queries. These concurrency operators match the temporal intervals introduced by Allen[All83], however the two intervals in Allen’s work that support consecutiveness (takes placebefore and meets) are covered by the followed by relation.

Definition 3.4.5 The concurrency operators:

{EQUALS(qz, qx) | qz.tb = qx.tb ∧ qz.te = qx.te}

{STARTS(qz, qx) | qz.tb = qx.tb ∧ qz.te < qx.te}

{FINISHES(qz, qx) | qz.tb < qx.tb ∧ qz.te = qx.te}

{DURING(qz, qx) | qz.tb > qx.tb ∧ qz.te < qx.te}

{OVERLAPS(qz, qx) | qz.tb > qx.tb ∧ qz.te > qx.te}

37

Page 50: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

The five concurrency classes correspond to the classes shown in Figure 3.2, and are meantto be used by the application programmer to describe concurrent events. For timed statements,the application programmer explicitly writes the timestamps. If the application programmerwants to state that one event should last 5 seconds longer than the other, we denote this bysetting te = 5 seconds. However, by using the concurrency classes, the application programmerdoes not have to specify explicit timestamps. For instance, the application programmer wantsto state that the medication should be taken while the monitored person stands close to themedicine cupboard, there is no need to specify the timing of the event. The system detects thatthe person stands close to the cupboard by reading the relevant sensors. If the monitored personstarts taking medicine, CommonSens starts evaluating the complex query. If the applicationprogrammer has specified that the deviations from this query are interesting, CommonSenssends a notification if the monitored person brings his medicine from the medication cupboardand takes them elsewhere. An example of a query that describes this complex event is shown inthe following section.

3.4.2 Syntax

The grammar of our query language is based on contributions made by already existing com-plex event processing systems. For instance we use the consecutiveness relation→, which forinstance is used for detecting consecutive patterns in the Esper CEP library [esp].The grammar is shown in extended BNF (EBNF) in Figure 3.7. We begin with a bottom-up

presentation of the grammar, however we do not include definitions of strings and numbers.Timing involves using timestamps to state how long events should occur. If only one times-tamp is used, the query is δ-timed. If the query contains two timestamps, the query is timedand the event should happen within a given interval. Finally, the P -registration is indicated bymax or min and a double that sets the percentage. The atomic events are described by atomicqueries, which are defined through Atomic. An atomic query always contains the capability,one of the operators in Operator, and the expected value of the capability. In addition, theatomic query can contain the LoI and temporal properties. The query language contains thecommon logical operators, &&, ||, and !, and the consecutiveness relation ->. The symbolsin ConcOp match five concurrency classes based on Allen’s temporal classes [All83]. Thegrammar for concurrency is defined in QConcurrent. It takes as arguments two chains ofqueries.A chain of queries, denoted Chain, contains Queries and possibly Timing. Queries

can contain one or more atomic or concurrent queries. The atomic queries are either related witha relation or one or more logical operators. If the dev operator is applied to a chain it meansthat the application programmer is only interested in notifications only if there are deviationsfrom the chain of queries. The root of the parsing tree is the symbol QComplex which denotesthe complex query. QComplex can consist of one or more chains, which are connected withrelations.An example is complex query that describes that the monitored person Adam is supposed

to be located at the LoI medicine cupboard when taking the two medicines he is prescribed.

[dev(during([(TakingMedication == Med1) ->(TakingMedication == Med2)],

38

Page 51: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

QComplex= ’[’ Chain ’]’ (Relation ’[’ Chain ’]’ )*;

Chain= Queries (’,’ Timing)? | ’dev(’ Queries (’,’ Timing)? ’)’;

Queries= (Atomic | QConcurrent) ((Relation | (LogicalOperator)+)(Atomic | QConcurrent))*;

QConcurrent= ConcOp ’(’ ’[’ Chain ’]’ ’,’ ’[’ Chain ’]’ ’)’;

ConcOp= ’equals’ | ’starts’ | ’finishes’ | ’during’ | ’overlaps’;

Relation= ’->’;

LogicalOperator= ’&&’ | ’||’ | ’!’;

Atomic= ’(’ Capability Operator Value ’)’ |’(’ Capability Operator Value ’,’ LoI ’)’ |’(’ Capability Operator Value ’,’ Timing ’)’;’(’ Capability Operator Value ’,’ LoI ’,’ Timing ’)’;

Operator= ’==’ | ’!=’ | ’<’ | ’>’ | ’<=’ | ’>=’;

Timing= Timestamp | Timestamp ’,’ Timestamp |Timestamp ’, (’max’ | ’min’) ’ ’ Double ’%’ |Timestamp ’,’ Timestamp ’,’ (’max’ | ’min’) ’ ’ Double ’%’;

Figure 3.7: Our query language as written in EBNF.

39

Page 52: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[(DetectPerson == Adam, MedCupboard)]))]

When the monitored person starts taking Med1, the query evaluation starts, and both themedicines should have been taken before Adam moves out of the LoI MedCupboard. If heonly takes Med1 and moves out of the LoI, CommonSens interprets this is a deviation sincedev is used in the query.

3.5 Discussion and ConclusionIn this chapter we define the three fundamental models in CommonSens; the event model, theenvironment model and the sensor model. The event model shows our definition and interpreta-tion of events. Our event definition limits events to states and state transitions in the environmentthat someone has declared interest in, i.e., events are concrete states and state transitions thatoccur in time and space. This means that the events can have duration. The spatial property ofan event is defined by a LoI, which is a set of coordinates. The atomic event can not be dividedinto smaller events. The complex event contains a set of consecutive and concurrent events. Adeviation is the non-occurrence of an event, i.e., it occurs when the event does not occur.In automated home care, events can only be detected by sensors, and these sensors are

affected by the environment they are in. The environment is a general description of the homewhere the monitored person lives and where the events occur. The environment consists ofobjects with shapes and permeability values. The permeability values are included in orderto define how the sensor signals are affected. In order to simplify to work of the applicationprogrammer, the objects can be reused. This means that when one object is defined it can bereused in several environments.We want sensors to support three main features. The sensors should read states from the

environment, historical data, and be able to aggregate data from several sensors. This givesthree different types of sensors: (1) the physical sensor, which reads analogue states from theenvironment and turn them into data tuples, (2) the external source, which returns stored data,and (3) the logical sensor, which aggregates and processes data tuples from different sensors. Inaddition, we want sensors only to provide capabilities, i.e., the state variables they can observe.The capabilities a sensor provides and the sensors are loosely coupled. This means that onesensor can provide several capabilities and one capability can be provided by several differentsensors. This allows the application programmer to write general queries that do not addressthe sensors directly. Only during an instantiation phase, the sensors are connected with thecapabilities. This allows queries to be reused in many environments with many different typesof sensors.The query language allows the application programmer to describe complex events. Instead

of addressing concrete instances, the queries only address capabilities and LoIs. In additionthey have temporal properties.In the following chapter we show how the models are used. We show how the queries are

instantiated, i.e., how the general query is mapped to a given instance.

40

Page 53: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 4

Instantiation and Event Processing

In order to show the combination and interworking of the concepts in CommonSens, we havecreated a simple work flow schema that shows the life cycle phases in CommonSens. Thischapter explains how CommonSens uses the data model concepts and the query language duringthe three system life cycle phases (see Figure 4.1): (1) A priori, (2) event processing, and (3)system shut down.In the a priori phase, the environment is modelled with new objects, or objects are reused

from a repository. The sensor model is independent of a particular application instance anddescribes the available sensors. Depending on the needs of the monitored person, the applicationprogrammer writes queries or reuses queries from a repository. The system uses the capabilitiesand LoIs to select the relevant sensor types during query instantiation. The results of this stepare instantiated queries that refer to particular sensors in the home instead of capabilities andLoIs. Furthermore, the system supports the sensor placement process and calculates whethersensors are appropriately placed to detect all events. When all the queries are instantiated, thequeries are turned into an event processing model that can be evaluated by CommonSens. Theevent processing phase consists of evaluating data tuples from the sensors against the conditionsin the current atomic queries. The third phase in the life cycle is the system shut down, wheree.g. adjustments to the queries can be performed.

Queries

Sensor placement

Query instantiation

Event processing model creation

(2) Event processing

(1) A priori

Sensor modelEnvironment model

(3) System shut down

Data gathering

Evaluation

Figure 4.1: Life cycle phases and concepts of CommonSens.

41

Page 54: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Section 4.1 discusses how to place the sensors in order to detect spatial events, i.e., eventsthat occur in specific LoIs in the environment. In Section 4.2 we define the query instantiation.Finally, the event processing model and event processing are discussed in Section 4.3.

4.1 Sensor PlacementIn this section we present how CommonSens uses the environment model and the sensor modelto support sensor placement. Sensor placement is important for detection of spatial events, i.e.,events that occur in the environment and inside the coverage area of the sensors. Sensors thatcover single objects, e.g. light switches, are not considered in this section. We first describehow to use existing signal propagation models and describe an algorithm that calculates howthe signal is affected by the permeability values of objects. In addition we present how coverageareas are modelled in CommonSens. Second, we discuss how to approximate the LoI by usingmultiple sensors and how to calculate the probability of false positives.

4.1.1 Coverage Area Calculation

In order to calculate how a particular sensor installation is affected by the environment, we useexisting signal propagation models for various types of signals. Signal processing is a complexissue which is out of scope for our work. On the other hand CommonSens is designed to acceptall types of signal propagation models. The only requirement is that the signal model returns thesignal strength when the initial signal strength, current permeability value, and distance from thesource are given as input. We assume that the signal strength decreases with the distance fromthe source. When the signal strength reaches a certain threshold m we regard the signal as tooweak, i.e., to be sure that the signal is strong enough to guarantee a correct reading. However,determining the reliability of the measurements made from the sensor is a signal processingissue and out of the scope for our work. We assume that every state measured by the sensor iscorrectly measured. Thus, the distance between the sensor and the coordinate where the signalstrength reaches m defines the real coverage range. We use the term range for the distancebetween two coordinates. To exemplify how our system handles signal propagation we adapt asimple model from Ding et al. [DLT+07].

Signal Model 1 The signal strength S at distance d from the sensor.

S(P0, β, d) =

{P0 if d < d0

P0

( dd0

)βotherwise

d0 denotes the physical size of the sensor, and can be deducted by using the sensor model. P0

is the signal strength at distance 0. The exponent β is based on the permeability value of thecurrent object.

Since the permeability value is between -1 and 1 sometimes the permeability value has to beconverted so that it matches the signal model. For instance, for Signal Model 1, the β value isbased on the permeability value. However, since the β value is an exponent it can not be directlyused as a permeability value. For instance, β = 0 only gives P0 and not 0. Hence, we need

42

Page 55: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

i1 i2 i3

i4

i5

i6

a)

b)

Physical sensor

Object

Figure 4.2: Signals that are sent through the objects in two directions and which create intervals.

additional functions that map the signal model to our concept of permeability. Signal Model 1also needs to be overridden with additional logic when the permeability value of an object is 0.Note that Signal Model 1 does not handle reflection, i.e., when the permeability values are

lower than 0. In addition, multipath fading issues are regarded as future work.It is important to identify all objects that could reduce the coverage area of the sensor. This

includes all objects that are located between the physical sensor and the maximal range of thesignal, i.e., where the signal strength reaches m. From the environment model all objects withtheir locations and permeability values are known. With this information, it is possible to dividethe range of the signal into a number of intervals. Each interval corresponds to the objectsthe signal may pass through. The length of the interval is the distance between the coordinatewhere the signal enters the object and the coordinate where the signal leaves the object. Figure4.2 shows the separate intervals and objects. The signals that are sent from a physical sensorpass through an object in many directions. In the figure this is illustrated by showing twosignals that go in two different directions from a physical sensor. Signal a) is perpendicular tothe object and signal b) is not. Therefore, the distance through the object for signal a) is shorterthan the distance for signal b), since the direction of signal b) is not perpendicular to the object.Hence, the two signals generate two separate intervals through the object. These two intervalsare denoted i2 and i5 and have different lengths.To calculate the signal propagation through objects, we apply the concepts from ray-tracing

and model the signal as a set of rays. A ray denotes the coordinates the signal covers when itpasses through an object until its signal strength reaches m. A coordinate is a triple (x, y, z) inthe environment. Hence, the set of coordinates in a ray defines the range of the signal.

Definition 4.1.1 A ray is a straight line from coordinate (x, y, z)B to (x, y, z)E that consists ofan ordered set of intervals:

ray((x, y, z)B, (x, y, z)E) = {val0, . . . , valN−1}

val = ((x, y, z)b, (x, y, z)e, p) where the Euclidean distance between (x, y, z)b and (x, y, z)e

43

Page 56: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

denotes the length of the interval. p denotes the permeability value. We set val0.(x, y, z)b =

(x, y, z)B.

The Euclidean distance between two coordinates (x, y, z)i and (x, y, z)j is defined as a functionDIST, which takes as arguments the two coordinates.

DIST((x, y, z)i, (x, y, z)j) =√

(xi − xj)2 + (yi − yj)2 + (zi − zj)2

Henceforth, when referring to the Euclidean distance between two coordinates (x, y, z)i and(x, y, z)j , we use the term distance or use DIST.For each of the intervals a signal passes, the signal model is applied with the permeability

value of the object and the length of the interval. This makes it possible to have a simplealgorithm that calculates how much the signal strength is reduced. To begin with, P0 is set tothe initial signal strength and d equals the length of the interval. This gives the signal strengthwhen the signal leaves the sensor. For the next interval, P0 and d are set to the current signalstrength and interval length. This operation continues until the signal strength reaches a valuethat is lower thanm or that the signal reaches an object with permeability value 0.The algorithm is defined in a function called REDUCE, which takes as arguments a signal

type γex, the threshold valuem, and the coordinate (x, y, z)φPexwhere the physical sensor φP

ex islocated. From the environment model we get all the objects between the sensor and the pointwhere the signal strength initially reaches m. From the sensor model it is possible to obtainthe current direction of the signal. The algorithm consists of a loop that adds new val tuples tothe ray and runs until P0 < m. The function returns a ray with the real coverage range. Thealgorithm is presented in Figure 4.3. The algorithm uses many helping functions in order toadapt to the current signal strength model.Initially, REDUCE uses a function MAXSTRENGTH, which takes as argument the signal type

and sets P0 to the maximum signal strength. For any coordinate in the environment the systemuses the environment model to identify which objects are located at that point. To do this thesystem uses a function called OBJECT, which returns the current objects. This is done in Line4.In Lines 6 and 7 the coordinates where the signal enters and leaves the object are obtained

from the environment model by using two functions called GETB and GETE. In order to obtainthe permeability value of the object, a function GETPERMEABILITY returns the permeabilityvalue for the signal type in the object. This is done in Line 8. In Line 10 the β value is set by afunction COMPUTEBETA, which maps the current permeability value to the β value that is usedin the function. In the next line, P0 is calculated by checking the current signal model. As inputthe algorithm uses the previous P0. While the signal strength is higher than or equalm, the newval tuples are added to the ray. This is done by a function called ASSIGN. The new P0 value isset by calling S(P0, β, d) in Line 11.In Line 12 the algorithm checks if the signal strength has been lower than m. If so, the

signal reaches m inside the object. In Line 13 the algorithm uses a function GETREMAIN-INGDISTANCE, which returns the distance from the edge of the object to the coordinate wherethe signal strength reachesm. (x, y, z)e is set by calling a function COORD that uses the environ-ment model to find this coordinate. Finally, REDUCE returns the ray. The real coverage rangeequals the length of the ray and is given by DIST(ray.val0.(x, y, z)b, ray.valN−1.(x, y, z)e),

44

Page 57: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Require: γex,m, (x, y, z)φPex

1: P0 ← MAXSTRENGTH(γex)2: (x, y, z)t ← (x, y, z)s3: i← 04: objtmp ← OBJECT5: while P0 ≥ m do6: vali.(x, y, z)b ← GETB(objtmp)7: vali.(x, y, z)e ← GETE(objtmp)8: vali.p← GETPERMEABILITY(objtmp.p, γex)9: d← DIST(vali.(x, y, z)b, vali.(x, y, z)e)10: β ← COMPUTEBETA(vali.p)11: P0 ← S(P0, β, d)12: if P0 < m then13: d = GETREMAININGDISTANCE14: vali.(x, y, z)e ← COORD(vali.(x, y, z)e, d)15: end if16: ASSIGN(ray((x, y, z)s), vali)17: i← i+ 118: end while19: return ray((x, y, z)s)

Figure 4.3: The REDUCE algorithm.

where |ray| = N .An example is illustrated in Figure 4.4. Every object except the last denotes an interval,

and in the figure there are five objects; Object A to Object E. Object A, Object Cand Object E represent the open space of three rooms. These three objects all have the samepermeability values, p1. The two remaining objects, Object B and Object D representwalls between those rooms and have permeability values p2 and p3, respectively. These per-meability values are lower than p1. The signal strength starts at 1 and decreases in ObjectA. Since p2 is lower than p1 the signal strength decreases more in Object B. In ObjectC, p1 reduces the decrease. Towards Object E, the objects and intervals correspond. Theinterval i5 starts in Object E, but has a shorter length than the object. This is because thesignal strength reaches m.The coverage area of the physical sensor is modelled as a set of rays. The rays go in those

directions the sensor is sending its signal to, or respectively receiving signals from.

Definition 4.1.2 The coverage area cov is a set of rays:

cov =

⎧⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎩

ray((x, y, z)0, (x, y, z)1),

ray((x, y, z)1, (x, y, z)2),

. . . ,

ray((x, y, z)N−2, (x, y, z)N−1),

ray((x, y, z)0, (x, y, z)N−1)

⎫⎪⎪⎪⎪⎪⎬⎪⎪⎪⎪⎪⎭

The rays that originate from (x, y, z)0 denote the signals and they sample the coverage area. Therays that do not originate from (x, y, z)0 connect the ends of two neighbouring signal rays to

45

Page 58: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

m

1

Object A Object B

Signal strength

p1 p2 p3p1 p1

Object C Object D Object E

i1 i2 i3 i4 i5

Figure 4.4: Coverage range divided into several intervals with different permeability values.

a) b) c)

Figure 4.5: A model of a circle with a set of rays.

define the outer bounds of the coverage area. The result is an approximation of the real coveragearea and the higher the number of samples, i.e., rays, the more accurate is this approximation.Note that Definition 4.1.2 allows coverage areas to be instantiated in both two and three

dimensions. Figure 4.5 shows an example of how a circle is modelled with a set of rays in a)and where the outer bounds are defined in b). The circle it models is shown in c).The real coverage area is found by applying the REDUCE algorithm on the rays in cov that

are not the outer bounds. Figure 4.6 shows an illustration of how a reduced coverage area iscreated. In a) the rays model a circle, in b) some of the rays are reduced by an object, and in c)the final rays are added to define the outer range of the coverage area.

4.1.2 Sensor Placement

It is important that the LoIs are covered by sensors that provide the correct capabilities. Oftenit is not possible to reduce the coverage areas of sensors so that they cover only the LoIs. Ifthere is a need for more accurate detection of whether an object is in the LoI, the readings fromseveral sensors need to be combined. This could be sensors that confirm that an object is in theircoverage area which overlaps with the LoI, or sensors that do not cover the LoI and confirm thatthe object is not in their coverage area. A motion detector is an example of such a sensor. Ifthere is motion somewhere in the coverage area of the motion detector, the motion detector

46

Page 59: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

a ) c )b )

Figure 4.6: The rays are affected by an object and the coverage area is reduced.

LoIA

b)

LoIA

a)

Part of NoIsec

Part of Isec

Figure 4.7: Using physical sensors to approximate LoIA.

47

Page 60: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Eq. Eq.

RFID reader

Coverage areaRFID taga)

Eq. Eq.

Sensor

b)

Coverage area

Figure 4.8: Examples of relations between sensors that give equivalent results.

reports this. However, the motion detector can not report where in the coverage area the motionis observed. This issue also applies to radio based sensors, for instance RFID readers. Figure4.8 a) shows an example where an RFID reader with a coverage area and a passive RFID tag areused. The RFID reader sends signals and responds to signals from RFID tags that are locatedwithin its coverage area. However, the RFID reader can not determine the exact location of thetag. From the RFID reader’s point of view, the three locations of the RFID tag are equivalent.MICAzmotes or similar physical sensors that use radio signals can create a similar issue. This isillustrated in Figure 4.8 b). Even though the two sensors have different spatial relations, both arewithin each other’s coverage area and can notice the presence of the other, but not the location.During event detection this issue might cause the system to report false positives. In the worstcase, the monitored person can be located on the other side of the wall but mistakenly beinginterpreted as inside the same room as the sensor. Therefore, it is important to approximate theLoIs as precisely as possible.In order to approximate the LoIs, we first combine the readings from physical sensors whose

coverage areas cover the LoI. This gives an intersection of the coverage areas, which covers theLoI. Note that in the following definitions we use set-theoretical notation for the geometricalconcepts. The intersection is defined as:

Definition 4.1.3 For all φPi in the set of physical sensors, ISEC(loi) gives the intersection of all

the coverage areas that cover loi:

ISEC(loi) =⋂{φP

i .cov|loi ⊆ φPi .cov}

48

Page 61: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

However, there may be setups with physical sensors whose coverage areas intersect withISEC(loiy) but not with loiy. If these physical sensors provide the correct capabilities and donot report any readings while the ones that are part of ISEC(loiy) do, the probability that theevent occurs inside the LoI is even higher.

Definition 4.1.4 The union of all the coverage areas that cover ISEC(loi) but not loi.

NOISEC(loi) =⋃{

φPj .cov

∣∣∣∣∣ (φPj .cov ∩ loi = ∅)∧

(φPj .cov ∩ ISEC(loi) �= ∅)

}

The approximation of loi is therefore found using the following equation:

LOIAPPROX(loi) = ISEC(loi) \ NOISEC(loi) (4.1)

The equation uses the difference between the two sets to remove the part of the shape that ISECand NOISEC have in common.A false positive can happen when an event is recognised outside loiy but inside the area

given by LOIAPPROX(loiy). In order to reduce the amount of false positives, it is important tominimise the difference between the real LoI and the approximation.The probability of a false positive can be found using the following equation:

FPPROB(loi) = 1−AREA(loi)

AREA(LOIAPPROX(loi))(4.2)

AREA is a function that calculates the size of a surface, i.e., the LoI and its approximation. Aperfect match between the LoI and the approximation is when FPPROB(loi) = 0. Unfortu-nately, it is not possible to assume that there are enough sensors to perfectly match all LoIs.An issue with the approximation and physical sensors is that in practice the sensor signals

sometimes may interfere. It is important that the system provides a synchronisation scheme thatcontrols when the physical sensors actually sample their coverage areas so that sensors whosecoverage areas intersect are not pulled concurrently.During the life cycle phases of CommonSens, the needs of the monitored person might

change. This means that there will be new LoIs and a possible need for new capabilities. Inthese situations it is important to know whether it is possible to utilise the existing setup ofphysical sensors. The system investigates the current sensors, the capabilities they provide,and their coverage areas. If sensors exist that provide the right capabilities and cover the newLoI, the system informs about this. In addition, the system calculates the probability of falsepositives.If the current setup does not cover the new LoI and the capabilities, there is need for a new

configuration of the setup. This is done either by adding new physical sensors, or by changingthe location of the physical sensors that are already in the environment. In the latter case, thisaffects the probability of false positives in the existing LoIs. In these situations the probabilityof false positives has to be recalculated for all LoIs that are affected by changing the placementof the physical sensors.

49

Page 62: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

The prototype implementation of CommonSens provides a graphical user interface that sup-ports manual placement of the physical sensors so that they cover the LoIs. FPPROB is calledand works out the probability for false positives given the current approximation. Covering theLoIs is an optimisation problem, where the cost associated with each sensor and its placementhas to be taken into consideration. However, optimising automatically the sensor placement isconsidered future work.For an atomic query, all the data tuples that arrive from the sensors in ISEC and NOISEC are

relevant. Hence, in an instantiated atomic query all the sensors in the two sets have to be in-cluded. Each of the atomic queries in a complex query is instantiated, and the ISEC and NOISECsets contain the sensors that approximate the LoI and that provide the correct capabilities. Thequery instantiation is discussed in the following section.

4.2 Query InstantiationThe main purpose of the instantiation process is to select correct sensors based on the LoIand capability in an atomic query. In other words, the instantiation process personalises thecomplex queries by adapting to the environment, sensors and monitored person. We focus oninstantiation of atomic queries since combining instantiated atomic queries for complex queriesis straight forward.The sensor model describes the capabilities for each sensor. Thus, a look-up in the set of

sensors for a given home results in those sensor types that provide the specified capability. If thesensors are not already in the home, they can be obtained from a repository. In case a capabilityis provided only by physical sensors or external sources these sensors can be used directly.Logical sensors depend on input from other sensors and the required capabilities of the inputsensors are part of the logical sensor description. These capabilities can be used to identifythe corresponding sensor types. This process needs to be iterated until all necessary sensors areidentified. In this tree the logical sensor is the root and the physical sensors and external sourcesare leaves. Intermediate nodes in this tree are also logical sensors. This tree describes thehierarchical dependency between all sensors that are needed to provide the requested capability.Simple examples of such trees are shown in Figure 3.5.In general, the size of such a tree can be of arbitrary size. The procedure is defined in Figure

4.9 and uses a recursive function called FINDSENSOR. The function takes as argument a setC of capabilities. The first time the function is called, the capability is the one that is part ofan atomic query. For each capability in C, the function obtains one of the sensors that providethis capability. This is done by a function called CHOOSESENSOR. We do not go into thedetails about how this function works; but it uses availability and cost metrics to decide whichsensor to return. In the situations where the system needs to be extended with more queries,CHOOSESENSOR investigates the current setup and returns sensors that are already placed inthe environment and approximate the LoI. For instance, it is appropriate to reuse a sensor in theenvironment if it already provides the capability and is placed correctly with respect to the LoI.In other cases it investigates cost and availability metrics. If several alternative sensors existthat provide a given capability, a possible choice would be to choose the one that costs less.However, more expensive sensors might provide even more capabilities, something which may

50

Page 63: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Require: C

1: for all ci ∈ C do2: φ← CHOOSESENSOR(ci)3: if φ = φL then4: sensors← sensors ∪ FINDSENSOR(φ.Cd)5: else6: sensors← sensors ∪ φ

7: end if8: end for9: return sensors

Figure 4.9: The FINDSENSOR algorithm.

facilitate reuse. In addition, the current queries have to be taken into consideration when tryingto choose sensors more optimally. The sensor returned by CHOOSESENSOR is mapped to thegeneral sensor φ. If the sensor is a logical sensor, the search continues. This is done in Line4, where the capabilities the logical sensor depends on are used as arguments for a new call toFINDSENSOR. The function returns the set sensors, which contains all the physical sensorsand external sources for the capability.In addition to letting the system choose physical sensors at system setup, the intention of the

vocabulary tree is to reduce the number of new sensors required when the monitored person’sneeds change. As stated in the previous section, if the system is extended with a query thatinvolves a new LoI and capability, there is a possibility that the current setup of sensors alreadyprovides the capabilities that are needed at the new LoI. Based on the vocabulary tree, it ispossible to discover new usages of the current setup.The instance of the hierarchy is defined globally. New sensors are added to the instance

with the capabilities they provide and possibly depend on. When the LoIs and the capabilitiesare chosen for a certain home, the system consults with the instance of the global hierarchy tofind the correct sensors. In cases where a capability is provided by several alternative sensors,issues like cost and availability can be taken into consideration as well.As indicated when using FINDSENSOR, there exist ways of optimising the search and choos-

ing the sensors that are most appropriate for the current environment. However, this is consid-ered future work.For an atomic query all the data tuples that arrive from the sensors in ISEC and NOISEC

are relevant. Hence, in an instantiated atomic query all the sensors in the two sets have to beincluded.

Definition 4.2.1 An instantiated atomic query of an atomic query qA is a tuple:

IqA =

(qA.cond, ISEC(qA.loi),NOISEC(qA.loi), qA.tb, qA.te, preg

)

All the atomic queries in a complex query are instantiated. During the event processingphase the sensors in ISEC should produce data tuples that match the condition, and the sensorsin NOISEC should produce data tuples that do not match the condition.

51

Page 64: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

4.3 Event ProcessingAs shown in Figure 4.1, the last part of the a priori phase is the event processing model creation.The a priori phase is followed by the event processing phase, and in this section we discuss theconcepts related to the event processing model creation and event processing.There are many alternative ways of designing an event processing model from stream based

event processing systems. The event processing model should consist of an element that re-ceives data tuples from sensors and evaluates these data tuples against one or more complexqueries that are evaluated concurrently. In order to do so, the event processing model shoulduse a query evaluator component. Since CommonSens should support many queries that areevaluated concurrently, we need a query pool, i.e., a data structure that keeps track of all therunning queries. In our work we do not discuss optimisation of concurrent queries, like oper-ator scheduling [CcR+03] or eddies [AH00]. Not all the sensors are interesting all the time.If a sensor is not part of the ISEC or NOISEC sets of an instantiated query, it is assumed thatthe sensor does not provide any important data tuples. CommonSens needs a component thatonly pulls data tuples from the sensors that are in the ISEC and NOISEC sets of the instantiatedqueries. This component is called the data tuple selector.The query evaluator, query pool and data tuple selector components are discussed in the

following sections.

4.3.1 Query Evaluator

The query evaluator is the fundamental component in all query processing systems, for instanceDBMSs [GMUW08], DSMSs [CCD+03], and CEPs [Luc01]. In general, a query evaluatorreceives a set of data tuples and a query, and returns whether the condition in the query wasmatched or not. This basic functionality is adapted by our query evaluator. However, there aretwo differences between the query processing in CommonSens and the other query processingsystems.The first difference is that next to containing one condition, an atomic query has temporal

properties that have to be evaluated. The data tuple has to contain timestamps that explicitlytell when the state value has been read. The timestamps of the data tuple have to be within theduration specified in the atomic query. In addition, the atomic queries can be timed, δ-timedand P -registered. This means that the query evaluator has to keep in memory the progress ofthe evaluation of the atomic queries. Since the complex queries also can be timed, δ-timed,and P -registered, the query evaluator needs to keep this information in memory as well. Thismeans that the query evaluator has to support stateful evaluation of the queries, e.g. countersthat investigate the P -registration. For instance, a query that corresponds to the P -registrationexample in Section 3.4 is stated like this: [(Temperature > 19C, kitchen, 08:00,23:00, min 80%)]. Between 08:00hr and 23:00hr, the temperature in the LoI kitchenshould be higher than 19◦C. This should apply to 80% of the data tuples. Only when the valueof the counter matches 80% and the current time has not passed 23:00hr, the query evaluatorconfirms that the atomic query is processed according to the conditions.The second difference is related to how the query processor should respond to the result.

Since CommonSens should respond with e.g. alarms if the monitored person does not wake up

52

Page 65: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

in the morning or if the monitored person falls and does not get up, the query evaluator shouldstart a user-defined action that depends on the query processing results. For instance, if thereis a deviation and the application programmer has explicitly asked CommonSens to be notifiedabout deviations, the query evaluator sends an alarm to the helping personnel or a notification tothe monitored person. This functionality corresponds to the event-condition-action (ECA) con-cept in other event driven architectures and active database systems [QZWL07]. The definitionof the notifications is left for future work.

4.3.2 Query Pool

CommonSens should support processing of several queries in parallel. This is a convenientfeature since there are many different events that can be important to know about in the home.For instance, there can be queries that constantly check the temperature in the home, whileother queries investigate the well being of the monitored person. In addition, there can beseveral monitored persons in the same home, even though we do not focus on this. In order tosolve this, the event processing model needs to support a pool of the current queries. It shouldalso be possible to remove and add queries to the query pool. Note that adding new querieshas to be performed during the system shut down phase, since this operation might require newsensors, sensor placement, and query instantiation.The query pool consists of the complex queries that are processed in the current environ-

ment. The instantiated complex queries are directly mapped to boxes and arrows. We have cho-sen to represent the instantiated queries as boxes and arrows since this is a simple and intuitiveparadigm. Currently, the atomic queries that are connected with logical operators are placedin one box. For instance, the following atomic queries would be placed together in one box:[(Temperature > 19C, kitchen, 08:00, 23:00) && (Light == ON , kitchen,08:00, 23:00) || (DetectPeron == Adam, livingroom) 08:00, 23:00 10%].A user-defined action is started if the temperature in the kitchen is above 19◦C and the lightis either on in the kitchen or the monitored person is in the living room. In the query pool thearrows in the boxes and arrows paradigm denote the followed-by relation. When the conditionsand temporal requirements in a box are matched, the consecutive box is evaluated. If the com-plex query uses one of the concurrency classes, there will be two chains of consecutive boxes.These two chains are evaluated concurrently by the query evaluator.

4.3.3 Data Tuple Selector

There may be many sensors in the environment, and we assume that not all these sensors canmeasure relevant states all the time. Only data tuples from the sensors in the ISEC and NOISECsets of the current atomic queries are relevant. Therefore, CommonSens needs a component thatcan pull only the relevant sensors. This is the third component in the event processing modeland is called the data tuple selector.The data tuple selector obtains from the query pool the current atomic queries. First, the data

tuple selector investigates the ISEC and NOISEC sets of each of the running queries. Second,it pulls the sensors and receives the data tuples. Third, the data tuples are sent to the queryevaluator. Sometimes there are atomic queries that require data tuples from the same sensors.

53

Page 66: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

IAQ3

IAQ1 IAQ6

Sensors

Data Tuple Selector

Environment

IAQ5

IAQ4IAQ2

ICQ1

Application Programmer

Helping Personnel

Query Evaluator

Figure 4.10: Overview of the query processor in CommonSens.

For instance, one complex query investigates the temperature value in the kitchen, while anotherquery waits for the temperature in the kitchen to reach 15◦C. The data tuple selector only pullsthe temperature sensor once and sends the data tuple to the query evaluator.A simple example of the event processing model is shown in Figure 4.10. To show an

example of the components of the event processing model, the figure shows a snapshot of theevaluation of an instantiated complex query IqC1, which consists of the instantiated atomicqueries IqA1 to IqA6. We only address the query names: IqC1 = Dev(IqA1 → during(IqA2 →

IqA4, Iqa3 → IqA5)→ IqA6). The application programmer wants CommonSens only to reportif there are any deviations from the complex query. Note that the illustration of the query issimplified. A more detailed illustration is shown in Figure 5.4 in Section 5.1.3.First IqA1 is evaluated. Then, two subsets of the complex query are evaluated concurrently:

IqA2 is followed by IqA4. During the processing of these two queries, the atomic query IqA3

should be followed by IqA5. After the concurrent queries are processed successfully, IqA6 isprocessed. The snapshot shows the processing of IqA2 and IqA3 after IqA1 are successfullyevaluated.

4.4 Discussion and ConclusionThis chapter explains how CommonSens uses the data model to create an event processingmodel that consists of instantiated complex queries that can be evaluated by a query processor.CommonSens uses signal propagation models to help the application programmer to place thesensors properly. This is done because the objects in the environment reduce the coverage areaof physical sensors. The amount of reduction depends on the permeability values in the objects.An issue with using signal propagationmodels is that even small changes in the environment

affect the signal considerably. This issue especially applies to radio signals. Many state trans-actions, e.g. turning on the water, change the coverage area of sensors that use radio signals. In

54

Page 67: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

addition, when the location of the objects in the environment changes, this affects the coveragearea. For instance, if the monitored person moves furniture or opens a door, the approximationof the LoIs might change. It is important that the application programmer is aware of these pos-sible pitfalls, and that signal propagation models and approximation of LoIs do not guaranteecorrect reading. However, using signal propagation models and approximation of LoIs can helpthe application programmer to be aware of possible sources of error and to avoid unforeseenstate values. This especially applies to signals that go through walls.When the complex queries are correctly instantiated, they are mapped to a query pool in

the event processor. In this chapter we have only presented the concepts of the event proces-sor through the event processing model. In the following chapter, we explain how the eventprocessing and other phases of the system life cycle are implemented.

55

Page 68: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 69: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 5

Implementation

This chapter describes the implementation of CommonSens. Through the implementation wecan realise the system life cycle phases and evaluate our three claims. We have chosen to imple-ment CommonSens in Java since it facilitates fast development and is a common programminglanguage that is well understood by many developers. This simplifies further shared develop-ment of the code base. In addition, the data model concepts and query language are intuitivelymapped to the concepts that an object oriented programming language uses. For instance, thethree sensor types in the sensor model are implemented as subclasses of an abstract class calledSensor. The implementation is a proof-of-concept prototype. This means that there are stillmany features that are not implemented. Some of these features are addressed during the pre-sentation of the implementation.We have chosen to separate the implementation into six Java packages with dedicated re-

sponsibilities. The implementation is inspired by the three-tiered model-view-controller (MVC)for separation of concern [Ree79]. However, there are some overlaps between the view and thecontroller classes, and we have not used any MVC frameworks during the implementation.Following is a list of all the packages:

• environment. Holds the classes that implement the properties from the environmentmodel.

• sensing. Holds the classes that relate to the sensor model.

• language. This package contains the classes that support the event processing language.

• modelViewController. This package contains the classes that are responsible for theuser interface, storing the data, and running the system.

• eventProcessor. The classes are responsible for the event processing phase.

The structure of this chapter follows the three system life cycle phases. Section 5.1 fo-cuses on how the models are implemented as classes and how these classes are structured. Thefunctionality and the remaining packages, e.g., how the event processing is implemented, arepresented in Section 5.2.

57

Page 70: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Environment

CommonSensObject

CoordinateStrength+strength: double

LocationOfInterest

Permeability+permVal: double

Ray

Shape

Triple+x: int

+y: int

+z: int

objects

1..*

boundary, boundaryReduced

1..*

1..*

from, to2

*

lois

1..*

triple

sensing.SignalType

sensing.Sensorsensors*

Figure 5.1: Key classes in the environment package.

5.1 OverviewThis section discusses and gives an overview of the implementation of the environment model,the sensor model and the query language. We exclude the event model from the discussion sincestates in the environment can only be observed by the sensors that are placed there. The statesbecome sensor readings, which are turned into data tuples and become events if and only if theymatch a condition in an atomic query.We have chosen to use UML to show the class hierarchy and the associations between the

classes that make up the implementation. To keep the diagrams simple, we have not includedall the attributes and methods. If two or more associations point to the same class type, we useone association and give it several names, which are separated by comma.

5.1.1 Environment Model

The environment package includes all the classes that are required in order to instantiate anenvironment. The classes in the package are shown in Figure 5.1. In order to correspond to themodels, the key class in the package is Environment. The environment consists of objects. Inaddition, the environment contains LoIs. In order to implement this, the Environment class hasassociations to objects of both the CommonSensObject and the LocationOfInterest classes. Theenvironment also contains sensors, and we have implemented this as an association sensorsthat point to the sensors. The sensors are part of the sensing package and are discussed inSection 5.1.2The objects, as defined in Definition 3.2.3, are implemented in the class CommonSensOb-

ject. We have chosen this name in order to avoid confusion with objects in object orientedprogramming. The CommonSensObject object has an association that points to one or moreobjects of the classes Shape and Permeability. Permeability contains the permeability value andan association to an object of the SignalType class. As with the sensors, SignalType is also inthe sensing package. LocationOfInterest is a simple class that only has one association to theShape class.

58

Page 71: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Sensor

PhysicalSensor+samplingFrequency: double

LogicalSensor

ExternalSource

Capability

<<interface>> CustomFunction

<<interface>> CapabilityDescription

SignalType

providedCapabilities

1..*

dependedCapabilities

1..*

customFunction

environment.Shape

environment.CommonSensObject

SensorDetected java.util.concurrent.ConcurrentLinkedQueue

dataTupleQueue

DataTuple *

Figure 5.2: Key classes in the sensing package.

LoIs, the environment and the physical sensors all have shapes. Hence, these classes haveassociations to the Shape class. In addition, although not explicitly mentioned in Chapter 3,the coverage area of a physical sensor can be interpreted as a shape as well. According toDefinition 3.2.1, a shape is only a set of coordinates. However, when the shape is a coveragearea, it might be reduced by the objects in the environment. Therefore, we have chosen toput all the logic that is related to shapes and calculation of shapes in the Shape class. Thisincludes the methods that calculate the reduced coverage area of physical sensors. This isreflected through the two associations boundary and boundaryReduced, which point to aset of Triple objects. The Triple class contains the x, y and z coordinates that define a spatialcoordinate. In addition, the implementation uses the GPC (General Polygon Clipper) library[gpc] and presents the shapes as polygons. This additional library supports set operations onpolygons and simplifies the operations related to approximation of LoIs. The approximationof LoIs is further discussed in Section 5.2.2. In addition, the objects made from the classesin the GPC library can be easily mapped to java.awt.Polygon, which simplifies the process ofrepresenting the environment graphically through the GUI. In order to keep the UML classdiagram in Figure 5.1 as simple as possible, the figure does not include any polygon references.According to Definition 4.1.1, a ray consists of a set of coordinate-strength tuples. These

tuples are implemented through the class CoordinateStrength. However, in our implementation,Ray objects are only used when calculating the real coverage ranges of the sensor coverage area.When the real coverage ranges are calculated, they are mapped to the boundaryReduced set.

5.1.2 Sensor Model

The sensing package contains the classes that make up the implementation of the sensor model.The key classes and their relations are shown in Figure 5.2. The most important class in the

59

Page 72: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

sensing package is the Sensor class, which is a superclass for PhysicalSensor, LogicalSensor,and ExternalSource. During the event processing, CommonSens always relates to the objectsof the three sensor types. The interaction with real sensors is always implemented behind theinterface either of the three classes provide.The Sensor class will never be instantiated, hence the class is defined as abstract. Since all

the three sensor types provide a set of capabilities, the Sensor class points to one or more ob-jects of the class Capability through the association providedCapabilities. Even thoughcapabilities play an important role in the instantiation of CommonSens queries, we have not un-ambiguously defined their properties. This is reflected through the implementation. Therefore,the Capability class is associated with the CapabilityDescription interface. The current imple-mentation only uses a simple textual description combined with a value range. For example,CapabilityDescription can be instantiated by the class SensorDetected, which returns true orfalse. SensorDetected can for instance be provided by a motion detector that reports true if it ismotion in the coverage area or false otherwise.As noted in the previous section, the PhysicalSensor class uses the Shape class in order

to model the cov property if it is defined as a coverage area. In case the cov property of thephysical sensor is an object, the PhysicalSensor class is associated with a CommonSensObjectclass. An alternative solution would have been to provide a class Cov, which has associationswith the two classes. The current design clearly tells a developer that the coverage is eitherShape or CommonSensObject. Currently, much of the logic related to cov is implemented in thePhysicalSensor class and the Shape class. The PhysicalSensor class contains an association tothe SignalType class. The SignalType class is currently implemented as a String that describesthe signal type. The samplingFrequency attribute is implemented as a double value in thePhysicalSensor class.According to Definition 3.3.4, the external source is only a set of capabilities. This is re-

flected through the ExternalSource class, which inherits the Capability association from theSensor class.The logical sensor depends on a set of capabilities. The dependency is reflected through the

UML class diagram. The class LogicalSensor has an association dependedCapabilitieswith one or more instances of the Capability class. According to Definition 3.3.5, the logi-cal sensor uses a custom function in order to perform the aggregations. This is implementedthrough an association customFunction that points to the CustomFunction interface. Cus-tomFunction is an interface for all the custom functions that the logical sensors can use. Ifneeded, the custom functions should be simple to use and simple to change. The most appropri-ate way of implementing the custom function is to implement them as plug-ins. In Java, plug-infunctionality can be provided through dynamic class loading. The only requirement is to havea compiled version of the custom function implementation. This .class file is referred toas part of the configuration of the logical sensor. When the logical sensor is instantiated, the.class file is dynamically loaded and is ready to be used. The dynamic loading is done duringthe a priori phase.In retrospect, we see that the dynamic class loading could have been used in a wider range.

For instance, capabilities and capability descriptions might change over time, and with dynamicclass loading, the changing would have been much more flexible. In addition, the implement-ing class could have been referred to in the sensor configuration. Currently, the classes that

60

Page 73: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

QueryElement

AndOperator

AtomicQuery+operator: String

+isMin: boolean

+percentage: double

+fPProb: double

ConcurrencyOperator

ConsecutiveRelationNotOperator

OrOperator

TimedListOfAtomicQueries

firstChain, secondChain

theList

nextQueryElement

environment.LocationOfInterest

sensing.CapabilityObject

eventProcessor.Timestamp

loi

capability value

begin, end

sensing.Sensor

isec, noIsec*

begin, end

Figure 5.3: Key classes in the language package.

implement the Capability interface have to be explicitly instantiated in the code. This solutionis inconvenient and makes CommonSens more static than necessary. An open problem is toredesign the code so that dynamic class loading is used more extensively.

5.1.3 Query Language

The language package contains the classes that are processed by the event processor. The struc-ture is very simple, and every class in the language is a subclass of the abstract QueryElementclass. The classes and their associations are shown in Figure 5.3. Given this structure, a com-plex query is a consecutive list of subclasses of QueryElement objects. This potential list-basedstructure is shown through the self-association nextQueryElement of the QueryElementclass. Due to polymorphism, the event processor only has to point to QueryElement objects.Only during event processing, the event processor identifies the subclass type and processes thetype thereafter. A possible solution would have been to use method overloading so that the eventprocessor does not have to investigate the subclass at all. However, each of the subclasses arevery different, and we need to use the instanceof comparison operator to identify the cor-rect subclass and use the specific methods in that class. The consecutive objects of QueryEle-ment form a list and this list corresponds to the nonterminal Queries in the syntax of thequery language. In this section we first discuss the AtomicQuery class. Second, we discuss theTimedListOfAtomicQueries and ConcurrencyOperator classes. Third, we present the operatorclasses and the relation class and show an example of an instance of a complex query.The AtomicQuery class implements the atomic query as defined in Definition 3.4.1 and

in the syntax. It has an association loi to a LocationOfInterest class and an associationcapability that points to the Capability class. Capability is used as part of the condition to-gether with the Object class, which is pointed to via the association value. We use the generalObject class, since the value in a condition triple depends on the capability. The capability isvery general and it can range from single values like integers to XML specifications like Haar

61

Page 74: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

classifiers for face recognition [WF06]. Therefore, the Java Object class is general enough tomodel all the possible formats a capability value can have. The operator in the condition is aString that can have one of the following values: ==, !=, <, >, <= or >=. We have not includedthe range of possible values in the diagram. With respect to the condition triple, an alternativedesign would have been to have a class called Condition which includes all the logic related tothe condition. Currently, this logic is located in the AtomicQuery class. The variables that areused in the P -registration are located in the AtomicQuery class. The P -registration is explicitlydefined by the boolean variable isMin and the double variable percentage. If isMin istrue it means the P -registration of the current atomic query is min. Otherwise it is max.An instantiated atomic query contains references to the sensors that approximate the LoI

through the ISEC and NOISEC sets. The class AtomicQuery implements the approximationthrough the associations isec and noIsec, which point to two sets of sensors. In addition,the variable fPProb corresponds to the probability for false positives, as defined by FPPROB.Finally, the associations begin and end point to two Timestamp objects. The Timestampclass belongs to the eventProcessor package, and holds a long value that denotes timing inCommonSens. The eventProcessor package is described in Section 5.2.3.The class TimedListOfAtomicQueries is both a QueryElement subclass and contains an

association theList that points to a list of QueryElement objects. It corresponds to the non-terminal Chain in the syntax of the query language. The purpose of the TimedListOfAtomic-Queries class is to manage a list of queries during the event processing phase and manage thetemporal properties of the list, i.e., timing, δ-timing and P -registration. As with the Atomic-Query class, the class has two associations begin and end that point to Timestamp objects.ConcurrencyOperator is a class that handles the concurrency and corresponds to the non-

terminal ConcOp in the syntax. ConcOp takes two lists as arguments, and this is reflected inthe UML class diagram through the two associations firstChain and secondChain thatpoint to TimedListOfAtomicQueries. In addition, the ConcurrencyOperator class inherits theassociation to the next QueryElement object.The three operator classes OrOperator, AndOperator and NotOperator do not contain any

logic simply because they are used to identify the logical relation between any two atomicqueries only.In order to show how the classes in the language package relate when they are parsed, we

use the example query IqC1 from Section 4.3.3 and Figure 4.10. Note that the figure does notshow the instantiation process where the ISEC and NOISEC sets are used. It only shows theparsed complex query. The parsed version of IqC1 is shown in Figure 5.4. The complex queryis managed by an object of TimedListOfAtomicQueries, which points to the AtomicQuery ob-ject IqA1. IqA1 points to the ConsecutiveRelation object r1, which points to the Concurren-cyOperator object during. As explained above, the ConcurrencyOperator class refers to twoTimedListOfAtomicQueries objects. Each of these two objects refer to the atomic queries thatdescribe concurrent events. Finally, during points to the ConsecutiveRelation object r4, whichpoints to the atomic query IqA6.The alternative solution would have been to link the objects so that their configuration is

similar to the one in Figure 4.10. However, we need an object that monitors the processing ofthe concurrent events and verifies that the concurrency is correct. This is most convenient whenthe two concurrent TimedListOfAtomicQueries objects are handled by the ConsecutiveRela-

62

Page 75: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

TimedChainOfAtomicQueries:IqC1

AtomicQuery:IqA1

ConcurrencyOperator:during

ConsecutiveRelation:r1

AtomicQuery:IqA2

ConsecutiveRelation:r2

AtomicQuery:IqA4

AtomicQuery:IqA3

ConsecutiveRelation:r3

AtomicQuery:IqA5

ConsecutiveRelation:r4

AtomicQuery:IqA6

TimedChainOfAtomicQueries:chain1

TimedChainOfAtomicQueries:chain2

Figure 5.4: The parsed version of IqC1.

tion object. Hence, the event processing continues if the concurrent complex events occur asspecified.The three packages environment, sensing and language contain classes that implement the

data model in CommonSens. In the following section, we show how these classes are usedduring the a priori phase.

5.2 FunctionalityThis section discusses how the sensor placement, query instantiation and query processingmodel creation are implemented, i.e., it discusses the implementation of the stages in the sys-tem life cycle phases (see Figure 4.1). Finally, the event processing phase is discussed. ThemodelViewController package combines many of the features that CommonSens provides.Therefore, it is important to present the classes that this package consists of. Note that thissection does not discuss all the algorithms and functionalities in CommonSens. This sectiononly discusses those algorithms and functionalities that are important for obtaining sufficientinformation about how the system works.

5.2.1 System Control

ThemodelViewController package is responsible for the user interface, storing/retrieving datafrom configuration files, and running CommonSens. In this section we discuss the most impor-tant classes: CommonSens, MainView, Core, QueryParser, SensorCreator, EnvironmentCre-ator, LoICreator, Environment, EnvironmentPanel, and MovementCreator. Figure 5.5 gives anoverview of these classes in themodelViewController package.The main class is called CommonSens. The most important task of the CommonSens class

is to read the command line arguments and configuration files, as well as starting up the coreof the system, i.e., the GUI (view) and the controller. The class CommonSens can take a set ofcommand line arguments, which can be used to instruct the system to e.g. perform regression

63

Page 76: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

CommonSens

Core

MainView

mainPanel

core

EnvironmentCreator

environmentCreator

LoICreator

MovementCreator

SensorCreator

QueryParserqueryParser

physicalSensorCreator

loiCreator

movementCreator

Environment

currEnvironment

EnvironmentPanel

drawPanel

Figure 5.5: Key classes in the modelViewController package.

Figure 5.6: Main window in CommonSens.

64

Page 77: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

tests of the system or to run experiments. The command line arguments are associated withexperiments and regression tests, and are further discussed in Chapter 6.The class CommonSens has an association mainPanel that points to one instance of the

class MainView. MainView is the GUI, and it is required that the GUI is simple to understandand to use for the application programmer. This implies that the application programmer shouldbe given the possibility to open files that describe environments and sensors, and give the appli-cation programmer the possibility to create and edit environments and sensors, as well to openand save queries. The application programmer should be able to place sensors in the instantiatedenvironment and to obtain information about the probability of false positives from the approx-imation of LoIs. Finally, the GUI should inform about the result from the query instantiationand let the application programmer start the event processing phase. When necessary, the GUIshould let the application programmer stop the event processing and transfer CommonSens tothe system shut down phase.The current MainView layout is shown in Figure 5.6. In the following we only present the

buttons that are related to the application programmer tasks. The current layout was designedduring the development of CommonSens. Hence, it contains several buttons related to exper-imentation. These buttons are discussed in Chapter 6. In order to design a user interface thatis more optimal for application programmers, i.e., easier to use, there is need for an extensivestudy. Such a study needs to include a representative selection of application programmers whogive feedback about the layout through an iterative process with the GUI designers. This pro-cess of conducting such a study is too time consuming for a prototype, hence we consider thisiterative process as future work.CommonSens has modes for simulations (‘Start Simulation’ button), step-by-step simu-

lation (‘Step Simulation’), and real-world monitoring (‘Start Monitoring’). The step-by-stepsimulation and full simulation modes are mostly used by the application programmer duringtesting of a new environment. The real-world monitoring mode is set when the environment isfully tested and when the monitoring should commence. Although they have different purposes,the modes are implemented quite similarly.MainView is associated with the Core class. When the application programmer has given an

instruction, Core is responsible for controlling the system so that it runs the correct operations,i.e., that it provides the application programmer with sensor placement, instantiates the queriesand performs event processing.Currently, Core only manages to use one environment. This means that one running instance

of CommonSens refers to one single home. This is currently sufficient. On the other hand, apossible extension is to let the application programmer handle several instances at the sametime, for instance housing cooperatives with many apartments and monitored persons. Theenvironment is specified in a configuration file and can be opened by pushing the ‘Open Envi-ronment’ button. The ‘New Environment’ button initiates the creation of a new environment.The environment is accessed through the class EnvironmentCreator. EnvironmentCreator uses aGUI that shows the current environment. The GUI is instantiated through the EnvironmentPanelclass (see Figure 5.7). Currently, the application programmer is restricted to instantiate the en-vironment in two dimensions. On the other hand, as explained in Section 5.1.1, the class Triplecontains the values x, y and z to define any spatial coordinate. A future extension of Common-Sens uses three dimensions to instantiate the environment, which gives an even more realistic

65

Page 78: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 5.7: Environment creator in CommonSens.

model of the environment. When the environment is fully created, the application programmercan save the environment by pressing the ‘Save Environment’ button.To show how the environment instance looks like for the application programmer, we have

chosen to display the environment that is used in [SGP10b]. The GUI that shows the envi-ronment allows the application programmer to create simple environments by adding objects(‘Add Object’ button), sensors (‘Add Sensor’ button) and LoIs (‘Add LoI’ button). In addition,the application programmer is allowed to change the size and rotate any objects in the environ-ment. The button ‘Add Person’ allows the application programmer to add a monitored personto the environment so that certain movement patterns can be investigated. ‘Toggle Coverage’switches between the coverage area as it is specified by the producer of the physical sensors andthe reduced coverage areas. The figure shows the reduced coverage areas.The application programmer can open queries with the ‘Open Query’ button. Figure 5.6

shows the complex query [(DetectPerson == Person1, LoI2, 10steps, min 50%)-> (DetectPerson == Person1, LoI1, 10steps, min 50%)]. CommonSens doesnot yet provide a lexical analyser [Bor79] like lex, so the queries have to be written in a verysimple form to be accepted by the query parser. The class QueryParser is discussed in Section5.2.3.Currently, the application programmer can create new objects and new physical sensors.

New objects can be created by pushing the ‘New Object’ button, which allows the applicationprogrammer to set shape and add permeability tuples to the object. In order to create physicalsensors, the application programmer can push the ‘New Physical Sensor’ button. The applica-tion programmer can set the properties of the physical sensor, e.g. define the coverage. Thecoverage area is simply defined by setting the angle. For instance, a circular coverage areahas an angle of 360◦. A camera might have a coverage area of 45◦. Since the GUI only pro-

66

Page 79: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

vides support to create physical sensors, both logical sensors and external sources have to bedefined manually through configuration files. Currently, more complex coverage areas have tobe defined manually in configuration files.Finally, in order to investigate how the instantiation works with a monitored person, the ap-

plication programmer is allowed to create movement patterns, i.e., patterns that show where themonitored person moves inside the environment. The creation of movement patterns is handledby the MovementCreator class. Movement patterns are easily created by letting the applicationprogrammer decide which coordinates in the environment where he wants the monitored personto be. When all the coordinates are defined, CommonSens automatically creates a movementpattern. The movement pattern can also be defined manually in a configuration file.CommonSens provides configuration files for the environments, sensors, queries, and move-

ment patterns. The configuration files act as repositories that allow reuse of e.g. already definedenvironment objects or sensors. The configuration files are written in a human readable format,which allows the application programmer to manually adjust and edit them. Currently the for-mats are very simple; there is a tag that tells what type of value the configuration file parsershould expect. The current formats can be easily transformed into XML and parsed by an XMLparser. It is also possible to store the configuration files in a database if that is needed.

5.2.2 Physical Sensor Creation and Placement

CommonSens allows the application programmer to place sensors in the instantiated environ-ment. This process helps the application programmer to investigate how the coverage areas ofthe physical sensors are affected by the objects and how the LoIs are approximated. In addition,when the sensors are chosen, the ISEC and NOISEC sets are defined as well, which means thatthe placement of sensors is a process that is part of the query instantiation. This section explainshow this functionality is implemented.First, the sensors have to be created or retrieved from a sensor repository. Physical sensors

can be created by using the GUI or manually by setting their properties in a configuration file.Logical sensors and external sources can only be created manually. Second, the applicationprogrammer has to choose the number of sensors that should approximate the LoIs. If therealready exist sensors in the environment that approximate the LoI, these can be included aswell.CommonSens calculates the FPPROB values in two calculation steps. The first step cal-

culates the FPPROB value during the placement of sensors. The second step calculates theFPPROB value when the queries are instantiated. The first step is simpler than the second, sinceit does not map capabilities to the coverage areas. The second step is related to the atomicqueries and maps the capabilities of the sensors with the LoI. We explain the reason for this twostep process in the following.As noted above, the first step investigates the approximation of the LoI and ignores if the

physical sensors provide the correct capabilities. This step corresponds directly with the def-inition of LOIAPPROX. The capabilities are ignored since there is no link between a LoI andthe capabilities that the application programmer wants to detect. The link is only defined in anatomic query, where both the cond triple and the LoI are used. Hence, during placement, thelink between the LoI and the capability has to be known implicitly by the application program-

67

Page 80: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

mer. For instance, an environment can have a LoI called medicineCloset, which coversthe medicine closet in a home. The application programmer moves the sensors so that theircoverage areas approximate medicineCloset as described in LOIAPPROX. When the ap-plication programmer moves the mouse pointer over a LoI in the environment, CommonSensautomatically generates the FPPROB value with the first calculation step. The calculation isperformed by a method calculateError() in the Environment class, which returns a dou-ble that indicates the FPPROB value. The calculateError() method is discussed later inthis section. Note that the methods sometimes use arguments. To keep this presentation simplewe do not show the arguments in the methods.The second calculation step maps the LoI with the capability in the atomic query. This cal-

culation step is more accurate, and is performed automatically during the query instantiation.Even though this does not correspond directly with the definition of LOIAPPROX, it has a morepractical value. Currently the second calculation step can not be used by the application pro-grammer to manually approximate the LoIs. For instance, in the field ‘Current condition/state:’in the GUI in Figure 5.6, the FPPROB value is shown next to the ISEC and NOISEC sets. Thisvalue is obtained from the second calculation step. The second calculation step is performedby the method calculateError(). The method finds the relevant sensors and populate theISEC and NOISEC sets. In addition, it sets the fPProb value in the AtomicQuery object.Even though the two calculations and methods take two different sets of arguments, they are

very similar. First, the methods traverse all the physical sensors in the environment and choosethose physical sensors that cover the LoI. The second calculation step also requires that thechosen sensor provides the capability. All these physical sensors are placed in an ISEC set. Themethod uses the polygon properties of the GPC library and calculates the intersection betweenthe coverage areas. Second, the method chooses the physical sensors that are in the NOISEC set.This choice is made by creating two temporary intersections. The first temporary intersectionis between the coverage area of the physical sensor and the coverage areas in the ISEC set. Thesecond temporary intersection is between the coverage area of the physical sensor and the LoI.If the area of the first intersection is larger than 0 and the area of the second intersection is 0,the physical sensor is placed in the NOISEC set. When the ISEC and NOISEC sets are defined,the method runs an XOR operation on the intersections of the coverage areas in the ISEC setand the intersections of the coverage areas in the NOISEC set. The result of this operation is theapproximation. Finally, the FPPROB value is calculated by subtracting the quotient of the areaof the LoI and the area of the final intersection from 1. If the LoI is not defined in the atomicquery, the method simply chooses all the sensors in the environment that provides the capabilityand adds them to the ISEC set. This requires that all the sensors send data tuples that match thecondition. We have yet not defined the behaviour of CommonSens when the LoI is not specified,and a consistent definition remains an open issue. The method calculateError() is shownin Appendix A.1.The issue with the current two-step calculation process is that the application programmer

can choose sensors that do not provide the correct capabilities. An alternative and more preciseapproach for calculating the FPPROB values is to let the application programmer choose whichatomic query he is interested in matching the approximation for. This approach lets the appli-cation programmer approximate the LoI for each atomic query, which excludes the possibilitythat the application programmer can include sensors in the ISEC and NOISEC sets that do not

68

Page 81: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

:PhysicalSensorApplication Programmer

:EnvironmentPanel :Shape :Ray

Loops through all triples in

the boundary set.

’Sensor is moved’getPolygonReduced()

getPolygonReduced()<<create>>

reduceRay()

:Triple

:Polygon:Polygon

’Shows new coverage area’

Figure 5.8: Classes involved in the calculation of reduced coverage area.

provide the correct capabilities.During the sensor placement it is important for the application programmer to see how the

permeability values in the objects affect the coverage areas. In the following we show howthe calculation is implemented. As mentioned in Section 5.1.1, the Shape class contains theassociations, boundary and boundaryReduced, that point to two sets of Triple objects.In order to define boundaryReduced, CommonSens uses a signal propagation model, e.g.Signal Model 1. To move a physical sensor, the application programmer uses the GUI to pointat the coverage area of the physical sensor. CommonSens notices if the physical sensor hasbeen moved by the application programmer. If it has been moved, the calculation of the newreduced coverage area starts.The classes that are involved in the calculation of the reduced coverage area are shown

in the UML sequence diagram in Figure 5.8. The EnvironmentPanel object calls the methodgetPolygonReduced() in the PhysicalSensor object, which calls a similar method in itsShape object. The Shape object, which already has a set of Triple objects in boundary,uses these Triple objects to define the reduced range for each ray in the boundaryReducedset. Note that in Definition 4.1.1, there are two types of rays that form the coverage area.The first type models the distance from the sensor to the edge. The second type builds theboundary between the edges (see Definition 4.1.2). Since we use the Polygon class from theGPC library, we are only required to model the coverage area in the first way. The Polygon classautomatically generates a polygon from the boundary points. The Ray class is just performingthe calculations using the REDUCE algorithm on each ray (see Figure 4.3).In fact, our implementation of the REDUCE algorithm iterates through every coordinate

along the ray. For each point, this linear algorithm investigates the current object and perme-ability value. This is a simpler way than identifying intervals, however more time consuming.If we wanted to correctly match the REDUCE algorithm, the implemented method would haveneeded to identify intervals instead of iterating.When the signal strength value reaches a predefined value or the permeability value is 0, the

algorithm returns a new Triple object. This object is used to define the new reduced coverage

69

Page 82: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

New Triple object

Coverage range

Reduced coverage range

Original triple objectStarting point not changed reduceRay()

a)

b)

ray

Figure 5.9: Before and after the reduceRay()method has been called.

range. This is illustrated in Figure 5.9. In a) the original Triple object defines the boundaryof the ray. The method reduceRay() returns the new Triple object, as shown in b). Theimplementation of the method is shown in Appendix A.2.

5.2.3 Event Processing Model Creation

This section presents the implementation of the event processing model and the event process-ing model creation. The classes that are related to the event processing model belong to theeventProcessor package. The event processing model creation is initiated when the applicationprogrammer chooses to start simulation, step simulation or start monitoring. We have chosen tolet CommonSens wait with the event processing model creation until the application program-mer starts simulation or monitoring since it is a complex process.The idea with the creation of the boxes-and-arrows structure is that it provides a frame-

work that extends the complex queries with the instance specific information. In addition, thestructure arranges the atomic queries so that they are simpler to evaluate and to process. Thisespecially applies to the logical operators, which combine a set of atomic queries that shouldbe evaluated concurrently. The event processing model creation is closely related to the secondcalculation step that we described in the previous section. In this step the ISEC and NOISECsets are populated based on the sensors that provide the correct capabilities and approximatethe LoI that is used in the atomic queries. As noted, the approximation of the LoIs is still notautomated; the application programmer has to manually choose sensors and place them in theenvironment.The key classes in the eventProcessor package and their relations are shown in the UML

class diagram in Figure 5.10. One or more objects of the class QueryPoolElement implementone complex query in the query pool. QueryPoolElement has an association currBox thatpoints to a Box object, which is a superclass for the classes ConcurrencyBox and TimedListOf-Boxes. In addition, the Box class has also associations to two other classes. The associationnextBoxTransition points to a Transition object, which points back to a new Box object.The structure of associations makes it possible to create the boxes and arrows, where the arrowsare represented by the Transition objects. In the current implementation, the Transition classdoes not have any other tasks than to point to the next Box object. Hence, an alternative solutionwould have been to skip the Transition class and have an association transition betweenthe Box objects.Box has an association elements, which points to zero or more ArrayList objects. Each

of these ArrayList objects points to a set of QueryElement class. We describe this structuremore thoroughly, since it is essential for how the current event processing is performed. The

70

Page 83: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

QueryPoolElement

+instantiateBoxes(currElem:QueryElement)()Box

Transition

currBoxnextBoxTransition

nextBox

java.util.ArrayList

elements

*

language.QueryElement 1..*

AndChainCalculations

andLists

*

ConcurrencyBox

TimedListOfBoxesMainDataTupleFilterDataTupleFilter

currentFilters*

java.util.concurrent.ConcurrentLinkedQueue

dataTupleQueuecurrBox

DataTuple

*

environment.Environment

firstBoxList, secondBoxList

theBoxList

Figure 5.10: Key classes in the eventProcessor package.

:java.util.ArrayList

qA1:AtomicQuery

qA2:AtomicQuery

qA3:AtomicQuery

:java.util.ArrayList

qA4:AtomicQuery

qA5:AtomicQuery

:java.util.ArrayList

qA6:AtomicQuery

:Box

s1:Sensor

sN:Sensor

Figure 5.11: Instantiation of a box with atomic queries.

71

Page 84: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

idea with this structure is to provide simple support for the logical operators ∧ and ∨. Thecurrent syntax allows the application programmer to write simple logical expressions like qA1∧

qA2 ∧ qA3 ∨ qA4 ∧ qA5 ∨ qA6. The expression is true if one of the sets {qA1, qA2, qA3} or {qA4,qA5} or {qA6} is true. Each of the atomic queries that are related with the ∧ operator have to betrue, and we place these atomic queries together, as shown in Figure 5.11. The list of atomicqueries that are related with ∧ is called an ∧-list. In the current implementation, all atomicqueries in an ∧-list have to have similar timing and P -registration. For example, queries like[(Temperature == 20, kitchen, 3hours) && (Temperature == 18, bedroom,08:00hr, 09:00hr, 50%)] are not allowed. Currently, all the succeeding atomic queries,i.e., those that are related with ∧, inherit the temporal attributes of the first atomic query. Wehave chosen to do this because it simplifies the semantics of the ∧ operator.Figure 5.11 also shows the pointers to the sensors that are part of the ISEC and NOISEC sets.

Since the QueryElement classes are instantiated as AtomicQuery objects, we have used thesein the figure. Even though the UML diagram shows that the ArrayList objects point to a set ofQueryElement classes, the current implementation only points to AtomicQuery objects. How-ever, future implementations might use other subclasses of QueryElement as well, for instanceto solve the following issue with the current structure. The general drawback of having such asimple structure is that it does not allow complex logical structures. For instance, expressionslike (((qA1.0 ∨ qA1.1) ∧ (qA2.0 ∨ qA2.1) ∧ qA3) ∧ (qA4 ∨ qA5) are currently not supported. Onequestion that needs to be answered is how complex logic we should expect. Answering sucha question requires a more thorough requirement analysis. On the other hand, extending thesupport for more complex logical expressions gives CommonSens a stronger expressivenessthan what CommonSens supports today, which is generally important. It might be sufficient toactively use other subclasses of QueryElement, but at this point we consider solving this issueas future work. The Box object has an association andLists that points to a set of AndChain-Calculations objects. These objects are used to organise the event processing, i.e., to find outwhich of the lists that are true. When one of the lists are true, the event processor can start thetransition to the next box.As with the ConcurrencyOperator and TimedListOfAtomicQueries in the language pack-

age, the event processing model uses ConcurrencyBox and TimedListOfBoxes objects to tell theevent processor how to handle the complex queries. The objects are chosen during the instantia-tion, which is discussed later in this section. ConcurrencyBox points to two QueryPoolElementobjects; one for each list. This corresponds to how the class ConcurrencyOperator class in thelanguage package uses the two associations firstChain and secondChain to point to twoTimedListOfAtomicQueries objects. TimedListOfBoxes points to QueryPoolElement throughtheBoxList.The data tuple filter is an important structure in the event processing model since it chooses

the sensors that should be pulled. The data tuple filter consists of two classes: DataTupleFilterand MainDataTupleFilter. We have divided the data tuple filter into two classes to meet futureextensions of CommonSens. The current implementation of CommonSens processes one queryat the time, i.e., when the application programmer opens a query from file, it means that Com-monSens removes the old one and inserts the new. This functionality is implemented in theCore class in the modelViewController package. Even though Core does not support manyqueries, we have implemented a framework in the eventProcessor package that can support

72

Page 85: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

several concurrent queries. MainDataTupleFilter has an association currentFilters thatpoints to a set of DataTupleFilter objects. One DataTupleFilter object is responsible for onecomplex query. This responsibility is illustrated through the association currBox that pointsto the current Box object, i.e., the current set of atomic queries that should be evaluated.There is no communication between the complex queries. If there are two atomic queries

that need to pull the same sensor concurrently, there is no functionality in the Box class thatmakes sure that the sensor is not pulled twice within a very short time interval. We assumethat concurrent pulls are unnecessary since they will provide the same value in the data tuple.Hence, the main task of the MainDataTupleFilter is to reduce the number of sensor pulls. TheMainDataTupleFilter pulls the sensor once and distributes the data tuple to both the atomicqueries. This process is further discussed in Section 5.2.4.The communication between the sensors and the data tuple filter is done through the current

environment, i.e., the Environment object. When the sensors are pulled, they return a data tuple.The data tuple is added to the ConcurrentLinkedQueue object. The same object is accessed bythe data tuple filter and is removed from the queue and sent to the DataTupleFilter object whichsends it to the current atomic query.The second part of this section shortly discusses how the classes in the eventProcessor

package are instantiated. We first discuss the creation of the query pool elements. The con-structor of QueryPoolElement receives the complex query and returns the boxes-and-arrowsstructure, i.e., the Box and Transition objects. This functionality is implemented in Query-PoolElement, which uses the method instantiateBoxes(). instantiateBoxes() isrecursive and creates the boxes-and-arrows structure by calling itself in new QueryPoolElementobjects. It iterates through the QueryElement objects in the parsed complex query and performsoperations that depend on the QueryElement subclass. In every new instance of QueryPoolEle-ment object, instantiateBoxes() performs a set of if-tests. instantiateBoxes()works as follows.

• TimedListOfBoxes objects initiate a new set of QueryPoolElement objects.

• AtomicQuery objects are added to the current list of atomic queries in the current Boxobject. When the AtomicQuery objects are separated by ∧ operators, they are pushed intothe current ArrayList of atomic queries.

• When the QueryElement is an ∨ operator, the Box object creates new list of atomicqueries, i.e., a new ArrayList.

• If the QueryElement is instantiated as a ConsecutiveRelation object, new Transition andBox objects are created.

• ConcurrencyBox objects initiate two lists of TimedListOfBoxes objects.

As we have shown above, the instantiation process consists of adding elements to the parsedcomplex query. In practice, the parsed complex query is inserted into the Box objects andits subclasses. This corresponds to Definition 4.2.1; the instantiated query inherits most of theattributes from the query that is written by the application programmer. In the following section,we show how the instantiated query is used in the event processing phase.

73

Page 86: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

5.2.4 Event Processing

This section describes the event processing phase, i.e., the steps CommonSens takes in orderto evaluate an instantiated complex query. The event processing phase consists of pulling datatuples from the sensors in the ISEC and NOISEC sets. In addition, the event processing phaseconsists of comparing the values in the data tuples with the conditions in the atomic queries.The event processing phase begins when the application programmer chooses to start a

simulation or monitoring. The simulationmode is convenient when the application programmerwants to test the system and run regression tests. For example, several of the experiments inChapter 6 are run in simulation mode. The main difference between simulation and monitoringis the timing. A simulation uses virtual sensors which are pulled as fast as possible, whichmeans that the simulations are not real-time. Both the simulation and monitoring are initiatedby the method startEventProcessing() in Core. Core, which is the controller class,handles most of the functionality related to simulations and monitoring. If the event processingis started in monitoring mode, the monitoring is started as a separate thread. This makes itpossible for the application programmer to use the system during the event processing phase,and to stop the monitoring thread when he wants to initiate the system shut-down phase. Ifthe event processing is started as a simulation, the whole process locks the system until thesimulation has finished.For both simulation and monitoring, the event processing phase has to be set up. The setup

is done by calling the method setupEventProcessingPhase(), which is located in theCore class. First, the method starts the timer. If the timer is told to start in simulation mode,it sets the time to 0. An alternative would have been to let the application programmer choosethe timestamp he wants the system to start simulation at. In monitoring mode, CommonSensuses the system clock to set the time to the current point of time. Second, the method startsall the sensors. This process mainly consists of setting the timing and the shared queue. Ifthe Sensor object is connected to a real sensor, it wakes up the sensor and makes it ready tosend data. Currently, the functionality related to waking up the real sensors is implementedin the CommonSens class, but future work consists of moving this functionality to the sensorclasses. We describe the process of waking up the real sensors as part of the evaluation inChapter 6. The third step in setupEventProcessingPhase() is to create and return theMainDataTupleFilter object. Throughout the event processing phase, Core only interacts withthe MainDataTupleFilter object.An overview of the event processing is shown in Figure 5.12. Note that there are several

details that are not included in the figure. We describe these details during the following pre-sentation. The evaluation is performed by calling the method pullAndEvaluate() in theMainDataTupleFilter class. First, the method calls the local method pullSensors(). Itgoes through all the DataTupleFilter objects. Note that the current implementation accepts onecomplex query, thus, it uses only one DataTupleFilter object. However, we refer to the DataTu-pleFilter objects in plural to make the presentation more general and more representative forfuture extensions.For each of the DataTupleFilter objects, pullAndEvaluate() calls the method get-

SensorsToPull(). The method is located in DataTupleFilter and calls the method get-SensorsToPull() in the current Box object. The Box object iterates through all the atomic

74

Page 87: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

:MainDataTupleFilter :DataTupleFilter :Box :AtomicQuery :Sensor :ConcurrentLinkedQueue

pullAndEvaluate()getSensorsToPull()

getSensorsToPull()getSensorsToPull()

HashMapHashMap

HashMap

pullThisSensor()add(:DataTuple)

poll()

DataTuple

getSensorsToPull()getSensorsToPull()

getSensorsToPull()

HashMapHashMap

HashMap

getBatch()evaluateBatch()

returnStatement:int

Figure 5.12: Overview of the event processing phase.

queries that are currently evaluated. These atomic queries are identified by the timestamps.The two timestamps in timed atomic queries tell when the atomic query should be evaluated.If the current time, either simulated or real time is within the interval of an atomic query, theBox object extracts the sensor references in the ISEC and NOISEC sets of the atomic query. δ-timed atomic queries are different, but when the evaluation of the δ-timed atomic queries starts,the timestamps in the atomic query are set. Since the tb and te attributes of the atomic queryare defined at once the evaluation of the δ-timed query starts, this process turns the δ-timedatomic query into a timed atomic query. The evaluation of the atomic queries is discussed laterin this section. The sensor references are returned as hash maps. The hash map data typeremoves redundant sensors and provides a simple way to avoid pulling the same sensor twice.CommonSens uses a pull-based model, i.e., it obtains information from the sensors when theyare pulled.The current implementation of CommonSens does not fully support variations in the sam-

pling frequency. The current sampling frequency is statically defined and is used by all thesensors. This approach simplifies much of the issues that comes from synchronising the evalu-ation of data tuples from sensors with different sampling frequencies. Future work consists ofinvestigating the sampling frequency issues further.Each of the sensors that are referred to in the hash map are accessed through the Environ-

ment object. In order to keep Figure 5.12 simple, it only shows the direct connection to theSensor object. The sensor is pulled with the method pullThisSensor(). The sensor ob-tains the value, creates the data tuple object and adds the current timestamp. The resulting datatuple is added to the shared queue. When all the sensors have been pulled, MainDataTupleFilterpolls the data tuples from the shared queue. We use the term poll, since this term correspondsto the poll()method that obtains the data tuple that is in front of the shared queue.An alternative to using the shared queue is to let the method pullThisSensor() return

the data tuple directly. This would have been sufficient in the current implementation. However,we use the shared queue in order to handle sensors that do not produce data tuples immediately

75

Page 88: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

when they are pulled. Simple sensors like temperature sensors report immediately when theyare pulled, but there might exist sensors that are more complex. For instance, a logical sensormight need to perform complex computations on the data tuples they obtain. A logical sensorthat provides face recognition might need time to process the camera image, detect the face andcompare the face with features in a database. In order to avoid that this logical sensor forces thepulling process to wait, it is more convenient to use a shared queue. The sensor adds the datatuple to the shared queue when it is ready. The current implementation does not support thatcertain data tuples can be delayed.We assume that all the data tuples are added to the queue. If one of the sensors are delayed,

the data tuple will simply not show up in the current sensor pull. As described in the followingdiscussion, absent data tuple affects the evaluation of the related atomic query.Figure 5.12 is simplified, and it appears that only the method pullAndEvaluate() in

MainDataTupleFilter performs all the computation related to the event processing. The truth isthat this process is performed by several methods inMainDataTupleFilter. However, to maintaina simple overview of the event processing phase, we choose to keep the current illusion. Furtherdetails are shown in the code.When the data tuple is polled from the shared queue, it is important to know which atomic

query should evaluate the data tuple. Currently, the shared queue is polled until it is empty. Thedata tuples are put into a hash map. For each of the data tuple filters, the sensors are identifiedonce more, which is a repetition of the previous calls. A more efficient solution is to keep theoverview about the box-sensor relationship from the first time this information is obtained.The bottom part of Figure 5.12 shows the methods in the query evaluation component. If

the box contains many atomic queries, the box receives a batch of data tuples. MainDataTu-pleFilter sends the batch to DataTupleFilter, which sends it to the box by using the methodevaluateBatch(). evaluateBatch() iterates through all the atomic queries. This isdone with a double for-loop that first iterates through the ArrayList objects, and for each Ar-rayList object, it iterates through the AtomicQuery objects. Finally, all the data tuples in thebatch have to be evaluated with the current atomic query. If the data tuple comes from a sensorin the ISEC set, it should match the condition in the atomic query. Otherwise, if the data tuplecomes from a sensor in the NOISEC set, it should not match the condition in the atomic query.If a batch of data tuples match the ISEC and do not match the NoIsec set, we say that thecondition in the atomic query is matched.Our current algorithm gives evaluateBatch() a complexity of O(N3). Such a com-

plexity is very time consuming for large values of N , i.e., if the Box object contains a largenumber of AtomicQuery objects and there is one data tuple for each atomic query. However,as we show in Chapter 6, current system utilisation does not suffer from issues regarding thecomplexity. On the other hand, O(N3) is generally not good, and future work is to optimisethis algorithm.The conditions in all the lists with atomic queries that are related with the ∧ operator have

to be matched. If one of the conditions is not matched, the algorithm jumps to the next setof atomic queries. At once every condition in one of the lists is matched, the algorithm startsupdating the statistics for this list. The process involves setting the current list in evaluationmode. If the atomic queries are δ-timed, they become timed: The tb attribute is set to containthe current timestamp. te is set to contain the sum of tb and the δ-time duration. Since the

76

Page 89: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

qA1

qA2

qA1

qA2

Mixed matching

Uniform matching

Figure 5.13: Mixed matching versus uniform matching.

P -registration is inherited by all the atomic queries in the ∧-list, the current implementationdoes not allow mixed matching of the atomic queries. By mixed matching we mean that atomicqueries in an ∧-list are matched by two different sequences of data tuples. An example ofmixed matching and uniform matching of the complex query qA1 ∧ qA2 are shown in Figure5.13. In order to realise mixed matching, the application programmer has to use the EQUALSconcurrency operator instead.When the box has identified which list of atomic queries that have matched the condition

and updated the statistics, the data tuple filter continues to evaluate whether it should perform atransition to the next box or not. The evaluation is performed by the methodevaluateBox(),which looks at each of the ∧-lists that are under evaluation. If one of the ∧-lists have met thetemporal requirements, the data tuple filter starts to send data tuples to the next box. If there areno more boxes, the complex query is finished.

evaluateBox() also investigates if there are any ∧-lists that are timed, and where thecurrent time has exceeded the tb-timestamp. In this case, it updates the statistics and sets aflag that informs that the evaluation of the ∧-list has started. Note that such an operation isallowed since the atomic queries can be P -registered; they do not have to be matched by 100%of the data tuples. In case the current time exceeds the value in te, evaluateBox() checksif the P -registration is satisfied. If not, the evaluation of the complex query stops. If the ap-plication programmer has stated that he is interested in deviations from the complex query,evaluateBox() sends a notification about the deviation.

5.3 Discussion and ConclusionThis chapter presents the prototype implementation of CommonSens. The implementation ismainly made to show that the concepts from Chapters 3 and 4 can be implemented in a realsystem. The structure of this chapter follows the life cycle phases of CommonSens, i.e., wegive an overview of how the models are implemented before showing the functionality andplacement and creation of the physical sensors. Finally, we discuss the elements that are part ofthe implementation of the event processing.During the presentation of the implementation we have pointed out that parts of the design

could have been different. However, the implementation process has been performed concur-rently with the development of the concepts, as a proof-of-concept during the design phase.Hence, this has also affected the final implementation. This means that there are many stepsthat can be taken in order to optimise the performance of the code. On the other hand, in this

77

Page 90: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

chapter we show that it is possible to implement the models and concepts of CommonSens. Inthe following chapter, we evaluate our claims by using our implementation.

78

Page 91: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 6

Evaluation

This chapter evaluates the claims that we made in Chapter 1. We claim that CommonSens(1) manages to detect complex events and deviations from complex events, (2) is scalable andmanages to detect complex events in near real-time, i.e., it processes as fast as possible, and(3) simplifies the work for the application programmer and provides personalisation. We splitthis chapter into three sections; one section per claim, and discuss for each section the approachand method we have chosen in order to evaluate and support our claims. We end this chapter bysumming up the conclusions and discussing whether or not our claims are sufficiently evaluated.In order to support the two first claims it is required that the query language can be used

to describe complex events and deviations correctly. This involves performing a systematicevaluation of the query language constructs, e.g. ∧ and →, and to verify that they work asspecified, i.e., that they are correctly implemented. In order to do this it is important to beable to choose the language constructs we want to evaluate. CommonSens takes as input anenvironment, a complex query and a workload. The workload can either come from real sensorsor from emulated sensors. In order to evaluate only some language constructs at the time, wehave to customise the input. Customisation is easier to do by using synthetic environments,complex queries and workloads, i.e., input that is not necessarily related to a real automatedhome care scenario.It is only appropriate to evaluate CommonSens in real scenarios when we know that the

language constructs work correctly. Therefore, in addition to using synthetic workload, we alsouse real workload in our evaluation. The real workload is either obtained from sensors in real-time or trace files from related work. When the workload is obtained in real-time it means thatCommonSens should detect states and state transitions when they happen in the environment.This takes more time than with synthetic workloads, since it includes the time it takes to pullthe sensors and to obtain the data tuples. However, real-time workload is not a requirement forevaluation of the event processing.CommonSens always performs the processing of the batches of data tuples in near real-time,

i.e., it processes them as fast as possible. This makes it possible to use synthetic workloads toevaluate our second claim as well, i.e., that the complex events are detected in near real-time.In addition, synthetic workloads and trace files make it possible to skip sequences where thereare no data tuples. This makes it possible to evaluate CommonSens faster. When the data tuplesare not obtained in real-time, this is called playback.Table 6.1 shows the three combinations of real workload, synthetic workload, real-time and

79

Page 92: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Real-time PlaybackReal workload Real-world: 6.1.2, 6.3 Trace file: 6.1.3

Synthetic workload - Simulation: 6.1.1, 6.2

Table 6.1: Workload types and sections they are used.

playback that we use in our evaluation. The table also shows in which sections the combinationsare used. For each table cell we present the term that we use for the combination and in whichsections this combination is used. Simulation is used in Sections 6.1.1 and 6.2. In this chapter,simulation is the combination of playback and synthetic workload. Section 6.1.1 evaluates thefunctionality of CommonSens, i.e., the detection of complex events and deviation. Section6.2 evaluates the second claim, i.e., if CommonSens manages to detect complex events anddeviations in real-time. Playback og workload from real sensors is called trace file evaluation.This is done in Section 6.1.3, where we evaluate if CommonSens manages to read trace filesfrom related work. Finally, the real-world evaluation is performed by using real sensors thatproduce data tuples in near real-time. This workload is used in Sections 6.1.2 and 6.3.There is one slot that we do not use in the table. This is the slot for real-time synthetic

workload. We conclude that real-time synthetic workloads do not add any more value to ourevaluation. This is because CommonSens already processes the data tuples as fast as it can, andsince real-time synthetic workloads do not include the time it takes to pull the sensors, it meansusing this type of workload only takes unnecessary amount of time. We can obtain equivalentresults though simulation.

6.1 Detecting Complex Events and DeviationsCommonSens is based on complex event processing concepts, and detection of complex eventsand deviations are two important concepts that CommonSens should support. The method weuse to evaluate these two concepts consists of designing and running complex queries withdifferent workloads and compare the results with the expected results.The evaluation of our first claim consists of three separate parts. The first part systemat-

ically evaluates the language constructs that the language provides, for instance concurrencyand temporal properties. For the first part we use the synthetic workloads and synthetic envi-ronments which are designed to investigate and isolate single language constructs. The detailsare explained thoroughly in the corresponding section.The second part of our evaluation uses real-time generated workloads to show that Com-

monSens manages to detect complex events using real sensors. The second part also evaluatesthe LoI approximation. The LoI approximation depends on correct instantiation of the atomicqueries. This includes both atomic queries that contain a LoI and atomic queries that do notcontain a LoI. It is very important that the LoIs are approximated correctly, i.e., that the ISECand NOISEC sets contain the correct sensors. If the approximation is not correct, Common-Sens might report wrong results. Unlike the first part, we do not aim to isolate the languageconstructs.The third part uses real-world trace files from related work. It is important to show that

CommonSens can detect complex events and deviations from other sources than the ones that

80

Page 93: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

we provide. In addition to acid-testing CommonSens and evaluate if it can use different types ofsensors, reading trace files might be useful when comparing CommonSens with other automatedhome care systems. To the best of our knowledge there does yet not exist any benchmark forautomated home care and CEP systems, e.g. similar to Linear Road for DSMSs [ACG+04].In order to detect complex events and deviations, CommonSens depends on a set of language

constructs. We have identified six language constructs that have to be evaluated. These languageconstructs are part of the query language.

I Timed and δ-timed atomic and complex queries. The temporal properties of atomicand complex queries have to be correct. This also applies to queries that are not timed,i.e., they should be successfully matched only once.

II P -registration. P -registration should work with both max and min. P -registration playsan important role in the event processing, since it indirectly supports unreliable sensors.One of our assumptions is that the sensors are reliable and produce correct state values.However, with the P -registration one can state that only a given percentage of the statevalues have to be correct.

III The logical operators ∧, ∨ and ¬. The logical operators are used between the atomicqueries. The current implementation of CommonSens does not support all types of logicalexpressions. However, it is important to show that the ∧-lists work as expected.

IV The consecutiveness relation→. It should be possible to detect consecutive atomic andcomplex events.

V Concurrency. Currently, we have only implemented the concurrency classes DURINGand EQUALS. However, these classes should work as defined.

6.1.1 Functionality Tests

Throughout the development process of CommonSens we have designed a set of regressiontests. These regression tests are used to verify that the language constructs in the languageare implemented correctly. In addition, the regression tests have been used to verify that newfunctionality has not affected already working code. In general, regression tests are designedand implemented with backwards compatibility in mind [Dus02]. However, the overall goal inthis section is that the regression tests can show that CommonSens detects complex events anddeviations from complex queries. The regression tests have a very simple design that allows usto define the input to CommonSens, while knowing the expected output. Since the regressiontests evaluate the functionality as well as backwards compatibility, we refer to regression testsas functionality tests throughout this chapter.

Test Overview

The workloads should either (1) match the complex queries or (2) not match the complexqueries. If the workload matches the complex queries, it means that the conditions in a sufficientnumber of the atomic queries are matched, i.e., the workload is a sequence of data tuples with

81

Page 94: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Value Meaning0 The evaluation of the complex query has finished correctly.3 A deviation is caused by temporal mismatch in one atomic query.6 The evaluation of the complex query has not started.7 A deviation is caused by mismatch in the concurrency.8 A deviation is caused by temporal mismatch in a list of atomic queries.

Table 6.2: Return values from the functionality tests and their meaning.

which the complex query finishes successfully. Since CommonSens supports deviation detec-tion, we must use workloads that do not match the complex query as well to see if CommonSenssuccessfully manages to detect deviations. An additional workload, which corresponds to (2) isthe workload that does not start the event processing. These workloads correspond to the largenumber of data tuples that the data tuple filter has to send to the Box objects but do not qualifyfor further evaluation.In CommonSens, the functionality tests are fully automated. We have chosen this approach

since a large number of tests can be designed and run when they are needed. Automation alsoremoves the time it takes to prepare the tests, as well as human errors. We only need to instructCommonSens to start the functionality tests and wait for the results.In addition to isolating the language constructs and evaluating them it is important to eval-

uate complex queries with several language constructs. For instance, we need to evaluate com-plex queries that consist of atomic queries with different timing, which are related with both ∧,∨ and→. On the other hand, some language constructs, like timing and P -registration alwayshave to be evaluated together, since P -registration depends on the current timing. However, itis hard to combine all the language constructs and sizes of complex queries. It is not possibleto evaluate all possible combinations, so we have to make a selection.In the following we give an overview of how the functionality tests are designed. The in-

put to our functionality tests is a triple of parameters that consists of an environment instance,a movement pattern and a complex query. In order to evaluate the language constructs, wecombine different environment instances, movement patterns and complex queries. By apply-ing different combinations, we show that the language constructs work correctly with differentinput.When one functionality test has finished, it returns a value that reports the outcome of the

test. The values and their meanings are presented in Table 6.2. The table shows the values0, 3, 6, 7 and 8. These values belong to a set of public constants that are used by the datatuple filter. In total, the data tuple filter uses 12 values that inform about the current complexquery, e.g. telling the data tuple filter to start sending data tuples to the next Box object. Adrawback with the way we have used the return values of the functionality tests is that they cannot specify which atomic query caused a deviation. The functionality tests simply return thevalue 3 when a temporal mismatch is detected. However, the tests are designed in a way thatwe know the atomic query that will return a deviation: If a deviation occurs, the test returnsa number that shows in which step of the simulation the deviation occurred. In addition, ifthere is a mismatch between the result and the expected result, this has to be investigated morethoroughly. All the functionality tests are described in a file, i.e., all the ‘environment instance’-‘movement pattern’-‘query’-triples are written down together with the expected return value.

82

Page 95: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

a) e1.env

S1

S2S3

S4

S5

S6

S1, S2, S3, S4

S1, S2

S3, S4

b) e2.env c) e3.env

Figure 6.1: Environment instances used in functionality tests.

The functionality tests are extensible, i.e., we can write additional information in these files ifthis is required or if new language constructs are added to CommonSens. In the following wediscuss the parameters of the input triple.

Environment Instances

The environment instances that we use in the functionality tests are synthetic and designed tosimplify the evaluation of the language constructs. They are not inspired by any real envi-ronment and contain a set of sensor objects and a room object. In order to evaluate differentlanguage constructs, we have created three environments with different numbers of sensors andsensor placement. The environments are shown in Figure 6.1. For example, concurrency canbe evaluated by having sensors that have coverage areas that cover each other. This is evaluatedsince both sensors are supposed to report data tuples at the same time. Sensors with adjacentcoverage areas can be used to investigate consecutiveness, i.e., that the sensors should reportdata tuples that match a condition in a given order.All the sensors in the environments have circular coverage areas that are not affected by

any of the other objects in the environment. This is done because we do not want to evaluatethe coverage area calculation in the section. The sensors simulate an RFID reader/tag scenario.This makes it easy to identify the location of the monitored person by using the coverage areasof the physical sensors. The sensors that are placed in the environments act as RFID read-ers while the monitored person wears an RFID tag. The RFID readers provide the capabilityDetectPerson. Occasionally, the RFID tag is inside the coverage area of a sensor. When thisoccurs, the RFID reader reports Person1 when it is pulled. Otherwise, the RFID reader doesnot send a data tuple at all. It is irrelevant what type of sensor we use, which corresponds to ourconcept of capabilities. Capabilities and sensors are only coupled during the query instantiation.However, we assume that the sensor is reliable and returns correct results.Environment instance e1.env consists of six sensors S1 to S6. These sensors are meant

to approximate LoIs that are used in the complex queries. The LoIs are discussed later in thefunctionality test description. S1 to S4 are located in each of the corners of the environment.S6 has a larger coverage than the other sensors. In addition it covers the coverage area of S5.Environment instance e2.env consists of four sensors S1 to S4. All the four sensors havecoverage areas that cover each other. This means that when the monitored person moves intothe coverage area that is shown in the figure, all the sensors report Person1 if they are pulled.

83

Page 96: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Class Filename Movement patterns

a)

m5.mov 1 center, 6 bottom left, 1 centerm6.mov 1 centre, 5 bottom left, 2 centrem8.mov 1 centre, 3 bottom left, 4 centrem7.mov 1 centre, 1 bottom left, 6 centrem11.mov 1 centre, 1 bottom left, 10 centre

b) m2.mov 2 centre, 2 left, 2 centrec) m3.mov 1 centre, 1 top centre, 1 center

d) m4.mov 1 centre, 1 top left, 1 centrem12.mov 1 centre, 1 top left, 10 centre

e)

m13.mov 1 centre, 6 top right, 1 centrem14.mov 1 centre, 5 top right, 2 centrem15.mov 1 centre, 3 top right, 4 centrem16.mov 1 centre, 1 top right, 6 centrem9.mov 1 centre, 1 top right, 10 centre

f) m1.mov 2 centre, 2 bottom right, 2 centrem10.mov 1 centre, 1 bottom right, 10 centre

g) m13.mov 1 centre, 3 top left, 1 centre, 6 bottom right, 1 centreh) m17.mov 1 centre, 1 bottom left, 1 centre, 4 top right, 5 centre

i)

m18.mov 1 centre, 1 bottom left, 1 top left, 1 top right, 6 bottom right, 2 centrem19.mov 1 centre, 1 bottom left, 1 top left, 1 top right, 1 bottom right, 7 centrem20.mov 1 centre, 3 bottom left, 3 top left, 3 top right, 3 bottom right, 1 centrem21.mov 1 centre, 2 bottom left, 3 top left, 3 top right, 3 bottom right, 1 centrem22.mov 1 centre, 3 bottom left, 3 top left, 2 top right, 3 bottom right, 1 centre

j) m23.mov 1 bottom right, 1 bottom centre, 5 centre, 1 top centre, 1 top leftk) m24.mov 2 bottom left, 6 bottom centre, 2 bottom right

l)m25.mov 3 top right, 17 bottom right, 3 bottom left, 7 bottom right, 10 top leftm26.mov 7 top right, 17 bottom right, 3 bottom left, 7 bottom right, 10 top leftm28.mov 3 top right, 6 bottom right, 6 bottom left, 7 bottom right, 10 top left

m) m27.mov 3 top right, 7 top left, 2 bottom right, 30 top leftn) m29.mov 3 top right, 17 bottom right, 3 bottom left, 21 bottom righto) m30.mov 10 bottom centre, 10 centre, 10 bottom centrep) m31.mov 10 bottom centre, 10 centre, 10 top right

Table 6.3: Mapping between movement pattern classes and movement patterns.

84

Page 97: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

LoI1, LoI7

LoI2LoI3, LoI8

LoI4

LoI5

LoI6

Figure 6.2: LoIs used in functionality tests.

In environment instance e3.env S1 and S2 overlap on the upper right corner while S3 and S4overlap in the lower left corner.

Locations of Interest

The LoIs and their locations in the environment are presented in Figure 6.2. In total, all thequeries use eight LoIs. LoI1 to LoI4, and LoI7 and LoI8 are defined to be in the corners of theenvironment. LoI5 is defined to be in the middle of the environment, while LoI6 is larger thanthe other LoIs and fully covers LoI5. LoI1 and LoI7, and LoI3 and LoI8 are defined to be in thesame area.This type of spatial definition allows us to investigate ∧ and concurrency, since these lan-

guage constructs require that two or more atomic queries at some point of time are processedconcurrently. For instance, we can state in two atomic queries that LoI1 and LoI7 shouldbe evaluated concurrently. If the monitored person is located in the approximation of thesetwo LoIs and the temporal specifications in the queries are matched, both atomic queries arematched. On the other hand, one can state that LoI1 and LoI8 should be matched concurrently,which we know is an impossible state since the two LoIs are located apart from each other inFigure 6.2. However, it is important that CommonSens reports this correctly as well.The placement of LoIs and sensors is not random. When a LoI is addressed in one or

more of the atomic queries, it is approximated by one or more sensors in the environmentinstance. However, in this section we do not evaluate the LoI approximation, only the languageconstructs. Evaluation of the LoI approximation is performed in Section 6.1.2.

Workloads

The workloads we use for our simulations are movement patterns of a virtualmonitored person.Since all the sensors provide the capability DetectPerson it is sufficient to use movementpatterns when evaluating complex event processing and deviation detection. This is becausethey provide data tuples that indicate whether the monitored person is located inside the ap-proximation of a LoI or not. This is implicitly supported when we use the synthetic workloads.When we want to investigate P -registration, we have to let the monitored person stay inside thecoverage area for a sufficient amount of time. The movement patterns consist of discrete steps,

85

Page 98: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

a) b) c) d)

e) f) g) h)

i) j) k) l)

m) n) o) p)

Figure 6.3: Movement pattern classes used in functionality tests.

86

Page 99: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

centre

bottom rightbottom left

leftright

top centre

top left top right

bottom centre

Figure 6.4: Nine possible locations in the environments.

and we simply count simulation steps, which are equal to epochs, i.e., the time it takes to collectall the data tuples and process them [MFHH05].In order to evaluate the query processing and deviation detection, we have created 16 sets

of movement pattern classes that aim to show movement between different coordinates in theenvironment. These sets are shown in Figure 6.3. Each of these classes contains one or moremovement patterns, i.e., the monitored person moves between the same coordinates as in theclass but stays at the coordinates in different numbers of epochs. This means that timing andP -registration can be evaluated. In addition, all the other language constructs can be evaluatedby similar movement patterns. For instance, a complex query that investigates the→ relationcan be evaluated by sending the monitored person to different coordinates in the environment.Each of these coordinates can be inside the approximation of the LoIs that are addressed in thecomplex query.In total, the current functionality test uses 31 different movement patterns. The possi-

ble coordinates correspond to one of nine locations in one environment. These locations areintuitively named top left/center/right, left, center, right, and bottom-left/center/right, and are shown in Figure 6.4. The movement patterns are simple;they do not contain the coordinates that are located on the line between any two coordinates.This means that the virtual monitored person actually moves discretely between the coordinates.Even though this is no natural movement, it is sufficient for our evaluation. If the monitoredperson moves to a coordinate that is covered by one or more sensors, the monitored person canbe detected by these sensors.Table 6.3 shows the mapping between the movement pattern classes and the movement

patterns. Some classes, e.g. e) and l), contain many movement patterns, while other classes,e.g., m) and b), contain only one movement pattern. One may extend the list with additionalmovement patterns.

87

Page 100: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Complex Queries

The final part of the functionality test triple is the complex query. We have created a set of47 complex queries that are used to investigate the language constructs. We begin with simplequeries that only address the capability with a value ([(DetectPerson==Person1)])to more complicated queries that combine temporal properties and consecutiveness. All thecomplex queries are shown in Tables A.2, A.3 and A.4. As with all the other parameters in theinput triplet, the list of complex queries can be extended.

Tests Examples

Currently, we have performed 182 functionality tests. All the functionality tests with theirexpected values are shown in Table A.1. The functionality tests are divided into sets that relateto each language construct, or combinations of language constructs. In order to show how thetests are implemented, we show two subsets of the functionality tests which are described morethoroughly than the remaining functionality tests. A discussion involving all the functionalitytests is unnecessary since Table A.1 contains information about each set. When the two subsetsare presented we discuss the results. We finally present a critical assessment of the functionalitytests and how they are designed.The syntax of our query language allows the application programmer to explicitly state

interest in deviations or not. This is done by surrounding a list of atomic queries with dev().However, to simplify the automation we have changed the semantics in the functionality tests.We do not use dev() in any of the queries. Instead, the query evaluator is instructed to reporta deviation if the workload deviates from the complex query (values 3, 7 and 8 in Table 6.2) orreport that the query evaluation was successful (0) or did not start (6).Timing is essential in CommonSens, and Language construct I is evaluated in each set.

This is because even atomic queries that do not have temporal specifications are included inthis set of tests, since they should be processed differently than atomic queries with temporalspecifications. In addition to investigating that the results from the functionality tests are correct,we investigate the time consumption of the query processing. We do this in order to evaluate ifCommonSens manages to process data tuples within near real-time. The time consumption isreported as the average processing time in milliseconds for each of the atomic queries. This isdone for each of the workloads. Since the functionality tests use synthetic sensors, we cannotinclude the time it takes to gather data tuples from the sensors over a network. The eventprocessing time we measure is the time interval from the point in time when all the relevant datatuples are collected until CommonSens has evaluated the data tuples. The time consumptionresults, i.e., average, minimum and maximum time consumption are shown in the table. Theissues related to time consumption are discussed more thoroughly in Section 6.2.The first examples are functionality tests number 178 to 182. These tests evaluate timing and

consecutiveness. We use complex query cq46.qry in Table A.4, the environment instance e-1.env and the five workloads m25.mov to m29.mov. The complex query uses the followed-by relation→ to describe consecutive atomic events.

[(DetectPerson==Person1, LoI1, 10, max 50%) ->(DetectPerson==Person1, LoI2, 10, min 50%) ->(DetectPerson==Person1, LoI3, 21, 30, max 50%) ->

88

Page 101: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

(DetectPerson==Person1, LoI4, 31, 40, min 50%)]

The complex query addresses LoI1 to LoI4. To match the complex query, the monitoredperson should move between the corners of the environment instance and stay in the cornersfor a certain amount of time. This amount depends on the current atomic query. The firstatomic query is δ-timed and states that the sensors that cover LoI1 should report that Person1is detected. The simulation steps, i.e., epochs, are denoted by time units from t1 to tx. Forinstance, the atomic event that should match the first atomic query should occur maximum 50%of ten time units. The second atomic query is similar to the first, but the event should occurfor minimum five time units in LoI2. The two last atomic queries are timed and should occurmaximum 50% between t21 and t30, and minimum 50% between t31 and t40.

m25.mov matches the complex query. Table 6.3 shows that m25.mov first contains threesteps in the top right corner in the environment (see Figure 6.4). The P -registration of the firstatomic query is set to max 50% of 10 steps. Therefore, the three first steps of the workload aresufficient for successful evaluation. These steps are followed by 17 steps in the bottom rightcorner, three steps in the bottom left corner, seven steps in the bottom right corner and ten stepsin the top left corner. Translated to LoIs that are addressed in the complex query it means that themonitored person moves to LoI3 from LoI1 through LoI2. Then, the monitored person movesback to LoI2 before stopping in LoI4. This movement pattern matches the complex query, eventhough the monitored person moves to LoI4 via LoI2. It is not stated in the complex query thatthis is not allowed, as long as LoI4 is visited for minimum 50% of the time steps between t31and t40. Note that CommonSens waits until the end of the time window before it identifies thedeviation. An alternative solution would have been to stop the evaluation of the atomic query atonce the maximum P -registration value has been exceeded.The succeeding workloads give a deviation in one of the atomic queries. Movement pattern

m26.mov, which is used in test number 179, begins with seven steps in the top right corner.This means that the maximum limit of five steps should be broken already here. The remainingworkloads aim to make the query evaluator report a deviation in each of the three consecutiveatomic queries. The third workload (m27.mov) only reaches LoI2 at six steps moving to LoI3at t9, gives a deviation at t19 when the minimum P -registration value in the second atomicquery is not matched. The fourth workload (m28.mov) gives a deviation at t25, since theDetectPerson==Person1 event in LoI3 occurs for more than five time units. Finally, thefifth workload (m29.mov) gives a deviation at t40 since the event does not occur when it should.The second examples are functionality tests number 172 and 173. These tests evaluate

timing and the concurrency class DURING. We use complex query cq47.qry in Table A.4,the environment instance e1.env and the workloads m30.mov and m31.mov. The complexquery is as follows:

[during([(DetectPerson==Person1, LoI5)],[(DetectPerson==Person1, LoI6)])]

Person1 should first be detected in LoI6, and during this event Person1 should alsobe detected in LoI5. Finally, and in order to preserve the conditions of the DURING class,Person1 should be detected only in LoI6. The workload m30.mov matches the complexquery correctly. The monitored person is first located in the bottom centre of the environment

89

Page 102: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Table 6.4: Results from functionality tests 178 to 182.Test number Average Minimum Maximum End time Result

178 0.18 0.05 0.32 39 True (0 = 0)179 0.37 0 0.8 4 True (3 = 3)180 0.16 0 0.3 19 True (3 = 3)181 0.22 0.08 0.38 25 True (3 = 3)182 0.16 0.05 0.34 40 True (3 = 3)

Table 6.5: Results from functionality tests 172 and 173.Test number Average Minimum Maximum End time Result

172 0.74 0.53 1.1 40 True (7 = 7)173 1.01 0.61 1.48 30 True (0 = 0)

for ten steps. This position starts the evaluation of the second atomic query, since this positionis inside LoI6. Afterwards, the monitored person moves to LoI5, which is inside LoI6, i.e., bothatomic queries are evaluated concurrently. Finally, the monitored person moves back to LoI6.m31.mov gives a deviation, since the monitored person moves to the top right corner. This isoutside LoI6, which violates the definition of the DURING concurrency class.The results from the first five functionality tests are shown in Table 6.4. The results are as

expected, i.e., the first workload matches the complex query, whereas the remaining workloadsdeviate from the complex query at different steps. Results from the functionality tests 172 and173 are shown in Table 6.5. As expected, the first workload does not give any deviation, andthe evaluation stops at once LoI6 is reached for the second time. The second workload gives adeviation at once LoI1 is reached instead of LoI6.The average processing time of an atomic query in the first complex query is less than

0.37 milliseconds. The minimum processing time is 0, which means that it takes less than amillisecond to process the atomic query. For the second complex query, which processes twocomplex queries concurrently, the average processing time is higher. The maximum processingtime is 1.48 milliseconds. This indicates that CommonSens is fast, and that the query processingis no bottleneck for detecting deviations in near real-time.All the results from the functionality tests are shown in Table A.5. All the 182 tests are

successfully evaluated, i.e., the expected value and the return value match. We have tried toidentify a representative set of complex queries with matching and deviating workloads, how-ever, we can not conclude that CommonSens detects complex events and deviations from alltypes of workloads. We can only state that the functionality tests are successfully evaluated. Onthe other hand, we combine environment instances, workloads and complex queries that addressonly certain language constructs. We also use workloads that deviate from the complex queriesin order to show that deviation detection also works. However, with respect to P -registration,only 25% and 50% are evaluated. In addition we also see that one P -registration test is missing.We do not have a workload that investigates P -registration that matches the conditions like inFigure 3.6. However, the experiments in Section 6.1.2 indicate that this is working correctly.The timing and δ-timing is also static, i.e., δ-timing is mostly evaluated with atomic queries thatstate that the event should last for five steps, e.g. cq26.qry. The timing is mostly from step

90

Page 103: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 6.5: The environment in CommonSens and in the real world.

one to step six, e.g., cq29.qry. Only the final complex queries use other temporal specifica-tions, e.g. cq46.qry. Timing of complex queries needs more evaluation. Functionality tests174 to 176 only investigate a P -registration with max 100%. A thorough evaulation of theseissues is required in future work. Despite these remarks, the results from the functionality testsindicate that CommonSens manages to detect complex events and deviations. In the followingsection, we strengthen this indication by evaluating CommonSens in real-world scenarios.

6.1.2 Real-world Evaluation

In this section we evaluate the real-world detection of complex events in CommonSens. Weuse real sensors that CommonSens pulls in real-time. In real-world evaluation it is importantthat the complex queries are instantiated correctly, and that the FPPROB values are as low aspossible. Therefore, in the real-world evaluation we evaluate on functional aspects of Common-Sens related to spatial issues, including coverage area calculation, sensor placement, and LoIapproximation.Reducing the probabilities of false positives is not trivial, especially not when applying

sensors that use radio signals, e.g. RFID tags and readers. Even though the functionality testsin Section 6.1.1 emulate RFID tags and readers, applying these types of sensors in the realworld is not straight forward, and one must assume that there are many false positives and falsenegatives [NBW07]. In addition, the sensors have to be placed so that the ISEC and NOISECsets are unique for each LoI. Otherwise, CommonSens will give wrong results because it is notclear which LoI having activity.The real-world evaluation is done through a use-case, which consists of an environment that

is instantiated in CommonSens and in the real world. The environments are an office and anoffice hallway. Both environments are equipped with cameras. We first evaluate spatial issuesand complex event processing in the office environment. Second, we investigate complex eventprocessing in the office hallway environment.

91

Page 104: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Office Environment Design, Method and Results

Although the office environments are not similar to homes, we can use this environment sinceit can contain sensors and LoIs. This is a strength with CommonSens; it can be used in manydifferent application domains. Future work aims to evaluate CommonSens in homes as well.The use-case consists of a series of experiments:

1. Demonstrate that radio signals propagating through walls can lead to false positives.

2. Investigate how our calculation of coverage area matches the real world coverage area ofa radio based sensor.

3. Show how we reduce the probability of false positives.

4. Increase the number of sensors to approximate the LoIs with a lower probability for falsepositives.

5. Place an obstacle in the environment and show how this obstacle reduces the coveragearea of one of the sensors, which again leads to a higher probability of false positives.

The office environment consists two rooms, Room A and Room B. The queries address twoLoIs: LoI1 and LoI2. The two LoIs are located in Room A. The room has also an area calledArea A, which is defined by dotted lines. The virtual CommonSens instance is illustrated inFigure 6.5 a). The real world instance is shown in Figure 6.5 b).We use two MICAz motes [xbo] M1 and M2 in the first experiment to emulate an RFID

reader (M2) and an RFID active tag (M1). RFID readers and tags are commonly used in au-tomated home care for localisation of events [NBW07]. We place M1 in the middle of LoI2.By applying the algorithm REDUCE for calculating coverage area, we get an indication of howthe signals will pass through the wall. In order to show that this will result in false positive, weinvestigate three scenarios: (1) We place M2 inside LoI1, (2) we move M2 to LoI2, and (3) wemove M2 to predefined coordinates on the other side of the wall. The metric we investigate isthe success rate, which is the rate of transmitted packets from M1 and received packets at M2.Since the signal passes through walls, the success rate 1 is during the entire experiment. Thismeans the system reports many false positives.In the second experiment, we increase the distance between M1 and M2. The second metric

we use is the received signal strength indicator (RSSI) value. We use the RSSI value togetherwith the success rate from the first experiment. M1 is still placed in LoI2. We first put M2next to M1. For each measurement we increase the distance between M1 and M2 with onemeter. This locates M2 next to the wall. It is important to know the permeability value of thewall between the two rooms, since this value is used to predict how much the signal strength isreduced when it passes through the wall. The permeability value of the wall is not available, soit has to be estimated empirically. To estimate the permeability value of the wall, we perform anadditional measurement next to the wall in Room B. The thickness of the wall is approximately12.5 cm, but we do not know the material. We set the permeability values for the rooms andthe wall by using the experimental values as input. We set m = −90dBm, which is the receivesensitivity of MICAz, i.e., the lowest signal strength that the MICAz motes can receive.

92

Page 105: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

-70-60-50-40-30-20-10

0 10

0 1 2 3 4 5

RS

SI (d

Bm

)

Distance from sensor (meter)

Room A Room BWall

Model Experimental results

Figure 6.6: Comparison of received and calculated signal strength.

We use Signal Model 1, which means that we have to set the initial value P0. The initialRSSI value is given by measuring while the two sensors are next to each other. Based on thismeasure we set P0 to 6.26dBm.Figure 6.6 shows the RSSI values with our algorithm for coverage area calculation using

Signal Model 1. The dashed line shows the experimental results while the other line showsthe expected reseults. In Room B the measured RSSI values slightly increase while the modelvalues decrease. We assume that this effect is due to multipath fading. This result indicatesthat we can find a good match between measured values and model values in Room A, but thesimple signal propagation model provided by Signal Model 1 is not sufficient to correctly modelthe signal strength in Room B. The conclusion from the two signal propagation experiments isthat for radio based signals there is need for more extensive studies. In addition, our resultsindicate that using radio based sensors can lead to a considerable number of false positives, andthe application programmer has to be aware of these pitfalls.We use the experience from the two first experiments to try sensors that are not based on

radio signals for sensing. In the third experiment we exchange the MICAz motes with oneweb-camera. The web-camera is called Cam1_0 and is located at the bottom (Figure 6.5 a)).The walls have a permeability value of 0 for light, i.e., a monitored person in Room B cannotcreate false positives. We have measured the angle of the coverage area to be 45.8◦, and placethe camera so that it points towards the corner where LoI1 and LoI2 are situated.A web-camera only provides capabilities that are related to streams of images, e.g. a capa-

bility ImageStream. We are interested in more than just a stream of images. Hence, in orderto provide additional information, the stream has to be aggregated with other sensors.In order to keep the implementation of the following experiments simple, we introduce

a capability called DetectMotion. DetectMotion returns a boolean value that tells ifmotion is detected or not. We assume that there exists at least two different types of sensorsthat provide this capability; physical sensors like motion detectors and logical sensors that usecameras merged with motion detection functionality. We focus on the logical sensor.The logical sensor depends on the capabilities MotionDescription and ImageStr-

eam. We use the computer vision library OpenCV [ope] to provide the capability Motion-Description. OpenCV is an open source C++ library that simplifies image processing byabstracting away all the image processing algorithms. The user only needs to call a set of

93

Page 106: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Cam1_0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23S

en

so

rs

Time (seconds)

Motion in LoI1

Cam1_0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Se

nso

rs

Time (seconds)

Motion in LoI2

Cam1_0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Se

nso

rs

Time (seconds)

Motion in Area A

Figure 6.7: Real world experiments with only one camera covering LoI1 and LoI2.

predefined functions. Motion detection is supported by one of the sample programs that theOpenCV source code provides. ImageStream is provided by the web-camera. All we needto do is to extend this sample program so that it communicates with the logical sensor. In orderto let the logical sensor and OpenCV communicate, we use non-blocking sockets. OpenCVobtains the image stream from the web-camera. If the OpenCV application detects motion aboolean value is set to true. When the motion stops, this boolean value is set to false.We apply a simple query that detects motion in the two LoIs.

[(DetectMotion == true, LoI1, 10, min 50%) ->(DetectMotion == true, LoI2, 10, min 50%)]

The query states that first there should be motion in LoI1. This should be registered inminimum 50% of the specified 10 seconds, i.e., 5 seconds. Afterwards, motion should bedetected in LoI2 for at least 5 seconds. Note that in the real-world evaluation we specify thetime in seconds and not in epochs.The instantiation of the complex query gives the following ISEC and NOISEC sets:

ISEC(LoI1) = {Cam1_0}

NOISEC(LoI1) = ∅

ISEC(LoI2) = {Cam1_0}

NOISEC(LoI2) = ∅

(6.1)

FPPROB for both LoIs is 0.9. Since the two ISEC sets are similar, the system will reportfalse positives. We confirm this with three tests. First, we create motion only in LoI1, then onlyin LoI2. Finally, we provoke false positives by creating movement in Area A. The results are

94

Page 107: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Cam1_0

Cam2_0

Cam3_0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Se

nso

rs

Time (seconds)

Motion from LoI1 to LoI2

Cam1_0

Cam2_0

Cam3_0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Se

nso

rs

Time (seconds)

Motion from LoI1 to Area A

Figure 6.8: Real world experiments with three cameras covering LoI1 and LoI2.

shown in Figure 6.7. The x-axis shows the duration of the experiment and the y-axis showsthe sensors. CommonSens samples the sensors once per second. The data tuples that containthe value true are shown as black squares. The plots also contain arrows that show the starttime of the evaluation, when the condition is fulfilled, and when the time window of 10 secondsis finished. As expected, the query finishes successfully in all the sets. This means the spatialevents are detected, but since we only use one sensor, CommonSens also reports false positives.In the fourth experiment we use all the three web-cameras. CommonSens reports that

FPPROB(LoI1) is 0.61 and that FPPROB(LoI2) is 0.80. The instantiation of the complexquery gives the following ISEC and NOISEC sets:

ISEC(LoI1) = {Cam1_0,Cam2_0,Cam3_0}

NOISEC(LoI1) = ∅

ISEC(LoI2) = {Cam1_0,Cam3_0}

NOISEC(LoI2) = {Cam2_0}

(6.2)

For the first set of experiments we create motion in LoI1 until the first atomic query issatisfied. We then create motion in LoI2 until the second part of the query is satisfied. In thesecond set of experiments we move from LoI1 to Area A instead of LoI2.The results of our experiments are shown in Figure 6.8. Both plots show that the two atomic

queries in the complex query are satisfied. The plots also show that the P -registration worksas specified. It is stated that 50% of the data tuples should match the condition, but it does notnecessarily have to be a consecutive list of matching data tuples. The plots from Figures 6.7and 6.8 show this. We conclude that even with several sensors, it is important that the LoIs areapproximated carefully.

95

Page 108: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

The fifth experiment is performed to show how obstacles in the environment stop the sig-nals. We put a cardboard box in front of Cam3_0. When updated with this new information,CommonSens changes the ISEC sets for both LoIs to Cam1_0 and Cam2_0. We perform sim-ilar movement patterns as we did with the fourth experiment. The system does not manage todifferentiate between the two workloads, since both web-cameras cover both LoIs. The resultis as expected.With our first five experiments, we first show an obvious effect of radio based sensors, i.e.,

their signal passes through walls and can lead to false positives. Furthermore, our results showthat a simple signal propagation model is not sufficient to perfectly calculate coverage areas ofradio based sensors. However, for objects with permeability 0, coverage area calculation workscorrectly. Finally, we show that CommonSens manages to handle sensor readings in close toreal-time, and automatically chooses sensors based on the current sensor setup and query.The experiments have shown that if the FPPROB values for the LoIs are too high, we can

simply not trust the results from the complex queries. This also applies to situations where theISEC and NOISEC sets are not unique, i.e., two or more LoIs are approximated by equivalentISEC and NOISEC sets. CommonSens relies on correct sensor placement. Future extensions ofCommonSens is to include a test in the instantiation phase of the complex queries that verifiesunique LoI approximation. However, there is still a probability of false positives. Given the en-vironment instances, CommonSens managed to detect the complex events correctly. Therefore,we can also conclude that the complex events are detected correctly in these experiments.

Hallway Environment Design, Methods and Results

With the hallway experiment we show that CommonSens manages to handle a larger numberof sensors. We have equipped the office hallway with nine IP-cameras, named Cam1_0 toCam9_0, which provide the capability ImageStream. We use the same logical sensor thatwe used in the previous section and use OpenCV as the provider of the capability Motion-Description. An overview of the hallway and the cameras is shown in Figure 6.9. Theoverview of the hallway is taken directly from the environment creator in CommonSens. Wehave included pictures that show the views from the cameras as well. For example, Cam1_0 islocated on the top of the entrance to the hallway, while Cam9_0 is directed towards a wall, cov-ering the LoI CoffeeMachine. In addition to the LoI CoffeeMachine, the environmentconsists of the LoIs HallwayInner, HallwayMain and HallwayTurn.We use a simple complex query that aims to investigate movement between the four LoIs:

[(DetectMotion == true, HallwayInner, 2, min 50%) ->(DetectMotion == true, CoffeeMachine, 2, min 50%) ->(DetectMotion == true, HallwayMain, 2, min 50%) ->(DetectMotion == true, HallwayTurn, 2, min 50%)]

For each atomic query it is sufficient to observe movement for one second. The instantiationof the complex query gives the following ISEC and NOISEC sets.

96

Page 109: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 6.9: Overview of the hallway and location of cameras.

97

Page 110: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Cam1_0

Cam2_0

Cam3_0

Cam4_0

Cam5_0

Cam6_0

Cam7_0

Cam8_0

Cam9_0

20

30

40

50

60

70

80

90

100

110

120

130

140

150

160

170

180

190

200

210

220

230

240

250

260

270

280

Sen

sors

Time

Figure 6.10: Results from the hallway experiment.

ISEC(HallwayInner) = {Cam2_0,Cam7_0,Cam8_0}

NOISEC(HallwayInner) = {Cam9_0}

ISEC(CoffeeMachine) = {Cam2_0,Cam7_0,Cam8_0,Cam9_0}

NOISEC(CoffeeMachine) = ∅

ISEC(HallwayMain) = {Cam2_0,Cam5_0,Cam6_0,Cam8_0}

NOISEC(HallwayMain) = {Cam1_0,Cam3_0,Cam4_0}

ISEC(HallwayTurn) = {Cam1_0,Cam5_0,Cam6_0,Cam8_0}

NOISEC(HallwayTurn) = ∅

(6.3)

The probabilities for false positives are as follows:

FPPROB(HallwayInner) = 0.79

FPPROB(CoffeeMachine) = 0.56

FPPROB(HallwayMain) = 0.7

FPPROB(HallwayTurn) = 0.55

(6.4)

We perform the experiment by creating motion in the four LoIs. With nine IP cameras, weexperience that the response is slow. Even though CommonSens is told to pull the cameras once

98

Page 111: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

every second, it often takes more time to obtain a data tuple. Figure 6.10 shows the results fromthe experiment. The black squares show when the sensors return true, i.e., we do not includethe data tuples that report false. On the other hand, the figure shows that there is a matchbetween the positive readings and the ISEC and NOISEC sets. The atomic queries are matchedat t43, t128, t224 and t275. We have to create manually the arrows in the figure, and the arrowssimply show when the processing of the next atomic query started. There is no movementbefore t28, hence this is when the plot starts. The experiment lasts for 280 seconds and returnslarge trace files. We have included the trace files for this experiment in A.3.1 to show how thedata tuples look like when they are pulled from the sensors.The hallway experiment indicates that CommonSens manages to handle larger scenarios

as well. However, we have experienced that a complex logical sensor like the one providingDetectMotion based on an IP camera and OpenCV has a lower sampling frequency. Thetrace files in Section A.3.1 confirm this. CommonSens has to adapt to this sampling frequencyin order to obtain more accurate results. An additional issue is related to the complex query.It only required one matching set of data tuples in order to match the atomic queries. We areaware that this P -registration and δ-timing is simple and that it does not sufficiently show thatthe P -registration works well. However, despite this issue, we experience that CommonSensmanages to detect the complex event and that it manages to communicate with real sensors aswell.

6.1.3 Trace File Evaluation

We use trace files obtained from the work of Cook and Schmitter-Edgecombe [CSE09]. Weperform the real-world evaluation to show that CommonSens also manages to read trace filesfrom related work and detect complex events and deviations. If CommonSens can use thesetrace files it is possible to compare CommonSens with other automated home care systems. Inaddition, reading trace files is also a way to acid-test the system and to identify new issues thatwe have yet not addressed in CommonSens.Several sensors are placed inside a home, and subjects are told to follow given patterns. For

instance, one of the patterns relates to talking in the telephone. The monitored person shouldlook up a specified number in a phone book that is taken from the shelf, call the number, andwrite down the cooking directions given on the recorded message. In addition, the monitoredperson should put the phone book back on the shelf. These trace files are training data that arecreated in order to use statistical methods, e.g. for detecting deviations. This means that Cookand Schmitter-Edgecombe have another approach than CommonSens, but we can still use theirtrace files.One important issue is that CommonSens uses the concept of LoIs. This concept is not

supported by related work. Hence, the data set from Cook and Schmitter-Edgecombe does notinclude LoIs, and we could not include this concept in the queries that describe the order in thephone event. In addition, it is not clear how the motion detectors in the environment work and ifthey have a coverage area or if they are touch-based. Touch-based motion detectors are actuallyswitches that are turned on if the monitored person steps on them. Such sensors are usuallyplaced in a carpet or underneath the floor boards. We have interpreted that the motion detectorshave coverage areas. Therefore, we have created a LoI ByTheTelephone, which is covered

99

Page 112: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

by a motion detector.We have used one of the trace files as template to define the duration of the correct pattern.

The expected duration of the complex event is set to be 119 seconds, but only 10% of the pullsresult in a matching data tuple. We use the during concurrency class to define the complexevent.

[during([(DetectPhoneBookPresent == ABSENT) ->(PhoneUsage == START)->(PhoneUsage == END) ->(DetectPhoneBookPresent == PRESENT)],[(DetectMotion == ON, ByTheTelephone, 119, min 10%)])]

When we evaluate the template trace file the query is successfully processed. In order todetect a deviation, we have used another trace file that does not contain the same pattern; themonitored person does not put the phone book back on the shelf. This should give a deviation.After 119 seconds the window ends with a match of 7.6%, which results in a deviation.Although CommonSens manages to read the trace files, we can only use one trace file to

create a complex query. This forces the monitored person to follow very strict patterns, and hu-mans tend to be more dynamic than what our complex queries support. In the example above, itwould have been required to create one complex query for each of the trace files, and run all thecomplex queries concurrently. However, the trace files from Cook and Schmitter-Edgecombeare used to generate statistical evaluations of patterns, which is not currently supported by Com-monSens. We conclude that CommonSens manages to read trace files. On the other hand, hu-man behaviour is not static, and there is a need to extend CommonSens with functionality thatsupports this. This includes supporting the complex queries with training data that includes thevariations in behaviour. This is further discussed in Section 7.3.

6.2 Scalability and Near Real-Time Event ProcessingIt is important that the system detects the events in near real-time, i.e., that the system detectsevents when they happen. In this section we evaluate the processing time and scalability ofCommonSens with respect to near real-time detection of events, performance and scalability.We want to answer two questions through our experiments:

1. How does the number of sensors to be evaluated influence the processing time? In someapplications there might be need for a considerable number of sensors.

2. How does the complexity of queries affect the processing time?

Design and Method

To answer the first question, we need to increase the number of sensors that an instantiatedcomplex query uses. We select one of the complex queries from the functionality tests. It is mostconvenient to use one of the queries that we have already used. Therefore, we use the complexquery cq46.qry. It provides consecutiveness, timing, δ-timing and P -registration. We useenvironment e1.env and the workload m25.mov. We choose m25.mov since it matches the

100

Page 113: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

0

0.5

1

1.5

2

2.5

3

3.5

4

0 5 10 15 20 25 30 35 40

Tim

e (m

illis

econ

ds)

Time steps

Experiment #1Experiment #2Experiment #3Experiment #4Experiment #5

Figure 6.11: Processing time with 6, 66, 126, 186, and 246 sensors in the environment.

complex query correctly. The parameter triple corresponds to the triple that functionality test176 uses.To ensure that the evaluated number of sensors is increasing at all evaluation steps for each

experiment, we add to each original sensor ten additional sensors for each experiment. Thus, inthe first experiment we start with the six sensors that provide the capability DetectPerson.We perform additionally four experiments with 66, 126, 186, and 246 sensors in total. The newsensors inherit the shapes and capabilities from the sensors that are already there.The second question is answered by increasing the number of concurrent queries that are

processed. To answer this question we use the complex query cq47.qry together with theenvironment e1.env and workload m30.mov. We increase the number of ∧ operators andatomic queries. This is done by adding ∧ between the atomic query (DetectPerson ==Person1, LoI5). In practice this means that when we add ∧ operators and atomic queries,we get the following complex query:

[during([(DetectPerson==Person1, LoI5) && ... &&(DetectPerson==Person1, LoI5)] ,

[(DetectPerson==Person1, LoI6)])]

We increase from 0 to 50 ∧ operators. Even though it is not realistic that the monitoredperson should be detected in the same LoI up to 51 times, we show that our system managesto handle that amount of concurrent atomic queries. We keep in this experiment the number ofsensors fixed and use the original installation with two sensors.

101

Page 114: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

We run each of the experiments 10 times to get an average processing time. The averageprocessing times for the first experiment are shown in Figure 6.11. The x-axis shows the timesteps of the simulation and the y-axis shows the processing time in milliseconds (ms).

Results

The results in Figure 6.11 show the processing time and it is clear that the processing starts atonce. This corresponds with the movement pattern, which spends the three first steps in LoI1.At the fourth step the monitored person moves to LoI2 and stays there for 17 steps. However,the processing of the second atomic query does not start until the tenth step in the experiment.This corresponds with how the max operator is implemented. The query evaluator never stopsevaluating an atomic query until te has been reached. This is because the data tuple selectormight send new data tuples that match the condition in the atomic query and perhaps violate themax limit. However, there are no additional matching data tuples, which means that there is notmuch processing done. The evaluation of the second atomic query starts at t10. The evaluationcontinues until t19. Note how the evaluation of the second atomic query continues even thoughthe min value is reached. This is because the data tuples match the condition. This is a resultof how the query evaluation is implemented. The batch of data tuples is matched against thecondition before the temporal conditions and P -registration are evaluated. This needs to befixed in future upgrades of the implementation. For timed queries the data tuple selector doesnot start sending data tuples before the timestamp matches tb. This is seen at t20, where thereis time consuming query processing. At t19, the monitored person moves to LoI3 and staysthere for three steps. The max in the third atomic query is correctly matched at t22, when themonitored person moves back to LoI2. The monitored person stays in LoI2 for seven stepsbefore moving to LoI4. According to cq46.qry the fourth atomic query is timed and theevaluation should start at t31. This corresponds to the plot in Figure 6.11.The maximum processing time for 246 sensors is 4 milliseconds. This happens at t39 when

the complex query is finishing. However, since the fourth atomic query only investigates singleLoIs, the maximum of sensors is 41, which is the maximum number of sensors that cover onesingle LoI. The data tuple selector sends a batch of 41 data tuples to the current Box object.Since there is only one atomic query, the time consumption that is shown in the plot is the timeit takes to iterate through the batch and compare it to the condition. Since all the data tuplesmatch the condition in the atomic query, the whole batch of data tuples is iterated.The results from the second experiment are shown in Figure 6.12. According to m30.mov,

the first 10 steps are covered by S6 only. The data tuple selector has to send the data tuples toa ConcurrencyBox object. Note that there is a slight difference in processing time between thefive experiments. At this point of time we can not explain why this happens. The processingtime should have been similar for all the experiments since only one atomic query is evaluatedat that time. However, for the first experiment the average time consumption between t1 and t10is 2.4 milliseconds, while for the fifth experiment the average time consumption is milliseconds.At t10, the time consumption increases considerably, however it seems like the time consump-tion increases linearly with the number of ∧ operators. Except for an outlier at t18, which isat 27 milliseconds, the average processing time in the fifth experiment is approximately 11milliseconds, while for the first experiment it is 0.58 milliseconds. The linear increase can be

102

Page 115: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

0

5

10

15

20

25

30

0 5 10 15 20 25 30

Tim

e (m

illis

econ

ds)

Time steps

1 atomic query11 atomic queries21 atomic queries31 atomic queries41 atomic queries51 atomic queries

Figure 6.12: Processing time with an increasing number of concurrent queries.

explained by how the Box object handles ∧-lists. As stated in Chapter 5, all the atomic queriesthat related with ∧ or ∨ operators are located in a Box object that the data tuple selector sendsdata tuples to. What we see is the effect of the iteration through the ∧-list. This means that wedo not see the iteration through the batch of data tuples. This happens because the data tupleselector only sends one data tuple to the box. In addition, we do not see any iteration throughthe ∨ operators, since there is only one. This means that the experiments have not shown thefull complexity of the event processor, i.e., shown the worst case processing time.All the functionality tests are also timed. The time consumption is evaluated by measuring

the average processing time of each atomic query. The plot is shown in Figure 6.13. Theaverage processing time is 0.2 milliseconds, and there is an outlier at functionality test 87 at12 milliseconds. However, the average processing time for functionality test 87 is 0.64, whichdoes not differ much from the total average.Based on the experiments we conclude that the processing time is sufficient for real-time

detection of complex events. The experiments show that even for the highest workload theevent processing part of CommonSens handles real-time processing of data tuples very well.In our application domain it is not CommonSens that is the bottleneck. However, we havenot investigated the time consumption of pulling the sensors. We have not evaluated this sinceCommonSens supports all types of sensors. If the sensors are slow, i.e., that they have a lowsampling frequency, this will affect CommonSens. However, note that the current implementa-tion of CommonSens does not fully support varying sampling frequencies.

103

Page 116: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

0

2

4

6

8

10

12

0 20 40 60 80 100 120 140 160 180

Tim

e (m

illis

econ

ds)

Regression test number

Figure 6.13: Average processing time for atomic queries in the functionality tests.

6.3 Personalisation and User InterfaceOur final claim is that CommonSens simplifies the work for the application programmer andprovides personalisation. CommonSens does this by, i.e., reducing the amount of work relatedto sensor placement, query writing and detection of complex events and deviations. In thissection we first investigate the personalisation claim; CommonSens queries can be easily usedin many different environments. We demonstrate the low effort required from the applicationprogrammer to personalise the queries through a use-case study and an example taken from thehallway experiment in Section 6.1.2. Second, we discuss the simplicity of the user interface.

Personalisation and Support for the Application Programmer

Personalisation is one of the strengths in CommonSens. To simplify the work of the applicationprogrammer, CommonSens aims to reuse complex queries in several environments. This ispossible because the query language allows the application programmer to address abstractconcepts like capabilities and LoIs instead of specific sensors. This means that the applicationprogrammer has to perform small changes in the queries, if any, to match a new home. In thefollowing, we study two scenarios to evaluate the personalisation and to show how this processworks in CommonSens.First, we show that the personalisation can be performed by only changing a few parameters.

We focus on queries related to detecting the activities falling and taking medication (see Figure3.5). These activities should be detected in two different instances with minimal rewriting of

104

Page 117: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

the queries. The instances are excerpts from two environments taken from related work; theWSU smart home project [CSE09] and MIT’s PlaceLab apartment [ILB+05]. The instancesare shown in Figure 6.14 and are realised through our proof-of-concept implementation. Theinstances are equipped with sensors from our sensor model. The walls are all objects withpermeability value 0 for light and 0.01 for radio signals.For fall detection, timing is not relevant because a fall can happen at any time of the day.

Hence, the temporal properties are not specified. The fall can also happen everywhere in thehome and CommonSens has to constantly pull the sensors that provide FallDetected. Thequery that detects the fall is then very simple: (FallDetected == personID). The valuepersonID identifies the person, and must be updated for the particular instance. The person-alisation process is done when CommonSens investigates the available sensor configurationsthat provide the capability FallDetected and investigates whether the current instance pro-vides these sensors. If the sensors are provided, CommonSens instantiates the query and startsreading the data tuples from the relevant sensors. If not, CommonSens informs the applicationprogrammer that the query cannot be instantiated and shows the list of sensors that need to bein the environment. Note that, in Section 5.2.2 the current implementation of CommonSensrequires that when the LoI is not specified, all the sensors that provide the capability have tosend data tuples that match the condition. For the example above, it would have been moreappropriate to require that only one of the sensors send data tuples that match the condition. Aconsistent definition regarding this issue remains an open issue.In order to detect that the monitored person takes medications, it is sufficient to have sensors

that provide the capability TakingMedication, and which return the type of medication thathas been taken. If the monitored person should take several medications, it is sufficient to usethe ∧ operator between each of the medication types, as long as they are taken at the same time.This is because of the way we have implemented ∧, i.e., it only accepts uniform matching. Ifthey should be taken in a given order, the→ relation can be used. The DURING concurrencyclass can be used to describe the temporal relation. The first part of the query identifies that themedications are taken while the second part of the query identifies that the monitored personis within the LoI related to the medication cupboard. This LoI is called MedCupboard andis defined with different coordinates for the two environments. The query that the applicationprogrammer has to write is based on this template:

during(((TakingMedication == Med1_0,timestamps) -> ... ->(TakingMedication == MedN_0,timestamps)), (DetectPerson == personID,MedCupboard,timestamps))

In order to show that two different types of sensors can provide the same capabilities, wehave placed two cameras in the kitchen (Figure 6.14 a)) and RFID tags in the bathroom (Figure6.14 b)). We also show the LoI that the complex query addresses. The camera in the medicinecupboard covers the LoI MedCupboard. The coverage area of the two cameras has an angleof 90◦, and the coverage area of the camera inside the cupboard is reduced by the panels. In thebathroom, MedCupboard is covered by three active RFID tags named Med1_0, Med2_0 and

105

Page 118: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 6.14: Two environments with different setup.

Med3_0. The three tags are attached to the medication and provide the capabilities Taking-Medication and DetectPerson. Affected by the walls in the medication cupboard, thecoverage areas of the tags are shown as small irregular circles. This is automatically calculatedby CommonSens based on the signal type and the permeability values of the walls. The wrist-worn RFID reader of the monitored person returns the correct readings when it is within thecoverage areas of the tags.By using the query above, the application programmer only needs to personalise the types of

medications, the timestamps and the coordinates of the LoIs. For instance, the monitored personAlice should take her medication in a given order between 08:00h and 09:00h every day. Thisprocess should take maximum six minutes and taking each medication should take one minute.The last part of the query can be rewritten as (DetectPerson== Alice, MedCupboard,08:00h, 09:00h, min 6%). The timestamps in each of the queries addressing Taking-Medication are rewritten to for instance (TakingMedication == Med1_0, 1m). Forthe monitored person Bob, each of the medications should be taken at different times duringthe day. The application programmer needs to write one query for each of the medications. Foreach of the queries the timestamps and coordinates of the LoIs are simply updated to match therequired temporal behaviour. Finally, CommonSens performs the late binding and defines theISEC and NOISEC sets.

Personalisation in Real-world Experiments

Finally, we show a simple personalisation example from the hallway experiment. We show howwe can define the coordinates for a LoI and place the LoI in different locations. CommonSensadapts the query to the environments by changing the approximation of the LoI. We use acomplex query that addresses a LoI called Hallway. Figure 6.15 shows excerpts from the

106

Page 119: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Figure 6.15: Excerpts from the hallway with the new LoI Hallway.

hallway where the LoI is placed. Note that the coordinates are the same as the coordinates forthe LoIs in Figure 6.9. The complex query investigates movement in Hallway. The conditiononly needs to be true in one data tuple in order for the query to finish correctly.

[(DetectMotion == true, Hallway, 2, min 50%)]

The ISEC and NOISEC sets are equal to the ones for the original LoIs. We run the experi-ments by generating motion in the LoI Hallway. All queries are matched successfully. Thisimplies that CommonSens manages to adapt to different environments. In addition, the resultsin Figure 6.16 show that the query processing works as well. Each plot shows the data tuplesuntil the query evaluation stops. Figure 6.16 a) shows the result from the LoI that was originallyHallwayMain. The query evaluation stops when Cam2_0, Cam5_0, Cam6_0 and Cam8_0report true. This corresponds with the Isec set of HallwayMain. Figure 6.16 b) corre-sponds to HallwayTurn. Figure 6.16 c) shows the activity in the NOISEC set as well. Thequery evaluation stops only when Cam9_0 stops reporting true at t23. Finally, in d) the queryis actually matched at t19, where all the sensors in the ISEC set reports true. However, theevaluation does not stop until t20, since this is how min is implemented in CommonSens. Thedifference between Figure 6.16 d) and the others is that we were still in the area of the LoI att20. This explains why we do not see that any of the other three queries continue the evaluationafter the query is matched. This is because there was no motion in the ISEC and NOISEC setsafter the condition was matched.Based on the use-case study and the hallway experiment, we conclude that CommonSens

manages to adapt to different environments.

107

Page 120: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Cam1_0Cam2_0Cam3_0Cam4_0Cam5_0Cam6_0Cam7_0Cam8_0Cam9_0

16

17

18

19

20

21

Sen

sors

Time

a) Hallway = HallwayMain

Cam1_0Cam2_0Cam3_0Cam4_0Cam5_0Cam6_0Cam7_0Cam8_0Cam9_0

19

20

21

22

23

24

25

26

Sen

sors

Time

b) Hallway = HallwayTurn

Cam1_0Cam2_0Cam3_0Cam4_0Cam5_0Cam6_0Cam7_0Cam8_0Cam9_0

9 10

11

12

13

14

15

16

17

18

19

20

Sen

sors

Time

d) Hallway = CoffeMachine

Cam1_0Cam2_0Cam3_0Cam4_0Cam5_0Cam6_0Cam7_0Cam8_0Cam9_0

1 2 3 4 5 6 7 8 9 10

11

12

13

14

15

16

17

18

19

20

21

22

23

Sen

sors

Time

c) Hallway = HallwayInner

Figure 6.16: Results from the four LoIs that are turned into Hallway.

User Interface

This section explains the steps the application programmer takes to evaluate a complex queryin an environment in CommonSens. We end this section by referring to a small user study thatwe have performed in our labs.The user interface is GUI based and is designed with simplicity and intuition in mind. Even

though the current implementation only allows the application programmer to create the envi-ronment in 2D (see Figure 5.7), it shows the possibilities of creating the environment. Currently,the application programmer has to write the complex queries outside of CommonSens and ob-tain them through pushing the ‘Open Query’ button. Therefore, future work consists of sup-porting query writing through the CommonSens implementation and to allow auto-completionof the queries, i.e., letting CommonSens suggest capabilities, values and LoIs based on theavailability in the repositories. When the environment is chosen, the sensors are placed, and thecomplex query is chosen, the application programmer can choose if he wants to run simulationsor monitoring. When running simulations, CommonSens pulls the virtual sensors as fast aspossible using an internal clock rather than the real point of time. Monitoring is real-time andand uses real workload. By using the GUI to create simple movement patterns, the applicationprogrammer can investigate if the queries work as they should or if they need to be modified.CommonSens sends messages about the query if the parser encountered any problems.We evaluate the simplicity of the user interface by running a small user study where five

users are requested to perform four simple tasks:

108

Page 121: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

1. Open an environment file, a query and a movement pattern. Simulate the movementpattern in the environment.

2. Move around the objects in the environment and toggle the coverage of the sensors. Tog-gling will show the real coverage area.

3. Change the size and rotation of some of the objects in the environment.

4. Save the environment after working with it.

The response is positive and the test users report that the GUI is simple and intuitive. Thefirst task is performed without any problems. On the other hand, the users report some limita-tions on how to move the objects and it is hard to see which object is chosen. In addition, theanimation of the movement pattern is not shown. These issues are easy to improve.

6.4 Discussion and ConclusionWe have evaluated the three claims that we made in Chapter 1.

Claim 1: CommonSens Detects Complex Events and Deviations

We divide the evaluation of Claim 1 in three parts: The first part is a functionality test of five lan-guage constructs. The query language is used to describe events, and if each of these languageconstructs work as designed, CommonSens manages to detect complex events and deviations.We combine different environment instances and synthetic workloads with complex queries thataddress combinations of language constructs. Timing is very important in CommonSens and therelated language construct is indirectly evaluated in every complex query. The workloads eithermatch the complex queries, deviate from the complex queries or do not start the evaluation ofthe complex query at all. We compare the results from 182 functionality tests to expected resultand show that all tests completed successfully.Since we evaluate Claim 1 by running measurements on a set of functionality tests, we

can not conclude more than what the results show, but by combining the language constructswe evaluate a representative selection of complex events. An analytical approach would haveprobably shown that CommonSens manages to detect all types of complex events and deviations[Jai91] which the implementation supports.The second part of the evaluation of Claim 1 is to evaluate if CommonSens manages to

detect complex events from real sensors. This includes an evaluation of the LoI approximation,which shows that radio based sensors can return unexpected results since they send signalsthrough walls. It is also important that the approximation for each LoI is unique, i.e., that twoor more LoIs are not approximated by two equivalent sets of sensors. Despite these experiences,CommonSens manages to detect the complex events correctly.In the third part we show that CommonSens manages to read trace files from related work

and detect complex events and deviations from these sets. This is a feature that is importantwhen we want to acid-test CommonSens and compare it with other automated home care sys-tems.

109

Page 122: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Claim 2: CommonSens Processes Data Tuples in Near Real-Time

We evaluate the near real-time support query processor by running two separate tests. First, weincrease the number of sensors that approximate the LoIs and show that CommonSens managesto handle input from several sensors while it still detects complex events in near real-time.Second, we increase the number of concurrent queries. The results show that the increasingnumber of concurrent queries does not affect the processing time significantly.

Claim 3: CommonSens simplifies the work for the application programmer and providespersonalisation

Simplicity is hard to measure, but we support our claim by performing a user test in our labs.We evaluate the personalisation by first showing a use-case involving complex events like falldetection and medication taking. Second we indicate through a real-world experiment thatCommonSens manages to automatically adapt to different environment instances. Finally, theuser test confirms that the user interface is simple and intuitive. We discuss the user interface andcompare it to the requirements from the application programmer and show that the requirementsare met.Based on the evaluation of CommonSens, our conclusion is that our three claims are suffi-

ciently evaluated and supported.

110

Page 123: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Chapter 7

Conclusion

In this chapter we conclude this thesis and summarise our contributions. In addition, we presenta critical review of our claims. We finally point out open problems and directions regardingfuture work.

7.1 Summary of ContributionsBased on our overall goal of simplifying the work for the application programmer, we havemodelled, designed, implemented and evaluated CommonSens, a multimodal complex eventprocessing system for automated home care. Automated home care is an emerging applicationdomain, and there exist many approaches that aim to solve the issues related to this domain.For instance, there exist proprietary solutions that use sensors to detect ADLs in the home, andthere exist simpler solutions where single sensors report incidents that they are programmed todetect. An example of the latter is an accelerometer that detects whether the monitored personhas fallen. If the monitored person falls, the sensor reacts because the accelerometer valuesreach a predefined threshold. Through the chapter concerning background material and relatedwork we have presented the most relevant work in the field. In addition, we have presented thetechnologies that CommonSens relies on; sensor technology and complex event processing. Tothe best of our knowledge, there exist no systems that provide solutions for the issues that wehave addressed in this thesis.A worst case scenario for the application programmer is to manually write queries for all

the monitored persons, who might have homes that are equipped with different types of sensors.One very important aspect to consider is the fact that there are many similarities between theinstances. In many instances, fall detection is something which is important to detect, hence, inthe worst case scenario, the same type of query has to be written several times. In order to avoidaddressing all types of sensors that detect falls, the query language in CommonSens allowsthe application programmer to address the capability of the sensors instead. CommonSensautomatically investigates a virtual instance of the home and binds the query to the sensorsthat provide the capabilities. Hence, in this thesis we have shown that it is possible to providean open and extensible automated home care system which still simplifies the work for theapplication programmer.Throughout our work we have made important contributions to the automated home care

111

Page 124: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

application domain. The first contributions in our work are models for events, environmentsand sensors. In Chapter 3 we introduce the three models. The event model differs betweenatomic and complex events and introduces the concept of LoIs, i.e., the spatial properties ofevents. In contrast to related work, we have an explicitly defined event model; events are onlythose states and state transitions that someone has declared interest in. The event model issimple. Still, it covers the aspects that we are interested in, i.e., determining if states or statetransitions match conditions in queries. We divide the sensor model into three types of sensors.The physical sensor obtains state values from the environment, the external source providesstored data, and the logical sensor aggregates data tuples from a set of other sensors. For theapplication programmer, the sensors are addressed through their capabilities. This means thatwhen the application programmer writes queries, the sensors are never addressed directly. Thisallows for abstract queries that can apply to many different homes.The properties of the environment model define the physical shapes of the objects that make

up a home. In addition, the properties contain information about how the objects affect signals.Our second contribution is to let the application programmer use the environment properties ina proactive way when placing the sensors in the home. In contrast to related work, we operatewith sensor coverage areas that are reduced by the environment. This makes sensor placementmore realistic. First, we show how to model that the permeability values of the objects affectthe signal strength. Second, we show how CommonSens uses the reduced coverage areas tocalculate the probability for false positives. Third, the instantiation of the queries is the pro-cess where the capabilities that are used in the queries are bound to the sensors in the home.Fourth, we show how CommonSens does this by populating the two sets ISEC and NOISEC.Note that, in order for the sensor placement to be correct, it is required that the environmentis sufficiently instantiated and that the signals are modelled with realistic propagation models.This is especially relevant for sensors using radio signals, e.g. RFID reader/tag pairs.Through the query language, the application programmer can write queries that address

complex events. Through our third contribution, i.e., deviation detection, the query languageallows the application programmer to write queries that instruct CommonSens to detect devia-tions. When the ADLs deviate, CommonSens should report that something is wrong. This is asimpler approach than using queries to describe everything that can go wrong in the home. Theapplication programmer simply has to query the expected ADLs. When the monitored persondoes not follow these rules, CommonSens interprets this as a deviation.CommonSens operates in a life cycle that consists of using the models for sensor placement,

query instantiation and query processing and evaluation. When there are needs for change, e.g.that the monitored person needs more monitoring, CommonSens enters a system shut downphase before the life cycle continues. This cyclic approach allows CommonSens to be extendedwhen this is required.

7.2 Critical Review of ClaimsWe have developed a set of new concepts and models. This set is the foundation for the designand implementation of CommonSens. The implementation is used in a set of experiments inorder to evaluate whether we achieve our claims. Through the evaluation we use simulations

112

Page 125: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

based on synthetic workload and trace files. We also use real-world experiments with realsensors and real-time CEP. Our claims are as follows:

Claim 1: CommonSens detects complex events and deviations

In order to support this claim, we have used CEP and combined concepts from this technol-ogy with our models. This is especially related to the introduction of concepts like our querylanguage, coverage areas, LoIs and approximation through ISEC and NOISEC sets. All theseconcepts are domain specific and related to detection of events and deviations in the home. First,in order to evaluate that the query language can be used to support our claim, we systematicallydesign tests for each query language construct. The tests use workloads that both match and donot match the queries. With this approach, we verify that the detection of the complex queriesis correct and that deviations are correctly detected as well. Second, we evaluate the claim byperforming experiments with real sensors. These experiments show that the complex events aredetected. However, we have not investigated the deviation detection in these experiments. In allthe experiments, coverage areas and LoI approximation are evaluated by using more than onesensor to approximate the LoIs and to detect the events and the deviations. We show that, givenan environment with sensors, CommonSens manages to instantiate the ISEC and NOISEC setscorrectly.

Claim 2: CommonSens processes data tuples in near real-time

In order to support this claim, we have designed CommonSens so that the number of datatuples that have to be evaluated is minimised. This is done through a pull-based model, i.e.,CommonSens only pulls those sensors that are relevant for the query. The relevant sensors areidentified by the data tuple selector by investigating the ISEC and NOISEC sets. Through a set oftwo separate experiments we show that CommonSens is scalable with respect to an increasingnumber of sensors and queries. We show that CommonSens manages to process all queries innear real-time. In addition, this claim is also evaluated and supported during the evaluation ofClaim 1. These experiments show that all the data tuples are processed in near-real time, aswell.

Claim 3: CommonSens simplifies the work for the application programmer and providespersonalisation

In order to support this claim we have introduced abstractions that help simplifying the workfor the application programmer. First, we provide a query language that lets the applicationprogrammer address LoIs and capabilities. Through LoIs, the application programmer only hasto describe general spatial properties, and through the capabilities, the application programmerdoes not have to address the sensors directly. This simplifies the personalisation, i.e., adaptinga query plan to the current home, sensors and monitored person. CommonSens investigatesthe environment and available sensors, and instantiates the queries based on late binding. Thelate binding is done automatically by finding the proper sensors based on their capabilities andplacement. The sensor placement can be done interactively, i.e., the application programmergets feedback from CommonSens regarding the probability for false positives. In addition, all

113

Page 126: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

the sensors and objects can be obtained and reused from a repository. In addition to showing thatCommonSens supports these concepts, we have performed a small user study where we askeda number of persons to use CommonSens to perform a simple task and evaluate the experience.The response from the user study was positive, however, there is a need for more extensive userstudies to fully support our claim.

7.3 Open Problems and Future WorkDespite all our contributions and the fact that we have solved our problem statement, there arestill some interesting issues that have to be investigated further. In this section we present someof these issues. First, we address the open problems, e.g. concepts that are yet not supportedin CommonSens. Second, we address new issues, i.e., interesting new directions based on theachievements we have made in CommonSens.

7.3.1 Open Problems

Due to time limitation, we have focused on implementing the core functionality to evaluate ourclaims. In order to provide a fully fledged product, there is still some functionality to remains tobe implemented. For instance, only one complex query can run at the same time. Note that thisdoes not involve concurrency; the concurrency classes DURING and EQUALS are implemented,and form a template for the implementation of the remaining concurrency classes.As noted in Chapter 5, the support for expressions using the ∧ and ∨ operators is limited.

An open problem is to exchange the current ∧-list structure with a structure that is better forsupporting complex first-order logical expressions. During the evaluation of deviations, Com-monSens was instructed to report about both successful event processing and deviations. This issufficient for evaluation of the event processing and the deviation detection. However, what re-mains to be implemented is the dev operator, so that the application programmer can explicitlyspecify that deviations are of interest.Through our experiments we have observed that there remains some work related to signal

propagation. We do not support multipath fading, which includes reflection of signals. This isa very important issue, since most signals are reflected by objects. Some signals, like radio sig-nals, both reflect from and pass through an object. Signal propagation is complex, and correctmodels have to be implemented in order for sensors to be placed more appropriately. For thesensor placement, CommonSens does not yet automatically approximate the LoIs. The sensorplacement has to be done by the application programmer through the CommonSens GUI. Au-tomatic sensor placement requires cost factors like available sensors and price, since a perfectapproximation of a LoI will use a considerable number of sensors, which is not practical in areal scenario.Finally, we need to define capabilities more precisely, i.e., we need to find a data model

that can describe the properties of capabilities. Currently we use a simple text based approach.However, there is need for more sophisticated data structures to address the capabilities moresufficiently. A possible approach is to investigate service oriented architecture (SOA) and seehow loosely coupled interfaces are defined to provide different kinds of services.

114

Page 127: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

7.3.2 Future Work

Through our work, we have found two issues that need to be further investigated and which arebased on the current state of CommonSens.First, we assume that all the readings from the sensors are correct. In the real-world, this

is not a realistic assumption. Sensors are unreliable, and in order to obtain correct readings,there are several operations that have to be done. For instance, it is important to identify whenthe quality of information is sufficient. This means that even though the information is notcomplete, the missing readings from the sensors can be constructed by performing calculationson the readings that we already have. In CommonSens we use several sensors to approximateLoIs, but sensor fusion, i.e., using the information from sensors that cover the same areas inthe environment, can also increase the quality of information, which indicates how good theinformation from the sensors is compared to the ground truth. The ground truth is defined bythe states and state transitions in the real-world.Second, it would be interesting to use CommonSens as an actuator. We have not discussed

how CommonSens should behave when events and deviations are detected, and currently theonly action performed by CommonSens is to send simple notifications. This can be extended,and by using different types of actions, we can let CommonSens start other processes in thehome. For instance, we can use the concept of capabilities and logical sensors to define com-plex actuators. An example of a complex actuator is how a notification can be adapted to theenvironment. The application programmer only has to state that he wants the notification tobe a wake up call. CommonSens investigates the environment and finds an alarm clock whichis located in the bedroom, and automatically binds the notification to this alarm. The alarm isstarted at a point of time that is defined in a query. Actuators can also be used as part of robottechnology, i.e., a robot uses sensors and performs actions based on a set of queries. Therefore,since it is designed and modelled for automated home care, CommonSens can be extended andused as a component in future home care robots as well.

115

Page 128: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 129: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Bibliography

[ACG+04] Arvind Arasu, Mitch Cherniack, Eduardo Galvez, David Maier, Anurag S.Maskey, Esther Ryvkina, Michael Stonebraker, and Richard Tibbetts. Linearroad: a stream data management benchmark. In Proceedings of the Thirtiethinternational conference on Very large data bases - Volume 30, VLDB ’04, pages480–491. VLDB Endowment, 2004.

[Agg05] Charu C. Aggarwal. On abnormality detection in spuriously populated datastreams. In SDM: SIAM International Conference on Data Mining, 2005.

[AH00] Ron Avnur and Joseph M. Hellerstein. Eddies: continuously adaptive query pro-cessing. SIGMOD Rec., 29:261–272, May 2000.

[AKJ06] Pradeep Kumar Atrey, Mohan S. Kankanhalli, and Ramesh Jain. Informationassimilation framework for event detection in multimedia surveillance systems.Multimedia Systems, 12(3):239–253, 2006.

[All83] James F. Allen. Maintaining knowledge about temporal intervals. Commun. ACM,26(11):832–843, 1983.

[ASSC02] I.F. Akyildiz, Weilian Su, Y. Sankarasubramaniam, and E. Cayirci. A survey onsensor networks. Communications Magazine, IEEE, 40(8):102 – 114, August2002.

[Atr09] Pradeep K. Atrey. A hierarchical model for representation of events in multimediaobservation systems. In EiMM ’09: Proceedings of the 1st ACM internationalworkshop on Events in multimedia, pages 57–64, New York, NY, USA, 2009.ACM.

[BF07] Azzedine Boukerche and Xin Fei. A coverage-preserving scheme for wirelesssensor network with irregular sensing range. Ad Hoc Netw., 5(8):1303–1316,2007.

[Bor79] Richard Bornat. Understanding and writing compilers: a do-it-yourself guide.Macmillan Publishing Co., Inc., London and Basingstoke, 1979.

[BP00] P. Bahl and V.N. Padmanabhan. Radar: an in-building rf-based user location andtracking system. In INFOCOM 2000. Nineteenth Annual Joint Conference of theIEEE Computer and Communications Societies. Proceedings. IEEE, volume 2,pages 775–784 vol.2, 2000.

117

Page 130: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[BR07] Mike Botts and Alexandre Robin. OpenGIS Sensor Model Language (SensorML)Implementation Specification. Open Geospatial Consortium Inc., 2007.

[Car07] Jan Carlson. Event Pattern Detection for Embedded Systems. PhD thesis, Depart-ment of Computer Science and Electronics, M�lardalen University, 2007.

[CBK09] Varun Chandola, Arindam Banerjee, and Vipin Kumar. Anomaly detection: Asurvey. ACM Comput. Surv., 41(3):1–58, 2009.

[CCC+05] Yi-Chao Chen, Ji-Rung Chiang, Hao-hua Chu, Polly Huang, and ArvinWen Tsui.Sensor-assisted wi-fi indoor location system for adapting to environmental dy-namics. In MSWiM ’05: Proceedings of the 8th ACM international symposiumon Modeling, analysis and simulation of wireless and mobile systems, pages 118–125, New York, NY, USA, 2005. ACM.

[CCD+03] Sirish Chandrasekaran, Owen Cooper, Amol Deshpande, Michael J. Franklin,Joseph M. Hellerstein, Wei Hong, Sailesh Krishnamurthy, Samuel R. Madden,Fred Reiss, and Mehul A. Shah. Telegraphcq: continuous dataflow processing. InSIGMOD ’03: Proceedings of the 2003 ACM SIGMOD international conferenceon Management of data, pages 668–668, New York, NY, USA, 2003. ACM.

[CcR+03] Don Carney, Ugur Çetintemel, Alex Rasin, Stan Zdonik, Mitch Cherniack, andMike Stonebraker. Operator scheduling in a data stream manager. In Proceedingsof the 29th international conference on Very large data bases - Volume 29, VLDB’2003, pages 838–849. VLDB Endowment, 2003.

[con] Contiki - the operating system for connecting the next billion devices - the internetof things. http://www.sics.se/contiki/.

[CSE09] D. J. Cook and M. Schmitter-Edgecombe. Assessing the quality of activities in asmart environment. Methods of Information in Medicine, 2009.

[CTX09] Yuanyuan Cao, Linmi Tao, and Guangyou Xu. An event-driven context modelin elderly health monitoring. Ubiquitous, Autonomic and Trusted Computing,Symposia and Workshops on, 0:120–124, 2009.

[DGH+06] Alan Demers, Johannes Gehrke, Mingsheng Hong, Mirek Riedewald, and WalkerWhite. Towards expressive publish/subscribe systems. In Advances in DatabaseTechnology - EDBT 2006, volume 3896/2006, pages 627–644. Springer Berlin /Heidelberg, 2006.

[DGP+07] Alan J. Demers, Johannes Gehrke, Biswanath Panda, Mirek Riedewald, VarunSharma, and Walker M. White. Cayuga: A general purpose event monitoringsystem. In CIDR, pages 412–422. www.crdrdb.org, 2007.

[DLT+07] Min Ding, Fang Liu, Andrew Thaeler, Dechang Chen, and Xiuzhen Cheng. Fault-tolerant target localization in sensor networks. EURASIP J. Wirel. Commun.Netw., 2007(1):19–19, 2007.

118

Page 131: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[Dus02] Elfriede Dustin. Effective Software Testing: 50 Ways to Improve Your SoftwareTesting. Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA,2002.

[EN10] Opher Etzion and Peter Niblett. Event Processing in Action. Manning Publica-tions Co., August 2010.

[esp] Esper - complex event processing homepage. http://esper.codehaus.org/.

[evi] Evidence - embedding technology. http://www.evidence.eu.com/.

[GADI08] D. Gyllstrom, J. Agrawal, Yanlei Diao, and N. Immerman. On supporting kleeneclosure over event streams. Data Engineering, 2008. ICDE 2008. IEEE 24thInternational Conference on, pages 1391–1393, April 2008.

[GIM+10] Daniel Giusto, Antonio Iera, Giacomo Morabito, Luigi Atzori, Luca Bencini,Giovanni Collodi, Davide Palma, Antonio Manes, and Gianfranco Manes. Areal implementation and deployment for wine production management based onwireless sensor network technology. In The Internet of Things, pages 339–348.Springer New York, 2010. 10.1007/978-1-4419-1674-7_33.

[GMUW08] Hector Garcia-Molina, Jeffrey D. Ullman, and Jennifer Widom. Database Sys-tems: The Complete Book. Prentice Hall Press, Upper Saddle River, NJ, USA,2008.

[gpc] General polygon clipper library. http://www.cs.man.ac.uk/~toby/alan/software/.

[HJ08] P. Hafliger and E. Johannessen. Analog to interval encoder with active use of gateleakage for an implanted blood-sugar sensor. In Biomedical Circuits and SystemsConference, 2008. BioCAS 2008. IEEE, pages 169 –172, 2008.

[HT03] Chi-Fu Huang and Yu-Chee Tseng. The coverage problem in a wireless sensornetwork. InWSNA ’03: Proceedings of the 2nd ACM international conference onWireless sensor networks and applications, pages 115–121, New York, NY, USA,2003. ACM.

[iee] Ieee 802.15 working group for wpan. http://www.ieee802.org/15/.

[ILB+05] Stephen S. Intille, Kent Larson, J. S. Beaudin, J. Nawyn, E. Munguia Tapia, andP. Kaushik. A living laboratory for the design and evaluation of ubiquitous com-puting technologies. In CHI ’05: CHI ’05 extended abstracts on Human factorsin computing systems, pages 1941–1944, New York, NY, USA, 2005. ACM.

[Jai91] Rai Jain. The Art of Computer Systems Performance Analysis. John Wiley &Sons, Inc., 1991.

119

Page 132: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[KH09] Kevin Kinsella and Wan He. An aging world: 2008. international populationreports. Issued June 2009. U.S. Department of Health and Human Services, 2009.

[KTD+03] Kimberle Koile, Konrad Tollmar, David Demirdjian, Howard Shrobe, and TrevorDarrell. Activity zones for context-aware computing. In In UbiComp, pages 90–106. Springer-Verlag, 2003.

[LAT05] Toril Laberg, Haakon Aspelund, and Hilde Thygesen. SMART HOME TECH-NOLOGY: Planning and management in municipal services. Norwegian Direc-torate for Social and Health Affairs, the Delta Centre, 2005.

[LC07] Dik Lun Lee and Qiuxia Chen. A model-based wifi localization method. In InfoS-cale ’07: Proceedings of the 2nd international conference on Scalable informa-tion systems, pages 1–7, ICST, Brussels, Belgium, Belgium, 2007. ICST (Institutefor Computer Sciences, Social-Informatics and Telecommunications Engineer-ing).

[LF98] David C. Luckham and Brian Frasca. Complex event processing in distributedsystems. Technical Report CSL-TR-98-754, Stanford University Technical Re-port, 1998.

[LKA+95] David C. Luckham, John J. Kenney, Larry M. Augustin, James Vera, WalterMann, Walter Mann, Doug Bryan, and Walter Mann. Specification and analysisof system architecture using rapide. IEEE Transactions on Software Engineering,21(4):336–355, 1995.

[Luc01] David C. Luckham. The Power of Events: An Introduction to Complex Event Pro-cessing in Distributed Enterprise Systems. Addison-Wesley Longman PublishingCo., Inc., Boston, MA, USA, 2001.

[mav] Mavhome - managing an adaptive versatile home. http://ailab.wsu.edu/mavhome/.

[MFHH05] Samuel R. Madden, Michael J. Franklin, Joseph M. Hellerstein, and Wei Hong.Tinydb: an acquisitional query processing system for sensor networks. ACMTrans. Database Syst., 30(1):122–173, 2005.

[MJAC08] Jie Mao, John Jannotti, Mert Akdere, and Ugur Cetintemel. Event-based con-straints for sensornet programming. In DEBS ’08: Proceedings of the second in-ternational conference on Distributed event-based systems, pages 103–113, NewYork, NY, USA, 2008. ACM.

[ML08] Marilyn Rose McGee-Lennon. Requirements engineering for home care technol-ogy. In CHI ’08: Proceeding of the twenty-sixth annual SIGCHI conference onHuman factors in computing systems, pages 1439–1442, New York, NY, USA,2008. ACM.

120

Page 133: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[MWKK04] Christopher A. Miller, Peggy Wu, Kathleen Krichbaum, and Liana Kiff. Auto-mated elder home care:long term adaptive aiding and support we can live with. InAAAI Spring Symposium on Interaction between Humans and Autonomous Sys-tems over Extended Operation, 2004.

[NBW07] Usman Naeem, John Bigham, and Jinfu Wang. Recognising activities of dailylife using hierarchical plans. In Smart Sensing and Context. Springer, 2007.

[ope] Opencv homepage. http://opencv.willowgarage.com.

[O’R87] Joseph O’Rourke. Art gallery theorems and algorithms. Oxford University Press,Inc., New York, NY, USA, 1987.

[PBV+09] Kyungseo Park, Eric Becker, Jyothi K. Vinjumur, Zhengyi Le, and Fillia Make-don. Human behavioral detection and data cleaning in assisted living environmentusing wireless sensor networks. In Proceedings of the 2nd International Confer-ence on Pervasive Technologies Related to Assistive Environments, PETRA ’09,pages 7:1–7:8, New York, NY, USA, 2009. ACM.

[PP07] Animesh Patcha and Jung-Min Park. An overview of anomaly detection tech-niques: Existing solutions and latest technological trends. Comput. Netw.,51:3448–3470, August 2007.

[PS06] Kostas Patroumpas and Timos Sellis. Window specification over data streams.Advances in Database Technology - EDBT 2006, 4254:445–464, 2006.

[QZWL07] Ying Qiao, Kang Zhong, HongAn Wang, and Xiang Li. Developing event-condition-action rules in real-time active database. In SAC ’07: Proceedings ofthe 2007 ACM symposium on Applied computing, pages 511–516, New York, NY,USA, 2007. ACM.

[Ree79] Trygve Reenskaug. Thing-model-view-editor, an example from a planningsys-tem. Technical report, Xerox PARC, May 1979.

[RGJ09] Setareh Rafatirad, Amarnath Gupta, and Ramesh Jain. Event composition opera-tors: Eco. In EiMM ’09: Proceedings of the 1st ACM international workshop onEvents in multimedia, pages 65–72, New York, NY, USA, 2009. ACM.

[RH09] Sean Reilly and Mads Haahr. Extending the event-based programming model tosupport sensor-driven ubiquitous computing applications. Pervasive Computingand Communications, IEEE International Conference on, 0:1–6, 2009.

[SBR09] Holger Storf, Martin Becker, and Martin Riedl. Rule-based activity recognitionframework: Challenges, technique and learning. In Pervasive Computing Tech-nologies for Healthcare, 2009. PervasiveHealth 2009. 3rd International Confer-ence on, pages 1 –7, 1-3 2009.

121

Page 134: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[SFSS09] Ansgar Scherp, Thomas Franz, Carsten Saathoff, and Steffen Staab. F–a modelof events based on the foundational ontology dolce+dns ultralight. In K-CAP ’09:Proceedings of the fifth international conference on Knowledge capture, pages137–144, New York, NY, USA, 2009. ACM.

[SGP08] Jarle Søberg, Vera Goebel, and Thomas Plagemann. To happen or not to hap-pen: towards an open distributed complex event processing system. InMDS ’08:Proceedings of the 5th Middleware doctoral symposium, pages 25–30, New York,NY, USA, 2008. ACM.

[SGP10a] Jarle Søberg, Vera Goebel, and Thomas Plagemann. Commonsens: Personal-isation of complex event processing in automated homecare. In The Sixth In-ternational Conference on Intelligent Sensors, Sensor Networks and InformationProcessing (ISSNIP), pages 275–280, December 2010.

[SGP10b] Jarle Søberg, Vera Goebel, and Thomas Plagemann. Detection of spatial events incommonsens. In Proceedings of the 2nd ACM international workshop on Eventsin multimedia, EiMM ’10, pages 53–58, New York, NY, USA, 2010. ACM.

[SGP11] Jarle Søberg, Vera Goebel, and Thomas Plagemann. Deviation detection in au-tomated home care using commonsens. In Workshop on Smart Environments toEnhance Health Care (SmartE), pages 668–673, March 2011.

[SH09] I. Skog and P. Handel. In-car positioning and navigation technologies - a survey.Intelligent Transportation Systems, IEEE Transactions on, 10(1):4 –21, 2009.

[SK09] M. Saini andM. Kankanhalli. Context-basedmultimedia sensor selectionmethod.In Advanced Video and Signal Based Surveillance, 2009. AVSS ’09. Sixth IEEEInternational Conference on, pages 262 –267, 2009.

[SKJ09] Mukesh Saini, Mohan Kankanhalli, and Ramesh Jain. A flexible surveillance sys-tem architecture. In Proceedings of the 2009 Sixth IEEE International Conferenceon Advanced Video and Signal Based Surveillance, AVSS ’09, pages 571–576,Washington, DC, USA, 2009. IEEE Computer Society.

[SR92] S.Y. Seidel and T.S. Rappaport. 914 mhz path loss prediction models for indoorwireless communications in multifloored buildings. Antennas and Propagation,IEEE Transactions on, 40(2):207–217, Feb 1992.

[SSGP07] Katrine Stemland Skjelsvik, Jarle Søberg, Vera Goebel, and Thomas Plagemann.Using continuous queries for event filtering and routing in sparse manets. In FT-DCS ’07: Proceedings of the 11th IEEE International Workshop on Future Trendsof Distributed Computing Systems, pages 138–148, Washington, DC, USA, 2007.IEEE Computer Society.

[SSS10] Sinan Sen, Nenad Stojanovic, and Ljiljana Stojanovic. An approach for iterativeevent pattern recommendation. In DEBS ’10: Proceedings of the Fourth ACMInternational Conference on Distributed Event-Based Systems, pages 196–205,New York, NY, USA, 2010. ACM.

122

Page 135: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

[TBGNC09] Tarik Taleb, Dario Bottazzi, Mohsen Guizani, and Hammadi Nait-Charif. Ange-lah: A framework for assisting elders at home. IEEE Journal on Selected Areasin Communications, 27(4):480–494, May 2009.

[tin] Tinyos homepage. http://www.tinyos.net/.

[TPS+05] Gilman Tolle, Joseph Polastre, Robert Szewczyk, David Culler, Neil Turner,Kevin Tu, Stephen Burgess, Todd Dawson, Phil Buonadonna, David Gay, andWei Hong. A macroscope in the redwoods. In Proceedings of the 3rd inter-national conference on Embedded networked sensor systems, SenSys ’05, pages51–63, New York, NY, USA, 2005. ACM.

[WDR06] Eugene Wu, Yanlei Diao, and Shariq Rizvi. High-performance complex eventprocessing over streams. In SIGMOD. ACM, 2006.

[WF06] Phillip Ian Wilson and John Fernandez. Facial feature detection using haar clas-sifiers. J. Comput. Small Coll., 21:127–133, April 2006.

[WJ07] U. Westermann and R. Jain. Toward a common event model for multimedia ap-plications. Multimedia, IEEE, 14(1):19–29, Jan.-March 2007.

[WRGD07] Walker White, Mirek Riedewald, Johannes Gehrke, and Alan Demers. What is"Next" in Event Processing? In PODS ’07: Proceedings of the twenty-sixth ACMSIGMOD-SIGACT-SIGART symposium on Principles of database systems. ACM,2007.

[WSM07] Ståle Walderhaug, Erlend Stav, and Marius Mikalsen. The mpower tool chain -enabling rapid development of standards-based and interoperable homecare ap-plications. In Norwegian Informatics Conference, 2007.

[WT08] Feng Wang and Kenneth J. Turner. Towards personalised home care systems.In PETRA ’08: Proceedings of the 1st international conference on PErvasiveTechnologies Related to Assistive Environments, pages 1–7, New York, NY, USA,2008. ACM.

[xbo] Crossbox technology inc. homepage. http://www.xbow.com.

[XZH+05] Zhe Xiang, Hangjin Zhang, Jian Huang, Song Song, and Kevin C. Almeroth.A hidden environment model for constructing indoor radio maps. A World ofWireless, Mobile and Multimedia Networks, International Symposium on, 1:395–400, 2005.

[YG03] Y. Yao and J. E. Gehrke. Query processing for sensor networks. In Proceedings ofthe 2003 Conference on Innovative Data Systems Resea rch (CIDR 2003), January2003.

[YYC06] Li-Hsing Yen, Chang Wu Yu, and Yang-Min Cheng. Expected k-coverage inwireless sensor networks. Ad Hoc Networks, 4(5):636 – 650, 2006.

[zig] Zigbee alliance. http://www.zigbee.org/.

123

Page 136: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 137: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Appendix A

Appendix

A.1 calculateError

/ * ** Ca l c u l a t e s t h e area o f a l l t h e i n t e r s e c t i o n s o f t h e s e n s o r s t h a t co v e r* a LoI i n t h e AtomicQuery tmpAQuery . I f t h e LoI == nu l l , a l l t h e s e n s o r s* i n t h e en v i ronmen t t h a t p r o v i d e t h e c a p a b i l i t y have t o be i n c l u d e d .* /

pub l i c vo id c a l c u l a t e E r r o r ( AtomicQuery tmpAQuery ) {

L o c a t i o nO f I n t e r e s t tmpLoI = tmpAQuery . g e tL o i ( ) ;C a p a b i l i t y c a p a b i l i t y = tmpAQuery . g e t C a p a b i l i t y ( ) ;A r r ayL i s t <Sensor > i s e c = new Ar r ayL i s t <Sensor > ( ) ;A r r ayL i s t <Sensor > n o I s e c = new Ar r ayL i s t <Sensor > ( ) ;Po ly i n t e r s e c t i o n = nu l l ;

/ / Ob ta in a l l t h e t u p l e s o u r c e s t h a t p r o v i d e t h e c a p a b i l i t y .

Ar r ayL i s t <Sensor > p r o v i d e s C a p a b i l i t y = f i n d S e n s o r ( c a p a b i l i t y ) ;

i f ( p r o v i d e sC a p a b i l i t y . i sEmpty ( ) ) {re turn ;}

/ / Add t h e s e n s o r s t h a t have an i n t e r s e c t i o n w i t h LoI .

i f ( tmpLoI != nu l l ) {f o r ( Senso r tmpSource : p r o v i d e s C a p a b i l i t y ) {i f ( tmpSource i n s t a n c e o f Ph y s i c a l S e n s o r ) {P h y s i c a l S e n s o r tmpSens = ( P h y s i c a l S e n s o r ) tmpSource ;i n t e r s e c t i o n = tmpLoI . ge tShape ( ) . g e t Po l y ( ) . i n t e r s e c t i o n (tmpSens . ge tPo lyReduced ( t h i s , tmpSens. g e t S i g n a lType ( ) ) ) ;

i f ( i n t e r s e c t i o n . g e tA r e a ( ) == tmpLoI . ge tShape ( ) . g e t Po l y ( ). g e tA r e a ( ) ) {i s e c . add ( tmpSens ) ;}

125

Page 138: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

}}

/ / Crea te t h e a c t u a l i n t e r s e c t i o n .

i f ( ! i s e c . i sEmpty ( ) ) {i n t e r s e c t i o n = ( ( P h y s i c a l S e n s o r ) i s e c . g e t ( 0 ) ) . ge tPo lyReduced (t h i s , ( ( P h y s i c a l S e n s o r ) i s e c . g e t ( 0 ) ) . g e t S i g n a lTy p e ( ) ) ;

f o r ( i n t i = 1 ; i < i s e c . s i z e ( ) ; i ++) {i n t e r s e c t i o n = ( ( P h y s i c a l S e n s o r ) i s e c . g e t ( i ) ). g e tPo lyReduced (t h i s ,( ( P h y s i c a l S e n s o r ) i s e c . g e t ( i ) ). g e t S i g n a lTyp e ( ) ) . i n t e r s e c t i o n (

i n t e r s e c t i o n ) ;}

/ / Add t h e s e n s o r s t h a t have an i n t e r s e c t i o n w i t h t h e/ / " i n t e r s e c t i o n " bu t no t t h e LoI ( n o I s e c ) .

f o r ( Senso r tmpSource : p r o v i d e s C a p a b i l i t y ) {i f ( tmpSource i n s t a n c e o f Ph y s i c a l S e n s o r ) {P h y s i c a l S e n s o r tmpSens = ( P h y s i c a l S e n s o r ) tmpSource ;i f ( tmpSens . g e t I s C a p a b i l i t y P r o v i d e d ( c a p a b i l i t y. getName ( ) ) ) {Poly tm p I n t e r s e c t i o n = tmpSens . ge tPo lyReduced ( t h i s ,tmpSens . g e t S i g n a lTy p e ( ) ) . i n t e r s e c t i o n (i n t e r s e c t i o n ) ;

Po ly t e s t I n t e r s e c t i o n = tmpSens . ge tPo lyReduced (t h i s , tmpSens . g e t S i g n a lTyp e ( ) ). i n t e r s e c t i o n ( tmpLoI . ge tShape ( ) . g e t Po l y ( ) ) ;

i f ( t m p I n t e r s e c t i o n . g e tA r e a ( ) != 0&& t e s t I n t e r s e c t i o n . g e tA r e a ( ) == 0 ) {

/ / The LoI i s no t covered , bu t t h e/ / i n t e r s e c t i o n i s .

no I s e c . add ( tmpSens ) ;}}}}

/ / Run an XOR on t h e i n t e r s e c t i o n and t h e e l emen t s i n/ / n o I s e c .

f o r ( Senso r tmpTS : n o I s e c ) {P h y s i c a l S e n s o r tmpSens = ( P h y s i c a l S e n s o r ) tmpTS ;Poly tm p I n t e r s e c t i o n = i n t e r s e c t i o n . i n t e r s e c t i o n ( tmpSens. ge tPo lyReduced ( t h i s , tmpSens . g e t S i g n a lType ( ) ) ) ;

i n t e r s e c t i o n = i n t e r s e c t i o n . xo r ( t m p I n t e r s e c t i o n ) ;}tmpAQuery . se tLo IApprox ( i n t e r s e c t i o n ) ;

126

Page 139: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

tmpAQuery . s e tFPP rob ( tmpLoI . ge tShape ( ) . g e t Po l y ( ) . g e tA r e a ( )/ i n t e r s e c t i o n . g e tA r e a ( ) ) ;

tmpAQuery . s e t I s e c ( i s e c ) ;tmpAQuery . s e tNo I s e c ( n o I s e c ) ;} e l s e / * i s e c was empty . * / {tmpAQuery . se tLo IApprox ( nu l l ) ;tmpAQuery . s e tFPP rob ( 0 ) ;}} e l s e / * tmpLoI == n u l l * / {tmpAQuery . se tLo IApprox ( nu l l ) ;tmpAQuery . s e tFPP rob ( 0 ) ;tmpAQuery . s e t I s e c ( p r o v i d e s C a p a b i l i t y ) ;}}

A.2 reduceRay

/ * ** Reduces t h e ray due t o o b j e c t s t h e ray mee t s . C u r r e n t l y i t u s e s t h e* Ding a l g o r i t hm .** @param tmpEnv* @param s i g n a l T y p e* @return The new boundary t r i p l e* /

pub l i c T r i p l e reduceRay ( Env i ronment tmpEnv , S igna lType s i g n a lTy p e ) {

i n t d0 = CommonSens .ANTENNA_LENGTH;i n t r a ng e = numElements ;double p0 = CommonSens . INITIAL_SIGNAL_STRENGTH;double m = CommonSens .THRESHOLD ;double b e t aA i r = Math . l o g ( p0 / m)/ Math . l o g ( ( ( double ) r a nge ) / ( double ) d0 ) ;

double s t r e n g t h = p0 ;i n t i = 0 ;i n t r = 0 ;double prevPerm = b e t aA i r ;double cu r rPe rm = b e t aA i r ;

whi le ( s t r e n g t h > m && r < r ange ) {cu r rPe rm = perm ( tmpEnv , t r i p l e s . g e t ( r ) . g e t T r i p l e ( ) , s i g n a lTy p e ) ;i f ( prevPerm != cu r rPe rm ) {p0 = s t r e n g t h ;i = d0 ;prevPerm = cur rPe rm ;}s t r e n g t h = s t r e n g t h ( tmpEnv , s i g n a lType , d0 , p0 , r , i ) ;r += 1 ;i += 1 ;}

127

Page 140: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

i f ( r == r ange )re turn t r i p l e s . g e t ( r a ng e − 1 ) . g e t T r i p l e ( ) ;

re turn t r i p l e s . g e t ( r ) . g e t T r i p l e ( ) ;}

A.3 Functionality Tests ConfigurationEvaluates a complex query with no LoIs and no temporal properties. Since no LoIs areaddressed, all the sensors in the environment instance that provide the capability have toreport a match. This should only happen in e2.env, since all the sensors cover the samearea in the environment. At once all the sensors report a match, the complex query evalua-tion stops. Language construct I is evaluated.Test number Environment Movement pattern Complex query Expected result

1 e1.env m1.mov cq1.qry 62 e2.env m2.mov cq1.qry 03 e2.env m3.mov cq1.qry 6

Evaluates Language construct I, but the complex query addresses LoI3. This means thatonly the sensors that approximate LoI3 are included in the instantiated complex query.Since the temporal properties are not defined, the complex query is matched at once thesensors in the approximation report correct data tuples.Test number Environment Movement pattern Complex query Expected result

4 e1.env m1.mov cq2.qry 05 e1.env m2.mov cq2.qry 66 e1.env m4.mov cq2.qry 6

Evaluates Language construct I by letting tb = 5 epochs. This means that the complexquery is δ-timed and all the data tuples have to match the condition at once the evaluationhas started. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

7 e1.env m5.mov cq3.qry 08 e1.env m6.mov cq3.qry 09 e1.env m7.mov cq3.qry 3

Evaluates Language construct I by letting tb = 1 and te = 6. This means that the com-plex query is timed and between epochs 1 and 6 and all the data tuples have to match thecondition between these two timestamps. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

10 e1.env m5.mov cq4.qry 011 e1.env m6.mov cq4.qry 012 e1.env m7.mov cq4.qry 313 e1.env m3.mov cq4.qry 3

Evaluates Language constructs I and II by letting tb = 5 epochs. The P -registration is setto min 25%. LoI3 is addressed.

128

Page 141: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Environment Movement pattern Complex query Expected result14 e1.env m5.mov cq5.qry 015 e1.env m6.mov cq5.qry 016 e1.env m8.mov cq5.qry 017 e1.env m7.mov cq5.qry 318 e1.env m3.mov cq5.qry 6

Evaluates Language constructs I and II by letting tb = 1 and te = 6. This means that thecomplex query is timed and between epochs 1 and 6 and minimum 25% of the data tupleshave to match the condition. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

19 e1.env m5.mov cq6.qry 020 e1.env m6.mov cq6.qry 021 e1.env m8.mov cq6.qry 022 e1.env m7.mov cq6.qry 323 e1.env m3.mov cq6.qry 3

Evaluates Language constructs I and II by letting tb = 5 epochs. The P -registration is setto max 25%. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

24 e1.env m5.mov cq7.qry 325 e1.env m6.mov cq7.qry 326 e1.env m8.mov cq7.qry 327 e1.env m7.mov cq7.qry 028 e1.env m3.mov cq7.qry 6

Evaluates Language constructs I and II by letting tb = 1 and te = 6. The P -registration isset to max 25%. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

29 e1.env m5.mov cq8.qry 330 e1.env m6.mov cq8.qry 331 e1.env m8.mov cq8.qry 332 e1.env m7.mov cq8.qry 033 e1.env m3.mov cq8.qry 0

Evaluates Language constructs I and III. The logical operator ∧ is used between two atomicqueries that are similar. Note that we can do this since CommonSens does not optimise thequeries and does not know about this similarity. It simply evaluates both atomic queries asif they were different. Also note that ∧ requires that both data tuple sequences are similar(Figure 5.13).

129

Page 142: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Environment Movement pattern Complex query Expected result34 e1.env m1.mov cq9.qry 635 e2.env m2.mov cq9.qry 036 e2.env m3.mov cq9.qry 6

Evaluates Language constructs I and III. The logical operator ∧ is used between two atomicqueries that are similar. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

37 e1.env m1.mov cq10.qry 038 e1.env m2.mov cq10.qry 639 e1.env m4.mov cq10.qry 6

Evaluates Language constructs I and III. The logical operator ∧ is used between two atomicqueries that are similar. The duration is 5 epochs. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

40 e1.env m5.mov cq11.qry 041 e1.env m6.mov cq11.qry 042 e1.env m7.mov cq11.qry 3

Evaluates Language constructs I and III by letting tb = 1 and te = 6. The logical operator ∧is used between two atomic queries that are similar. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

43 e1.env m5.mov cq12.qry 044 e1.env m6.mov cq12.qry 045 e1.env m7.mov cq12.qry 346 e1.env m3.mov cq12.qry 3

Evaluates Language constructs I, II and III by letting tb = 5 epochs. The P -registration isset to min 25%. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

47 e1.env m5.mov cq13.qry 048 e1.env m6.mov cq13.qry 049 e1.env m8.mov cq13.qry 050 e1.env m7.mov cq13.qry 351 e1.env m3.mov cq13.qry 6

Evaluates Language constructs I, II and III by letting tb = 1 and te = 6. The P -registrationis set to min 25%. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

52 e1.env m5.mov cq14.qry 053 e1.env m6.mov cq14.qry 054 e1.env m8.mov cq14.qry 055 e1.env m7.mov cq14.qry 356 e1.env m3.mov cq14.qry 3

Evaluates Language constructs I, II and III by letting tb = 5 epochs. The P -registration isset to max 25%. LoI3 is addressed.

130

Page 143: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Environment Movement pattern Complex query Expected result57 e1.env m5.mov cq15.qry 358 e1.env m6.mov cq15.qry 359 e1.env m8.mov cq15.qry 360 e1.env m7.mov cq15.qry 061 e1.env m3.mov cq15.qry 6

Evaluates Language constructs I, II and III by letting tb = 1 and te = 6. The P -registrationis set to max 25%. LoI3 is addressed.Test number Environment Movement pattern Complex query Expected result

62 e1.env m5.mov cq16.qry 363 e1.env m6.mov cq16.qry 364 e1.env m8.mov cq16.qry 365 e1.env m7.mov cq16.qry 066 e1.env m3.mov cq16.qry 0

Evaluates Language constructs I and III. The logical operator ∧ is used between two atomicqueries that are similar. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

67 e3.env m1.mov cq17.qry 068 e3.env m2.mov cq17.qry 669 e3.env m4.mov cq17.qry 6

Evaluates Language constructs I and III. The logical operator ∧ is used between two atomicqueries that are similar. The duration is 5 epochs. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

70 e3.env m5.mov cq18.qry 071 e3.env m6.mov cq18.qry 072 e3.env m7.mov cq18.qry 3

Evaluates Language constructs I and III by letting tb = 1 and te = 6. The logical operator ∧is used between two atomic queries that are similar. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

73 e3.env m5.mov cq19.qry 074 e3.env m6.mov cq19.qry 075 e3.env m7.mov cq19.qry 376 e3.env m3.mov cq19.qry 3

Evaluates Language constructs I, II and III by letting tb = 5 epochs. The P -registration isset to min 25%. LoI3 and LoI4 are addressed.Test number Environment Movement pattern Complex query Expected result

77 e3.env m5.mov cq20.qry 078 e3.env m6.mov cq20.qry 079 e3.env m8.mov cq20.qry 080 e3.env m7.mov cq20.qry 3

131

Page 144: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

81 e3.env m3.mov cq20.qry 6Evaluates Language constructs I, II and III by letting tb = 1 and te = 6. The P -registrationis set to min 25%. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

82 e3.env m5.mov cq21.qry 083 e3.env m6.mov cq21.qry 084 e3.env m8.mov cq21.qry 085 e3.env m7.mov cq21.qry 386 e3.env m3.mov cq21.qry 3

Evaluates Language constructs I, II and III by letting tb = 5 epochs. The P -registration isset to max 25%. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

87 e3.env m5.mov cq22.qry 388 e3.env m6.mov cq22.qry 389 e3.env m8.mov cq22.qry 390 e3.env m7.mov cq22.qry 091 e3.env m3.mov cq22.qry 6

Evaluates Language constructs I, II and III by letting tb = 1 and te = 6. The P -registrationis set to max 25%. LoI3 and LoI8 are addressed.Test number Environment Movement pattern Complex query Expected result

92 e3.env m5.mov cq23.qry 393 e3.env m6.mov cq23.qry 394 e3.env m8.mov cq23.qry 395 e3.env m7.mov cq23.qry 096 e3.env m3.mov cq23.qry 0

Evaluates Language constructs I and III. Uses ∨ between atomic queries that address LoI1to LoI4. The monitored person only needs to move to one of the LoIs.Test number Environment Movement pattern Complex query Expected result

97 e1.env m9.mov cq24.qry 098 e1.env m10.mov cq24.qry 099 e1.env m11.mov cq24.qry 0100 e1.env m12.mov cq24.qry 0

Evaluates Language constructs I and III. Uses ∨ between pairs of atomic queries that arerelated with ∧. The complex queries address LoI1, LoI3, LoI7 and LoI8.Test number Environment Movement pattern Complex query Expected result

101 e3.env m9.mov cq25.qry 0102 e3.env m9.mov cq26.qry 3

Evaluates Language constructs I, II and III. Uses ∨ between pairs of atomic queries thatare related with ∧. The complex queries address LoI1, LoI3, LoI7 and LoI8.

132

Page 145: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Environment Movement pattern Complex query Expected result103 e3.env m5.mov cq27.qry 0104 e3.env m6.mov cq27.qry 0105 e3.env m7.mov cq27.qry 3106 e3.env m8.mov cq27.qry 3107 e3.env m13.mov cq27.qry 0108 e3.env m14.mov cq27.qry 0109 e3.env m15.mov cq27.qry 3110 e3.env m16.mov cq27.qry 3111 e3.env m3.mov cq27.qry 3112 e3.env m5.mov cq28.qry 0113 e3.env m6.mov cq28.qry 0114 e3.env m8.mov cq28.qry 0115 e3.env m7.mov cq28.qry 3116 e3.env m13.mov cq28.qry 0117 e3.env m14.mov cq28.qry 0118 e3.env m15.mov cq28.qry 0119 e3.env m16.mov cq28.qry 3120 e3.env m3.mov cq28.qry 6121 e3.env m5.mov cq29.qry 0122 e3.env m6.mov cq29.qry 0123 e3.env m8.mov cq29.qry 0124 e3.env m7.mov cq29.qry 3125 e3.env m13.mov cq29.qry 0126 e3.env m14.mov cq29.qry 0127 e3.env m15.mov cq29.qry 0128 e3.env m16.mov cq29.qry 3129 e3.env m3.mov cq29.qry 3130 e3.env m5.mov cq30.qry 3131 e3.env m6.mov cq30.qry 3132 e3.env m8.mov cq30.qry 3133 e3.env m7.mov cq30.qry 0134 e3.env m13.mov cq30.qry 3135 e3.env m14.mov cq30.qry 3136 e3.env m15.mov cq30.qry 3137 e3.env m16.mov cq30.qry 0138 e3.env m3.mov cq30.qry 6139 e3.env m5.mov cq31.qry 3140 e3.env m6.mov cq31.qry 3141 e3.env m8.mov cq31.qry 3

133

Page 146: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

142 e3.env m7.mov cq31.qry 0143 e3.env m13.mov cq31.qry 3144 e3.env m14.mov cq31.qry 3145 e3.env m15.mov cq31.qry 3146 e3.env m16.mov cq31.qry 0147 e3.env m3.mov cq31.qry 0

Evaluates Language constructs I, II and III. At least one of the ∧-lists are δ-timed or timed.Test number Environment Movement pattern Complex query Expected result

148 e3.env m5.mov cq32.qry 0149 e3.env m6.mov cq32.qry 0150 e3.env m8.mov cq32.qry 3151 e3.env m7.mov cq32.qry 3152 e3.env m13.mov cq32.qry 0153 e3.env m14.mov cq32.qry 0154 e3.env m15.mov cq32.qry 3155 e3.env m16.mov cq32.qry 3156 e3.env m3.mov cq32.qry 3157 e3.env m17.mov cq32.qry 0158 e1.env m18.mov cq33.qry 0159 e1.env m19.mov cq33.qry 3

Evaluates Language constructs I and IV.Test number Environment Movement pattern Complex query Expected result

160 e1.env m19.mov cq34.qry 0161 e1.env m20.mov cq35.qry 0162 e1.env m21.mov cq35.qry 3163 e1.env m20.mov cq36.qry 0164 e1.env m21.mov cq36.qry 3165 e1.env m22.mov cq36.qry 3

Evaluates Language constructs I and III. The logical operator is ¬.Test number Environment Movement pattern Complex query Expected result

166 e1.env m1.mov cq37.qry 0167 e1.env m1.mov cq38.qry 0

Evaluates Language constructs I and V. The concurrency class is EQUALS.Test number Environment Movement pattern Complex query Expected result

168 e3.env m1.mov cq39.qry 0169 e3.env m1.mov cq40.qry 7

Evaluates Language constructs I and V. The concurrency class is DURING.Test number Environment Movement pattern Complex query Expected result

170 e1.env m23.mov cq41.qry 0171 e1.env m24.mov cq41.qry 7

134

Page 147: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

172 e1.env m31.mov cq47.qry 7173 e1.env m30.mov cq47.qry 0

Evaluates Language construct I with focus on timing of complex queries.Test number Environment Movement pattern Complex query Expected result

174 e1.env m23.mov cq42.qry 8175 e1.env m23.mov cq43.qry 8176 e1.env m23.mov cq44.qry 0

Evaluates Language constructs I and IV with combination of timing and δ-timing.Test number Environment Movement pattern Complex query Expected result

177 e1.env m12.mov cq45.qry 3178 e1.env m25.mov cq46.qry 0179 e1.env m26.mov cq46.qry 3180 e1.env m27.mov cq46.qry 3181 e1.env m28.mov cq46.qry 3182 e1.env m29.mov cq46.qry 3

Table A.1: Regression tests.

Test number Average Minimum Maximum End time Result1 0.15 0 0.67 6 True (6 = 6)2 0.38 0 2 2 True (0 = 0)3 0.09 0 0.5 12 True (6 = 6)4 0.35 0 2.33 2 True (0 = 0)5 0.07 0 0.5 6 True (6 = 6)6 0.07 0 0.67 3 True (6 = 6)7 0.32 0 0.83 5 True (0 = 0)8 0.34 0 1.17 5 True (0 = 0)9 0.26 0 1 5 True (3 = 3)10 0.29 0 1.5 5 True (0 = 0)11 0.29 0 0.83 5 True (0 = 0)12 0.23 0 0.67 5 True (3 = 3)13 0.16 0 0.71 6 True (3 = 3)14 0.3 0 2 5 True (0 = 0)15 0.29 0 1 5 True (0 = 0)16 0.25 0 1 5 True (0 = 0)17 0.25 0 0.83 5 True (3 = 3)18 0.04 0 0.33 12 True (6 = 6)19 0.27 0 1.17 5 True (0 = 0)20 0.3 0 0.83 5 True (0 = 0)21 0.22 0 0.5 5 True (0 = 0)22 0.23 0 0.67 5 True (3 = 3)

135

Page 148: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

FilenameComplex

querycq1.qry

[(DetectPerson==Person1)]

cq2.qry[(DetectPerson==Person1,LoI3)]

cq3.qry[(DetectPerson==Person1,LoI3,5)]

cq4.qry[(DetectPerson==Person1,LoI3,1,6)]

cq5.qry[(DetectPerson==Person1,LoI3,5,min

25%)]

cq6.qry[(DetectPerson==Person1,LoI3,1,6,

min25%)]

cq7.qry[(DetectPerson==Person1,LoI3,5,max

25%)]

cq8.qry[(DetectPerson==Person1,LoI3,1,6,

max25%)]

cq9.qry[(DetectPerson==Person1)&&(DetectPerson==Person1)]

cq10.qry[(DetectPerson==Person1,LoI3)&&(DetectPerson==Person1,LoI3)]

cq11.qry[(DetectPerson==Person1,LoI3,5)&&

(DetectPerson==Person1,LoI3,5)]

cq12.qry[(DetectPerson==Person1,LoI3,1,6)

&&(DetectPerson==Person1,LoI3,1,6)]

cq13.qry[(DetectPerson==Person1,LoI3,5,min

25%)&&(DetectPerson==Person1,LoI3,5,

min25%)]

cq14.qry[(DetectPerson==Person1,LoI3,1,6,

min25%)&&(DetectPerson==Person1,LoI3,1,6,

min25%)]

cq15.qry[(DetectPerson==Person1,LoI3,5,max

25%)&&(DetectPerson==Person1,LoI3,5,

max25%)]

cq16.qry[(DetectPerson==Person1,LoI3,1,6,

max25%)&&(DetectPerson==Person1,LoI3,1,6,

max25%)]

cq17.qry[(DetectPerson==Person1,LoI3)&&(DetectPerson==Person1,LoI8)]

cq18.qry[(DetectPerson==Person1,LoI4,5)&&

(DetectPerson==Person1,LoI3,5)]

cq19.qry[(DetectPerson==Person1,LoI3,1,6)

&&(DetectPerson==Person1,LoI8,1,6)]

cq20.qry[(DetectPerson==Person1,LoI3,5,min

25%)&&(DetectPerson==Person1,LoI8,5,

min25%)]

cq21.qry[(DetectPerson==Person1,LoI3,1,6,

min25%)&&(DetectPerson==Person1,LoI8,1,6,

min25%)]

cq22.qry[(DetectPerson==Person1,LoI8,5,max

25%)&&(DetectPerson==Person1,LoI3,5,

max25%)]

cq23.qry[(DetectPerson==Person1,LoI3,1,6,

max25%)&&(DetectPerson==Person1,LoI8,1,6,

max25%)]

TableA.2:C

omplex

queriescq1.qrytocq23.qry,w

hichareused

intheregression

tests.

136

Page 149: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

FilenameComplexquery

cq24.qry

[(DetectPerson==Person1,LoI1)||

(DetectPerson==Person1,LoI2)||

(DetectPerson==Person1,LoI3)||

(DetectPerson==Person1,LoI4)]

cq25.qry

[(DetectPerson==Person1,LoI1)&&

(DetectPerson==Person1,LoI7)||

(DetectPerson==Person1,LoI3)&&

(DetectPerson==Person1,LoI8)]

cq26.qry

[(DetectPerson==Person1,LoI4,5)

&&(DetectPerson==Person1,LoI3,5)

||

(DetectPerson==Person1,LoI1,5)

&&(DetectPerson==Person1,LoI2,5)]

cq27.qry

[(DetectPerson==Person1,LoI1,1,

6)&&(DetectPerson==Person1,LoI7,1,6)

||

(DetectPerson==Person1,LoI3,1,

6)&&

(DetectPerson==Person1,LoI8,1,6)]

cq28.qry

[(DetectPerson==Person1,LoI3,5,

min25%)&&

(DetectPerson==Person1,LoI8,5,

min25%)||

(DetectPerson==Person1,LoI1,5,

min25%)&&

(DetectPerson==Person1,LoI7,5,

min25%)]

cq29.qry

[(DetectPerson==Person1,LoI1,1,

6,min25%)&&

(DetectPerson==Person1,LoI7,1,

6,min25%)||

(DetectPerson==Person1,LoI3,1,

6,min25%)&&

(DetectPerson==Person1,LoI8,1,

6,min25%)]

cq30.qry

[(DetectPerson==Person1,LoI1,5,

max25%)&&

(DetectPerson==Person1,LoI7,5,

max25%)||

(DetectPerson==Person1,LoI8,5,

max25%)&&

(DetectPerson==Person1,LoI3,5,

max25%)]

cq31.qry

[(DetectPerson==Person1,LoI3,1,

6,max25%)&&

(DetectPerson==Person1,LoI8,1,

6,max25%)||

(DetectPerson==Person1,LoI1,1,

6,max25%)&&

(DetectPerson==Person1,LoI7,1,

6,max25%)]

cq32.qry

[(DetectPerson==Person1,LoI3,5)

||

(DetectPerson==Person1,LoI1,4,

5)]

cq33.qry

[(DetectPerson==Person1,LoI1,5)

||

(DetectPerson==Person1,LoI2,5)

||(DetectPerson==Person1,LoI3,5)||

(DetectPerson==Person1,LoI4,5)]

cq34.qry

[(DetectPerson==Person1,LoI3)->

(DetectPerson==Person1,LoI4)->

(DetectPerson==Person1,LoI1)->

(DetectPerson==Person1,LoI2)]

TableA.3:Complexqueriescq24.qrytocq34.qry,whichareusedintheregressiontests.

137

Page 150: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

FilenameComplex

querycq35.qry

[(DetectPerson==Person1,LoI3,3)

->(DetectPerson==Person1,LoI4,3)

->

(DetectPerson==Person1,LoI1,3)

->(DetectPerson==Person1,LoI2,3)]

cq36.qry[(DetectPerson==Person1,LoI3,1,

3)->(DetectPerson==Person1,LoI4,5,7)

->

(DetectPerson==Person1,LoI1,8,

10)->(DetectPerson==Person1,LoI2,11,13)]

cq37.qry[!(DetectPerson==Person1)]

cq38.qry[(DetectPerson==Person1,LoI3)&&

!(DetectPerson==Person1,LoI4)]

cq39.qry[equals([(DetectPerson==Person1,LoI3)],

[(DetectPerson==Person1,LoI8)])]

cq40.qry[equals([(DetectPerson==Person1,LoI3)],

[(DetectPerson==Person1,LoI1)])]

cq41.qry[during([(DetectPerson==Person1,LoI5)],

[(DetectPerson==Person1,LoI6,5)])]

cq42.qry[(DetectPerson==Person1,LoI2)->

(DetectPerson==Person1,LoI4),1,5,

max100%]

cq43.qry[(DetectPerson==Person1,LoI2)->

(DetectPerson==Person1,LoI4),0,5,

max100%]

cq44.qry[(DetectPerson==Person1,LoI2),0,5,

max100%]

cq45.qry[(DetectPerson==Person1,LoI2,1,

max100%)->

(DetectPerson==Person1,LoI4,5,

7,max100%)]

cq46.qry[(DetectPerson==Person1,LoI1,10,max

50%)->

(DetectPerson==Person1,LoI2,10,

min50%)->

(DetectPerson==Person1,LoI3,21,

30,max50%)->

(DetectPerson==Person1,LoI4,31,

40,min50%)]

cq47.qry[during([(DetectPerson==Person1,LoI5)],

[(DetectPerson==Person1,LoI6)])]

TableA.4:C

omplex

queriescq35.qrytocq74.qry,w

hichareused

intheregression

tests.

138

Page 151: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Average Minimum Maximum End time Result23 0.14 0 0.57 6 True (3 = 3)24 0.38 0 1 2 True (3 = 3)25 0.5 0 4 2 True (3 = 3)26 0.42 0 1.67 2 True (3 = 3)27 0.18 0 0.67 5 True (0 = 0)28 0.04 0 0.17 12 True (6 = 6)29 0.45 0 1.67 2 True (3 = 3)30 0.4 0 1.33 2 True (3 = 3)31 0.45 0 1.33 2 True (3 = 3)32 0.2 0 0.83 5 True (0 = 0)33 0.14 0 1.29 6 True (0 = 0)34 0.15 0 0.5 6 True (6 = 6)35 0.41 0 1.67 2 True (0 = 0)36 0.05 0 0.17 12 True (6 = 6)37 0.34 0 2 2 True (0 = 0)38 0.04 0 0.5 6 True (6 = 6)39 0.04 0 0.33 3 True (6 = 6)40 0.34 0 1.17 5 True (0 = 0)41 0.31 0 0.67 5 True (0 = 0)42 0.26 0 2 5 True (3 = 3)43 0.28 0 0.67 5 True (0 = 0)44 0.27 0 0.83 5 True (0 = 0)45 0.23 0 0.67 5 True (3 = 3)46 0.15 0 0.43 6 True (3 = 3)47 0.29 0 0.67 5 True (0 = 0)48 0.32 0 3.5 5 True (0 = 0)49 0.27 0 0.67 5 True (0 = 0)50 0.22 0 2 5 True (3 = 3)51 0.05 0 0.33 12 True (6 = 6)52 0.28 0 0.67 5 True (0 = 0)53 0.3 0 0.67 5 True (0 = 0)54 0.27 0 1 5 True (0 = 0)55 0.23 0 0.67 5 True (3 = 3)56 0.15 0 0.43 6 True (3 = 3)57 0.46 0 2.33 2 True (3 = 3)58 0.45 0 1.67 2 True (3 = 3)59 0.43 0 1.67 2 True (3 = 3)60 0.17 0 1.17 5 True (0 = 0)61 0.06 0 0.42 12 True (6 = 6)62 0.46 0 1.67 2 True (3 = 3)

139

Page 152: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Average Minimum Maximum End time Result63 0.48 0 1.33 2 True (3 = 3)64 0.47 0 1.67 2 True (3 = 3)65 0.18 0 0.67 5 True (0 = 0)66 0.11 0 0.71 6 True (0 = 0)67 0.29 0 1.33 2 True (0 = 0)68 0.08 0 0.5 6 True (6 = 6)69 0.06 0 0.67 3 True (6 = 6)70 0.37 0 0.83 5 True (0 = 0)71 0.38 0 3.17 5 True (0 = 0)72 0.24 0 0.67 5 True (3 = 3)73 0.35 0 1 5 True (0 = 0)74 0.37 0 1.17 5 True (0 = 0)75 0.26 0 0.83 5 True (3 = 3)76 0.15 0 0.57 6 True (3 = 3)77 0.4 0 2.33 5 True (0 = 0)78 0.35 0 0.83 5 True (0 = 0)79 0.29 0 0.67 5 True (0 = 0)80 0.25 0 1.33 5 True (3 = 3)81 0.06 0 0.25 12 True (6 = 6)82 0.36 0 1.5 5 True (0 = 0)83 0.38 0 8 5 True (0 = 0)84 0.31 0 3 5 True (0 = 0)85 0.26 0 0.83 5 True (3 = 3)86 0.17 0 0.43 6 True (3 = 3)87 0.64 0 12 2 True (3 = 3)88 0.55 0 1.67 2 True (3 = 3)89 0.55 0 2.67 2 True (3 = 3)90 0.17 0 0.5 5 True (0 = 0)91 0.06 0 0.25 12 True (6 = 6)92 0.5 0 1.33 2 True (3 = 3)93 0.46 0 1 2 True (3 = 3)94 0.45 0 1.67 2 True (3 = 3)95 0.21 0 3.33 5 True (0 = 0)96 0.12 0 0.57 6 True (0 = 0)97 0.49 0 2 1 True (0 = 0)98 0.44 0 3.5 1 True (0 = 0)99 0.41 0 3.5 1 True (0 = 0)100 0.4 0 1 1 True (0 = 0)101 0.46 0 1.5 1 True (0 = 0)102 0.27 0 0.67 5 True (3 = 3)

140

Page 153: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Average Minimum Maximum End time Result103 0.34 0 0.67 5 True (0 = 0)104 0.37 0 0.83 5 True (0 = 0)105 0.23 0 0.5 5 True (3 = 3)106 0.32 0 0.67 5 True (3 = 3)107 0.38 0 0.83 5 True (0 = 0)108 0.38 0 0.83 5 True (0 = 0)109 0.34 0 0.67 5 True (3 = 3)110 0.22 0 0.5 5 True (3 = 3)111 0.19 0 0.57 6 True (3 = 3)112 0.39 0 0.83 5 True (0 = 0)113 0.39 0 1 5 True (0 = 0)114 0.27 0 0.67 5 True (0 = 0)115 0.26 0 0.67 5 True (3 = 3)116 0.38 0 0.83 5 True (0 = 0)117 0.37 0 1 5 True (0 = 0)118 0.33 0 1.5 5 True (0 = 0)119 0.25 0 0.67 5 True (3 = 3)120 0.05 0 0.25 12 True (6 = 6)121 0.39 0 0.83 5 True (0 = 0)122 0.36 0 0.67 5 True (0 = 0)123 0.28 0 0.67 5 True (0 = 0)124 0.27 0 0.5 5 True (3 = 3)125 0.37 0 0.83 5 True (0 = 0)126 0.36 0 0.83 5 True (0 = 0)127 0.29 0 0.67 5 True (0 = 0)128 0.25 0 0.67 5 True (3 = 3)129 0.21 0 1.43 6 True (3 = 3)130 0.53 0 1.33 2 True (3 = 3)131 0.5 0 1 2 True (3 = 3)132 0.52 0.33 1 2 True (3 = 3)133 0.21 0 0.5 5 True (0 = 0)134 0.56 0.33 5.33 2 True (3 = 3)135 0.53 0.33 1 2 True (3 = 3)136 0.5 0 1 2 True (3 = 3)137 0.2 0 0.5 5 True (0 = 0)138 0.05 0 0.25 12 True (6 = 6)139 0.54 0.33 4.67 2 True (3 = 3)140 0.52 0.33 1.33 2 True (3 = 3)141 0.56 0 2.67 2 True (3 = 3)142 0.23 0 0.5 5 True (0 = 0)

141

Page 154: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Average Minimum Maximum End time Result143 0.49 0.33 1 2 True (3 = 3)144 0.48 0 1.67 2 True (3 = 3)145 0.5 0 1 2 True (3 = 3)146 0.21 0 0.5 5 True (0 = 0)147 0.15 0 0.43 6 True (0 = 0)148 0.31 0 1.33 5 True (0 = 0)149 0.36 0 1 5 True (0 = 0)150 0.33 0 0.67 5 True (3 = 3)151 0.29 0 0.83 5 True (3 = 3)152 0.24 0 1.6 4 True (0 = 0)153 0.2 0 0.6 4 True (0 = 0)154 0.22 0 1.17 5 True (3 = 3)155 0.19 0 1.67 5 True (3 = 3)156 0.19 0 0.5 5 True (3 = 3)157 0.27 0 0.6 4 True (0 = 0)158 0.29 0 0.67 8 True (0 = 0)159 0.32 0.11 1.89 8 True (3 = 3)160 0.53 0 1.6 4 True (0 = 0)161 0.32 0 0.62 12 True (0 = 0)162 0.35 0 0.75 3 True (3 = 3)163 0.28 0.08 0.54 12 True (0 = 0)164 0.29 0 0.57 6 True (3 = 3)165 0.31 0.1 0.7 9 True (3 = 3)166 0.63 0 3 0 True (0 = 0)167 0.26 0 1 2 True (0 = 0)168 0.53 0 2.33 2 True (0 = 0)169 0.22 0.17 0.67 6 True (7 = 7)170 0.68 0.38 1.25 7 True (0 = 0)171 0.61 0.25 0.92 12 True (6 = 6)172 0.74 0.53 1.1 40 True (7 = 7)173 1.01 0.61 1.48 30 True (0 = 0)174 0.21 0 0.67 9 True (8 = 8)175 0.17 0 0.44 9 True (8 = 8)176 0.64 0 4 0 True (0 = 0)177 0.17 0 0.71 6 True (3 = 3)178 0.18 0.05 0.32 39 True (0 = 0)179 0.37 0 0.8 4 True (3 = 3)180 0.16 0 0.3 19 True (3 = 3)181 0.22 0.08 0.38 25 True (3 = 3)182 0.16 0.05 0.34 40 True (3 = 3)

142

Page 155: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

Test number Average Minimum Maximum End time Result

Table A.5: Regression test results.

A.3.1 Data Files from the Last Experiment in Section 6.1.2

The data files show the data tuples from the hallway experiment. The first attribute is the times-tamp when the data tuple arrived. The second attribute is the sensor. The third and fourthattributes show the capability and value, and the two last attributes show the tb and te times-tamps. All the timestamps are in milliseconds.

131000 Cam1_0 DetectMotion false 131000 131000135000 Cam1_0 DetectMotion false 135000 135000138000 Cam1_0 DetectMotion false 138000 138000142000 Cam1_0 DetectMotion false 142000 142000145000 Cam1_0 DetectMotion false 145000 145000149000 Cam1_0 DetectMotion false 149000 149000152000 Cam1_0 DetectMotion false 152000 152000156000 Cam1_0 DetectMotion false 156000 156000161000 Cam1_0 DetectMotion false 161000 161000165000 Cam1_0 DetectMotion false 165000 165000170000 Cam1_0 DetectMotion false 170000 170000174000 Cam1_0 DetectMotion false 174000 174000179000 Cam1_0 DetectMotion false 179000 179000184000 Cam1_0 DetectMotion false 184000 184000189000 Cam1_0 DetectMotion false 189000 189000193000 Cam1_0 DetectMotion false 193000 193000196000 Cam1_0 DetectMotion false 196000 196000200000 Cam1_0 DetectMotion false 200000 200000201000 Cam1_0 DetectMotion false 201000 201000203000 Cam1_0 DetectMotion false 203000 203000207000 Cam1_0 DetectMotion false 207000 207000212000 Cam1_0 DetectMotion false 212000 212000216000 Cam1_0 DetectMotion false 216000 216000220000 Cam1_0 DetectMotion false 220000 220000222000 Cam1_0 DetectMotion false 222000 222000224000 Cam1_0 DetectMotion false 224000 224000228000 Cam1_0 DetectMotion false 228000 228000229000 Cam1_0 DetectMotion false 229000 229000232000 Cam1_0 DetectMotion false 232000 232000234000 Cam1_0 DetectMotion false 234000 234000236000 Cam1_0 DetectMotion false 236000 236000239000 Cam1_0 DetectMotion false 239000 239000241000 Cam1_0 DetectMotion false 241000 241000243000 Cam1_0 DetectMotion false 243000 243000245000 Cam1_0 DetectMotion false 245000 245000247000 Cam1_0 DetectMotion false 247000 247000250000 Cam1_0 DetectMotion false 250000 250000252000 Cam1_0 DetectMotion false 252000 252000255000 Cam1_0 DetectMotion false 255000 255000256000 Cam1_0 DetectMotion false 256000 256000257000 Cam1_0 DetectMotion false 257000 257000

143

Page 156: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

258000 Cam1_0 DetectMotion false 258000 258000260000 Cam1_0 DetectMotion false 260000 260000261000 Cam1_0 DetectMotion false 261000 261000262000 Cam1_0 DetectMotion false 262000 262000265000 Cam1_0 DetectMotion false 265000 265000267000 Cam1_0 DetectMotion false 267000 267000270000 Cam1_0 DetectMotion false 270000 270000272000 Cam1_0 DetectMotion false 272000 272000275000 Cam1_0 DetectMotion true 275000 275000

1000 Cam2_0 DetectMotion false 1000 10002000 Cam2_0 DetectMotion false 2000 20006000 Cam2_0 DetectMotion false 6000 60007000 Cam2_0 DetectMotion false 7000 70008000 Cam2_0 DetectMotion false 8000 800010000 Cam2_0 DetectMotion false 10000 1000012000 Cam2_0 DetectMotion false 12000 1200014000 Cam2_0 DetectMotion false 14000 1400017000 Cam2_0 DetectMotion false 17000 1700019000 Cam2_0 DetectMotion false 19000 1900022000 Cam2_0 DetectMotion false 22000 2200026000 Cam2_0 DetectMotion false 26000 2600028000 Cam2_0 DetectMotion false 28000 2800030000 Cam2_0 DetectMotion false 30000 3000032000 Cam2_0 DetectMotion false 32000 3200034000 Cam2_0 DetectMotion false 34000 3400036000 Cam2_0 DetectMotion false 36000 3600039000 Cam2_0 DetectMotion true 39000 3900041000 Cam2_0 DetectMotion true 41000 4100043000 Cam2_0 DetectMotion true 43000 4300046000 Cam2_0 DetectMotion true 46000 4600049000 Cam2_0 DetectMotion true 49000 4900051000 Cam2_0 DetectMotion true 51000 5100054000 Cam2_0 DetectMotion false 54000 5400056000 Cam2_0 DetectMotion false 56000 5600059000 Cam2_0 DetectMotion false 59000 5900061000 Cam2_0 DetectMotion true 61000 6100063000 Cam2_0 DetectMotion true 63000 6300066000 Cam2_0 DetectMotion true 66000 6600068000 Cam2_0 DetectMotion true 68000 6800070000 Cam2_0 DetectMotion false 70000 7000073000 Cam2_0 DetectMotion false 73000 7300075000 Cam2_0 DetectMotion false 75000 7500077000 Cam2_0 DetectMotion false 77000 7700080000 Cam2_0 DetectMotion false 80000 8000082000 Cam2_0 DetectMotion false 82000 8200085000 Cam2_0 DetectMotion false 85000 8500086000 Cam2_0 DetectMotion false 86000 8600089000 Cam2_0 DetectMotion false 89000 8900090000 Cam2_0 DetectMotion false 90000 9000091000 Cam2_0 DetectMotion false 91000 9100093000 Cam2_0 DetectMotion false 93000 9300096000 Cam2_0 DetectMotion false 96000 96000

144

Page 157: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

98000 Cam2_0 DetectMotion false 98000 98000100000 Cam2_0 DetectMotion false 100000 100000102000 Cam2_0 DetectMotion true 102000 102000104000 Cam2_0 DetectMotion false 104000 104000108000 Cam2_0 DetectMotion false 108000 108000111000 Cam2_0 DetectMotion false 111000 111000115000 Cam2_0 DetectMotion false 115000 115000119000 Cam2_0 DetectMotion false 119000 119000121000 Cam2_0 DetectMotion false 121000 121000123000 Cam2_0 DetectMotion false 123000 123000126000 Cam2_0 DetectMotion false 126000 126000128000 Cam2_0 DetectMotion true 128000 128000131000 Cam2_0 DetectMotion true 131000 131000135000 Cam2_0 DetectMotion true 135000 135000138000 Cam2_0 DetectMotion true 138000 138000142000 Cam2_0 DetectMotion true 142000 142000145000 Cam2_0 DetectMotion false 145000 145000149000 Cam2_0 DetectMotion false 149000 149000152000 Cam2_0 DetectMotion false 152000 152000156000 Cam2_0 DetectMotion false 156000 156000161000 Cam2_0 DetectMotion true 161000 161000165000 Cam2_0 DetectMotion true 165000 165000170000 Cam2_0 DetectMotion true 170000 170000174000 Cam2_0 DetectMotion true 174000 174000179000 Cam2_0 DetectMotion true 179000 179000184000 Cam2_0 DetectMotion false 184000 184000189000 Cam2_0 DetectMotion false 189000 189000193000 Cam2_0 DetectMotion false 193000 193000196000 Cam2_0 DetectMotion false 196000 196000200000 Cam2_0 DetectMotion false 200000 200000201000 Cam2_0 DetectMotion false 201000 201000203000 Cam2_0 DetectMotion false 203000 203000207000 Cam2_0 DetectMotion false 207000 207000212000 Cam2_0 DetectMotion false 212000 212000216000 Cam2_0 DetectMotion false 216000 216000220000 Cam2_0 DetectMotion false 220000 220000222000 Cam2_0 DetectMotion true 222000 222000224000 Cam2_0 DetectMotion true 224000 224000

131000 Cam3_0 DetectMotion false 131000 131000135000 Cam3_0 DetectMotion false 135000 135000138000 Cam3_0 DetectMotion false 138000 138000142000 Cam3_0 DetectMotion false 142000 142000145000 Cam3_0 DetectMotion false 145000 145000149000 Cam3_0 DetectMotion false 149000 149000152000 Cam3_0 DetectMotion false 152000 152000156000 Cam3_0 DetectMotion false 156000 156000161000 Cam3_0 DetectMotion false 161000 161000165000 Cam3_0 DetectMotion false 165000 165000170000 Cam3_0 DetectMotion false 170000 170000174000 Cam3_0 DetectMotion false 174000 174000179000 Cam3_0 DetectMotion false 179000 179000184000 Cam3_0 DetectMotion false 184000 184000

145

Page 158: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

189000 Cam3_0 DetectMotion false 189000 189000193000 Cam3_0 DetectMotion false 193000 193000196000 Cam3_0 DetectMotion false 196000 196000200000 Cam3_0 DetectMotion false 200000 200000201000 Cam3_0 DetectMotion false 201000 201000203000 Cam3_0 DetectMotion false 203000 203000207000 Cam3_0 DetectMotion false 207000 207000212000 Cam3_0 DetectMotion false 212000 212000216000 Cam3_0 DetectMotion false 216000 216000220000 Cam3_0 DetectMotion false 220000 220000222000 Cam3_0 DetectMotion false 222000 222000224000 Cam3_0 DetectMotion false 224000 224000

131000 Cam4_0 DetectMotion false 131000 131000135000 Cam4_0 DetectMotion false 135000 135000138000 Cam4_0 DetectMotion false 138000 138000142000 Cam4_0 DetectMotion false 142000 142000145000 Cam4_0 DetectMotion false 145000 145000149000 Cam4_0 DetectMotion false 149000 149000152000 Cam4_0 DetectMotion false 152000 152000156000 Cam4_0 DetectMotion false 156000 156000161000 Cam4_0 DetectMotion false 161000 161000165000 Cam4_0 DetectMotion false 165000 165000170000 Cam4_0 DetectMotion false 170000 170000174000 Cam4_0 DetectMotion false 174000 174000179000 Cam4_0 DetectMotion false 179000 179000184000 Cam4_0 DetectMotion false 184000 184000189000 Cam4_0 DetectMotion false 189000 189000193000 Cam4_0 DetectMotion false 193000 193000196000 Cam4_0 DetectMotion false 196000 196000200000 Cam4_0 DetectMotion false 200000 200000201000 Cam4_0 DetectMotion false 201000 201000203000 Cam4_0 DetectMotion false 203000 203000207000 Cam4_0 DetectMotion false 207000 207000212000 Cam4_0 DetectMotion false 212000 212000216000 Cam4_0 DetectMotion false 216000 216000220000 Cam4_0 DetectMotion false 220000 220000222000 Cam4_0 DetectMotion false 222000 222000224000 Cam4_0 DetectMotion false 224000 224000

131000 Cam5_0 DetectMotion false 131000 131000135000 Cam5_0 DetectMotion false 135000 135000138000 Cam5_0 DetectMotion false 138000 138000142000 Cam5_0 DetectMotion false 142000 142000145000 Cam5_0 DetectMotion false 145000 145000149000 Cam5_0 DetectMotion false 149000 149000152000 Cam5_0 DetectMotion false 152000 152000156000 Cam5_0 DetectMotion false 156000 156000161000 Cam5_0 DetectMotion false 161000 161000165000 Cam5_0 DetectMotion true 165000 165000170000 Cam5_0 DetectMotion true 170000 170000174000 Cam5_0 DetectMotion true 174000 174000179000 Cam5_0 DetectMotion false 179000 179000

146

Page 159: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

184000 Cam5_0 DetectMotion false 184000 184000189000 Cam5_0 DetectMotion false 189000 189000193000 Cam5_0 DetectMotion false 193000 193000196000 Cam5_0 DetectMotion false 196000 196000200000 Cam5_0 DetectMotion false 200000 200000201000 Cam5_0 DetectMotion false 201000 201000203000 Cam5_0 DetectMotion false 203000 203000207000 Cam5_0 DetectMotion false 207000 207000212000 Cam5_0 DetectMotion false 212000 212000216000 Cam5_0 DetectMotion false 216000 216000220000 Cam5_0 DetectMotion false 220000 220000222000 Cam5_0 DetectMotion false 222000 222000224000 Cam5_0 DetectMotion true 224000 224000228000 Cam5_0 DetectMotion true 228000 228000229000 Cam5_0 DetectMotion true 229000 229000232000 Cam5_0 DetectMotion true 232000 232000234000 Cam5_0 DetectMotion true 234000 234000236000 Cam5_0 DetectMotion true 236000 236000239000 Cam5_0 DetectMotion true 239000 239000241000 Cam5_0 DetectMotion true 241000 241000243000 Cam5_0 DetectMotion true 243000 243000245000 Cam5_0 DetectMotion true 245000 245000247000 Cam5_0 DetectMotion false 247000 247000250000 Cam5_0 DetectMotion false 250000 250000252000 Cam5_0 DetectMotion false 252000 252000255000 Cam5_0 DetectMotion false 255000 255000256000 Cam5_0 DetectMotion false 256000 256000257000 Cam5_0 DetectMotion false 257000 257000258000 Cam5_0 DetectMotion false 258000 258000260000 Cam5_0 DetectMotion false 260000 260000261000 Cam5_0 DetectMotion false 261000 261000262000 Cam5_0 DetectMotion false 262000 262000265000 Cam5_0 DetectMotion false 265000 265000267000 Cam5_0 DetectMotion false 267000 267000270000 Cam5_0 DetectMotion false 270000 270000272000 Cam5_0 DetectMotion true 272000 272000275000 Cam5_0 DetectMotion true 275000 275000

131000 Cam6_0 DetectMotion false 131000 131000135000 Cam6_0 DetectMotion false 135000 135000138000 Cam6_0 DetectMotion false 138000 138000142000 Cam6_0 DetectMotion false 142000 142000145000 Cam6_0 DetectMotion false 145000 145000149000 Cam6_0 DetectMotion false 149000 149000152000 Cam6_0 DetectMotion false 152000 152000156000 Cam6_0 DetectMotion false 156000 156000161000 Cam6_0 DetectMotion false 161000 161000165000 Cam6_0 DetectMotion true 165000 165000170000 Cam6_0 DetectMotion true 170000 170000174000 Cam6_0 DetectMotion true 174000 174000179000 Cam6_0 DetectMotion false 179000 179000184000 Cam6_0 DetectMotion false 184000 184000189000 Cam6_0 DetectMotion false 189000 189000

147

Page 160: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

193000 Cam6_0 DetectMotion false 193000 193000196000 Cam6_0 DetectMotion false 196000 196000200000 Cam6_0 DetectMotion false 200000 200000201000 Cam6_0 DetectMotion false 201000 201000203000 Cam6_0 DetectMotion false 203000 203000207000 Cam6_0 DetectMotion false 207000 207000212000 Cam6_0 DetectMotion false 212000 212000216000 Cam6_0 DetectMotion false 216000 216000220000 Cam6_0 DetectMotion false 220000 220000222000 Cam6_0 DetectMotion false 222000 222000224000 Cam6_0 DetectMotion true 224000 224000228000 Cam6_0 DetectMotion true 228000 228000229000 Cam6_0 DetectMotion true 229000 229000232000 Cam6_0 DetectMotion true 232000 232000234000 Cam6_0 DetectMotion true 234000 234000236000 Cam6_0 DetectMotion true 236000 236000239000 Cam6_0 DetectMotion true 239000 239000241000 Cam6_0 DetectMotion true 241000 241000243000 Cam6_0 DetectMotion true 243000 243000245000 Cam6_0 DetectMotion false 245000 245000247000 Cam6_0 DetectMotion false 247000 247000250000 Cam6_0 DetectMotion false 250000 250000252000 Cam6_0 DetectMotion false 252000 252000255000 Cam6_0 DetectMotion false 255000 255000256000 Cam6_0 DetectMotion false 256000 256000257000 Cam6_0 DetectMotion false 257000 257000258000 Cam6_0 DetectMotion false 258000 258000260000 Cam6_0 DetectMotion false 260000 260000261000 Cam6_0 DetectMotion false 261000 261000262000 Cam6_0 DetectMotion false 262000 262000265000 Cam6_0 DetectMotion false 265000 265000267000 Cam6_0 DetectMotion false 267000 267000270000 Cam6_0 DetectMotion false 270000 270000272000 Cam6_0 DetectMotion true 272000 272000275000 Cam6_0 DetectMotion true 275000 275000

1000 Cam7_0 DetectMotion false 1000 10002000 Cam7_0 DetectMotion false 2000 20006000 Cam7_0 DetectMotion false 6000 60007000 Cam7_0 DetectMotion false 7000 70008000 Cam7_0 DetectMotion false 8000 800010000 Cam7_0 DetectMotion false 10000 1000012000 Cam7_0 DetectMotion false 12000 1200014000 Cam7_0 DetectMotion false 14000 1400017000 Cam7_0 DetectMotion false 17000 1700019000 Cam7_0 DetectMotion false 19000 1900022000 Cam7_0 DetectMotion false 22000 2200026000 Cam7_0 DetectMotion false 26000 2600028000 Cam7_0 DetectMotion true 28000 2800030000 Cam7_0 DetectMotion true 30000 3000032000 Cam7_0 DetectMotion true 32000 3200034000 Cam7_0 DetectMotion true 34000 3400036000 Cam7_0 DetectMotion true 36000 36000

148

Page 161: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

39000 Cam7_0 DetectMotion true 39000 3900041000 Cam7_0 DetectMotion true 41000 4100043000 Cam7_0 DetectMotion true 43000 4300046000 Cam7_0 DetectMotion false 46000 4600049000 Cam7_0 DetectMotion false 49000 4900051000 Cam7_0 DetectMotion false 51000 5100054000 Cam7_0 DetectMotion false 54000 5400056000 Cam7_0 DetectMotion false 56000 5600059000 Cam7_0 DetectMotion false 59000 5900061000 Cam7_0 DetectMotion false 61000 6100063000 Cam7_0 DetectMotion false 63000 6300066000 Cam7_0 DetectMotion false 66000 6600068000 Cam7_0 DetectMotion true 68000 6800070000 Cam7_0 DetectMotion true 70000 7000073000 Cam7_0 DetectMotion true 73000 7300075000 Cam7_0 DetectMotion false 75000 7500077000 Cam7_0 DetectMotion false 77000 7700080000 Cam7_0 DetectMotion false 80000 8000082000 Cam7_0 DetectMotion false 82000 8200085000 Cam7_0 DetectMotion false 85000 8500086000 Cam7_0 DetectMotion true 86000 8600089000 Cam7_0 DetectMotion true 89000 8900090000 Cam7_0 DetectMotion true 90000 9000091000 Cam7_0 DetectMotion true 91000 9100093000 Cam7_0 DetectMotion true 93000 9300096000 Cam7_0 DetectMotion true 96000 9600098000 Cam7_0 DetectMotion true 98000 98000100000 Cam7_0 DetectMotion true 100000 100000102000 Cam7_0 DetectMotion true 102000 102000104000 Cam7_0 DetectMotion false 104000 104000108000 Cam7_0 DetectMotion false 108000 108000111000 Cam7_0 DetectMotion false 111000 111000115000 Cam7_0 DetectMotion false 115000 115000119000 Cam7_0 DetectMotion false 119000 119000121000 Cam7_0 DetectMotion false 121000 121000123000 Cam7_0 DetectMotion true 123000 123000126000 Cam7_0 DetectMotion true 126000 126000128000 Cam7_0 DetectMotion true 128000 128000

1000 Cam8_0 DetectMotion false 1000 10002000 Cam8_0 DetectMotion false 2000 20006000 Cam8_0 DetectMotion false 6000 60007000 Cam8_0 DetectMotion false 7000 70008000 Cam8_0 DetectMotion false 8000 800010000 Cam8_0 DetectMotion false 10000 1000012000 Cam8_0 DetectMotion false 12000 1200014000 Cam8_0 DetectMotion false 14000 1400017000 Cam8_0 DetectMotion false 17000 1700019000 Cam8_0 DetectMotion false 19000 1900022000 Cam8_0 DetectMotion false 22000 2200026000 Cam8_0 DetectMotion false 26000 2600028000 Cam8_0 DetectMotion true 28000 2800030000 Cam8_0 DetectMotion true 30000 30000

149

Page 162: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

32000 Cam8_0 DetectMotion true 32000 3200034000 Cam8_0 DetectMotion true 34000 3400036000 Cam8_0 DetectMotion true 36000 3600039000 Cam8_0 DetectMotion true 39000 3900041000 Cam8_0 DetectMotion true 41000 4100043000 Cam8_0 DetectMotion true 43000 4300046000 Cam8_0 DetectMotion false 46000 4600049000 Cam8_0 DetectMotion true 49000 4900051000 Cam8_0 DetectMotion true 51000 5100054000 Cam8_0 DetectMotion false 54000 5400056000 Cam8_0 DetectMotion false 56000 5600059000 Cam8_0 DetectMotion false 59000 5900061000 Cam8_0 DetectMotion false 61000 6100063000 Cam8_0 DetectMotion false 63000 6300066000 Cam8_0 DetectMotion false 66000 6600068000 Cam8_0 DetectMotion true 68000 6800070000 Cam8_0 DetectMotion true 70000 7000073000 Cam8_0 DetectMotion true 73000 7300075000 Cam8_0 DetectMotion false 75000 7500077000 Cam8_0 DetectMotion false 77000 7700080000 Cam8_0 DetectMotion false 80000 8000082000 Cam8_0 DetectMotion false 82000 8200085000 Cam8_0 DetectMotion false 85000 8500086000 Cam8_0 DetectMotion false 86000 8600089000 Cam8_0 DetectMotion true 89000 8900090000 Cam8_0 DetectMotion true 90000 9000091000 Cam8_0 DetectMotion true 91000 9100093000 Cam8_0 DetectMotion true 93000 9300096000 Cam8_0 DetectMotion true 96000 9600098000 Cam8_0 DetectMotion true 98000 98000100000 Cam8_0 DetectMotion true 100000 100000102000 Cam8_0 DetectMotion true 102000 102000104000 Cam8_0 DetectMotion false 104000 104000108000 Cam8_0 DetectMotion false 108000 108000111000 Cam8_0 DetectMotion false 111000 111000115000 Cam8_0 DetectMotion false 115000 115000119000 Cam8_0 DetectMotion false 119000 119000121000 Cam8_0 DetectMotion false 121000 121000123000 Cam8_0 DetectMotion false 123000 123000126000 Cam8_0 DetectMotion true 126000 126000128000 Cam8_0 DetectMotion true 128000 128000131000 Cam8_0 DetectMotion true 131000 131000135000 Cam8_0 DetectMotion true 135000 135000138000 Cam8_0 DetectMotion true 138000 138000142000 Cam8_0 DetectMotion true 142000 142000145000 Cam8_0 DetectMotion true 145000 145000149000 Cam8_0 DetectMotion true 149000 149000152000 Cam8_0 DetectMotion false 152000 152000156000 Cam8_0 DetectMotion false 156000 156000161000 Cam8_0 DetectMotion true 161000 161000165000 Cam8_0 DetectMotion false 165000 165000170000 Cam8_0 DetectMotion false 170000 170000174000 Cam8_0 DetectMotion false 174000 174000

150

Page 163: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

179000 Cam8_0 DetectMotion true 179000 179000184000 Cam8_0 DetectMotion true 184000 184000189000 Cam8_0 DetectMotion false 189000 189000193000 Cam8_0 DetectMotion false 193000 193000196000 Cam8_0 DetectMotion false 196000 196000200000 Cam8_0 DetectMotion false 200000 200000201000 Cam8_0 DetectMotion false 201000 201000203000 Cam8_0 DetectMotion false 203000 203000207000 Cam8_0 DetectMotion false 207000 207000212000 Cam8_0 DetectMotion false 212000 212000216000 Cam8_0 DetectMotion false 216000 216000220000 Cam8_0 DetectMotion true 220000 220000222000 Cam8_0 DetectMotion true 222000 222000224000 Cam8_0 DetectMotion true 224000 224000228000 Cam8_0 DetectMotion false 228000 228000229000 Cam8_0 DetectMotion false 229000 229000232000 Cam8_0 DetectMotion false 232000 232000234000 Cam8_0 DetectMotion false 234000 234000236000 Cam8_0 DetectMotion false 236000 236000239000 Cam8_0 DetectMotion false 239000 239000241000 Cam8_0 DetectMotion false 241000 241000243000 Cam8_0 DetectMotion false 243000 243000245000 Cam8_0 DetectMotion true 245000 245000247000 Cam8_0 DetectMotion true 247000 247000250000 Cam8_0 DetectMotion true 250000 250000252000 Cam8_0 DetectMotion false 252000 252000255000 Cam8_0 DetectMotion false 255000 255000256000 Cam8_0 DetectMotion false 256000 256000257000 Cam8_0 DetectMotion false 257000 257000258000 Cam8_0 DetectMotion false 258000 258000260000 Cam8_0 DetectMotion false 260000 260000261000 Cam8_0 DetectMotion false 261000 261000262000 Cam8_0 DetectMotion false 262000 262000265000 Cam8_0 DetectMotion false 265000 265000267000 Cam8_0 DetectMotion false 267000 267000270000 Cam8_0 DetectMotion true 270000 270000272000 Cam8_0 DetectMotion true 272000 272000275000 Cam8_0 DetectMotion true 275000 275000

1000 Cam9_0 DetectMotion false 1000 10002000 Cam9_0 DetectMotion false 2000 20006000 Cam9_0 DetectMotion false 6000 60007000 Cam9_0 DetectMotion false 7000 70008000 Cam9_0 DetectMotion false 8000 800010000 Cam9_0 DetectMotion false 10000 1000012000 Cam9_0 DetectMotion false 12000 1200014000 Cam9_0 DetectMotion false 14000 1400017000 Cam9_0 DetectMotion false 17000 1700019000 Cam9_0 DetectMotion false 19000 1900022000 Cam9_0 DetectMotion false 22000 2200026000 Cam9_0 DetectMotion false 26000 2600028000 Cam9_0 DetectMotion false 28000 2800030000 Cam9_0 DetectMotion false 30000 30000

151

Page 164: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

32000 Cam9_0 DetectMotion false 32000 3200034000 Cam9_0 DetectMotion false 34000 3400036000 Cam9_0 DetectMotion true 36000 3600039000 Cam9_0 DetectMotion true 39000 3900041000 Cam9_0 DetectMotion true 41000 4100043000 Cam9_0 DetectMotion false 43000 4300046000 Cam9_0 DetectMotion false 46000 4600049000 Cam9_0 DetectMotion false 49000 4900051000 Cam9_0 DetectMotion false 51000 5100054000 Cam9_0 DetectMotion false 54000 5400056000 Cam9_0 DetectMotion false 56000 5600059000 Cam9_0 DetectMotion false 59000 5900061000 Cam9_0 DetectMotion false 61000 6100063000 Cam9_0 DetectMotion false 63000 6300066000 Cam9_0 DetectMotion false 66000 6600068000 Cam9_0 DetectMotion false 68000 6800070000 Cam9_0 DetectMotion false 70000 7000073000 Cam9_0 DetectMotion false 73000 7300075000 Cam9_0 DetectMotion false 75000 7500077000 Cam9_0 DetectMotion false 77000 7700080000 Cam9_0 DetectMotion false 80000 8000082000 Cam9_0 DetectMotion false 82000 8200085000 Cam9_0 DetectMotion false 85000 8500086000 Cam9_0 DetectMotion false 86000 8600089000 Cam9_0 DetectMotion true 89000 8900090000 Cam9_0 DetectMotion true 90000 9000091000 Cam9_0 DetectMotion true 91000 9100093000 Cam9_0 DetectMotion true 93000 9300096000 Cam9_0 DetectMotion true 96000 9600098000 Cam9_0 DetectMotion true 98000 98000100000 Cam9_0 DetectMotion true 100000 100000102000 Cam9_0 DetectMotion false 102000 102000104000 Cam9_0 DetectMotion false 104000 104000108000 Cam9_0 DetectMotion false 108000 108000111000 Cam9_0 DetectMotion false 111000 111000115000 Cam9_0 DetectMotion false 115000 115000119000 Cam9_0 DetectMotion false 119000 119000121000 Cam9_0 DetectMotion false 121000 121000123000 Cam9_0 DetectMotion false 123000 123000126000 Cam9_0 DetectMotion true 126000 126000128000 Cam9_0 DetectMotion true 128000 128000

A.4 Trace Files from Cook and Schmitter-EdgecombeThe trace files are available from [mav]. The following trace file (adlnormal/p04.t1)matches the complex query:

2008-03-03 14:13:52.600541 M23 ON2008-03-03 14:13:54.456505 M01 ON2008-03-03 14:13:55.250537 M07 ON2008-03-03 14:13:55.398526 M08 ON2008-03-03 14:13:56.445215 M09 ON

152

Page 165: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

2008-03-03 14:13:57.12047 M14 ON2008-03-03 14:13:58.241716 M07 OFF2008-03-03 14:13:58.363754 M01 OFF2008-03-03 14:13:58.485715 M08 OFF2008-03-03 14:13:58.873547 M23 OFF2008-03-03 14:13:58.997577 M13 ON2008-03-03 14:13:59.144543 M09 OFF2008-03-03 14:14:01.33088 M14 OFF2008-03-03 14:14:04.578001 I08 ABSENT2008-03-03 14:14:09.120758 M13 OFF2008-03-03 14:14:09.622629 M13 ON2008-03-03 14:14:12.413874 M13 OFF2008-03-03 14:14:13.99698 M13 ON2008-03-03 14:14:14.487348 M13 OFF2008-03-03 14:14:14.911205 M13 ON2008-03-03 14:14:15.950932 M13 OFF2008-03-03 14:14:19.244063 M13 ON2008-03-03 14:14:20.561695 M13 OFF2008-03-03 14:14:22.337199 M13 ON2008-03-03 14:14:24.844515 M13 OFF2008-03-03 14:14:33.514688 M13 ON2008-03-03 14:14:35.631879 M13 OFF2008-03-03 14:14:36.387969 M13 ON2008-03-03 14:14:40.667864 M13 OFF2008-03-03 14:14:41.618625 M13 ON2008-03-03 14:14:42.63431 M13 OFF2008-03-03 14:14:46.396516 M13 ON2008-03-03 14:14:55.316742 M13 OFF2008-03-03 14:14:55.434818 M13 ON2008-03-03 14:14:56.446848 M13 OFF2008-03-03 14:14:57.306908 M13 ON2008-03-03 14:15:00 asterisk START2008-03-03 14:15:00.476208 M13 OFF2008-03-03 14:15:03.787079 M13 ON2008-03-03 14:15:08.595738 M13 OFF2008-03-03 14:15:18.95146 M13 ON2008-03-03 14:15:19.568756 M13 OFF2008-03-03 14:15:44.5496 M13 ON2008-03-03 14:15:47 asterisk END2008-03-03 14:15:52.554399 I08 PRESENT2008-03-03 14:15:57.127844 M13 OFF2008-03-03 14:15:57.712722 M13 ON

The following trace file (adlerror/p20.t1) does not match the complex query:

2008-04-04 12:31:59.786012 M07 ON2008-04-04 12:32:00.578721 M09 ON2008-04-04 12:32:01.441472 M14 ON2008-04-04 12:32:01.817373 M23 OFF2008-04-04 12:32:02.585161 M07 OFF2008-04-04 12:32:02.747191 M01 OFF2008-04-04 12:32:03.562919 M13 ON2008-04-04 12:32:03.562919 M08 OFF

153

Page 166: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during

2008-04-04 12:32:03.873881 M09 OFF2008-04-04 12:32:06.544226 M14 OFF2008-04-04 12:32:06.865179 M14 ON2008-04-04 12:32:07.888211 M14 OFF2008-04-04 12:32:10.45401 I08 ABSENT2008-04-04 12:32:15.606277 M13 OFF2008-04-04 12:32:16.385045 M13 ON2008-04-04 12:32:18.664438 M13 OFF2008-04-04 12:32:20.495014 M13 ON2008-04-04 12:32:23.804 M13 OFF2008-04-04 12:32:23.94962 M13 ON2008-04-04 12:32:24.966414 M13 OFF2008-04-04 12:32:26.293044 M13 ON2008-04-04 12:32:27.460747 M13 OFF2008-04-04 12:32:32.432286 M13 ON2008-04-04 12:32:40.164729 M13 OFF2008-04-04 12:32:44.595454 M13 ON2008-04-04 12:32:45.596171 M13 OFF2008-04-04 12:32:53.832791 M13 ON2008-04-04 12:33:01.91064 M13 OFF2008-04-04 12:33:05.221913 M13 ON2008-04-04 12:33:15 asterisk START2008-04-04 12:33:17.888646 M13 OFF2008-04-04 12:33:39.866439 M13 ON2008-04-04 12:33:46.72581 M13 OFF2008-04-04 12:34:08.509267 M13 ON2008-04-04 12:34:13.129167 M13 OFF2008-04-04 12:34:13.448077 M13 ON2008-04-04 12:34:15.968381 M13 OFF2008-04-04 12:34:31.511776 M13 ON2008-04-04 12:34:32 asterisk END2008-04-04 12:34:38.307348 M13 OFF2008-04-04 12:34:55.500391 M13 ON

That’s all, folks!

154

Page 167: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during
Page 168: CommonSens - folk.uio.nofolk.uio.no/jarleso/jarleso_phd_thesis.pdf · 2011-06-09 · I also want to thank Radioresepsjonen for their podcasts. They helped me to fall asleep during