30
Kalman Filtering, Sensor Fusion, and Eye Tracking Pramod P. Khargonekar EECS Department UC Irvine ECCV OpenEyes Workshop August 23, 2020

Kalman Filtering, Sensor Fusion, and Eye Tracking

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Kalman Filtering, Sensor Fusion, and Eye Tracking

Kalman Filtering, Sensor Fusion, and Eye Tracking

Pramod P. KhargonekarEECS Department

UC Irvine

ECCV OpenEyes WorkshopAugust 23, 2020

Page 2: Kalman Filtering, Sensor Fusion, and Eye Tracking

Outline

• Introduction to Kalman Filtering

• Multi-sensor, multirate fusion

• Distributed/decentralized Kalman filtering

• Thoughts on potential for eye tracking

Page 3: Kalman Filtering, Sensor Fusion, and Eye Tracking

R. E. Kalman 1930-2016

Page 4: Kalman Filtering, Sensor Fusion, and Eye Tracking

Kalman Filtering: Linear System + Gaussian Noise

• Linear time-varying discrete time system, T sampling period• State variable x• Known input u (=0 for this talk)• Process noise w• Measurement noise v• We will simplify by assuming that the known input is 0

x((k + 1)T ) = A(k)x(kT ) +B(k)u(kT ) + w(kT ), x(0) = x0

y(kT ) = C(k)x(kT ) +D(k)u(kT ) + v(kT )<latexit sha1_base64="0fMFNGf94JE1XQwTC6CuD23WEAM=">AAACPHicbZBLSwMxFIUzPuv4qrp0EyyWGSplRgXdFGrrwqViX9CWkknTGibzIMnUltIf5sYf4c6VGxeKuHVtZiyi1gsJH+eeS3KPEzIqpGU9anPzC4tLy6kVfXVtfWMzvbVdE0HEManigAW84SBBGPVJVVLJSCPkBHkOI3XHLcf9+oBwQQO/IkchaXuo79MexUgqqZO+HhqGm7PNipktnBmuOTTcipkrKYoSuo3vAzg0LLMw7FgQtlr6KNayhfK3/fzbPojvTjpj5a2k4CzYU8iAaV120g+tboAjj/gSMyRE07ZC2R4jLilmZKK3IkFChF3UJ02FPvKIaI+T5SdwXyld2Au4Or6EifpzYow8IUaeo5wekjfiby8W/+s1I9k7bY+pH0aS+PjroV7EoAxgnCTsUk6wZCMFCHOq/grxDeIIS5W3rkKw/648C7XDvH2UP7w6zhRL0zhSYBfsAQPY4AQUwQW4BFWAwR14Ai/gVbvXnrU37f3LOqdNZ3bAr9I+PgEzpaXJ</latexit>

E(x0xT0 ) = P (0)

E(w(kT )wT (kT )) = Q(k)

E(v(kT )vT (kT )) = R(k)<latexit sha1_base64="fmN6q2nD9roH0/d4tdpD6uDGkes=">AAACLHicbZDLSgMxFIYzXmu9VV26CRZluikzVdCNUCwFl630Bu10yKRpG5q5kGRaS+kDufFVBHFhEbc+h5l2EG09JOTn/84hye8EjAppGDNtbX1jc2s7sZPc3ds/OEwdHdeEH3JMqthnPm84SBBGPVKVVDLSCDhBrsNI3RkUIl4fEi6o71XkOCCWi3oe7VKMpLLsVKGoP9qGWu1K5uK2pBsZ2Goli/pIH1Qyo3YlOhQo64MYDCNn+AMeFLBTaSNrzAuuCjMWaRBXyU69tjo+Dl3iScyQEE3TCKQ1QVxSzMg02QoFCRAeoB5pKukhlwhrMv/sFJ4rpwO7Plfbk3Du/p6YIFeIseuoThfJvlhmkfkfa4aye2NNqBeEknh4cVE3ZFD6MEoOdignWLKxEghzqt4KcR9xhKXKN6lCMJe/vCpquax5mc2Vr9L5uziOBDgFZ0AHJrgGeXAPSqAKMHgCL+AdzLRn7U370D4XrWtaPHMC/pT29Q2lLqEm</latexit>

Dynamic System Model

Page 5: Kalman Filtering, Sensor Fusion, and Eye Tracking

Kalman FilterOptimal estimate of the state consists of time-update and measurement update:

Kalman gain L and estimation error covariances evolve according to:

P�1(k|k) = P�1(k|(k � 1)) + CT (k)R�1(k)C(k)

L(k) = P (k|k)CT (k)R�1(k)

P ((k + 1)|k) = A(k)P (k|k)AT (k) +Q(k)<latexit sha1_base64="Z9nIwznZGkmMVMrV7mw9qpcWrFk=">AAACaXicbVHLSgMxFM2Mrzq+WkUR3QSLMkNpmamCboTWbly4qNKq0NaSSdM2TOZBkhHKWPAb3fkDbvwJ03YQtV644dxzz83jxI0YFdK23zV9YXFpeSWzaqytb2xuZXPb9yKMOSZNHLKQP7pIEEYD0pRUMvIYcYJ8l5EH16tN+g/PhAsaBg05ikjHR4OA9ilGUlHd7Gv9KSk6Y9N78Sx4cgm/S9MrOpZVqD0lDVVadylv1VS228aNmeqnk3MqpaibpldwrHTfqiJn2mqqLdyqxTC62bxdsqcB54GTgjxIo97NvrV7IY59EkjMkBAtx45kJ0FcUszI2GjHgkQIe2hAWgoGyCeik0ydGsNjxfRgP+QqAwmn7M+JBPlCjHxXKX0kh+Jvb0L+12vFsn/RSWgQxZIEeHZQP2ZQhnBiO+xRTrBkIwUQ5lTdFeIh4ghL9TkTE5y/T54H9+WSc1oq357lK1epHRlwCI6ACRxwDirgGtRBE2Dwoa1ru9qe9qnn9H39YCbVtXRmB/wKPf8F69Gung==</latexit>

x̂((k + 1)T |k) = A(k)x̂(kT |k)x̂(kT |k) = x̂(kT |(k � 1)) + L(k)(y(kT )� C(k)x̂(kT |(k � 1)))

<latexit sha1_base64="rJJ5Jx5NvGpJmW+ZguclkPUPFI8=">AAACVHicbZFNS8MwGMfTzumcb1OPXoLD0SKOVgbqYTDdxYOHCXuDdYw0y1xo+kKSiqPuQ+pB8JN48WC6TdicDwT++T0vSf5xI0aFtKxPTc9sZDe3ctv5nd29/YPC4VFbhDHHpIVDFvKuiwRhNCAtSSUj3YgT5LuMdFyvnuY7z4QLGgZNOYlI30dPAR1RjKRCg4LnjJFMXqaG4Z3bZvPVM0vVW8Mzf7GXIug4+RVQqi5tDe/CNs3zB9VlTBQwL+qrA+YFZn5QKFplaxZwXdgLUQSLaAwK784wxLFPAokZEqJnW5HsJ4hLihmZ5p1YkAhhDz2RnpIB8onoJzNTpvBMkSEchVytQMIZXe5IkC/ExHdVpY/kWPzNpfC/XC+Wo+t+QoMoliTA84NGMYMyhKnDcEg5wZJNlECYU3VXiMeIIyzVP6Qm2H+fvC7al2W7Ur55rBRrdws7cuAEnAID2OAK1MA9aIAWwOANfGlA07QP7VvP6Nl5qa4teo7BSuj7P1fGrd0=</latexit>

Page 6: Kalman Filtering, Sensor Fusion, and Eye Tracking

Rich Heritage from Kalman

• The most important and impactful result from control theory and one of the most important from engineering

• Equivalent to Bayes theorem (under the specified conditions)

• Extensions to continuous-time, nonlinear, non-Gaussian, distributed, networked, …

• Real-time, on-line, recursive algorithms and implementations

• Computationally efficient implementations

• Applications in many fields of engineering and sciences (including machine vision)

• Huge literature

Page 7: Kalman Filtering, Sensor Fusion, and Eye Tracking

Multi-Sensor Data Fusion

• Combine data from multiple, disparate sensors to arrive at a unified estimate of the unknown system/signal

• Wide variety of techniques to address disparate challenges related to the system, the sensors, and the data characteristics: probabilistic, Dempster-Schafer, fuzzy, …

• Many application domains:

Ø Aerospace, naval, and defense applicationsØ Sensor networks, IoTØ Digital twins, manufacturing, monitoring, …Ø RoboticsØ MedicalØ Physical sciences: weather, environment, air, water, …

Page 8: Kalman Filtering, Sensor Fusion, and Eye Tracking

Kalman Filtering and Multi-Sensor Data Fusion

“Nonetheless, the Kalman filter is one of the most popular fusion methods mainly due to its simplicity, ease of implementation, and

optimality in a mean-squared error sense. It is a very well-established data fusion method whose properties are deeply studied and examined

both theoretically and in practical applications. On the other hand, similar to other least-square estimators, the Kalman filter is very

sensitive to data corrupted with outliers. Furthermore, the Kalman filteris inappropriate for applications whose error characteristics are

not readily parameterized.”

Khaleghi et al., Multisensor Data Fusion, Information Fusion (2011)

Page 9: Kalman Filtering, Sensor Fusion, and Eye Tracking

Related Techniques and Extensions

• Extended Kalman filter (EKF)

• Unscented Kalman filter (UKF)

• Particle filters (Monte-Carlo algorithms)

• Other related methods

Page 10: Kalman Filtering, Sensor Fusion, and Eye Tracking

Multi-Rate Kalman Filtering

• Setting: Multiple sensors operating at different sampling rates

• Sensor 1 sampling time T1 = NT

• Sensor 2 sampling time T2 = MT

• …

• Sensor N sampling time …

• Kalman filter equations generalize to this case quite easily although the notation can get quite complicated

Page 11: Kalman Filtering, Sensor Fusion, and Eye Tracking

Lifting Approach

Start with a discrete-time sequence z(0), z(1), … z(N), …

ZN = ZN(0) =

z(0)z(1).

.

z(N − 1)

, ZN(1) =

z(N)z(N + 1)

.

.

z(2N − 1)

, ...

A multi-rate system can be rewritten as a standard discrete-system using lifted representation for various signals

Many papers in the control systems literature on this lifting-based approaches

Page 12: Kalman Filtering, Sensor Fusion, and Eye Tracking
Page 13: Kalman Filtering, Sensor Fusion, and Eye Tracking

Distributed/Decentralized Kalman Filtering

1976 IEEE Conference on Decision and Control

Page 14: Kalman Filtering, Sensor Fusion, and Eye Tracking

Distributed Kalman Filtering and Sensor Fusion

Page 15: Kalman Filtering, Sensor Fusion, and Eye Tracking

Key Ideas

Suppose we have J different sensors described by:

y1(kT ) = C1(k)x(kT ) + v1(kT )

y2(kT ) = C2(k)x(kT ) + v2(kT )

. . .

yJ(kT ) = CJ(k)x(kT ) + vJ(kT )<latexit sha1_base64="pzrGAeRcZF6W97FgoIWpCR1Oif0=">AAACTnicbVHNS8MwHE3n15xfVY9eisOxIYy2CnoRhrvIThP2BWspaZZtYekHSTosZX+hF/Hmn+HFgyLadh3MzQeBl/fejyQvtk8JF6r6JuU2Nre2d/K7hb39g8Mj+fikw72AIdxGHvVYz4YcU+LitiCC4p7PMHRsirv2pJ743SlmnHhuS4Q+Nh04csmQIChiyZJxaGnlSatSuqsnpPKUbC6nc9EwCqGlL2x9ydYzu2TQgSd4Gmwsgo2lYCoWLLmoVtUUyjrRMlIEGZqW/GoMPBQ42BWIQs77muoLM4JMEETxrGAEHPsQTeAI92PqQgdzM0rrmCkXsTJQhh6LlyuUVF2eiKDDeejYcdKBYsxXvUT8z+sHYnhrRsT1A4FdND9oGFBFeErSrTIgDCNBw5hAxEh8VwWNIYNIxD+QlKCtPnmddPSqdlXVH6+Ltfusjjw4A+egDDRwA2rgATRBGyDwDN7BJ/iSXqQP6Vv6mUdzUjZzCv4gl/8Fdj6sZA==</latexit>

Centralized optimal solution: combine these measurements and solve the resulting KF problem:

y(kT ) =

y1(kT )y2(kT ). . .

yJ(kT )

v(kT ) =

v1(kT )v2(kT ). . .

vJ(kT )

Page 16: Kalman Filtering, Sensor Fusion, and Eye Tracking

Local Kalman Filters and Fusion of Results

• Create a Kalman Filter at each sensor and generate state estimates and covariance matrices

• Variety of algorithms for exchanging information between sensors with or without a central processor

• Analytical results on the performance of the resulting estimates and convergence to the centralized estimator

• Voluminous literature on these themes

Page 17: Kalman Filtering, Sensor Fusion, and Eye Tracking

Example: Fusion without Feedback

Hashemipour et al, 1988

P�1(k|k) = P�1(k|(k � 1)) +JX

i=1

�P�1i (k|k)� P�1

i (k|(k � 1)k)�

<latexit sha1_base64="0gRyw1RQHGLXtuvsjCzbNogglB4=">AAACSXicbVDLSgMxFM3Ud31VXboJitAiLTN1oZuC6EJxVcGq0GnHTJppw2QeJHeEMs4P+SFu3LnzH9y4UMSVaSu+6oHAuefcQ5LjxoIrMM1HIzcxOTU9MzuXn19YXFourKyeqyiRlDVoJCJ56RLFBA9ZAzgIdhlLRgJXsAvXPxz4F9dMKh6FZ9CPWSsg3ZB7nBLQklO4qrfTspUV/Ru/VMNfQ9EvW6XStq2SwEl5zcra6UlmC+YBLuK6w79DuPxzHua0aEve7QEu5fNOYdOsmEPgcWJ9ks39I3xrOzfdulN4sDsRTQIWAhVEqaZlxtBKiQROBcvydqJYTKhPuqypaUgCplrpsIkMb2mlg71I6hMCHqo/EykJlOoHrt4MCPTUX28g/uc1E/D2WikP4wRYSEcXeYnAEOFBrbjDJaMg+poQKrl+K6Y9IgkFXf6gBOvvl8fJebVi7VSqp7qNAzTCLFpHG6iILLSL9tExqqMGougOPaEX9GrcG8/Gm/E+Ws0Zn5k19Au5iQ9ZkrAq</latexit>

P�1(k|k)x̂(kT |k) = P�1(k|(k � 1))x̂(kT |(k � 1))

+JX

i=1

⇥P�1i (k|k)x̂i(kT |k)� P�1

i (k|(k � 1)k)x̂i(kT |(k � 1))⇤

<latexit sha1_base64="jEr3Zba2M9Khg+kLRIy5zeKJUdU=">AAAClXicbVFdT9swFHXCYCxjUOBhD3uxqFYVTa0ShgQvlfiYpokXOolSpCZEjuu0VpwP2TeIKuQf8Wt449/gfk1dy5UsHZ9z7r32vUEmuALbfjXMtQ/rGx83P1mft75s71R2925VmkvKOjQVqbwLiGKCJ6wDHAS7yyQjcSBYN4gux3r3gUnF0+QGRhnzYjJIeMgpAU35lef2fdFwynr0FB26QwLFo8Y3+oJrLfxPq0cN53BRnxLYdbFV++GqPPYL3nLK++KqdAULAfdw2+crpbVrXr6xaJiUW3HNm0g+GAL2LMuvVO2mPQm8CpwZqKJZtP3Ki9tPaR6zBKggSvUcOwOvIBI4Fay03FyxjNCIDFhPw4TETHnFZKol/q6ZPg5TqU8CeMIuZhQkVmoUB9oZExiqZW1Mvqf1cghPvYInWQ4sodNGYS4wpHi8ItznklEQIw0IlVy/FdMhkYSCXuR4CM7yl1fB7VHT+dk8+ntcPbuYjWMTfUMHqI4cdILO0B/URh1EjT3j1Dg3LsyvZsv8Zf6eWk1jlrOP/gvz+g0u7cLy</latexit>

Page 18: Kalman Filtering, Sensor Fusion, and Eye Tracking

Combining Estimates in a Sensor Network using Consensus Algorithms

Olfati-Saber, 2005

Page 19: Kalman Filtering, Sensor Fusion, and Eye Tracking

“The first published account of the use of a Kalman filter in the context of VR appears to be Rebo’s master’s thesis (Rebo, 1988).”

“Information and associated databases will be organized by physical location and time, allowing users to both store and retrieve past, present, and future information in the context of physical locality and direction of gaze. The Kalman filter will undoubtedly

play a role in this vision, no matter what the underlying sources of signals.”

Page 20: Kalman Filtering, Sensor Fusion, and Eye Tracking

Control theory (Kalman Filtering) is heavily based on mathematical models of the dynamics and observation

processes

Newer machine learning avoid such models and rely on data and learning algorithms

How can we combine model-based approaches with machine learning approaches?

Page 21: Kalman Filtering, Sensor Fusion, and Eye Tracking

Recent Example

Page 22: Kalman Filtering, Sensor Fusion, and Eye Tracking

Rambach et al, 2016

Page 23: Kalman Filtering, Sensor Fusion, and Eye Tracking

Results

Rambach et al, 2016

Page 24: Kalman Filtering, Sensor Fusion, and Eye Tracking

Eye Tracking Issues

• Eye movements

Ø Fixations

Ø Saccades

Ø Dynamic Stimuli: Smooth Pursuits

• Pupil detection

• Tracker calibration

• Slippage or calibration drift

Santini et al 2019 Symposium on Eye TrackingResearch and Applications

Page 25: Kalman Filtering, Sensor Fusion, and Eye Tracking

Use of KF in eye tracking goes back to Sauter et al, 1991

Page 26: Kalman Filtering, Sensor Fusion, and Eye Tracking

A Few Publications in the Control Theory Literature

Page 27: Kalman Filtering, Sensor Fusion, and Eye Tracking
Page 28: Kalman Filtering, Sensor Fusion, and Eye Tracking

Control Theory has Potential to Provide Useful Tools

Main Idea: Incorporate sensor drift via a state variable in the system dynamics model and use variants of KF for estimation

“major remaining challenge hindering a wider adoption of ubiquitous eye-tracking seems to be device slippage.”

Page 29: Kalman Filtering, Sensor Fusion, and Eye Tracking

Conclusions

• Rich body of literature on Kalman filtering and myriad extensions

• Rich boy of literature on multirate, multisensory fusion leveraging KF

• Combination with newer ML techniques to leverage their strengths

• Potential for application to eye and gaze tracking problems

Page 30: Kalman Filtering, Sensor Fusion, and Eye Tracking

Thank you!

[email protected]

http://faculty.sites.uci.edu/khargonekar/