Ivy Zhu, Research Scientist, Intel at MLconf SEA - 5/01/15

Preview:

Citation preview

Model-based machine learning for real-time brain decoding

Ivy Zhu

Intel Labs

2

Why bother?

3

Functional MRI (fMRI)

4

metabolic brain

anatomical brain

• Non-invasive observation• Observation-based inference

Brain Image Analysis/Decoding

5

• Huge amount of data• 1 volume per scan period (1~2s)• 100K ~150K voxels per volume• 100’s ~ 1000’s scans per experiment

• Need sophisticated preprocessing to denoise• Thermal and system noise from scanner HW• Head motion, respiration, heart beat, etc., physiological processes• Neuronal activity related to non-task-related brain process

• Prone to overfitting – typically number of observations < number of features

6

General Linear Model (GLM)

General linear model

Statistical parametric map (SPM)Design matrix, Sm

Statisticalinference

Realignment Smoothing

Normalisation

Image time-series

Template

Kernel

Y = ( Σ hm conv Sm) + ε

hmi = bi . βm

i

Haemodynamic Response Function (HRF)

And its partial derivatives

Preprocessing to denoise

7

Voxels are not independent.

Haxby et al. (2001), Science

8

Brain networks are complicated and dynamic.

Turk-Browne, N.B. (2013) Functional interactions as big data in the human brain. Science 342, 580-584.

9

Can we have a model that describes local and global spatial dependencies, as well as dynamic

brain networks?

10

Topographic Factor Analysis (TFA)

Manning JR, Ranganath R, Norman KA, Blei DM (2014) Topographic Factor Analysis: A Bayesian Model for Inferring Brain Networks from Neural Data. PLoS ONE 9(5): e94914. doi:10.1371/journal.pone.0094914

� � �

11

TFA Matrix Representation

Local Spatial Dependencies

Global DependenciesBrain Networks

���

12

TFA discovers latent factors.

Manning JR, Ranganath R, Norman KA, Blei DM (2014) Topographic Factor Analysis: A Bayesian Model for Inferring Brain Networks from Neural Data. PLoS ONE 9(5): e94914. doi:10.1371/journal.pone.0094914

13

TFA discovers brain networks.

Manning JR, Ranganath R, Norman KA, Blei DM (2014) Topographic Factor Analysis: A Bayesian Model for Inferring Brain Networks from Neural Data. PLoS ONE 9(5): e94914. doi:10.1371/journal.pone.0094914

14

How can we discover factors common amongst humans while preserving key individual

differences?

15

Hierarchical Topographic Factor Analysis (HTFA)

Manning JR, Stachenfeld K, Ranganath R, Turk-Browne N, Norman KA, Blei DM. A probabilistic approach to full-brain functional connectivity. Submitted to PNAS.

16

Graphical Model for HTFA

Manning JR, Stachenfeld K, Ranganath R, Turk-Browne N, Norman KA, Blei DM. A probabilistic approach to full-brain functional connectivity. Submitted to PNAS.

� subject �� trials V voxels y observed voxel activations

� latent factors (µ, ) � weights

Individual difference

Global Factors

17

HTFA Inference Algorithm

while global template not converged and nIter < maxOuterIter dofor subject = 1 to � do

while individual factors not converged and mIter < maxInnerIter doEstimate new weight matrix based on existing centers/widthsEstimate new centers/widths based on existing weightsmIter ++

endUpdate global template based on subject’s new centers/widths

endnIter ++

end

for subject = 1 to � doUpdate weight matrix based on converged global template

end

18

In essence, TFA/HTFA is a type of factor analysis. How does it compare with other factor

analyses?

19

TFA/HTFA vs PCA vs ICA

• Commonality• All decompose observed brain images into a weighted sum of

components

• Difference• PCA & ICA emphasize the orthogonality or independence of

components. They cannot capture dynamic brain networks

• TFA/HTFA relax the orthogonality/independency requirement, and with a closed-form factor function, are able to discover richer information from brain images

• local dependencies• global dependencies• dynamic brain networks

20

How can we bring HTFA into reality?

21

Intel-Princeton Collaboration

22

Bringing HTFA to Reality

Two initiatives:

Reduce the reconstruction error on small number of

factors (K<10) to be lower than 5%

Reduce the overall execution time of a key case study (10

subjects, 10 sources, 200images/subject) to be less than

5mins

23

HTFA reconstruction error was …

Need more optimization when the number of factors is small

Results are pretty good when the number of factors is large

24

HTFA reconstruction error is smaller.

Global CentersBefore Optimization

Global CentersAfter Optimization

global centers (x) global centers (y) global centers (x) global centers (y)

25

HTFA reconstruction error is smaller.

True ConnectivityEstimated ConnectivityBefore Optimization

Estimated ConnectivityAfter Optimization

5

4

3

2

1

Factor

Fact

or

5

4

3

2

1

Factor

5

4

3

2

1

Factor

26

Methods for Speeding up HTFA

Used Intel Math Kernel Library (MKL) where appropriate, e.g., single/double precision nonlinear least square solver with/without constraints

Used thread-level parallelism

Optimized matrix operation order to better utilize cache locality

27

HTFA Speedup Results

0

0.2

0.4

0.6

0.8

1

1.2

1 2 3

No

rmal

ized

Ex

ecu

tio

n T

Ime

Raw Data (#factors, #subjects, #img/subject)

HTFA optimization and speedup

Before Optimization

After Optimization

3X to 10X speedup after optimization

28

Recap

Real-time brain decoding can save lives!

Bayesian model-based HTFA is promising for decoding real-time fMRI data

Intel is working with Princeton to bring real-time full-brain decoding closer to reality

29