Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease

Preview:

DESCRIPTION

Automated Web-Based Behavioral Test for Early Detection of Alzheimer’s Disease. Eugene Agichtein* , Elizabeth Buffalo , Dmitry Lagun , Allan Levey, Cecelia Manzanares, JongHo Shin, Stuart Zola . Emory University. Intelligent Information Access Lab. - PowerPoint PPT Presentation

Citation preview

Automated Web-Based Behavioral Test for Early Detection of

Alzheimer’s Disease

Eugene Agichtein*, Elizabeth Buffalo, Dmitry Lagun, Allan Levey, Cecelia

Manzanares, JongHo Shin, Stuart Zola

Intelligent Information

Access Lab

Emory University

2

Emory IR Lab: Research Directions

• Modeling collaborative content creation for information organization and indexing.

• Mining search behavior data to improve information finding.

• Medical applications of Search, NLP, behavior modeling.

3

Mild Cognitive Impairment (MCI) and Alzheimer’s Disease

• Alzheimer’s disease (AD) affects more than 5M Americans, expected to grow in the coming decade

• Memory impairment (aMCI) indicates onset of AD (affects hippocampus first)

• Visual Paired Comparison (VPC) task: promising for early diagnosis of both MCI and AD before it is detectableby other means

4

VPC: Familiarization Phase

5

VPC: Delay Phase

Delay

6

VPC: Test Phase

7

VPC Task: Eye Tracking Equipment

8

Subjects with Normal Visual Recognition Memory > 66% of time on Novel Images

9

VPC: Low Performance Indicates Increased Risk for Alzheimer’s Disease

1. Detects onset earlier than ever before possible

2. Sets stage for intervention

Eugene Agichtein, Emory University

10

Behavioral Performance on the VPC test is a Predictor of Cognitive Decline

Eugene Agichtein, Emory University

[Zola et al., AAIC 2012]

Scores on the VPC task accurately predicted, up to three years prior to a change in clinical diagnosis, MCI patients who would progress to AD, and Normal subjects who would progress to MCI

11

VPC: Gaze Movement AnalysisLagun et al., Journal of Neuroscience Methods, 2011

Visual examination behavior in the VPC test phase. In this representative example, the familiar image is on the left (A), and the novel image is on the right (B), for a normal control subject. The detected gaze positions are indicated by blue circles, with the connecting lines indicating the ordering of the gaze positions.

12

Technical Contribution: Eye Movement AnalysisLagun et al., Journal of Neuroscience Methods, 2011

13

Significant Performance Improvements

Method Features Accuracy Sensitivity Specificity AUC

Baseline NP 0.667 0.6 0.734 0.667

LR NP+SO+RF+FD 0.71 0.712 0.707 0.71

SVM NP+SO+RF+FD 0.869* (+30%) 0.967* (+61%) 0.772* (+5%) 0.869* (+30%)

Lagun et al., Journal of Neuroscience Methods, 2011

14

Our Big Idea: Web-based VPC task (VPW)with E. Buffalo, D. Lagun, S. Zola

• Web-based version of VPC without an eye tracker

• Can be administered anywhere in the world on any modern computer.

• Can adapt classification algorithms to automatically interpret the viewing data collected with VPW

15

VPC-W Architecture

16

VPC-W: basic prototype demo

Delay

ViewPort

position

Familiarization (identical images)

Test (novel image on left)

17

Experiment Overview

• Step 1: Optimize VPC-W on (presumably) Normal Control (NC) subjects

• Step 2: Analyze VPC-W subject behavior with both gaze tracking and viewport tracking simultaneously

• Step 3: Validate VPC-W prediction on discriminating Impaired (MCI/AD) vs. NC

18

VPC-W: Novelty Preference Preserved

Delay(seconds)

Mean novelty preference, VPC

(N=30)

Mean novelty preference, VPC-W

(N=34)10 67% 65%60 68% 69%

Self-reported elderly NC subjects tested with VPC-W over the internet exhibit similar novelty preference to that of VPC.

Single-factor ANOVA reveals no significant difference between VPC and VPC-W subjects

19

VPC vs. VPC-W: Similar Areas of Interest

VPC ranking VPC-W ranking

Quantifying viewing similarity: Coarse measure: divide into 9 regions (3x3), rank by VPC and VPW viewing time. The Spearman rank correlation varies between 0.56 and 0.72 for different stimuli.

VPC VPC-W

Areas of attention: heat map for VPW (viewport-based) is concentrated in similar areas to VPC (unrestricted eye-tracking) .

20

Actual Gaze vs. Viewport Position

Attention w.r.t. ViewPort

21

Eye-Cursor Time Lag Analysis

XY: minimum at -75.00 ms 199.8578X:minimum at -90.00 ms 161.8480Y:minimum at -35.00 ms 116.3665

22

Viewport Movement ~ Eye Movement

Normal elderly subject (NP=88%, novel image is on left). Impaired elderly subject (NP=49%, novel image is on left).

23

Exploiting Viewport Movement Data

Novelty Preference

fixation duration distribution

+

24

VPC-W Results: Detecting MCI

21 Subjects (11 NC, 10 aMCI), recruited @Emory ADRC:

Accuracy on the pilot data comparable to best reported values for manually administered cognitive assessment test (MC-FAQ, reported accuracy, specificity, and sensitivity of 0.83, 0.9, and 0.89 respectively) (Steenland et al., 2009).

Classification method

5-fold CV 10-fold CV leave-1-out

Acc. Sens. Spec. AUC Acc. Sens. Spec. AUC Acc. Sens. Spec. AUC

Baseline: NP>=0.58 0.81 0.80 0.82 0.81 0.81 0.80 0.82 0.81 0.81 0.80 0.82 0.81

SVM (VPC-W) 0.81 0.80 0.83 0.81 0.85 0.80 0.9 0.86 0.86 0.80 0.91 0.86Accuracy, Sensitivity, Specificity, and AUC (area under the ROC curve) for automatically classifying patients tested with VPC-W using 5-fold, 10-fold, and leave-one-out cross validation.

25

Current Work

• Analysis:– Applying deep learning and “motif” analysis for more

accurate analysis of trajectory– Incorporating visual saliency signals

• Data collection:– Longitudinal tracking of subjects– “Test/Retest”: effects of repeated testing– Sensitivity analysis: for possible use in drug trials– Wide range of “normative” data using Mturk worker pool

26

Future Directions and Collaboration Possibilities

• Can we apply similar or the same techniques for cost-effective and accessible detection of:– Autism (previous work on difference in gaze patterns)– ADHD– Stroke/Brain trauma– Other possibilities?

• What can we learn about the searcher from their natural search and browsing behavior?– Image search and examination preferences (anorexia)– Correlate behavior with biomarkers (Health 101 cohort)

27

VPC-W Summary

VPC-W, administered over the internet, elicits viewing behavior in normal elderly subjects similar to eye tracking-based VPC task in the clinic.

Preliminary results show automatic identification of amnestic MCI subjects with accuracy comparable to best manually administered tests.

VPC-W and associated classification algorithms could facilitate cost-effective and widely accessible screening for memory loss with a simple log on to a computer.

Other potential applications: online detection and monitoring of ADD, ADHD, Autism and other neurological disorders.

This project has the potential to dramatically enhance the current practice of Alzheimer’s clinical and translational research.

28

Eye Tracking for Interpreting Search Behavior

• Eye tracking gives information about searcher interests:– Eye position– Pupil diameter– Saccades and fixations

Reading

Search

Camera

29

We Will Put an Eye Tracker on Every Table! - E. Agichtein, 2010

• Problem: eye tracking equipment is expensive and not widely available.

• Solution: infer searcher gaze position from searcher interactions.

30

Inferring Gaze from Mouse Movements

Actual Eye-Mouse Coordination Predicted

No Coordination (35%)

Bookmarking (30%)

Eye follows mouse (35%)

Guo & Agichtein, CHI WIP 2010

31

Post-click Page Examination Patterns

• Two basic patterns: “Reading” and “Scanning”– “Reading”: consuming or

verifying when (seemingly) relevant information is found

– “Scanning”: not yet found the relevant information, still in the process of visually searching

32

Cursor Heapmaps (Reading vs. Scanning)[Task: “verizon helpline number”]

Relevant (dwell time: 30s) Not Relevant (dwell time: 30s)

Move cursor horizontally “reading”

Passively move cursor “scanning”

33

Typical Viewing Behavior (Complex Patterns) [Task: “number of dead pixels to replace a Mac”]

Relevant (dwell time: 70s) Not Relevant (dwell time: 80s)

Actively move the cursor with pauses “reading” dominant

Keep the cursor still and scroll “scanning” dominant

Recommended