The Green Button Project - HIMSS365 · 2019. 2. 8. · The Green Button Project Physician Symposium...

Preview:

Citation preview

1

The Green Button ProjectPhysician Symposium Session #3, February 11 2019

Alison Callahan MISt PhD, Research Scientist, Stanford University

2

Alison Callahan, PhD, has no real or apparent conflicts of interest to report.

Conflict of interest

3

• The Green Button origin story

• Anatomy of the consult service

• Methods and challenges

• Learning from the first 100 consults

• Deploying the service at a new site

• Ongoing efforts

Agenda

4

• Define use cases for an informatics consult service

• Describe requirements for setting up an informatics consult service

• Plan deployment of an informatics consult service at a new site

Learning objectives

5

Informatics Consult team

Acknowledgements

Funding: NLM, NIGMS, Stanford School of Medicine, Department of Medicine, Department of

Biomedical Data Science, Center for Population Health Sciences, an anonymous donor

Stanford Health Care partners

David Entwistle Tip Kim

Christopher SharpNigam Shah

Saurabh Gombar

Robert Harrington

Alison Callahan Vladimir Polony

Rob Tibshirani

Ken Jung

Trevor Hastie

6

A teenager with systemic lupus erythematosus

• proteinuria

• antiphospholipid antibodies

• pancreatitis

Meet Laura

Source: Mayo Foundation for Medical Education and Research

7

Managing Laura’s care

Source: Mayo Foundation for Medical Education and Research

8

The Origin of the Green Button

9

Finding patients with

“X”

2-3 weeks to

generate a cohort

10

11

Given a specific case, provide a report with a descriptive summaryof similar patients in Stanford’s clinical data warehouse, thecommon treatment choices made, and the observed outcomes afterspecific treatment choices.

An institutional review board approved study (IRB # 39709) overone year.

The Informatics Consult Service

http://greenbutton.stanford.edu

12

An example report

13

The process

Requesting

physician

Informatics

physician

EHR data

specialist

Data

scientist

Request

consultRefine

clinical

question

Create

definitions

for

exposures

and

outcomesBuild

patient

cohorts

Perform

statistical

analysis

Write

consult

report

Review

results

Apply

evidence

to clinical

decision

24 to 72 hours

14

• Building patient cohorts accurately and quickly

• Asking the right question

• Controlling for confounding

• Ensuring quick turnaround

Methods and challenges

15

1 0 -1

1 0 -1

1 0 -1

Building patient cohorts

1. How will you handle time?2. What features will you use?3. How will you state your phenotype definition?

From timelines to data frames

Phenoty

pin

g

P

ers

on

s

Features

1 0 -1

ProceduresDevicesDiseasesDrugs

1 • cost

• utility

16

The search engine

• Diagnosis and procedure

codes

• Clinical notes

• Lab results

• Vital signs

• Inpatient and outpatient visits

www.tinyurl.com/search-ehr

17

Asking the right question

18

• Identify subsets of patient cohorts that are “similar”

– Matching on age, gender, record length, year

– Using propensity score matching

Controlling for confounding

Pj

Pi

Pk

19

• Use negative controls for empirical calibration

• E-values to quantify the degree of confounding that can produce the observed effect

• Ask the question using multiple datasets

• Schedule an in-person debrief

What we do to not be wrong

20

• Search engine API available in CRAN

• R library for data pre-processing

• Semi-automated pipeline for survival and causal analyses, report generation

Ensuring quick turnaround

21

Guideline

available?

Use level A

guideline

Yes

No

Use

Green

Button

Large cohort of patients

present?

Yes

Use

professional

judgementNo

Analysis + Report

• The question as posed

• How we asked the question

• Our interpretation

• Research walkthrough

List of clinical

situations

Candidates for further

study

Point of care randomization/

large simple trial

Useful

byproduct

High

priority

Increment

priority

22

Learning from the first 100 consults

• How many? 55%

• Which treatment? 30%

• How long? 15%

23

Learning from the first 100 consults

24

“likely to recommend” was 100%

Learning from the first 100 consults

25

• Institutional support

• Data science expertise

• Marketing

• A process to sanity-check the data and consult findings

Deploying the service at your site

26

We’re not the first to provide an on-demand informatics consult service

27

28

29

Stanford: 3 million

Optum: 55 million

Truven: 126

million

Now versus then

30

• What is really useful?

– Description of what happened

– Estimation: Population or Individual level

– Patient level prediction

• Financial viability – who can pay for this “test”?

• Informatics research

– Phenotyping (how do I know the patient had X)

– Representation learning

– Matching, and population level inference

– Personalized effect estimates

• Deploying as a hospital-side service at Stanford Health Care

Open questions and ongoing efforts

31http://greenbutton.stanford.edu

32

Thank you! Questions?

@clssfr

acallaha@stanford.edu

Please complete the online session evaluation!

Recommended