View
1.090
Download
0
Category
Preview:
Citation preview
1
Benchmarking Improvement Program to Prioritise and Improve
Uber-Customer and Customer Outcomes
Date: May 10, 2016 CDO Forum Europe – Thanks to the Organisers!
Program Lead: Eugene (Gene) Kolker, CDO Executive Sponsor: Mark Del Beccaro, SVP/CMO
Team: Libby Montague, Roger Higdon, Winn Haynes, Jeneen Coydan,
Rob Arnold, Beth Stewart, Bill Broomall, Imre Janko, Natali Kolker
Contact: eugene.kolker@seattlechildrens.org, egnklkr@gmail.com
2
Agenda: 3 Parts and 3 Data Stories
A. Introduction
B. 3 Data Stories:
1. Improving Chai Experience (EEE)
2. Introduction to Benchmarking Improvement Program (Uber-customer, customer and business Outcomes)
3. Why the War Room? (Process Improvement)
C. Lessons Learned
3
A. 3 Key Strategies to Empower People with PPT-DAM Approach
3 Strategies:
1. Identify the needs and aspirations of your uber-customers, customers and business, people’s pain points, and business growth opportunities (People - First)
2. Establish processes that support and are supported by the people (Process – Second)
3. Select the appropriate technology to align with your people and processes (Technology – Third)
PPT-DAM Approach: 1. People, 2. Process, 3. Technology (empowered by Data, Analytics & Metrics)
4
People
Process Technology Experiment – Execute – Evaluate
3 Magic Triangles
5
DAM Are Your Customers, Your Business, YOU! ¤ Data, Analytics & Metrics (DAM) are key strategic assets for:
¤ improvement ¤ growth ¤ competitive advantage
¤ Utilising DAM to Experiment, Execute and Evaluate (EEE) on strategic objectives of your super-customers (e.g. in Healthcare: patients, individuals, n=1), customers (doctors, nurses) and business
¤ DAM should be thought of as a core service similar to IT, HR, and Finance. We are in consulting business.
¤ DAM are synergistic with and complementary to IT, not redundant
For more details, please visit cdoanalytics.org
6
PPT-DAM Approach
¤ People: Uber-clients (n=1), clients and business ¤ People needs, pain points and wishes (challenges/opportunities) ¤ Business competitive advantage, long-term changes and survival
¤ Process ¤ Best practices in consulting and business
¤ Technology ¤ R, Java, Python, SQL, EDW (IBM/NZ), Hadoop, NLP, D3, Spark
¤ Data ¤ Internal, external, structured, unstructured, and social media data
¤ Analytics ¤ A/B Testing, regression analysis, modeling, optimisation
¤ Metrics ¤ (Clinical) Outcomes, process and financial metrics
7
Data to Information to Action and Outcomes
Beyond Data Surfing - Towards Data Acting with EEE (Experiment, Execute & Evaluate)
VS.
8
History of CDO: 1st Approximation Overall: 1. First CDO: Cathryne Doss, Capital One, Jan 2002–Dec 2005
Multifamily Business Data Officer, Fannie Mae, from Nov 2014 2. Second CDO, VP: Joseph Bugajski, Visa, Aug 2002–Jan 2008
Managing VP, CDO, Info and Analytics Res., Gartner, from Nov 2011 In Healthcare and Life Sciences: 1. First CDO, VP: Inderpal Bhandari, Medco (Express), Dec 2006–Jan 2014
Global CDO, IBM, from Dec 2015 2. Second CDO: Eugene Kolker, Seattle Children’s, Feb 2007–present
QS for YOU: What about You?! Your Industry?!
Pls email me your info: egnklkr@gmail.com
9
B. 3 Data Stories*
1. Improving Chai Experience (with EEE)
2. Introduction to Benchmarking Improvement Program (Super-customer, customer and business Outcomes)
3. Why the War Room? (Process Improvement)
*Becoming/making FDAs – Friends of Data & Analytics
10
1. Improving Chai Experience with EEE
?
? ?
11
Improving Chai Experience with EEE
?
? ?
EXPERIMENT!
12
Improving Chai Experience with EEE
EXECUTE!!
13
Improving Chai Experience with EEE
?
? ?
EVALUATE!!!
14
2. Benchmarking Improvement Program
¤ Each year SCH participates in USNWR: ¤ The rankings help our uber-clients (patients and families) to make
data-informed decisions about their service (care).
¤ We use the results to benchmark SCH against top industry performers and identify areas for improvement.
¤ In 2013, CDO Team did not complete the project due to lack of resources*
¤ In 2014, CDO Team’s analysis identified a downward trend that led to the creation of a new enterprise-wide program
*My 2 cents on your next reading: 1. T. Nunno, “The Wolf in CIO’s Clothing” (Defensive & offensive methods) 2. Kolker, Kolker, Big Data, 2014, 2, 50-4 (Benchmarking improvement)
15
How does USNWR Work? ¤ Rankings reflect the interrelationship between:
Stru
ctur
e
Pro
cess
Out
com
es
16-19 Metrics 3 Metrics 3-6 Metrics
16
In 2014, we discovered the following trends in SCH’s performance:
Ranking Trends and 2015 Predictions
17
Program Background: USNWR submission requires considerable time and effort. The process had a low ROI as it was not aligned with strategic planning and did not fill the need for system-wide benchmarking.
This new Benchmarking Improvement Program was created to address these issues with focus on (1) Uber-Customer, Customer and Business (Clinical) Outcomes and (2) Process Improvement.
Program Description:
1. What is the baseline of SCH’s current USNWR process? 2. Where are the opportunities for process improvement? 3. How can SCH use the USNWR results to improve care?
Program Overview
18
Forecasted vs. Actual Ranks, 2015 ¤ Compared to 2015 Forecasted ranks (slide 16 & below, left), Actual ranks
(below, right) improved dramatically. ¤ This change was primarily driven by Outcomes.
Actual Ranks Forecasted Ranks
19
Improved Outcomes
¤ There are two system-wide Outcomes: ICU Infections and Pressure Ulcers.
¤ Improvements in ICU infections increased the scores in 6 (out of 10) service lines.
¤ Many small changes contributed to improvements in Outcomes within each service line.
“The improvements that the CDO Team identified were applicable system-wide as well as on a medical service line level. Their work is of immense value because it helped us improve patient care.” - David Fisher, M.D.,SVP, Chief Medical Officer (Ret.)
20
3. Why the War Room? 1. 3 Improvement Goals
2. Proposed Strategy
3. Discussion and Agreement
21
3. Why the War Room? 1. 3 Improvement Goals
2. Proposed Strategy
3. Discussion
and Agreement
22
3 Improvement Goals Improve:
1. Customer Experience
2. SOP and Reproducibility - Increase standardisation - Decrease error rate - Increase reproducibility
3. Data Quality Uber-goal: transforming customers
into FDAs!
23
Our Collective Experience
Our approach is analyst-driven and collaborative:
1. It is based on feedback from all of us
2. We are summarising our collective knowledge here, and it is an approach we should all agree to
3. Please share your thoughts and feelings about this process. And let us know one-to-three thing(s) good and bad about your experience in prior years
24
EEE Strategy: Communication
Our Strategies for Improved Customer Experience:
¤ Consider yourself a partner with or at service to your assigned service lines (customers)
¤ Decide on question interpretation and expected deliverables at first meeting
¤ Take an iterative approach, ask for feedback and accept criticism
¤ Communicate clearly and frequently ¤ Prioritise face-to-face meetings over emails ¤ Promptly make and communicate changes to close the
feedback loop. Get a verbal or written “thumbs up” before moving on
25
EEE Strategy: Data Quality
¤ Team approach in the War Room: half-days everyday until the data collection, analysis and reporting are complete
¤ Arrange face-to-face meetings twice a week with assigned service lines’ leaders (customers) to ensure access and get faster turn-around time with questions
¤ Document process and queries
EXPERIMENT, EXECUTE and EVALUATE!!!
26
C. Data to Information to Action & Outcomes
Beyond Data Surfing - Towards Data Acting with EEE and Friends of Data & Analytics!
VS.
27
Failed Project Successful Project
Process Not defined Well-defined
Customers Identified
Concerned stakeholders
Clear, engaged customers
Core Question Free floating anxiety Hypothesis
Shared Vision No expectations Fully articulated
Communication Opaque Transparent
Data Science High quality High quality
Actionable Insights
Yes, but too much to handle Actionable
Key Determinants of Project Success
DAM
People
Bottom line: People make the Difference!
28
Client Brings:
Experience and Data
Final Decision on Project Direction
Implementation and Outcomes
DAM Team Brings:
Data Processing
and Analytics Expertise
Optional Decision Pathways and
Predicted Outcomes
Ongoing Measurements
and Comparative Effectiveness
What Each Party Brings to the Project
29
Hadoop, Clusters
Advanced Analytics
EDW (IBM/NZ)
External Data
Repositories, Databases
Hybrid Architecture
Adapted from Gartner
30
Abbreviations & Details PPP = People, Process, Technology DAM = Data, Analytics, Metrics EEE = Experiment, Execute, Evaluate FDAs = Friends of Data & Analytics
Interested in details: 1. Kolker, Healthcare Tech Outlook, 2015, June, 8-9 2. Koster, Stewart, Kolker, Academic Medicine, 2016, 91(2), 165-7 3. Kolker, Healthcare Tech Outlook, 2016, June, in press 4. Kolker, Ozdemir, Kolker, OMICS JIB, 2016, 20(6), in press
Interested in more details: 1. Check our site: cdoanalytics.org 2. Email me to: egnklkr@gmail.com
31
3 Key Lessons Learned: DAM are YOU!
1. Uber-customer (n=1), customer and business questions identified for a well-defined project.
2. Maintain communication and high sponsor engagement with cadence and transparency throughout the process.
3. Great data science to deliver actionable insights and execution towards shared vision.
PPT-DAM Approach: People-Process-Technology (empowered by DAM with EEE & FDAs)
32
People
Process Technology Experiment – Execute – Evaluate
3 Magic Triangles
33
Many THANKS To You and My FDAs: TOP 10 CDO: Inderpal Bhandari (Global CDO, IBM),
Cortnie Abercrombie (Emerging Roles Leader for CDOs, IBM), Ursula Cottone (CDO, Citizens Bank), Mark Ramsey (CDO, GSK), Maria Villar (Global VP, SAP), Derek Strauss (CDO, TDAmeritrade), Andrew Salesky (Global CDO, Charles Schwab), Larry Smarr (UCSD), Wes Hunt (SVP/CDO, MetLife), Vural Ozdemir (DELSA).
TOP 10 SCH: Mark Del Beccaro, Tom Hansen, Jeff Sperring, Drex DeFord, Peter Tarczy-Hornoch, Gerald van Belle, Skip Smith, Craig Jackson, Roy Diaz, and Jack Faris.
TOP 12 EK: Eugene Luskin, Jacob Grinberg, John Koster, Paul Buehrens, Dave Eckert, Keith Marton, Jeff Snedden, Tom Martin, Raif Khassanov, Dmitri Frishman, Andrey Lisitsa, and Doron Lancet.
Recommended