37
Event: DDMA Data Quality Awards 2010 Spreker: Jos Leber – T-Mobile Netherlands Datum: 14 oktober 2010, Klooster Noordwijk 21/10/2010 Page 1 Datum: 14 oktober 2010, Klooster Noordwijk www.ddma.nl

DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

  • Upload
    ddma

  • View
    1.058

  • Download
    0

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Event: DDMA Data Quality Awards 2010

Spreker: Jos Leber – T-Mobile Netherlands

Datum: 14 oktober 2010, Klooster Noordwijk

21/10/2010Page 1

Datum: 14 oktober 2010, Klooster Noordwijk

www.ddma.nl

Page 2: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

How to get D ata Quality Management started?

T-Mobile Netherlands

21/10/2010Page 2

T-Mobile Netherlands

Jos Leber Sr. Data Manager

Date: October 14th 2010

Page 3: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Contents

� Introduction T-Mobile

� How does Data Quality start?

� What is Data Quality within T-Mobile Netherlands

� Data Quality monitoring and Tools

� Cost of poor Data Quality

21/10/2010Page 3

� Cost of poor Data Quality

� TMNL Data Quality mission statement

� Data Management Maturity Models

Page 4: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

T-Mobile biedt het wereldwijde netwerk.

� Gestart in 1999 als � Sinds 2003 T-Mobile� Toonaangevend bedrijf in mobiele

communicatie.� Een van de drie strategische business

units van Deutsche Telekom .� Wereldwijd bijna 151 miljoen mobiele

21/10/2010Page 4

� Wereldwijd bijna 151 miljoen mobiele communicatie klanten.

� In Nederland in 2009 een jaaromzet van 1,807 miljard euro en ruim 2000 medewerkers.

Qua Data Quality Management:� Meer dan 100 systemen� En 60 tools

Page 5: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

T-Mobile internationaal netwerk.

• T-Mobile Amerika• T-Mobile Engeland• T-Mobile Duitsland• T-Mobile Tsjechië• T-Mobile Hongarije• T-Mobile Oostenrijk• T-Mobile Kroatië• T-Mobile Slovenië• T-Mobile Macedonië• T-Mobile Montenegro

21/10/2010Page 5

• T-Mobile Montenegro• ERA Polen• OTE groep (Griekenland,

Roemenië, Bulgarije, Albanië)

� Daarnaast heeft T-Mobile roaming afspraken met ruim 400 roaming partners in meer dan 185 landen.

Page 6: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Complexity of Mobile communications

• Ca 100 different barrings and SMS services

• Block dialing all 0900 numbers• Block 0900 for 18+ numbers• Block downloading games etc• Block in cases of bad debt• Voice mail on / off?• “Nummer weergave”

Etc, etc

21/10/2010Page 6

• Etc, etc

• Family plan

• Mobile to Mobile (business customers

• Information technology

• Network technology

Page 7: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

The start for data quality in 2001

21/10/2010Page 7

Page 8: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

NEN 5825 Standard for street and city namesStandard for street and city names

titel(s)

voorletter(s)

naam-gegevens 000 %

100 %

021 %

BestandscanBestandscanBestandscanBestandscan

21/10/2010Page 8

netnummer

abonneenummer

net- en abonneenummer

telefoon-gegevens

straatnaam

postbus

huis/postbusnummer

huisnummer-toevoeging

postcode

woonplaats

AW-gegevens

volledige zakelijke naam

achtervoegsel(s)

achternaam

voorvoegsel(s) 021 %

100 %

000 %

001 %

100 %

000 %

100 %

015 %

100 %

100 %

000 %

042 %

000 %

Page 9: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Customer Data Inconsistencies (2004)Differences between the CRM and Billing system

Amount

Open / Solved cases

21/10/2010Page 9

0

Amount

04/1

2/20

03

08/0

1/20

04

22/0

1/20

04

05/0

2/20

04

19/0

2/20

04

04/0

3/20

04

11/0

3/20

04

18/0

3/20

04

01/0

4/20

04

07/0

4/20

04

14/0

4/20

04

21/0

4/20

04

28/0

4/20

04

05/0

5/20

04

12/0

5/20

04

19/0

5/20

04

26/0

5/20

04

02/0

6/20

04

09/0

6/20

04

16/0

6/20

04

23/0

6/20

04

30/0

6/20

04

14/0

7/20

04

28/0

7/20

04

11/0

8/20

04

25/0

8/20

04

08/0

9/20

04

22/0

9/20

04

06/1

0/20

04

20/1

0/20

04

Date

Open Cases

Solved cases

Page 10: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Steps during the Phoenix projectQ4 2004 Q1 2005 Q2 2005 Q3 2005 Q4 2005 Q1 2006

0. Definitie Scope Phoenix project

1. Data Cleaning

�Norm definition�Data Cleaning (‘X’ issues)�Criteria for data cleaning�Data Mapping versus data standard�Aftercare

Go Live

21/10/2010Page 10

2. Data Migration

�DAT Data Acceptance Test plan�Data Base Attributes List �Business rule book

�Data Display Tests�Special Test Cases�Sanity Check

3. Tooling & Process

�Compare & Quality tools development�Daily DQM meetings�Reporting to management (IPB)

Page 11: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

21/10/2010Page 11

Page 12: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Quality definition

Data are of high quality if they are fit for their intended uses in operations, decision making, and planning (after Jo seph Juran)

Data that are fit for use areFree of defect: Posses desired features:- accessible - relevant- accurate - comprehensive- timely - proper level of detail- complete - easy to read- consistent with other sources etc - easy to interpret etc

What is Data Quality for T-Mobile Netherlands?Definition of Data Quality according to a simple keyword: A.C.C.U.

21/10/2010Page 12

Definition of Data Quality according to a simple keyword: A.C.C.U.� Actual is data still ‘up-to-date’ ?

(e.g. Outdated data is corrected to the new/changed data standards)

� Correct Data is filled in within the confirmed standards (e.g. empty or not in the agreed format (Numeric, NEN conform etc))

� Complete is any information missing?� Unique is it unique, no duplicate relations (within a single system)

What is Inconsistency?� The same information different in two or more systems

Data are only of high quality if those who use them say so.

Page 13: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Example of a data standard

In a Data standard attributes (or fields) are defined for e.g. Dutch Postcode:� how it is named� for what purpose do we use and maintain this attribute� what is the master?� It’s length� it’s validation rules

Entity ADDRESS

Standard Name Standard NL name

Attribute nameLogical Data Model

Clarify

Screen name Clarify

Description and objective Norm Y/N Measured Y/N?

Format Master Mandatory Rules and values Comment

Postcode Postcode ZIPcode Postcode The Postal code of the formal physical Text20 x Yes Capitals

21/10/2010Page 13

location where a Customer is settled/established.

Y Y

NL Postcode is stored as dddd AA (with single space)[1-9][0-9]{3}\s[A-Z]{2}

In case of a foreign Postal code (ZIP0 the format is free text with a maximum of 20

characters.

Overruled Afwijzing (overschrijven)

Overruled (not displayed) Indicates whether the postcode check is overruled by Supervisor

Y N

Boolean x No

X-coordinate X-coördinaat X_X_COORDINATE Coordinates x/y Geological x-coordinate of the location identified by postcode and house number; is used to calculate the location of the “Home Zone” in the

GSM network

Number x No Selected fromGeo-tool table X-coordinate is specified using the ‘abc’ notation. According to this standard the X-coordinate can be

maximally 6 digits long (unconfirmed).

Y-coordinate Y-coördinaat X_Y_COORDINATE Coordinates x/y Geological y-coordinate of the location identified by postcode and house number; is used to calculate the location of the “Home Zone” in the

GSM network

Number x No Selected from ‘xyx’-tool table Y-coordinate is specified using the ‘abc’ notation. According to this

standard the Y-coordinate can be maximally 7 digits long

(unconfirmed).

Page 14: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Monitoring and Data Inspection tools

0.00%

0.10%

0.20%

0.30%

0.40%

0.50%

0.60%

Per

cen

tag

e C

ust

om

ers

with

an

inco

nsi

sten

cy

21/10/2010Page 14

0.00%

2005-11-17

2005-12-15

2006-01-19

2006-02-23

2006-03-23

2006-04-20

2006-05-19

2006-06-22

2006-07-19

2006-08-02

2006-08-17

2006-08-31

2006-09-14

2006-10-10

2006-11-08

Bi weekly Quality Monitoring

% active Customers with an inconsistency issue without customer impact

% active Customers with an inconsistency issue with direct customer impact

Target 0.50%

�To measure is to know

�Meten = weten

�Messen ist Wissen

Page 15: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Technical concept DQ Dashboard

SystemA

Extract

Compartuned

Persistent inconsistenci

es CSV

Daily

Active CompareStatistics (Excel)

+details file used to correct data

Active Quality Statistics (Excel)

+details file

Weekly

21/10/2010Page 15

System

B

Extract

Compare &

Quality Databas

e

tuned queries

Spot inconsistenci

es2x

es CSV

Monthly

Summary Sheet(Excel)

Report(Powerpoint)

Quality issues

0.00%1.00%2.00%

3.00%4.00%5.00%6.00%

2008

-02-

0820

08-0

3-17

2008

-04-2

320

08-0

5-14

2008

-06-1

720

08-0

7-23

2008-0

8-11

2008-0

8-21

2008-0

9-08

2008-0

9-18

2008-1

0-22

2008-1

1-22

2008-1

2-08

Pe

rce

ntag

e C

usto

me

rs w

ith

an

inco

nsis

tenc

y

% active Customers w ith a non-blocking data quality error

% active Customers w ith a blocking data quality error

Target

Datum # active customers aantal met impact

% active Customers with a blocking data quality error aantal zonder impact

% active Customers with a non-blocking data quality error Target

2008-02-08 592496 13279 2.2412% 22365 3.7747% 0.50%2008-03-17 581846 12240 2.1036% 15708 2.6997% 0.50%2008-04-23 730445 13451 1.8415% 14149 1.9370% 0.50%2008-05-14 735347 12390 1.6849% 8871 1.2064% 0.50%2008-06-17 747582 13897 1.8589% 12229 1.6358% 0.50%2008-07-23 761830 13258 1.7403% 12264 1.6098% 0.50%2008-08-11 766990 11818 1.5408% 8490 1.1069% 0.50%2008-08-21 767645 5607 0.7304% 8991 1.1712% 0.50%2008-09-08 768004 4424 0.5760% 5984 0.7792% 0.50%2008-09-18 768244 3628 0.4722% 2624 0.3416% 0.50%2008-10-22 768690 3034 0.3947% 1581 0.2057% 0.50%2008-11-22 769156 2988 0.3885% 961 0.1249% 0.50%2008-12-08 769156 2905 0.3777% 883 0.1148% 0.50%

Page 16: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Compare statistic active customers 0 01-08-10 08-08-10 15-08-10 21-08-10 29-08-10 Delta ImpactCustomer records compared 1 2190973 2195402 2199179 2203809 2211593 7784 Inconsistencies for category Customer 2 1823 1773 1766 1760 3366 1606 Customer.type 5 8 8 8 8 8 0 Customer.account_number 6 118 118 120 120 1694 1574 Verkeerd reknr op acc giro of A.I. Customer.payment_method 7 96 99 100 97 122 25 Verkeerde betaalwijze acc.giro of A.I.Customer.status 8 33 39 39 38 39 1 Klant inactive in system A and active in system BCustomer.name 9 508 505 505 504 501 -3 Customer.billing_address.street 10 224 218 213 213 215 2 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.housenr 11 178 173 171 171 172 1 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.housenr_add 12 62 63 63 63 62 -1 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.city 13 256 255 254 254 256 2 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.zipcode 14 222 217 214 213 214 1 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.country 15 10 10 11 11 13 2 post niet naar juiste adres (SWL, Factuur)

Compare statistics & details file examples

Compare statistics

21/10/2010Page 16

Customer.billing_address.country 15 10 10 11 11 13 2 post niet naar juiste adres (SWL, Factuur)Customer.billing_address.bill_line_2 16 108 68 68 68 70 2

Frico 10.2

Details file exampleattribute primary_key found_in

_system afound_in

_system bvalue_in_system a value_in_system b first_detected

_datelast_detected_date

customer_status

Customer.account_number <<number>> Y Y 494291400 603238416 14-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 3243325 14-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 420657096 21-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 6287953 28-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 3674570 2838689 28-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 2195502 463859987 28-Aug-10 29-Aug-10 activeCustomer.account_number <<number>> Y Y 546149928 559308787 28-Aug-10 29-Aug-10 active

Page 17: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Quality check examples

attribute primary_key parent_key ext_ref Source value first_detected_datelast_detected_date customer_statusCla_contact.birthname 123456677 1.11591288/0 System a 01/07/1955 03-Oct-10 04-Oct-10 activeCla_contact.birthname 269859164 1.11689312/0 System a BUSCH 03-Oct-10 04-Oct-10 activeCla_contact.birthname 269859164 1.11846553/0 System a CHEN 03-Oct-10 04-Oct-10 activeCla_contact.birthname 269859164 1.11870069/0 System a ROUS 03-Oct-10 04-Oct-10 activeCla_contact.birthname 270042435 1.11872380/0 System a VERHAGEN 03-Oct-10 04-Oct-10 activeCla_contact.birthname 270060987 1.11890909/0 System a SEWPERSAD 03-Oct-10 04-Oct-10 activeCla_contact.birthname 270070652 1.11900568/0 System a PIETERSE 03-Oct-10 04-Oct-10 active

21/10/2010Page 17

attribute primary_key parent_key ext_ref source value first_detected_datelast_detected_date customer_statusCla_contact.initials 272300706 1.11580066/1905523 System b JJ 03-Oct-10 04-Oct-10 activeCla_contact.initials 272300752 1.11580066/1905569 System b DJ 03-Oct-10 04-Oct-10 activeCla_contact.initials 272300754 1.11580066/1905571 System b MCA 03-Oct-10 04-Oct-10 activeCla_contact.initials 272300810 1.11580066/1905627 System b HJM 03-Oct-10 04-Oct-10 activeCla_contact.initials 272300982 1.11580066/1905799 System b HWA 03-Oct-10 04-Oct-10 activeCla_contact.initials 272300989 1.11580066/1905806 System b PFM 03-Oct-10 04-Oct-10 activeCla_contact.initials 272301111 1.11580067/1905928 System b r 03-Oct-10 04-Oct-10 active

Page 18: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Quality & Inconsistency MonitorSystem Overview

Data Quality & Inconsistency MonitorTotal System Overview - Direct Impact

X %

X %

X %X %

X % X %

X %

<<amount>>

<<amount>>

<<amount>>

<<amount>>

<<amount>>

<<amount>>#

cust

omer

s

X %

X %

X %

X %

X %

X %

X %

21/10/2010Page 18

New compareNew compare

?X %

X %X %

X %X %

0

<<amount>>

<<amount>>

<<amount>>

Jan Feb Mrt Apr Mei Jun Jul Aug Sep Oct Nov Dec

X %

X %

X %

X %

System A versus B System X Quality System L versus H

System C versus D System h / K Barrings System K retrospectively

dbA versus dbB System Z Quality System L versus P

System E / F retrospectively Overig (a,b,c,d,e,f,) System K retrospectively

Percentage of total customers Overall target %

Page 19: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

DQM “Driehoeks overleg” and Tooling

Functional managementFunctional

manageme

System Originator Business partner

IT/NT Ops Development DQM Fix Tools

Products Services

Under Development

System a name name name

System b name name

service

System

21/10/2010Page 19

+ Users management

IT

����Cooperation����Communication����Structure����Monitoring����Problem management

SLA’s+

KPI’s

+ Users management

IT ServiceManagement

IT Enablers

����Cooperation����Communication����Structure����Monitoring����Problem management

SLA’s+

KPI’s

OLA

Out of Scope

System

Project

Page 20: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Quality improvement and 6 Sigma DMAIC methodology - How to systematically improve data quality

How big is the data quality problem?

What is the problem in data quality ?

�Direct customer impact ?�Indirect customer impact? �No customer impact?� Document new

How can we make the data improvement sustainable?

21/10/2010Page 20

What is the fix to the data problem?

�Work around required ?�Communicate solution/work around to customers�Create known error (KER)

�Design and develop data fix�Implement data fix

process� Set in place monitoring� New controls required?

What is the root cause of the data problem?

Page 21: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Cost of poor Data Quality (summary)

� In the period from july 13th till august 13th 2009,‘x amount’ calls‘x amount’ calls to Customer Service were related to dataquality issues. � ‘y amount’ of these calls needed a case to case to

2nd line.2nd line.

� The majority of these calls were related to:� Incorrect bills

21/10/2010Page 21

� Incorrect bills� Loyalty (e.g. not receiving gifts or points)� Customers unable to use certain services

(e.g. outgoing calls, service ‘b’)

� The total costs for TMNL in 31 days are €€ z z amount ofamount of

� This means that the direct initial costs on a yearly base for TMNL/Customer Service caused by dataquality issues are €€ y y amount ofamount of

Page 22: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

TMNL Data Quality mission statement 2007

Key area’s of the function are:

� Representing the business interest in data quality for customer, contract and product data in all parts of T-Mobile Netherlands where data quality is involved to

TMNL Mission statementThe goal of Data Quality Management is to initiate, stimulate, coordinate and support activities that improve and maintain the quality of data of T-Mobile Netherlands so that data can be trusted and used to support company business processes internally and externally in the most efficient and effective way.

= € ?

21/10/2010Page 22

product data in all parts of T-Mobile Netherlands where data quality is involved to ensure that the elements of data management are part of operational processes.

� Ensuring that the appropriate tools are in place to measure data quality , regularly reporting on the status of data quality and making sure that where there are problems with data quality or inconsistencies and that the appropriate measures are taken to solve them.

� Evangelise the information culture , represent the right behaviours for a mature information based organization and raising the profile of Data Quality as a business issue by making the business value of data quality clear.

� Continuously looking for new opportunity areas where data qualit y can be improved , and gaining the support from the relevant departments to undertake new data quality initiatives.

= € ?

Page 23: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

21/10/2010Page 23

Page 24: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Data Management Maturity Models

Risk

Reward

21/10/2010Page 24

Page 25: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Maslow Maturity Model: Hierarch of Needs (1943)

21/10/2010Page 25

Page 26: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Maturity Models Overview 2010

� Maslow’s Hierarchy of Needs 1943 , psychologist proposed such a model for 5 levels of human needs

� Richard Nolan’s SGM (Stages of Growth Model) 1970 – 1979; maturity of automation

� CMM - Capability Maturity Model for Software (also known as CMM and SW-CMM) published by Software Engineering Institute (SEI) and Carnegie Mellon University and defines software development maturity of organizations based on procedures and processesCMMI-SE/SW CMM Integration (CMMI) ; successor of CMM

21/10/2010Page 26

� CMMI-SE/SW CMM Integration (CMMI) ; successor of CMM http://iea.wikidot.com/cmmi

� CMM - ITSM; IT Service Management� Data Warehouse Maturity models� VDC Maturity Model - Virtual Data Center (VDC) of tomorrow--the data center

where virtualization technologies work together to deliver applications.� Internet Marketing Maturity Models� Gartner's web analytics maturity model presented by Bill Grassman at eMetrics

San Francisco is to analyze the vendors themselves in comparison to what they data they can provide.

� The Architecture Maturity Model is organised into 5 levels, based on the Carnegie-Mellon Software Engineering Institute’s Capability Maturity Model for

Page 27: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Maturity Models 2010 (continued) � New Services Maturity Model technology professional services maturity model � The Professional Services Maturity Model

The study has been developed to measure the correlation between process maturity and service performance excellence.

� Project management maturity model� Corporate Sustainability: Capability Maturity Model :

The first step in developing a sustainability program is to assess where your firm is and where you want it to be on the following five-level corporate sustainability capability maturity model.

21/10/2010Page 27

capability maturity model.� BPM Maturity Model Alignment to a BPM Maturity model helps to ensure that the

overall Organisational BPM intiative is in alignment with a solid internal BPM Architecture Framework.

� SOA Maturity Model has become a great foundation for companies worldwide who have approached application integration using a service-oriented architecture (SOA). It provides IT decision makers with a simple framework for benchmarking the strategic value of their SOA planning and implementation—and a model for visualizing future success.

� E-Business Maturity Model

Page 28: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Why do you need a Data Maturity Model?

21/10/2010Page 28

Page 29: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Our approach to manage data quality is to continue the operative cleaning started with Phoenix and in parallel establish a conceptual data management to reduce the required cleaning effort

•Develop business model•Logical data model

Proactive AdaptiveReactive

Incident &problem management; Clean/ repair

data when problems become visible

Preventive testing & data inconsistency monitoring in order to

proactivelyidentify and correct errors /problems

Make sure new projects and changes are in line with

business and data model

Data Quality Management From Reactive to Adaptive Data Management

Clean data manually or via •ITT and UAT testing•End to end testing

21/10/2010Page 29

Current focus is on reactive data-management. Trouble shooting when problems get identified

•Logical data model•Technical data model•Data Distribution matrix•Glossary of terms•Data standard•GUI design standards •Interface architecture•Business & validation rules•Contact/channel matrix

•Monitor data quality•KPI’s for data quality in SLA & PM’s

Find and fix the root cause

Clean data manually or via script

•End to end testing•Data Acceptance testing

•Data Monitoring•Create incidents/problems•Work around scripts

0%

1%

2%

3%

4%

5%

6%

7%

Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

Clarif y Qualit y

My T-Mobile

BSCS-HLR

ADB-BSCS

Clarif y - BSCS inconsist ency

Page 30: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

T-Mobile Enterprise Data Management Maturity model 2006

Risk

Reward

Initial Repeatable Defined Managed Optimizing

21/10/2010Page 30

People – Who is involved and what contributions must they make?Process – What activities must be performed?Technology – What investments in technology must be made?Risk and Reward – What risks does the organization face at the current stage and what

could it gain from progressing forward?

Page 31: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

The IBM model for Data Management Maturity (2008)Stage1: Uncertainty(ad hoc)

Stage2: Awakening (repeatable)

Stage 3: Enlightenment (defined)

Stage 4: Wisdom (managed)

Stage 5: Certainty (optimizing)

1. Strategy and Understanding* Executive Interest* Alignment of Business and Information Strategy* Communication on Data

Execs are not aware of data governance.No coordinated information strategy.Projects executed in ad hoc way.No communication on data-projects or results.

IT execs support data management. Limited, informal, talk on data initiatives. Elements of information strategy exist. Initiatives coordinate on stand alone basis.

IT and business teaming on data-related projects. Cross-dept information strategy in place, aligned with business strategy. Regular communications on data-projects & results.

Execs support data governance financially, incl personal emphasis. Benefits tracked; strategy adjusted to maximize benefits and support business priorities.

Execs manage data assets as driver of efficiency, performance and comp. differentiation. Partners support info strategy. Data is 'talk of the town'.

2. Organization* Business & IT roles in Information Lifecycle* Data Skills, Learning and Training

Business/IT roles in data management and projects not clearly defined. Inconsistent business participation. Data skills not always available..

Data management roles and responsibilities in business/IT are defined. Data management skills and training available across the IT organization.

Roles & responsibilities assigned, not always executed. Business directing data mgt priorities. Consistent development of data skills.

Strategic business planning leads efforts to bring info innovation into business plans. Deep role- based training on data mgt in business & IT.

Business/IT roles implemented and adaptive. Changing in- and external environments supported by ongoing development of in- and external data skills

3. Processes* Processes for obtaining information Customer, Service, Product Data Definition Processes

Data collection takes up most of the time. Sources of data often silo-ed. Information is non¬integrated. Changes to data are uncontrolled.

Some data integration. Controls developed around changes of data definitions. Some common data definitions. Different guidelines and

Services-based data apps. Integration of data silos. Key data available.Data management processes

Business process integration via information services. Data is seamless, shared and available throughout processes, enabling

In- and external data shared and readily available. Additional sources easily added. High level of standards-centric

21/10/2010Page 31

Product Data Definition Processes* Alignment of Business Processes and Data Mgt

Changes to data are uncontrolled. No common data definitions.

definitions. Different guidelines and processes around definitions and requirements gathering.

Data management processes rationalized. Common data definitions, shared between business and IT. Controlled changes.

throughout processes, enabling process innovation. Definitions shared and centrally managed

High level of standards-centric information definition, creation and use across business and IT.

4. Governance* Data Governance Org.* Stewardship & Ownership* Policies & Procedures* Data aspects in projects/processes

No data governance organization, policies or standards. Data aspects of business & IT projects seldom linked or addressed. No Data Stewardship. Data owned at departmental level

Stronger, informal governance role and policies exists. Departmental processes address data aspects of/between IT/business projects. Data stewardship and ownership on departmental level.

Governance organization in place. Standard processes to address data aspects of projects. Data Stewardship implemented. Accountability and authority over data definitions and changes coordinated.

Data governance in place, linked to key internal processes. Preventive action. Deliveries of projects that address data aspects are reviewed. Data linked to exec. performance. Policies stored and accessible.

Governance extended to bus. partners. Prevention has main focus. All demand and supply processes address data aspects. Stewardship extended to bus. Partners. Adherence to policies is enforced and trained.

5. Data and Data Quality* Data Architecture & Standards* Master Data Management DQ Management* DQ Metrics & Standards Metadata Management

Decisions cannot be made due to unreliable data : no quality checks. Ad hoc efforts to meet quality needs. Manual effort to coordinate master data. Capture of metadata when it adds value.No version of the truth.

Data Quality monitored. Preventative data quality processes. Ad hoc correction efforts. Loose, not uniform, master data mgt. Silos of metadata. High level architectural standards.Multiple versions of the truth

Enterprise data architecture developed and managed. Quality requirements governed by business/IT. Processes to validate data quality compliance. Master data owned and controlled across processes and depts. Metadata captured and used consistently.

Flexible data architecture - information as a service. DQ metrics embedded in processes and systems. DQ approach adjusted when bus. strategy changes. Metadata integrated across processes/technologies. Single version of truth.

Partners managed to use data architecture. Master data controlled across bus. partners. DQ meets industry quality standards. Self-healing DQcapabilities. Metadata capturing and exchange with business partners.

Page 32: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

21/10/2010Page 32

Page 33: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Key Elements of Data Maturity � Level 1: Ad Hoc (1998 – 2004)

� Executives are not aware of data management� No data management organisation, policies or standards� Ad hoc efforts to meet quality needs (project oriented)

� Level2: Repeatable (2005 – 2009)� Full Time Data Manager (role) � Some common data definitions� Data stewardship� Data Quality monitoring “To measure is to know”

21/10/2010Page 33

� Data Quality monitoring “To measure is to know”

� Level 3: Defined (2009 – 201x)� Data Quality Management/Governance processes in place� Meta Data

� Level 4: Managed� Data Quality Budget� Preventive

� Level 5: Optimized

Page 34: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

DQM maturity timelines

Stage 1 “Ad hoc” Stage 2 “Repeatable”Stage 3 “Defined”

Data Quality Background

Data Quality Targets officialized

New Data Quality compares were created

Data Manager role was defined

Data inconsistency reports were created

Data inconsistency meetings were held

The first draft for a Customer Data Quality standard was

created

Financial effect DQ issues for CS made visible

First draft on Product- and Contract Data Standard

21/10/2010Page 34

Customer Data Standard realized and officialized

Data Quality Dashboard introduced

Data cleaning was started to prepare for customer data

migration

A start was made with measuring data inconsistencies

Data Manager role officialized

Standardization in reporting

Second Data Manager appointed

2002 2003 2004 2005 2006 2007 2008 2009 2010

Page 35: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Summary: Why do we need a data maturity model?

� You need to know at what stage you are currently and why you are there (as-is)

� You can understand the risks associated with undervalued data management practices

� Help understand the benefits and costs associated with a move to the next stage

� To improve you have to change the entire culture of your organization – from personnel to technology to management

21/10/2010Page 35

organization – from personnel to technology to management strategies

� You can accurately set goals for data maturity (and it takes time)� This will help you to move to the next stage (to-be)

Current Stage + Best practiceRoadmap for mature data

management

Page 36: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Thank you for your attention.

21/10/2010Page 36

Page 37: DDMA Data Quality Award 2010 - Presentatie T- Mobile Netherlands - Jos Leber

Any

21/10/2010Page 37

Any Questions?