51
Monetizing Data Management - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved! [email protected] 1 - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved! Peter Aiken DoD Computer Scientist Reverse Engineering Program Manager/Office of the Chief Information Officer (1992-1997) Visiting Scientist Software Engineering Institute/Carnegie Mellon University (2001-2002) DAMA International President (http://dama.org) 2001 DAMA International Individual Achievement Award (with Dr. E. F. "Ted" Codd) 2005 DAMA Community Award Founding Advisor/International Association for Information and Data Quality (http://iaidq.org) Founding Advisor/Meta-data Professionals Organization (http://metadataprofessional.org) Founding Director Data Blueprint 1993 BS VCU 1981 Information Systems & Management MS VCU 1985 Information Systems PhD GMU 1989 Information Technology Engineering Full time in information technology since 1981 IT engineering research and project background University teaching experience since 1979 Seven books and dozens of articles Research Areas reengineering, data reverse engineering, software requirements engineering, information engineering, human-computer interaction, systems integration/ systems engineering, strategic planning, and DSS/BI Director George Mason University/Hypermedia Laboratory (1989-1993) 2

Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Monetizing Data Management

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

[email protected]

1

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Peter Aiken

• DoD Computer Scientist– Reverse Engineering Program Manager/Office of the Chief Information Officer (1992-1997)

• Visiting Scientist– Software Engineering Institute/Carnegie Mellon University (2001-2002)

• DAMA International President (http://dama.org)

– 2001 DAMA International Individual Achievement Award (with Dr. E. F. "Ted" Codd)– 2005 DAMA Community Award

• Founding Advisor/International Association for Information and Data Quality (http://iaidq.org)

• Founding Advisor/Meta-data Professionals Organization (http://metadataprofessional.org)

• Founding Director Data Blueprint 1993

• BS VCU 1981 Information Systems & Management• MS VCU 1985 Information Systems• PhD GMU 1989 Information Technology Engineering• Full time in information technology since 1981• IT engineering research and project background• University teaching experience since 1979• Seven books and dozens of articles• Research Areas

– reengineering, data reverse engineering, software requirements engineering, information engineering, human-computer interaction, systems integration/systems engineering, strategic planning, and DSS/BI

• Director– George Mason University/Hypermedia Laboratory (1989-1993)

2

Page 2: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!3

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $1 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

• As user requested• As the sales

executive described it• As published in the

functional description• As specified in the

request for proposals• As developed by the

winning contractor• As installed at the

operational sites• As accredited for

operation• How the project was

documented• How the help desk

supported it• How the customer

was billed4

Traditional Systems Life Cycle

Page 3: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

http://www.despair.com/

5

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

IT Project Failure RatesRecent IT project failure rates statistics can be summarized as follows:

– Carr 1994• 16% of IT Projects completed on time,

within budget, with full functionality– OASIG Study (1995)

• 7 out of 10 IT projects "fail" in some respect – The Chaos Report (1995)

• 75% blew their schedules by 30% or more• 31% of projects will be canceled before they ever get completed• 53% of projects will cost over 189% of their original estimates• 16% for projects are completed on-time and on-budget

– KPMG Canada Survey (1997)• 61% of IT projects were deemed to have failed

– Conference Board Survey (2001) • Only 1 in 3 large IT project customers were very “satisfied"

– Robbins-Gioia Survey (2001)• 51% of respondents viewed their large IT implementation project as unsuccessful

– MacDonalds Innovate (2002)• Automate fast food network from fry temperature to # of burgers sold-$180M USD write-

off– Ford Everest (2004)

• Replacing internal purchasing systems-$200 million over budget– FBI (2005)

• Blew $170M USD on suspected terrorist database-"start over from scratch"

http://www.it-cortex.com/stat_failure_rate.htm (accessed 9/14/02)

New York Times 1/22/05 pA31

6

Page 4: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

© Copyright 01/1/08 and previous years by Data Blueprint - all rights reserved! - datablueprint.com

Organizations Surveyed

7

• Results from more than 500 organizations

• 32% government

• Appropriate public company representation

• Enough data to demonstrate European organization DM practices are generally more mature

Local Government4%

State Government Agencies17%

Federal Government11%

Public Companies 58%

International Organizations10%

0

0.09

0.18

0.27

0.36

0.45

Successful

Don't know/too soon to tell

Does not exist• In 25 years:

© Copyright 01/1/08 and previous years by Data Blueprint - all rights reserved! - datablueprint.com

% of DM organizations labeled "successful"

8

19812007

Page 5: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

DM Origins – Which arrives first – DM or DBMS?

• A key indicator of organizational awareness• 75% reacting instead of anticipating • Best practices are obvious

26%68%

6%

9%

75%

6%

DM 1st

DBMS 1st

Simultaneously

1981 2007

9

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Data Management Involvement

10

Data Warehousing

XML

Data Quality

Customer Relationship Management

Master Data Management

Customer Data Integration

Enterprise Resource Planning

Enterprise Application Integration

Initiative Leader Initiative Involvement Not Involved

Page 6: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Why Data Projects Fail by Joseph R. Hudicka

• Assessed 1200 migration projects!

– Surveyed only experienced migration specialists who have done at least four migration projects

• The median project costs over 10 times the amount planned!

• Biggest Challenges: Bad Data; Missing Data; Duplicate Data

• The survey did not consider projects that were cancelled largely due to data migration difficulties

• "… problems are encountered rather than discovered"

$0 $125,000 $250,000 $375,000 $500,000

Median Project Expense

Median Project Cost

Joseph R. Hudicka "Why ETL and Data Migration Projects Fail" Oracle Developers Technical Users Group Journal June 2005 pp. 29-3111

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Monetizing - from Wikipedia

12

• Monetization is the process of converting or establishing something into legal tender.

• It usually refers to the printing of banknotes by central banks, but things such as gold, diamonds and emeralds, and art can also be monetized.

• Even intrinsically worthless items can be made into money, as long as they are difficult to make or acquire.

Page 7: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Root Cause Analysis

13

• Symptom of the problem– The weed– Above the surface – Obvious

• The underlying Cause– The root– Below the surface – Not obvious

• Poor Information Management Practices– Did not hire Adastra!

Data Governance, Data Quality, Data Security, Analytics, Data Compliance,

Data Mashups, Business Rules (more ...)

DataManagement

(DM) ! 2000-

Organization-wide DM coordinationOrganization-wide data integration

Data stewardship, Data use

EnterpriseData

Administration(EDA)

! 1990-2000Data requirements analysis

Data modeling

Data Administration

(DA) ! 1970-1990

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Expanding DM Scope

DataBase Administration (DBA) ! 1950-1970

Database designDatabase operation

14

Page 8: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Data Data

Data

Information

Fact Meaning

Request

A Model Specifying Relationships Among Important Terms

[Built on definition by Dan Appleton 1983]

Intelligence

Use

1. Each FACT combines with one or more MEANINGS.2. Each specific FACT and MEANING combination is referred to as a DATUM.3. An INFORMATION is one or more DATA that are returned in response to a specific

REQUEST. 4. INFORMATION REUSE is enabled when one FACT is combined with more than

one MEANING.5. INTELLIGENCE is INFORMATION associated with its USES.

Wisdom & knowledge are often used synonymously

Data

Data

Data Data

15

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

StandardData

Data Management

Data Program Coordination

OrganizationalData Integration

DataStewardship

Data SupportOperations

Data Asset Use

Organizational Strategies

Goals

IntegratedModels

BusinessData

Business Value

Application Models & Designs

Feedback

Implementation

Direction

DataDevelopment

Guidance

16

Page 9: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Assign responsibilities for data.

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Manage data coherently.

Share data across boundaries.

Engineer data delivery systems.

Maintain data availability.

11

Data Program Coordination

Organizational Data Integration

Data Stewardship

Data Development

Data Support Operations

Data Management

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

StandardData

Data Management

Data Program Coordination

OrganizationalData Integration

DataStewardship

Data SupportOperations

Data Asset Use

Organizational Strategies

Goals

IntegratedModels

BusinessData

Business Value

Application Models & Designs

Feedback

Implementation

Direction

DataDevelopment

Guidance

18

Page 10: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!19

Our DM practices are ad hoc and dependent upon "heroes" and heroic efforts

Initial(1)

Repeatable(2) We have DM experience and

have the ability to implement disciplined processes

Data Management Capability Maturity Model Levels

Defined(3)

We have experience that we have standardized so that all in the organization

can follow it

Managed(4)

We manage our DM processes so that the whole organization can

follow our standard DM guidance

Optimizing(5)

We have a process for improving our

DM capabilities

One concept for process improvement, others include:

• Norton Stage Theory• TQM• TQdM• TDQM• ISO 9000

and focus on understanding current processes and determining where to make improvements.

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Assessment Components

Data Management Practice AreasData Management Practice AreasData program coordination

DM is practiced as a coherent and coordinated set of activities

Organizational data integration

Delivery of data is support of organizational objectives – the currency of DM

Data stewardship Designating specific individuals caretakers for certain data

Data development

Efficient delivery of data via appropriate channels

Data support Ensuring reliable access to data

4

Capability Maturity Model Levels

Examples of practice maturity

1 – Initial Our DM practices are ad hoc and dependent upon "heroes" and heroic efforts

2 - Repeatable We have DM experience and have the ability to implement disciplined processes

3 - Documented We have standardized DM practices so that all in the organization can perform it with uniform quality

4 - Managed We manage our DM processes so that the whole organization can follow our standard DM guidance

5 - Optimizing We have a process for improving our DM capabilities

Page 11: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Data Program Coordination

Organizational Data Integration

Data Stewardship

Data Development

Data Support Operations

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Data Management Practices Measurement (DMPA)

21

Focus: Implementation

and Access

Focus: Guidance and

Facilitation

Optimizing (V)

Managed (IV)

Documented (III)

Repeatable (II)

Initial (I)

• CMU's Software Engineering Institute (SEI) Collaboration• Results from hundreds organizations in

various industries including:– Public Companies – State Government Agencies– Federal Government– International Organizations• Defined industry standard• Steps toward defining data

management "state of the practice"

Academic Research Findings0% 12.500% 25.000% 37.500% 50.000%

49.00%

39.00%

21.00%

20.00%

20.00%

20.00%

19.00%

18.00%

18.00%

17.00%

Retail

Consulting

Air Transportation

Food Products

Construction

Steel

Automobile

Publishing

Industrial Instruments

Telecommunications

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!22

A 10% improvement in data usability on

productivity (increased sales per

employee by 14.4% or $55,900)

Measuring the Business Impacts of Effective Data by Anitesh Barua,, Deepa Mani,, Rajiv Mukherjee

Page 12: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Academic Research Findings

23

Projected increase in sales (in $M) due to 10% improvement in

data usability on productivity (sales per employee)

Measuring the Business Impacts of Effective Data by Anitesh Barua,, Deepa Mani,, Rajiv Mukherjee

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!24

Projected impact of a 10% improvement in data quality and

sales mobility on Return on Equity

Measuring the Business Impacts of Effective Data by Anitesh Barua,, Deepa Mani,, Rajiv Mukherjee

Academic Research Findings

Page 13: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!25

Projected Impact of a 10% increase in intelligence and accessibility of

data on Return on Assets

Measuring the Business Impacts of Effective Data by Anitesh Barua,, Deepa Mani,, Rajiv Mukherjee

Academic Research Findings

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!26

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

Page 14: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Monitization: Time & Leave Tracking

27

At Least 300 employees are spending 15 minutes/week

tracking leave/time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!28

Capture Cost of Labor/Category

Page 15: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!29

Computer Labor as OverheadRoutine Data EntryRoutine Data EntryRoutine Data Entry

District-L (as an example) Leave Tracking Time AccountingEmployees 73 50Number of documents 1000 2040Timesheet/employee 13.70 40.8Time spent 0.08 0.25Hourly Cost $6.92 $6.92Additive Rate $11.23 $11.23Semi-monthly cost per timekeeper

$12.31 $114.56

Total semi-monthly timekeeper cost

$898.49 $5,727.89

Annual cost $21,563.83 $137,469.40

• Range $192,000 - $159,000/month

• $100,000 Salem

• $159,000 Lynchburg

• $100,000 Richmond

• $100,000 Suffolk

• $150,000 Fredericksburg

• $100,000 Staunton

• $100,000 NOVA

• $800,000/month or $9,600,000/annually

• Awareness of the cost of things considered overhead - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!30

Annual Organizational Totals

Page 16: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!31

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

ERP Implementation Success

• Most ERP implementations today result in cost and schedule overruns; courtesy of the Standish Group

On time, within budget, as planned 10%

Cancelled 35%

Overrun 55%

100% 100%41%

178%230%

59%

0%

50%

100%

150%

200%

250%

300%

350%

Cost Schedule PlannedFunctionality

32

Page 17: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Why?

33

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Platform: UniSysOS: OS1998 Age: 21 Data Structure: DMS (Network)Physical Records: 4,950,000Logical Records: 250,000Relationships: 62Entities: 57Attributes: 1478

Predicting Engineering Problem Characteristics

New System

Legacy System #1: Payroll

Legacy System #2: Personnel

Platform: AmdahlOS: MVS1998 Age: 15 Data Structure: VSAM/virtual database tablesPhysical Records: 780,000Logical Records: 60,000Relationships: 64Entities: 4/350Attributes: 683

Characteristics Logical PhysicalPlatform: WinTel Records: 250,000 600,000OS: Win'95 Relationships: 1,034 1,0201998 Age: new Entities: 1,600 2,706Data Structure: Client/Sever RDBMS Attributes: 15,000 7,073

34

Page 18: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

"Extreme" Data Engineering• 2 person months = 40 person days• 2,000 attributes mapped onto 15,000• 2,000/40 person days = 50 attributes

per person dayor 50 attributes/8 hour = 6.25 attributes/hour

and• 15,000/40 person days = 375 attributes

per person dayor 375 attributes/8 hours = 46.875 attributes/hour

• Locate, identify, understand, map, transform, document, QA at a rate of -

• 52 attributes every 60 minutes or .86 attributes/minute!

35

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Challenge

• "Green screen" legacy system to be replaced with Windows Icons Mice Pointers (WIMP) interface; and

• Major changes to operational processes– 1 screen to 23 screens

• Management didn't think workforce could adjust to simultaneous changes– Question: "How big a change will it be to replace all

instances of person_identifier with social_security_number?"

• Answer: – (from "big" consultants) "Not a very big change."

36

Page 19: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

InstalledPeopleSoftSystem• Queries to

PeopleSoft Internals

• PeopleSoft external RDBM Tables

• Printed PeopleSoft Datamodel

Metadata Uses

• System Structure Metadata - requirements verification and system change analysis

• Data Metadata - data conversion, data security,and user training

• Workflow Metadata - business practice analysis and realignment

implementationrepresentation

Componentmetadata integration

data metadata

system structure metadata

workflow metadata

post derivationmetadata

analysisand

integration

Reverse Engineering PeopleSoft

TheMAT

37

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Home Page

Business Process Name

Business Process Component

Business Process Component Step

PeopleSoft Process Metadata

38

Home Page Name

(relates to one or more)

Business Process Name

(relates to one or more)

Business Process Component Name

(relates to one or more)

Business Process Component Step Name

Page 20: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Example Query Outputs39

processes(39)

homepages(7)

menugroups(8)

components(180)

stepnames(822)

menunames(86)

panels(1421)

menuitems(1149)

menubars(31)

fields(7073)

records(2706)

parents(264)

reports(347)

children(647)

(41) (8)

(182)

(847)

(949)

(86)

(281)

(1259)(1916)

(5873)(264)

(647)(708)(647)

(25906)

(347)

Metadata Peoplesoft Does not Possess

40 - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Page 21: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Resolution

41

Quantity System Component

Time to make change

Labor Hours

1,400 Panels 15 minutes 3501,500 Tables 15 minutes 375984 Business

process component steps

15 minutes 246

Total 971X $200/hour $194,200X 5 upgrades $1,000,000

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!42

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

Page 22: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Legacy System Migration to ERP: MDM

43

• Challenge– Millions of NSN/SKUs– Key and other data stored in clear text/comment fields– Original suggestion was manual approach to text

extraction– Left the MDM structuring problem unsolved

• Solution– Proprietary, improvable text extraction process– Converted non-tabular data into tabular data– Saved a minimum of $5 million– Literally person centuries of work

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

An Iterative Approach to MDM Structuring

44

Unmatched Items

Unmatched Items

Ignorable Ignorable Items

Avg Extracted

Items MatchedItems Matched

Rev#

(% Total) NSNs (% Total) Items Matched

Per Item (% Total)

Items Extracted

1 329948 31.47% 14034 1.34% N/A N/A N/A 264703

2 222474 21.22% 73069 6.97% N/A N/A N/A 286675

3 216552 20.66% 78520 7.49% N/A N/A N/A 287196

4 340514 32.48% 125708 11.99% 582101 1.1000222 55.53% 640324

… … … … … … … … …

14 94542 9.02% 237113 22.62% 716668 1.1142914 68.36% 798577

15 94929 9.06% 237118 22.62% 716276 1.1139282 68.33% 797880

16 99890 9.53% 237128 22.62% 711305 1.1153008 67.85% 793319

17 99591 9.50% 237128 22.62% 711604 1.1154392 67.88% 793751

18 78213 7.46% 237130 22.62% 732980 1.2072812 69.92% 884913

Page 23: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Time needed to review all NSNs once over the life of the project:Time needed to review all NSNs once over the life of the project:NSNs 2,000,000Average time to review & cleanse (in minutes) 5Total Time (in minutes) 10,000,000

Time available per resource over a one year period of time:Time available per resource over a one year period of time:Work weeks in a year 48Work days in a week 5Work hours in a day 7.5Work minutes in a day 450Total Work minutes/year 108,000

Person years required to cleanse each NSN once prior to migration:Person years required to cleanse each NSN once prior to migration:Minutes needed 10,000,000Minutes available person/year 108,000Total Person-Years 92.6

Resource Cost to cleanse NSN's prior to migration:Resource Cost to cleanse NSN's prior to migration:Avg Salary for SME year (not including overhead) $60,000.00Projected Years Required to Cleanse/Total DLA Person Year Saved

93Total Cost to Cleanse/Total DLA Savings to Cleanse NSN's: $5.5 million - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Quantitative Benefits

45

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Page 24: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Page 25: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

• National Stock Number (NSN) Discrepancies– If NSNs in LUAF, GABF, and RTLS are

not present in the MHIF, these records cannot be updated in SASSY

– Additional overhead is created to correct data before performing the real maintenance of records

• Serial Number Duplication– If multiple items are assigned the same

serial number in RTLS, the traceability of those items is severely impacted

– Approximately $531 million of SAC 3 items have duplicated serial numbers

• On-Hand Quantity Discrepancies– If the LUAF O/H QTY and number of items serialized in RTLS conflict, there

can be no clear answer as to how many items a unit actually has on-hand– Approximately $5 billion of equipment does not tie out between the LUAF and

RTLS - datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Business Implications

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!50

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

Page 26: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Seven Sisters from British Telecom

51

Thanks to Dave Evans

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!52

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

Page 27: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Friendly Fire deaths

traced to Dead

Battery

53

Date: Tue, 26 Mar 2002 10:47:52 -0500From: Jamie McCarthy <[email protected]>Subject: Friendly Fire deaths traced to dead battery

In one of the more horrifying incidents I've read about, U.S. soldiers andallies were killed in December 2001 because of a stunningly poor design of aGPS receiver, plus "human error."

 http://www.washingtonpost.com/wp-dyn/articles/A8853-2002Mar23.html

A U.S. Special Forces air controller was calling in GPS positioning fromsome sort of battery-powered device.  He "had used the GPS receiver tocalculate the latitude and longitude of the Taliban position in minutes andseconds for an airstrike by a Navy F/A-18."

According to the *Post* story, the bomber crew "required" a "secondcalculation in 'degree decimals'" -- why the crew did not have equipment toperform the minutes-seconds conversion themselves is not explained.

The air controller had recorded the correct value in the GPS receiver whenthe battery died.  Upon replacing the battery, he called in thedegree-decimal position the unit was showing -- without realizing that theunit is set up to reset to its *own* position when the battery is replaced.

The 2,000-pound bomb landed on his position, killing three Special Forcessoldiers and injuring 20 others.

If the information in this story is accurate, the RISKS involve replacingmemory settings with an apparently-valid default value instead of blinking 0or some other obviously-wrong display; not having a backup battery to holdvalues in memory during battery replacement; not equipping users totranslate one coordinate system to another (reminiscent of the Mars ClimateOrbiter slamming into the planet when ground crews confused English withmetric); and using a device with such flaws in a combat situation

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!54

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

Page 28: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Plaintiff(Company X)

Defendant(Company Y)

April Requests a recommendation from ERP Vendor

Responds indicating "Preferred Specialist" status

July Contracts Defendant to implement ERP and convert legacy data

Begins implementation

January Realizes a key milestone has been missed

Stammers an explanation of "bad" data

July Slows then stops Defendant invoice payments

Removes project team

Files arbitration request as governed by contract with Defendant

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Messy Sequencing Towards Arbitration

55

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Points of Contention• Who owned the risks?

• Who was the project manager?

• Was the data of poor quality?

• Did the contractor (Company Y) exercise due diligence?

• Was their methodology adequate?

• Were required standards of care followed and were the work products of required quality?

56

Page 29: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Discovery

• In documents and pre-trial testimony, Company Y blamed conversion failure on "bad" data

• Expert witnesses were introduced by both X & Y

57

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Expert ReportsOurs provided evidence that :1. Company Y's conversion code introduced

errors into the data2. Some data that Company Y converted was of

measurably lower quality than the quality of the data before the conversion

3. Company Y caused harm by not performing an analysis of the Company X's legacy systems and that that the required analysis was not a part of any project plan used by Company Y

4. Company Y caused harm by withholding specific information relating to the perception of the on-site consultants' views on potential project success

Expert Report

58

Page 30: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Hypothesized extensions contributed by a Chicago DAMA Member10. Psychologically female, biologically male11. Psychologically male, biologically female 12. Both soon to be female13. Both soon to be male

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

FBI & Canadian Social Security Gender Codes1. Male2. Female3. Formerly male now female4. Formerly female now male5. Uncertain6. Won't tell7. Doesn't know8. Male soon to be female9. Female soon to be male

59

If column 1 in source = "m"

• then set value of target data to "male"

• else set value of target data to "female"

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

The defendant knew to prevent duplicate SSNs

!************************************************************************! Procedure Name: 230-Assign-PS-Emplid!! Description : This procedure generates a PeopleSoft Employee ID! (Emplid) by incrementing the last Emplid processed by 1! First it checks if the applicant/employee exists on! the PeopleSoft database using the SSN.!!************************************************************************Begin-Procedure 230-Assign-PS-Emplid

move 'N' to $found_in_PS !DAR 01/14/04 move 'N' to $found_on_XXX !DAR 01/14/04

BEGIN-SELECT -Db'DSN=HR83PRD;UID=PS_DEV;PWD=psdevelopment'NID.EMPLIDNID.NATIONAL_ID

move 'Y' to $found_in_PS !DAR 01/14/04 move &NID.EMPLID to $ps_emplid

FROM PS_PERS_NID NID!WHERE NID.NATIONAL_ID = $ps_ssnWHERE NID.AJ_APPL_ID = $applicant_idEND-SELECT

if $found_in_PS = 'N' !DAR 01/14/04 do 231-Check-XXX-for-Empl !DAR 01/14/04 if $found_on_XXX = 'N' !DAR 01/14/04 add 1 to #last_emplid let $last_emplid = to_char(#last_emplid) let $last_emplid = lpad($last_emplid,6,'0') let $ps_emplid = 'AJ' || $last_emplid end-if end-if !DAR 01/14/04

End-Procedure 230-Assign-PS-Emplid

AJHR0213_CAN_UPDATE.SQR

The exclamation point prevents this line from

looking for duplicates, so no check is made for a duplicate SSN/National

ID

Legacy systems business rules allowed employees to

have more than one AJ_APPL_ID.

60

Page 31: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!61

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Identified & Quantified Risks

62

Page 32: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Risk Response “Risk response development involves defining enhancement steps

for opportunities and threats.” Page 119, Duncan, W., A Guide to the Project Management Body of Knowledge, PMI, 1996

"The go-live date may need to be extended due to certain critical path deliverables not being met. This extension will require additional tasks and resources. The decision of whether or not to extend the go-live date should be made by Monday, November 3, 20XX so that resources can be allocated to the additional tasks."

Tasks HoursNew Year Conversion 120Tax and payroll balance conversion 120General Ledger conversion 80

Total 320

Resource HoursG/L Consultant 40Project Manager 40Recievables Consultant 40HRMS Technical Consultant 40Technical Lead Consultant 40HRMS Consultant 40Financials Technical Consultant 40

Total 280

Delay Weekly Resources Weeks Tasks CumulativeJanuary (5 weeks) 280 5 320 1720February (4 weeks) 280 4 1120

Total 284063

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!64

Page 33: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Professional & Workmanlike Manner

65

Defendant warrants that the services it provides hereunder will be performed in a professional and workmanlike manner in accordance with industry standards.

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

The Defense's "Industry Standards"• Question:

– What are the industry standards that you are referring to?• Answer:

– There is nothing written or codified, but it is the standards which are recognized by the consulting firms in our (industry).

• Question:– I understand from what you told me just a moment ago that

the industry standards that you are referring to here are not written down anywhere; is that correct?

• Answer:– That is my understanding.

• Question:– Have you made an effort to locate these industry standards

and have simply not been able to do so?• Answer:

– I would not know where to begin to look.66

Page 34: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Published Industry Standards Guidance• Examples from the:• IEEE (365,000 members)

– Institute of Electrical and Electronic Engineers– 150 countries, 40 percent outside the United States– 128 transactions, journals and magazines– 300 conferences

• ACM (80,000+ members)– Association of Computing Machinery– 100 conferences annually

• ICCP (50,000+ members)– Institute for Certification of Computing Professionals

• DAMA International (3,500+ members)– Data Management Association– Largest Data/Metadata conference

67

We, the members of the IEEE, in recognition of the importance of our technologies in affecting the quality of life throughout the world, and in accepting a personal obligation to our profession, its members and the communities we serve, do hereby commit ourselves to the highest ethical and professional conduct and agree:

1. To accept responsibility in making engineering decisions consistent with the safety, health and welfare of the public, and to disclose promptly factors that might endanger the public or the environment;

2. To avoid real or perceived conflicts of interest whenever possible, and to disclose them to affected parties when they do exist;

3. To be honest and realistic in stating claims or estimates based on available data; 4. To reject bribery in all its forms; 5. To improve the understanding of technology, its appropriate application, and potential

consequences; 6. To maintain and improve our technical competence and to undertake technological tasks for

others only if qualified by training or experience, or after full disclosure of pertinent limitations; 7. To seek, accept, and offer honest criticism of technical work, to acknowledge and correct

errors, and to credit properly the contributions of others; 8. To treat fairly all persons regardless of such factors as race, religion, gender, disability, age, or

national origin; 9. To avoid injuring others, their property, reputation, or employment by false or malicious action; 10. To assist colleagues and co-workers in their professional development and to support them in

following this code of ethics. [Approved by the IEEE Board of Directors, August 1990]

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

IEEE Code of Ethics

http://www.ieee.org/portal/site/mainsite/menuitem.818c0c39e85ef176fb2275875bac26c8/index.jsp?&p Name=corp_level1&path=about/whatis&file=code.xml&xsl=generic.xsl accessed on 4/10/04.68

Page 35: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

1. General Moral Imperatives.1.2 Avoid harm to others

• Well-intended actions, including those that accomplish assigned duties, may lead to harm unexpectedly. In such an event the responsible person or persons are obligated to undo or mitigate the negative consequences as much as possible. One way to avoid unintentional harms is to carefully consider potential impacts on all those affected by decisions made during design and implementation.

• To minimize the possibility of indirectly harming others, computing professionals must minimize malfunctions by following generally accepted standards for system design and testing. Furthermore, it is often necessary to assess the social consequences of systems to project the likelihood of any serious harm to others. If system features are misrepresented to users, coworkers, or supervisors, the individual computing professional is responsible for any resulting injury.

http://www.acm.org/constitution/code.html69

ACM Code of Ethics and Professional Conduct

• Three days after the hearing, the panel issued a one-page decision awarding damages of $5 million to Company X

Defendant Plaintiff $5,000,000.00 Five million Dollars and 00/100 ************************************************** **************** dollars

one big mistake!

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

Outcome

70

Sep 20, 2010

Page 36: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!71

Monetizing Data Management• Why is it important?

– Concretizing• State Agency Time & Leave Tracking

– $10 million USD annually• ERP Implementation

$1 million USD on a large project• Data Warehouse Quality Analysis

$5 billion USD US DoD (prevention)• MDM British Telecom rollout

– £ 250 (small investment)• Non-Monetized Example

– Different measures• ERP Implementation Legal Case

$ 5,355,450 CAN damages/penalties

- datablueprint.com 9/20/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!

http://peteraiken.net

Contact Information:

Peter Aiken, Ph.D.

Department of Information Systems School of BusinessVirginia Commonwealth University1015 Floyd Avenue - Room 4170Richmond, Virginia 23284-4000

Data Blueprint Maggie L. Walker Business & Technology Center501 East Franklin StreetRichmond, VA 23219804.521.4056http://datablueprint.com

office :+1.804.883.759cell:+1.804.382.5957

e-mail:[email protected]://peteraiken.net

Copyright 12/18/07 by Data Blueprint - all rights reserved!72

Page 37: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

29APRIL 2010

PERSPECTIVES

Published by the IEEE Computer Society0018-9162/10/$26.00 © 2010 IEEE

In the absence of other published standards of care, it is reasonable for contractual parties to rely on an applicable, widely available code of conduct to guide expectations.

When legal disputes arise, the primary focus of judges, juries, and arbitra-tion panels is on interpreting facts. In cases of alleged underperformance, they must evaluate facts against con-

tract language, which typically states that services will be provided in accordance with industry standards. Legal arbiters seek well-articulated “standards of care” against which to evaluate the behavior of contractual parties and, in the absence of other published standards, increasingly rely on codes of conduct (CoCs) to establish an objective context. In fact, they have successfully applied CoCs—including the ACM/IEEE-CS CoC—in instances where the parties were not even affiliated with the CoC-sponsoring organization.

We illustrate the current application of CoCs with a fictional enterprise resource planning (ERP) system imple-mentation failure that is a compilation of real-life cases. Subject to binding panel arbitration, the plaintiff and defen-dant in the case presented conflicting interpretations of the same facts: From the plaintiff’s perspective, the defendant

Peter Aiken, Virginia Commonwealth University

Robert M. Stanley and Juanita Billings, Data Blueprint

Luke Anderson, Duane Morris LLC

failed to migrate the ERP system as promised; the defen-dant countered that defective and poor-quality data delayed the migration. Using the ACM/IEEE-CS CoC as a reference, expert testimony convinced the arbitration panel that the defendant’s position was untenable, and the panel accord-ingly awarded the plaintiff a multimillion-dollar judgment.

CASE STUDYAcme Co. received a directive from its parent cor-

poration mandating replacement of its legacy pay and personnel systems with a specific ERP software package designed to standardize payroll and personnel processing enterprise-wide. Upon the vendor’s “referred specialist” recommendation, Acme Co. contracted with ERP Systems Integrators to implement the new system and convert its legacy data for $1 million.

The contracted timeline was six months, beginning in July and wrapping up with a “big bang” conversion at the end of December. The year-end conversion failed, alleg-edly due to ERP Systems Integrators’ poor data migration practices, and Acme Co. had to run the old and new sys-

Using Codes of Conduct to Resolve Legal Disputes

Page 38: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

PERSPECTIVES

COMPUTER 30

tems in parallel—a complex and expensive situation that it had carefully planned to avoid and that ERP Systems Integrators had assured them would not occur. When the conversion was pushed into April of the following year, Acme Co. slowed and then ceased paying ERP Systems Integrators’ invoices. In July, ERP Systems Integrators pulled its implementation team and Acme Co. initiated arbitration.

Most IT projects are governed by contracts that assign responsibilities to each party and provide specific rem-edies for delayed implementation or project failure. Such contracts require the parties to submit to private, binding arbitration to resolve disputes. As the “Arbitration versus Civil Suits” sidebar indicates, this process slightly differs from civil litigation in a court of law. However, the use of CoCs applies equally to both settings.

Almost a year passed before the arbitration hear-ing. Meanwhile, Acme Co. and ERP Systems Integrators deposed witnesses, and experts scrutinized sales materi-als, project artifacts (e-mails, status reports, project plans, and so on), contract documents, application software, migration tools, and contents of the shared-drive imple-mentation environment.

THE “STANDARD OF CARE” DILEMMA The arbitration panel had to resolve three key issues:

• Who was responsible for project management? Acme Co. produced paperwork indicating that responsibil-ity rested with ERP Systems Integrators. The plaintiff claimed that it had no idea how to implement such a system and had hired the defendant to provide such expertise—including project management.

• What standards applied to the programming used for data conversion? Acme Co. attacked specific con-version software changes as harmful in that they

increased the amount of incorrect data within the converted database by an unnecessarily compli-cated order of magnitude. ERP Systems Integrators responded that the referenced software changes did not constitute “software engineering” and thus were not subject to CoC guidance.

• How significant were project communication failures? During discovery, numerous intracompany e-mails from ERP Systems Integrators described the project in a markedly more pessimistic tone than the com-munications delivered to Acme Co. in compliance with contract provisions.

These issues collectively fell under the “standard of care” portion of the contract. The dilemma Acme Co. faced—one common to companies in the same posi-tion—was detailing the standard of care it expected from ERP Systems Integrators. The contract language specified that ERP Systems Integrators “warrants that the services it provides hereunder will be performed in a professional and workmanlike manner in accordance with industry standards.”

As the following exchange shows, ERP Systems Integra-tors could not provide more detail regarding the warranty statement:

Question: What are the industry standards that you are referring to?Defense: There is nothing written or codified, but they are the standards recognized by the consulting firms in our industry.Question: I understand that the industry standards that you are referring to here are not written down any-where; is that correct?Defense: That is my understanding.Question: Have you made an effort to locate these industry standards and have simply not been able to do so?Defense: I would not know where to begin to look.

For its part, Acme Co. argued that suitable Internet-based CoCs were available to guide various behaviors. The “Online Codes of Conduct” sidebar describes one useful resource, the Online Ethics Center, that aggregates numerous CoCs. In particular, the plaintiff referenced the ACM/IEEE-CS Software Engineering Code of Ethics and Professional Practice. The “SECEPP” sidebar provides a brief history of this CoC.

Acme Co. successfully argued to the arbitration panel that, when faced with obscure or publicly available stan-dards, contracting parties should expect the accessible standards to apply. The plaintiff then cited objective, concrete portions of SECEPP that directly supported its positions on the three disputed issues.

ARBITRATION VERSUS CIVIL SUITS

I n contrast to civil suits tried in a court of law, arbitration vests the functions of judge and jury in a panel of arbitrators, typically

lawyers or industry professionals, whose time is paid for by the involved parties. Arbitration is private and frequently subject to confidentiality and nondisclosure agreements—because only the participants know the details, arbitration influences future litigation at a slower rate than do public court proceedings.

An arbitration panel does not issue an opinion; instead, it hands down a one-page decision to award damages (or not)—typically monetary—to one party or the other. The reasoning supporting any judgment thus must be inferred from the parties’ arguments. Arbitration decisions are final and generally cannot be appealed.

Remaining rules of court are much like those in a trial. A prelimi-nary phase is dedicated to evidence gathering, motion exchanges, depositions, and other discovery forms. Lawyers for each party try to convince the panel of the validity of their client’s position.

Page 39: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

31APRIL 2010

PROJECT MANAGEMENT RESPONSIBILITY

In pre-arbitration deposi-tions, ERP Systems Integrators asserted that it was not the project manager and that the contract specified its perfor-mance solely at the direction of Acme Co. The plaintiff con-tended that, while contract language did exist, overall project management lay with the defendant because it assumed that role in spite of its denials. In its defense, Acme Co. cited Section 2.7 of SECEPP, which states that “computing professionals have a responsibility to share technical knowledge with the public by encouraging understanding of comput-ing, including the impacts of computer systems and their limitations.”

T h e p a n e l m e m b e r s understood SECEPP to be ana logous to a bui ld ing code and that, because of its broad wording, applied to the project in general and not specifically to its software engineering aspects.

Expert test imony sup-ported Acme Co.’s claims. Referencing widely published and accepted principles1 that supplemented SECEPP, the plaintiff prepared a framework of project management behaviors as shown in Table 1. Specific evidence included a timesheet signed weekly by ERP Systems Integra-tors charging approximately 2,000 hours against the job category “Project Manager” and the task “Project Management.”

It was obvious from the evidence that ERP Systems Inte-grators was hired in a specialist capacity and that Acme Co. had no ability to provide oversight. The arbitration panel determined that the defendant acted as, and clearly was, the project manager.

RELEVANT PROGRAMMING STANDARDSERP Systems Integrators blamed the conversion failure

on “bad data.” However, Acme Co. provided evidence of

flawed programming practices, missing analysis data, and measurably lower-quality converted data.

Failure to test for other valuesThe plaintiff alleged that the defendant failed to follow

generally accepted testing standards. Evidence consisted of instances involving poorly implemented conversion software.

For example, the conversion software was supposed to check a specific field value for one of two possible values—say, “1” and “2” corresponding to the values “male” and “female,” respectively. The software executed by ERP Sys-tems Integrators checked to see if the source field value was “1” and, if so, assigned the value “male” to the con-verted field; if the source field value was not “1,” it assigned

ONLINE CODES OF CONDUCT

T he Online Ethics Center for Engineering and Research (http://onlineethics.org) is a joint project of the Center for Ethics, Engineering, and Society at the National Academy of Engineering and the

Ethics Education Library at the Center for the Study of Ethics in the Professions at the Illinois Institute of Technology. Funded by a grant from the National Science Foundation, the website brings together more than 50 CoCs from organizations including:

While not unified, the collection exhibits a striking cohesiveness.

Institute of Electrical and Electronics Engi-neers (IEEE)

Association for Computing Machinery (ACM)

Institute for Certifica-tion of Computing Professionals (ICCP)

Data Management Association (DAMA)

• 365,000members• 150countries,40per-

centoutsidetheUS• 128transactionsand

journals/magazines• 300conferences

annually• ACM/IEEE-CSSoft-

wareEngineeringCodeofEthicsandProfessionalPractice(SECEPP)

• 52,000+members• 100conferences

annually• CodeofConductand

SECEPP

• 50,000+members• CodeofConduct• CodeofGood

Practice

• 3,500+members• largestdata/meta-

dataconference• CodeofEthics

SECEPP

O riginally adopted in 1972 by the ACM and the IEEE Computer Society, the Software Engineering Code of Ethics and Professional Practice principally served as a method of “self-regulation,”

listing violations and accompanying sanctions. In 1993, SECEPP was revised to “clarify and formally state” the consensus of professional ethical requirements for which “the profession (is) accountable to the public.” The more comprehensive code was also designed to serve “as an aid to individual decision making.”1 In the years since adoption, SECEPP (www.computer.org/computer/code-of-ethics.pdf) has provided the foundation for numerous subsequent guidelines. Knowledgeable persons are aware of the code’s positive impact on professionalism within the IT industry.2

References 1. R.E. Anderson et al., “Using the New ACM Code of Ethics in Decision Making,” Comm. ACM, Feb.

1993, pp. 98-107. 2. S. Rogerson, “An Ethical Review of Information Systems Development and the SSADM

Approach,” Centre for Computing and Social Responsibility, De Montfort Univ., UK, 2 Jan. 2008; www.ccsr.cse.dmu.ac.uk/staff/Srog/teaching/ssadm.htm.

Page 40: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

PERSPECTIVES

COMPUTER 32

the value “female” to the converted field without determin-ing and reporting possible nonconforming values.

Section 1.2 of SECEPP states that “to minimize the possibility of indirectly harming others, computing professionals must minimize malfunctions by follow-ing generally accepted standards for system design and testing.” Accepted software engineering programming standards would call for testing for positive confirmation of “2” before setting the converted value to “female” and for reporting incoming values and numbers of values not “1” or “2.” The defendant’s failure to follow these standards permits “3” in the source data to be assigned the value “female” after conversion, resulting in demonstrably lower-quality converted data.

Failure to prevent duplicate record insertionAcme Co. demonstrated that ERP Systems Integrators’

software produced other structure-related conversion errors.1 The defendant’s e-mail traffic revealed an urgent need to get records in the system “even if they weren’t the correct ones.” Instead of attempting to determine why the conversion programs would not successfully complete, ERP Systems Integrators identified the lines of code prohibiting the insertion of duplicate records and “commented them out,” thereby inactivating the software functionality.

Consequently, there were 63,131 customers instead of approximately 6,000 and 100,236 employee records instead of approximately 10,000 in the system after con-

version. This in turn increased the data clean-up cost. Due to the inherent complexities of working in a multiflawed environment,2 the cost to clean up 10 times more data is often much greater than 10 times the cost of cleaning up the original data.

Section 6.08 of SECEPP states that software engineers should “take responsibility for detect-ing, correcting and reporting errors in software and associated documents.” The arbitration panel agreed with the plaintiff that the code provided an objective measure to assess responsibility for minimizing software malfunctions and correcting errors, and as such could reasonably guide Acme Co.’s expectations.

COMMUNICATION FAILURE RESPONSIBILITY

Acme Co. alleged that ERP Systems Integrators withheld important information from the on-site consulting team. The plaintiff presented e-mails of defendant personnel exchanging dire predictions about the project’s fate. One message warned it could become “our biggest mess!” These starkly contrasted with the rosy reports presented by ERP Systems Integrators during status meetings.

On the subject of a client’s obligation to communicate project failure indicators, SECEPP is unambiguous: Accord-ing to Section 2.06, “any signs of danger from systems must be reported to those who have opportunity and/or responsibility to resolve them.” The evidence clearly showed a pattern by the defendant of communicating one message internally (project failure) and a second message to plaintiff (everything okay).

A second communication failure occurred at a more systematically significant level. All project management guidelines stress the importance of treating project planning diagrams as living documents, and most are managed via specialized software that permits determi-nation of planned versus actual. Drawing on the Project Management Body of Knowledge (PMBOK), Acme Co. dem-onstrated that by never updating the project plan shown in Figure 1, developed using a simple spreadsheet, ERP Sys-tems Integrators was unable to report fact-based measures of progress and thus failed to meet expected standards.

Project statistic metadata lets stakeholders and imple-menters respond to challenges with all parties speaking the same vocabulary. Static project plans are out of date as soon as any task deviates from the plan and, as a result, management cannot determine the status of and impact on subsequent tasks.

Additional evidence indicating a vastly overbooked resource pointed to a project that was out of control. Figure 2 indicates a “plan” for one individual to accomplish the work of 18 others. This kind of error occurs in projects

Table 1. Summary evidence of project management behaviors.

Process area

Defendant leadPlaintiff

leadMethodology Demonstrated

Scopeplanning 3 3

Scopedefinition 3 3

Activitydefinition 3 3

Activitysequencing 3 3

Activitydurationestimation 3 3

Scheduledevelopment 3 3

Resourceplanning 3 3

Costestimating 3 3

Costbudgeting 3 3

Projectplandevelopment 3 3

Qualityplanning 3 3 ?

Communicationplanning 3 3

Riskidentification 3 3 3

Riskquantification 3 3

Riskresponsedevelopment 3 3 ?

Organizationalplanning 3 3

Staffacquisition 3 3

Page 41: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Figure 2.Indicatorofprojectfailure:a“plan”foroneindividualtoaccomplishtheworkof18others.

33APRIL 2010

interpretations of behaviors, CoCs are influencing various contracting parties as well as the IT, business, and consult-ing communities.

Because arbitration results are private, word-of-mouth has been the chief means of propagating the success of comparing litigant behavior against CoCs. Moving from arbitration into case law, CoCs will be increasingly applied. In spite of limited current awareness, SECEPP is well on its way to becoming a de facto standard as it enjoys growing awareness throughout the legal community and increasing compliance in the IT profession.4

More extensive application of publicly available stan-dards—and growing awareness of them—will positively impact the IT industry. Five initial benefits accrue to orga-nizations capitalizing on CoC knowledge:

where the existing environment has not been understood well enough to prop-erly plan.

Acme Co. proved that ERP Systems Integrators had not performed legacy system analysis3 and that it failed to adequately manage project risk. These activities are subsumed under Section 2.5 of SECEPP: “Give comprehensive and thorough evaluations of computer systems and their impacts, including analysis of possible risks.”

The arbitration panel concluded that ERP Systems Integrators’ failure to update project plans, communicate responsibly, and manage risk appro-priately constituted an inadequate standard of care. The defendant had obviously foreseen failure and hid from the plaintiff information indicat-ing that the project had no possibility of succeeding.

RESOLUTION AND DISCUSSIONDays after the arbitration hearing concluded, the panel

issued a one-page decision awarding $5 million to Acme Co.: five times the project’s worth. The decision was par-ticularly hard-hitting because ERP Systems Integrators’ insurance carrier denied coverage for the incident based on the evidence of its “failure to perform in a workman-like manner.”

The ruling favorable to the plaintiff indicated over-whelming support for its CoC-based case. The arguments Acme Co. presented are deciding factors in a grow-ing number of real-life judicial disputes. In technology contexts where key issues revolve around competing

Figure 1.Projectplanmaintainedasaread-onlyspreadsheet.

Page 42: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

PERSPECTIVES

COMPUTER 34

The evidence speaks for itself. Courts, juries, and arbitration panels are finding that fail-ure to follow generally accepted public standards for design and testing of soft-ware are grounds for seeking damages. A

wider understanding of the existence and usefulness of existing ethical and professional standards will represent added value to in-house IT managers and enhance the stature of IT professionals. An organization’s ability to evaluate conduct and, when appropriate, consider poten-tial legal matters more knowledgeably is paramount to imposing accountability on all participants.

AcknowledgmentsThe authors express gratitude to the anonymous review-

ers for their comments. They also thank the hundreds of unnamed data management and legal professionals with whom they have worked and who have contributed to a better understanding of this field as it is practiced and as it should be practiced.

References 1. V.Y. Yoon, P. Aiken, and T. Guimaraes, “Managing Organi-

zational Data Resources: Quality Dimensions,” Information Resources Management J., July-Sept. 2000, pp. 5-13.

2. P. Aiken and P. Piper, “Estimating Data Reverse Engineering Projects,” Proc. 5th Ann. Systems Reengineering Workshop, Naval Surface Warfare Center, 1995, pp. 133-145.

3. P. H. Aiken, “Reverse Engineering of Data,” IBM Systems J., Apr. 1998, pp. 246-269.

4. D. Gotterbarn, “How the New Software Engineering Code of Ethics Affects You,” IEEE Software, Nov./Dec. 1999, pp. 58-64.

5. T. DeMarco and T. Lister, “Both Sides Always Lose: Litiga-tion of Software-Intensive Contracts,” CrossTalk, Feb. 2000; www.stsc.hill.af.mil/crossTalk/2000/02/demarco.html.

Peter Aiken is an associate professor of information sys-tems at Virginia Commonwealth University and founding director of Data Blueprint, a data management consulting firm based in Richmond, Virginia. Contact him at [email protected].

Robert M. Stanley is a senior member of the technical staff and a project lead specializing in project and quality management for Data Blueprint. Contact him at [email protected].

Juanita Billings is a senior member of the technical staff and a project lead specializing in legal support, systems analysis, and project management for Data Blueprint. Con-tact her at [email protected].

Luke Anderson is a law partner specializing in technol-ogy and intellectual property matters with Duane Morris LLC in Atlanta, Georgia. Contact him at landerson@ duanemorris.com.

• Increasing use and broader applicability. As awareness of CoCs grows, so also will their popularity. More-over, as the legal community becomes more familiar with CoC-based arguments, they will be applied more broadly. CoC expertise could become ubiqui-tous and perhaps as branded as the PMP designation from the Program Management Institute (www.pmi.org/CareerDevelopment/Pages/Obtaining-Credential. aspx). CoCs are not only easy to use, they are unam-biguous about specific, holistic IT professional responsibilities—to the project, stakeholders, our profession, and society.

• Public codification of conduct standards by IT pro-fessionals. Evidence is mounting that public CoCs serve as standards for evaluating the performance and determining the responsibilities not only of IEEE members, but IT professionals in general. Organiza-tions and professionals are using CoCs to determine specific attributes of compliance and noncompliance.

• Preventing and resolving disputes. Guidance to first prevent and subsequently settle disputes is generally welcome. CoCs provide objective particulars that liti-gants can use in a proactive manner. Following a CoC is one way to promote a successful project environ-ment and insulate contracting parties from potential legal liability. In doing so, it is possible to identify criti-cal prelitigation and other decision points that allow parties to better deal with or entirely avoid disputes.

• Better understanding of IT project implementation. The current dispute resolution process favors contractors. Understanding CoC utility enables organizations to rethink their relationships with clients. This could impact how organizations evaluate, select, and inter-act with IT professionals.

• Organization-wide CoC applicability. SECEPP is guided by the philosophy that CoCs generally apply to orga-nizations—including those that do not have members belonging to the ACM or IEEE Computer Society—as well as to their leaders. The potential implications for organizations and leadership are staggering.

Seeking a legal resolution to a dispute over contracted IT services is a growing trend. This is not surprising given the alarming statistic that up to 70 percent of IT projects fail (www.it-cortex.com/stat_failure_rate.htm). Litigation of software-intensive endeavors has been called a “major growth industry,” with forecasted legal costs rising “faster than any other aspect of software development.”5

As companies increasingly rely on IT systems to drive their business, failures and delayed implementations can cause costly ripples throughout their organization. Many are unwilling to absorb these costs and, consequently, expect IT professionals and especially their vendor part-ners to share responsibility.

Selected CS articles and columns are available for free at http://ComputingNow.computer.org.

Page 43: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

0018-9162/07/$25.00 © 2007 IEEE42 Computer P u b l i s h e d b y t h e I E E E C o m p u t e r S o c i e t y

C O V E R F E A T U R E

version (changing data into other forms, states, orproducts), or scrubbing (inspecting and manipulat-ing, recoding, or rekeying data to prepare it for sub-sequent use).

• Approximately two-thirds of organizational datamanagers have formal data management training;slightly more than two-thirds of organizations useor plan to apply formal metadata management tech-niques; and slightly fewer than one-half manage theirmetadata using computer-aided software engineer-ing tools and repository technologies.3

When combined with our personal observations, theseresults suggest that most organizations can benefit fromthe application of organization-wide data managementpractices. Failure to manage data as an enterprise-, cor-porate-, or organization-wide asset is costly in terms ofmarket share, profit, strategic opportunity, stock price,and so on. To the extent that world-class organizationshave shown that opportunities can be created throughthe effective use of data, investing in data as the onlyorganizational asset that can’t be depleted should be ofgreat interest.

Increasing data management practice maturity levels can positively impact the

coordination of data flow among organizations, individuals, and systems. Results

from a self-assessment provide a roadmap for improving organizational data

management practices.

Peter Aiken, Virginia Commonwealth University/Institute for Data Research

M. David Allen, Data Blueprint

Burt Parker, Independent consultant

Angela Mattia, J. Sergeant Reynolds Community College

A s increasing amounts of data flow within andbetween organizations, the problems that canresult from poor data management practicesare becoming more apparent. Studies haveshown that such poor practices are widespread.

For example,

• PricewaterhouseCoopers reported that in 2004, onlyone in three organizations were highly confident intheir own data, and only 18 percent were very con-fident in data received from other organizations.Further, just two in five companies have a docu-mented board-approved data strategy (www.pwc.com/extweb/pwcpublications.nsf/docid/15383D6E748A727DCA2571B6002F6EE9).

• Michael Blaha1 and others in the research communityhave cited past organizational data management edu-cation and practices as the cause for poor databasedesign being the norm.

• According to industry pioneer John Zachman,2 orga-nizations typically spend between 20 and 40 percentof their information technology budgets evolving theirdata via migration (changing data locations), con-

Measuring Data ManagementPractice Maturity: A Community’s Self-Assessment

Page 44: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

April 2007 43

DATA MANAGEMENT DEFINITION AND EVOLUTION

As Table 1 shows, data management consists of sixinterrelated and coordinated processes, primarilyderived by Burt Parker from sponsored research he ledfor the US Department of Defense at the MITRECorporation.4

Figure 1 supports the similarly standardized defini-tion: “Enterprise-wide management of data is under-standing the current and future data needs of anenterprise and making that data effective and efficient insupporting business activities.”4

The figure illustrates howorganizational strategies guideother data management pro-cesses. Two of these processes—data program coordinationand organizational data inte-gration—provide direction tothe implementation processes—data development, data sup-port operations, and data assetuse. The data stewardship pro-cess straddles the line betweendirection and implementation.All processes exchange feed-back designed to improve andfine-tune overall data manage-ment practices.

Data management has existedin some form since the 1950sand has been recognized as adiscipline since the 1970s. Datamanagement is thus a young discipline compared to, forexample, the relatively mature

accounting practices that have been practiced for thou-sands of years. As Figure 2 shows, data management’sscope has expanded over time, and this expansion contin-ues today.

Ideally, organizations derive their data managementrequirements from enterprise-wide information andfunctional user requirements. Some of these require-ments come from legacy systems and off-the-shelf soft-ware packages. An organization derives its future datarequirements from an analysis of what it will deliver, aswell as future capabilities it will need to implement orga-nizational strategies. Data management guides the trans-

Data programcoordination

Organizationaldata integration

Datastewardship

Data supportoperations

Dataasset use

Organizational strategies

GoalsIntegrated

models

Businessdata

Business value

Application modelsand designs

Feedback

Implementation

Direction

Guidance

Datadevelopment

Standarddata

Figure 1. Interrelationships among data management processes (adapted from Burt

Parker’s earlier work4). Blue lines indicate guidance, red lines indicate feedback, and green

lines indicate data.

Table 1. Data management processes.4

Process Description Focus Data type

Data program Provide appropriate data Direction Program data: Descriptive propositions or observations needed to coordination management process and establish, document, sustain, control, and improve organizational

technological infrastructure data-oriented activities (such as vision, goals, policies, and metrics).Organizational Achieve organizational Direction Development data: Descriptive facts, propositions, or observations used data integration sharing of appropriate data to develop and document the structures and interrelationships of data

(for example, data models, database designs, and specifications). Data stewardship Achieve business-entity Direction and Stewardship data: Descriptive facts about data documenting

subject area data integration implementation semantics and syntax (such as name, definition, and format). Data development Achieve data sharing within Implementation Business data: Facts and their constructs used to accomplish enterprise

a business area business activities (such as data elements, records, and files). Data support Provide reliable access to Implementation operations data Data asset use Leverage data in business Implementation

activities

Page 45: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

44 Computer

formation of strategic organizational information needsinto specific data requirements associated with particu-lar technology system development projects.

All organizations have data architectures, whetherexplicitly documented or implicitly assumed. An impor-tant data management process is to document the archi-tecture’s capabilities, making it more useful to theorganization.

In addition, data management

• must be viewed as a means to an end, not the enditself. Organizations must not practice data man-agement as an abstract discipline, but as a processsupporting specific enterprise objectives—in partic-ular, to provide a shared-resource basis on which tobuild additional services.

• involves both process and policy. Data managementtasks range from strategic data planning to the cre-ation of data element standards to database design,implementation, and maintenance.

• has a technical component: interfacing with and facil-itating interaction between software and hardware.

• has a specific focus: creating and maintaining data toprovide useful information.

• includes management of metadata artifacts thataddress the data’s form as well as its content.

Although data management serves the organization,the organization often doesn’t appreciate the value itprovides. Some data management staffs keep ahead ofthe layoff curve by demonstrating positive businessvalue. Management’s short-term focus has often madeit difficult to secure funding for medium- and long-termdata management investments. Tracing the discipline’sefforts to direct and indirect organizational benefits hasbeen difficult, so it hasn’t been easy to present an artic-ulate business case to management that justifies subse-

quent strategic investments in datamanagement.

Viewing data management as a col-lection of processes, each with a rolethat provides value to the organizationthrough data, makes it easier to tracevalue through those processes andpoint not only to a methodological“why” of data management practiceimprovement but also to a specific,concrete “how.”

RESEARCH BASISMark Gillenson has published three

papers that serve as an excellent back-ground to this research.5-7 Like earlierworks, Gillenson focuses on theimplementation half of Figure 1,adopting a more narrow definition of

data administration. Over time, his work paints a pic-ture of an industry attempting to catch up with techno-logical implementation. Our work here updates andconfirms his basic conclusions while changing the focusfrom whether a process is performed to the maturitywith which it is performed.

Three other works also influenced our research: RalphKeeney’s value-focused thinking,8 Richard Nolan’s six-stage theory of data processing,9 and the CapabilityMaturity Model Integration (CMMI).10,11

Keeney’s value-focused thinking provides a method-ological approach to analyzing and evaluating the var-ious aspects of data management and their associatedkey process areas. We wove the concepts behind meansand fundamental objectives into our assessment’s con-struction to connect how we measure data managementwith what customers require from it.

In Stage VI of his six-stage theory of data processing,Nolan defined maturity as data resource management.Although Nolan’s theory predates and is similar to theCMMI, it contains several ideas that we adapted andreused in the larger data management context. However,CMMI refinement remains our primary influence.

Most technologists are familiar with the CMM (and itsupgrade to the CMMI), developed at Carnegie Mellon’sSoftware Engineering Institute with assistance from theMITRE Corporation.10,11 The CMMI itself was derivedfrom work that Ron Radice and Watts Humphrey per-formed while at IBM. Dennis Goldenson and DianeGibson presented results pointing to a link betweenCMMI process maturity and organizational success.12 Inaddition, Cyndy Billings and Jeanie Clifton demonstratedthe long-term effects for organizations that successfullysustain process improvement for more than a decade.13

CMMI-based maturity models exist for humanresources, security, training, and several other areas ofthe software-related development process. Our colleague,

Expanding Data Management Scope 1950-1970 1970-1990 1990-2000 2000 topresent

Database development Database operation Data requirements analysis Data modeling Enterprise data management coordinationEnterprise data integration Enterprise data stewardship Enterprise data use Explicit focus on data quality throughout Security Compliance Other responsibilities

Figure 2. Data management’s growth over time.The discipline has expanded from

an initial focus on database development and operation in the 1950s to 1970s to

include additional responsibilities in the periods 1970-1990, 1990-2000, and from

2000 to the present.

Page 46: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

Brett Champlin, contributed a list of dozens of maturitymeasurements derived from or influenced by the CMMI.This list includes maturity measurement frameworks fordata warehousing, metadata management, and softwaresystems deployment. The CMMI’s successful adoption inother areas encouraged us to use it as the basis for ourdata management practice assessment.

Whereas the core ideas behind the CMMI present areasonable base for data management practice maturitymeasurement, we can avoid some potential pitfalls bylearning from the revisions and later work done withthe CMMI. Examples of such improvements includegeneral changes to how the CMMI makes interrela-tionships between process areas more explicit and howit presents results to a target organization.

Work by Cynthia Hauer14 and Walter Schnider andKlaus Schwinn15 also influenced our general approach toa data management maturity model. Hauer nicely artic-ulated some examples of the value determination fac-tors and results criteria that we have adopted. Schniderand Schwinn presented a rough but inspirational out-line of what mature data management practices mightlook like and the accompanying motivations.

RESEARCH OBJECTIVESOur research had six specific objectives, which we

grouped into two types: community descriptive goalsand self-improvement goals.

Community descriptive research goals help clarify ourunderstanding of the data management community andassociated practices. Specifically, we want to understand

• the range of practices within the data managementcommunity;

• the distribution of data management practices, specif-ically the various stages of organizational data man-agement maturity; and

• the current state of data management practices—inwhat areas are the community data managementpractices weak, average, and strong?

Self-improvement research goals help the communityas a whole improve its collective data management prac-tices. Here, we desire to

• better understand what defines current data man-agement practices;

• determine how the assessment informs our standingas a technical community (specifically, how does datamanagement compare to software development?);and

• gain information useful for developing a roadmapfor improving current practice.

The CMMI’s stated goals are almost identical to ours:“[The CMMI] was designed to help developers select

process-improvement strategies by determining their cur-rent process maturity and identifying the most criticalissues to improving their software quality and process.”10

Similarly, our goal was to aid data management practiceimprovement by presenting a scale for measuring datamanagement accomplishments. Our assessment resultscan help data managers identify and implement processimprovement strategies by recognizing their data man-agement challenges.

DATA COLLECTION PROCESS AND RESEARCH TARGETS

Between 2000 and 2006, we assessed the data man-agement practices of 175 organizations. Table 2 pro-vides a breakdown of organization types.

Students from some of our graduate and advancedundergraduate classes largely conducted the assessments.We provided detailed assessment instruction as part ofthe course work. Assessors used structured telephoneand in-person interviews to assess specific organizationaldata management practices by soliciting evidence ofprocesses, products, and common features. Key conceptssought included the presence of commitments, abilities,measurements, verification, and governance.

Assessors conducted the interviews with the personidentified as having the best, firsthand knowledge oforganizational data management practices. Trackingdown these individuals required much legwork; identi-fying these individuals was often more difficult thansecuring the interview commitment.

The assessors attempted to locate evidence in the orga-nization indicating the existence of key process areaswithin specific data management practices. During theevaluation, assessors observed strict confidentiality—they reported only compiled results, with no mention ofspecific organizations, individuals, groups, programs,or projects. Assessors and participants kept all infor-mation to themselves and observed proprietary rights,including several nondisclosure agreements.

All organizations implement their data managementpractice in ways that can be classified as one of fivematurity model levels, detailed in Table 3 on the nextpage. Specific evidence, organized by maturity level,helped identify the level of data management practiced.

April 2007 45

Table 2. Organizations included in data management

analysis, by type.

Organization type Percent

Local government 4 State government 17 Federal government 11 International organization 10 Commercial organization 58

Page 47: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

46 Computer

For example, the data program coordination practicearea results include:

• Mystery Airline achieved level 1 on responses 1, 2,and 5, and level 2 on responses 3 and 4.

• The airline industry performed above both MysteryAirline and all respondents on responses 1 through3.

• The airline industry performed below both MysteryAirline and all respondents on response 4, andMystery Airline performed well below all respon-dents and just those in the airline industry onresponse 5.

Figure 3f illustrates the range of results for all orga-nizations surveyed for each data management process—for example, the assessment results for data programcoordination ranged from 2.06 to 3.31.

The maturity measurement framework dictates thata data program can achieve no greater rating than thelowest rating achieved—hence the translation to thescores for Mystery Airline of 1, 2, 2, 2, and 2 combin-ing for an overall rating of 1. This is congruent withCMMI application.

Although this might seem a tough standard, the rat-ing reflects the adage that a chain is only as strong as itsweakest link. Mature data management programs can’trely on immature or ad hoc processes in related areas.The lowest rating received becomes the highest possible

For each data management process, the assessmentused between four and six objective criteria to probefor evidence. Assessed outside the data collectionprocess, the presence or absence of this evidence indi-cated organizational performance at a correspondingmaturity level.

ASSESSMENT RESULTSThe assessment results reported for the various prac-

tice areas show that overall scores are repeatable (level2) in all data management practice areas.

Figure 3 shows assessment averages of the individualresponse scores. We used a composite chart to group theaverages by practice area. Such groupings facilitatenumerous comparisons, which organizations can use toplan improvements to their data management practices.

We present sample results (blue) for an assessed orga-nization (disguised as “Mystery Airline”), whose man-agement was interested in not only how the organizationscored but also how it compared to other assessed air-lines (red) and other organizations (white).

We grouped 19 individual responses according to thefive data management maturity levels in the horizontalbar charts. Most numbers are averages. That is, for anindividual organization, we surveyed multiple data man-agement operations, combined the individual assessmentresults, and presented them as averages. We reportedassessments of organizations with only one data man-agement function as integers.

Table 3. Data management practice assessment levels.

Level Name Practice Quality and results predictability

1 Initial The organization lacks the necessary processes for The organization depends entirely on individuals, with little or no sustaining data management practices. Data corporate visibility into cost or performance, or even awarenessmanagement is characterized as ad hoc or chaotic. of data management practices. There is variable quality, low

results predictability, and little to no repeatability. 2 Repeatable The organization might know where data management The organization exhibits variable quality with some

expertise exists internally and has some ability to predictability. The best individuals are assigned to criticalduplicate good practices and successes. projects to reduce risk and improve results.

3 Defined The organization uses a set of defined processes, Good quality results within expected tolerances most of the time.which are published for recommended use. The poorest individual performers improve toward the best

performers, and the best performers achieve more leverage. 4 Managed The organization statistically forecasts and directs Reliability and predictability of results, such as the ability to

data management, based on defined processes, determine progress or six sigma versus three sigmaselected cost, schedule, and customer satisfaction measurability, is significantly improved. levels. The use of defined data management processes within the organization is required and monitored.

5 Optimizing The organization analyzes existing data management The organization achieves high levels of results certainty. processes to determine whether they can be improved, makes changes in a controlled fashion, and reduces operating costs by improving current process performance or by introducing innovative services to maintain their competitive edge.

Page 48: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

overall rating. This also explains why many organiza-tions are at level 1 with regard to their software devel-opment practices. While the CMMI process results in asingle overall rating for the organization, data manage-ment requires a more fine-grained feedback mechanism.Knowing that some data management processes per-form better than others can help an organization developincentives as well as a roadmap for improving individ-ual ratings.

Taken as a whole, these numbers show that no datamanagement process or subprocess measured on aver-age higher than the data program coordination process,at 3.31. It’s also the only data management process thatperformed on average at a defined level (greater than 3).The results show a community that is approaching the ability to repeat its processes across all of data management.

Results analysisPerhaps the most important general fact represented

in Figure 3 is that organizations gave themselves rela-tively low scores. The assessment results are based onself-reporting and, although our 15-percent validationsample is adequate to verify accurate industry-wideassessment results, 85 percent of the assessment is basedon facts that were described but not observed. Althoughdirect observables for all survey respondents would haveprovided valuable confirming evidence, the cost of sucha survey and the required organizational access wouldhave been prohibitive.

We held in-person, follow-up assessment validationsessions with about 15 percent of the assessed organi-zations. These sessions helped us validate the collectionmethod and refine the technique. They also let us gaugethe assessments’ accuracy.

April 2007 47

0 1 2 3 4 0 1 2 3 4

Response 1

Response 2

Response 3

Response 4

Response 5

(a) (b)1

1

2

2

1

Response 8

Response 9

Response 7

Response 62

2

2

23.08

2.18

2.572.05

0.982.34

2.662.98

Response 10-a

Response 10-b

Response 10-c

Response 10-d

Response 10-e

Response 10-f

2

2

1

2

2

2

2.130.96

1./052.23

2.21

1.98

1.1

0.97

2.153.04

2.400.965

Response 14

Response 13

Response 12

Response 11

Response 15

2

2

2

2

2

0.89

1.2

1.05

0.79

1.142.25

2.46

2.01

1.57

2.33

Response 19

Response 18

Response 17

Response 16

0.00

1.00

2.00

3.00

4.00

5.00

Data programcoordination

results

Enterprisedata

integrationresults

Datastewardship

results

Datadevelopment

results

Data supportoperations

results

Mystery AirlineAirline industryAll respondents

3.313.14

2.881.09

3.112.06

2.572.98

2.723.15

0 1 2 3 4(c)

0 1 2 3 4(e) (f)

0 1 2 3 4(d)

3

3

3

3

1.112.17

3.042.04

2.663.11

2.892.66

2.06

3.31

2.66

2.18 1.98

2.28 2.46

1.57

2.04

2.66

Figure 3. Assessment results useful to Mystery Airline: (a) data program coordination, (b) enterprise data integration, (c) data

stewardship, (d) data development, (e) data support organizations, and (f) assessments range.

Page 49: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

48 Computer

Although the assessors strove to accurately measureeach subprocess’s maturity level, some interviewsinevitably were skewed toward the positive end of thescale. This occurred most often because intervieweesreported on milestones that they wanted to or wouldsoon achieve as opposed to what they had achieved. Wesuspected, and confirmed during the validation sessions,that responses were typically exaggerated by one pointon the five-point scale.

When we factor in the one-point inflation, the num-bers in Table 4 become important. Knowing that the baris so low will hopefully inspire some organizations toinvest in data management. Doing so might give them astrategic advantage if the competition is unlikely to bemaking a similar investment.

The relatively low scores reinforce the need forthis data management assessment. Based on theoverall scores in the data management practiceareas, the community receives five Ds. These areasprovide immediate targets for future data manage-ment investment.

WHERE ARE WE NOW?We address our original research objectives according

to our two goal categories.

Community descriptive research goals First, we wanted to determine the range of practices

within the data management community. A wide rangeof such practices exists. Some organizations are strongin some data management practices and weak in others(the range of practice is consistently inconsistent). Thewide divergence of practices both within and betweenorganizations can dilute results from otherwise strongdata management programs. The assessment’s applica-bility to longitudinal studies remains to be seen; this isan area for follow-up research. Although researchersmight undertake formal studies of such trends in thefuture, evidence from ongoing assessments suggests thatresults are converging. Consequently, we feel that oursample constitutes a representation of community-widedata management practices.

Next, we wanted to know whether the distribution ofpractices informs us specifically about the various stagesof organizational data management maturity. Theassessment results confirm the framework’s utility, as dothe postassessment validation sessions. Building on theframework, we were able to specify target characteris-tics and objective measurements. We now have betterinformation as to what comprises the various stages oforganizational data management practice maturity.Organizations do clump together into the various matu-rity stages that Nolan originally described. We can nowdetermine the investments required to predictably moveorganizations from one data management maturity levelto another.

Finally, we wanted to determine in what areas thecommunity data management practices are weak, aver-age, and strong. Figure 4 shows an average of unad-justed rates summarizing the assessment results. As thefigure shows, the data management community reportsitself relatively and perhaps surprisingly strong in all fivemajor data management processes when compared tothe industry averages for software development. Therange and averages indicate that the data managementcommunity has more mature data program coordina-tion processes, followed by organizational data inte-gration, support operations, stewardship, and then datadevelopment. The relatively lower data developmentscores might suggest data program coordination imple-mentation difficulties.

Self-improvement research goals Our first objective was to produce results that would

help the community better understand current best prac-tices. Organizations can use the assessment results tocompare their specific performance against others intheir industry and against the community results as awhole. Quantities and groupings indicate the relativestate and robustness of the best practices within eachprocess. Future research can use this information toidentify specific practices that can be shared with the

Table 4. Assessment scores adjusted for self-reporting

inflation.

Response Adjusted average

1 1.72388 2 1.57463 3 1.0597 4 1.8806 5 2.31343 6 1.66418 7 1.33582 8 1.57463 9 1.1791

10 a 1.40299 10 b 1.14925 10 c 0.97761 10 d 1.20896 10 e 1.23134 10 f 1.12687 11 1.32836 12 0.57463 13 1.00746 14 1.46269 15 1.24627 16 1.65672 17 1.66418 18 1.04478 19 1.17164

Page 50: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

community. Further study ofthese areas will provide lever-ageable benefits.

Next, we wanted to deter-mine how the assessment in-forms our standing as a tech-nical community. Our researchgives some indication of theclaimed current state of datamanagement practices. How-ever, given the validation sessionresults, we believe that it’s best to caution readers thatthe numbers presented probably more accuratelydescribe the intended state of the data management community.

As it turns out, the relative number of organizationsabove level 1 for both software and data managementare approximately the same, but a more detailed analy-sis would be helpful. Given the belief that investmentin software development practices will result in signif-icant improvements, it’s appropriate to anticipate sim-ilar benefits from investments in data managementpractices.

Finally, we hoped to gain information useful for devel-oping a roadmap for improving current practice.Organizations can use the survey assessment informationto develop roadmaps to improve their individual datamanagement practices. Mystery Airline, for example,could develop a roadmap for achieving data managementimprovement by focusing on enterprise data integration,data stewardship, and data development practices.

SUGGESTIONS FOR FUTURE RESEARCHAdditional research must include a look at relation-

ships between data management practice areas, whichcould indicate an efficient path to higher maturity lev-els. Research should also explore the success or failureof previous attempts to raise the maturity levels of orga-nizational data management practices.

One of our goals was to determine why so many orga-nizational data management practices are below expec-tations. Several current theses could spur investigationof the root causes of poor data management practices.For example,

• Are poor data management practices a result of theorganization’s lack of understanding?

• Does data management have a poor reputation ortrack record in the organization?

• Are the executive sponsors capable of understandingthe subject?

• How have personnel and project changes affectedthe organization efforts?

Our assessment results suggest a need for a more for-malized feedback loop that organizations can use to

improve their data management practices. Organizationscan use this data as a baseline from which to look for,describe, and measure improvements in the state of thepractice. Such information can enhance their under-standing of the relative development of organizationaldata management. Other investigations should probefurther to see if patterns exist for specific industry or busi-ness focus types.

Building an effective business case for achieving a cer-tain level of data management is now easier. The failureto adequately address enterprise-level data needs hashobbled past efforts.4 Data management has, at best, abusiness-area focus rather than an enterprise outlook.Likewise, applications development focuses almostexclusively on line-of-business needs, with little atten-tion to cross-business-line data integration or enterprise-wide planning, analysis, and decision needs (other thanwithin personnel, finance, and facilities management).In addition, data management staff is inexperienced inmodern data management needs, focusing on data man-agement rather than metadata management and on syn-taxes instead of semantics and data usage.

F ew organizations manage data as an asset. Instead,most consider data management a maintenance cost.A small shift in perception (from viewing data as a

cost to regarding it as an asset) can dramatically changehow an organization manages data. Properly manageddata is an organizational asset that can’t be exhausted.Although data can be polluted, retired, destroyed, orbecome obsolete, it’s the one organizational resource thatcan be repeatedly reused without deterioration, providedthat the appropriate safeguards are in place. Further, allorganizational activities depend on data.

To illustrate the potential payoff of the work presentedhere, consider what 300 software professionals applyingsoftware process improvement over an 18-year periodachieved:16

• They predicted costs within 10 percent.• They missed only one deadline in 15 years.• The relative cost to fix a defect is 1X during inspec-

tion, 13X during system testing, and 92X during operation.

April 2007 49

Initial Repeatable Defined

Data program coordination 2.06 2.71 3.31

Enterprise data integration 2.18 2.44 2.66

Data stewardship 1.98 2.18 2.40

Data development 1.57 2.12 2.46

Data support operations 2.04 2.38 2.66

Figure 4. Average of unadjusted rates for the assessment results, by process.

Page 51: Monetizing Data Management (DAMA-NCR)dama-ncr.org/DamaDay/MonetizingDataManagement.pdf · Monetizing Data Management • Why is it important? – Concretizing • State Agency Time

50 Computer

• Early error detection rose from 45 to 95 percentbetween 1982 and 1993.

• Product error rate (measured as defects per 1,000lines of code) dropped from 2.0 to 0.01 between1982 and 1993.

If improvements in data management can producesimilar results, organizations should increase their matu-rity efforts. ■

Acknowledgments

We thank Graham Blevins, David Rafner, and SantaSusarapu for their assistance in preparing some of thereported data. We are greatly indebted to many of PeterAiken’s classes in data reengineering and related topicsat Virginia Commonwealth University for the carefulwork and excellent results obtained as a result of theirvarious contributions to this research. This article alsobenefited from the suggestions of several anonymousreviewers. We also acknowledge the helpful, continuingwork of Brett Chaplin at Allstate in collecting, apply-ing, and assessing CMMI-related efforts.

References1. M. Blaha, “A Retrospective on Industrial Database Reverse

Engineering Projects—Parts 1 & 2,” Proc. 8th Working Conf.Reverse Eng., IEEE Press, 2001, pp. 147-164.

2. J. Zachman, “A Framework for Information Systems Archi-tecture,” IBM Systems J., vol. 26, 1987, pp. 276-292.

3. P.H. Aiken, “Keynote Address to the 2002 DAMA Interna-tional Conference: Trends in Metadata,” Proc. 2002 DAMAInt’l/Metadata Conf., CD-ROM, Wilshire Conf., 2002, pp.1-32.

4. B. Parker, “Enterprise Data Management Process Maturity,”Handbook of Data Management, S. Purba, ed., AuerbachPublications, CRC Press, 1999, pp. 824-843.

5. M. Gillenson, “The State of Practice of Data Administration—1981,” Comm. ACM, vol. 25, no. 10, 1982, pp. 699-706.

6. M. Gillenson, “Trends in Data Administration,” MIS Quar-terly, Dec. 1985, pp. 317-325.

7. M. Gillenson, “Database Administration at the Crossroads:The Era of End-User-Oriented, Decentralized Data Process-ing,” J. Database Administration, Fall 1991, pp. 1-11.

8. R.L. Keeney, Value-Focused Thinking—A Path to CreativeDecisionmaking, Harvard Univ. Press, 1992.

9. R. Nolan, “Managing the Crisis in Data Processing,” Har-vard Business Rev., Mar./Apr. 1979, pp. 115-126.

10. Carnegie Mellon Univ. Software Eng. Inst., Capability Matu-rity Model: Guidelines for Improving the Software Process,1st ed., Addison-Wesley Professional, 1995.

11. M.C. Paulk and B. Curtis, “Capability Maturity Model, Ver-sion 1.1,” IEEE Software, vol. 10, 1993, pp. 18-28.

12. D.R. Goldenson and D.L. Gibson, “Demonstrating the Impactand Benefits of CMM: An Update and Preliminary Results,”

special report CMU/SEI-2003-SR-009, Carnegie Mellon Univ.Software Eng. Inst., 2003, pp. 1-55.

13. C. Billings and J. Clifton, “Journey to a Mature SoftwareProcess,” IBM Systems J., vol. 33, 1994, pp. 46-62.

14. C.C. Hauer, “Data Management and the CMM/CMMI:Translating Capability Maturity Models to OrganizationalFunctions,” presented at National Defense Industrial Assoc.Technical Information Division Symp., 2003; www.dtic.mil/ndia/2003technical/hauer1.ppt.

15. W. Schnider and K. Schwinn, “Der Reifegrad des Datenman-agements” [The Data Management Maturity Model], KPPConsulting; www.kpp-consulting.ch/downloadbereich/DM%20Maturity%20Model.pdf, 2004 (in German).

16. H. Krasner, J. Pyles, and H. Wohlwend, “A Case History ofthe Space Shuttle Onboard Systems Project,” TechnologyTransfer 94092551A-TR, Sematech, 31 Oct. 1994.

Peter Aiken is an associate professor of information systemsat Virginia Commonwealth University and founding direc-tor of Data Blueprint. His research interests include dataand systems reengineering. Aiken received a PhD in infor-mation technology from George Mason University. He is asenior member of the IEEE, the ACM, and the Data Man-agement Association (DAMA) International. Contact himat [email protected].

M. David Allen is chief operating officer of Data Blueprint.His research interests include data and systems reengineer-ing. Allen received an MS in information systems from Virginia Commonwealth University. He is a member ofDAMA. Contact him at [email protected].

Burt Parker is an independent consultant based in Wash-ington, DC. His technical interests include enterprise datamanagement program development. Parker received anMBA in operations research/systems analysis (general sys-tems theory) from the University of Michigan. He is a mem-ber of DAMA. Contact him at [email protected].

Angela Mattia is a professor of information systems at J.Sergeant Reynolds Community College. Her research inter-ests include data and systems reengineering and maturitymodels. Mattia received an MS in information systems fromVirginia Commonwealth University. She is a member ofDAMA. Contact her at [email protected].