Upload
haque
View
213
Download
1
Embed Size (px)
Citation preview
BIG DATA KICK START
Troy Christensen
December 2013
• Define the Target Operating Model 1
• Develop Implementation Scope and Approach 2
• Progress Key Data Management Capabilities 3
• Transition to Operate 4
Big Data Roadmap
Target Data Operating Model
the Data Solution
Target Operating Model
Begin with the End in Mind
Data Solution
Define a content based scope
then standardize to common process requirements
Data Processes
1. Manage Data Strategy, Performance & Capability: Define and execute data management strategy, including governance, performance management and organizational discipline.
2. Manage Data Issues & Risks: Identify, mitigate, and resolve issues and Risks within the Data Solution. This process refers only to the data related issues and risks which have an impact on the Data Solution (other issues are managed through normal issue resolution process.
3. Manage Data Objects: Defines Data Objects, Manages Metrics, Manages Policies, Manages Metadata, Manages Data Attributes and Manages Data Standards.
4. Manage Data Template: Enables data to be bulk loaded into the Data Solution in compliance with defined standards.
5. Manage Data Content: This process manages data content via agreed on hierarchy (e.g. global vs. regional)
6. Manage Data Audit: Enables the data audit requirements of the business processes to be traced and analyzed
7. Monitor Data Quality: Enables the data quality of each data object to be managed to requirements of the collective business processes
Data Roles
X
X
X
X
X
X
X
XX
X
X
X
X
X
XX
XX
XXX
X
X
X
X
X
X
X
XX
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
XX
X
X
X
X
X
X
X
X
Today's Date: 18-Sep-13
REGIONAL ROLES
REGION P2P REGION O2C REGION R2R REGION PROCESS
DRAFT DATA MANAGEMENT ORGANIZATION
Role Assignments
USERS
SUPER USERS
PROCESS & DATA
OWNERDATA STEWARD
P2P PROCESS & DATA
OWNERP2P DATA STEWARD
O2C PROCESS & DATA
OWNERO2C DATA STEWARD
R2R PROCESS & DATA
OWNERR2R DATA STEWARD
IT&S ROLES
GLOBAL SERVICE MANAGER SERVICE DELIVERY MANAGER USER MANAGEMENT SUPPORT LEVEL, 1-3 APPLICATION SMES FUNCTIONAL SMES
SEGMENT ROLES
SEGMENT P2P SEGMENT O2C SEGMENT R2R SEGMENT PROCESS
P2P PROCESS & DATA
OWNERP2P PROCESS LEAD
O2C PROCESS & DATA
OWNERO2C PROCESS LEAD
R2R PROCESS & DATA
OWNERR2R PROCESS LEAD
PROCESS & DATA
OWNERPROCESS LEAD
DATA STEWARDP2P DATA STEWARDP2P VENDOR
RELATIONSHIP MGMTO2C DATA STEWARD R2R DATA STEWARD
GLOBAL DATA
SPECIALIST
GLOBAL P2P
PROCESS OWNER
GLOBAL P2P
SPECIALIST
GLOBAL O2C
PROCESS OWNER
GBS ROLES
GLOBAL O2C
SPECIALIST
GLOBAL R2R
PROCESS OWNER
GLOBAL R2R
SPECIALIST
GLOBAL PROCESS
OWNER
GLOBAL
SPECIALIST
GLOBAL DATA
DIRECTOR
Without organizational roles, data process won’t execute
Data Tools
Big Data tools will likely enable the target operating model, but we simply can’t define any tool
requirements until the operating model requirements are landed
Big Data Implementation
8
Implementation involves four objectives: • Standardization – The standards, policies or rules that define the
target state (what good data looks like) are gathered and documented. Standards are then mapped to the business processes that use them.
• Harmonization – Data sources are documented
and each data source is mapped to the data standard. Scorecards are developed to compare data sets from disparate sources against common standards.
• Alignment – Scorecards are analyzed to identify gaps in current data source against common data standards. Gaps are prioritized into remediation projects and funding secured.
• Operate – Harmonization and alignment activities are periodically repeated as data standards continue to mature. Tools are implemented to improve SLA’s and reduce costs.
Harmonize
Align
Operate
Standardize
Key - Defining the Start Point
9
Pilot an Achievable Objective • Data is a critical input to every business
process we execute daily.
• We can ensure it does not cause our business processes to fail by managing our core data to specific standards.
• We can demonstrate its value by deploying a
proven model to a limited set of data subject areas.
Land Initial Scope Vendor data is a good data subject area candidate to pilot a data management operating model because: • Relevancy – Vendor data impacts many
business processes across many different segments and functions. There is value to ensure it does not impede someone’s ability to quickly procure the right materials or services from the most cost effective source.
• Reusability – Vendor data standards and data quality scorecards already exist, so we can quantify and address the vendor data quality gap with fewer resources.
Data silos weren’t built overnight and they won’t be removed overnight, but common
processes won’t function properly until data silos are removed.
Big Data Dependency Matrix
10
Maturity
Level
Common
Process
Data
Requirements
Organization/
Governance
Data Tools Architecture
Level 1 Common
Business
Processes
Documented
Data Process
Requirements
Defined
Common Data
Maturity Model
Landed
Data Tool
Requirements
Documented
Data Flows
Diagrammed
Level 2 Data Scope
Defined
Data
Standards
Common Data
Organization
Landed
Tool Strategy
Defined
Data
Architecture
Strategy
Defined
Level 3
Data Sustain
and Support
Processes
Defined
Data Metrics
and
Scorecards
Common Data
Governance
Landed
Common Data
Tool Gaps
Identified
Data
Architecture
Gaps
Identified
Level 4 Data Sustain
and Support
Processes
Integrated
Data Quality
Performance
Functional Data
Management
Capabilities
Developed
Common Data
Tool
Roadmap
Developed
Data
Architecture
Roadmap
Developed
Level 5
Common
Business
Processes
Optimized
Data Quality
Enforcement
Functional Data
Management
Capabilities
Optimized
Common Data
Tool Set
Implemented
Common Data
Architecture
Implemented
Dependencies generally run left to right, top to bottom. To understand the requirements for a common data architecture and tool set, we first need to develop common data processes,
data standards and governance capabilities.
Big Data Activity Sets
11
Standardization • Gather data standards across domain
• Determine common format for capturing global data standards
• Document data standards using common formatting
• Share with segment data stewards and gain consensus on group standards
• Map standards to global process
Harmonization • Identify all data sources
• Map data sources to group data standards
• Develop scorecard to measure data quality gaps
• Run scorecards against each data source
Alignment • Identify data quality gaps from each scorecard
• Use global process mapping to assess impact and risks of each gap
• Prioritize remediation activities
• Build scorecard to describe impact of improved vendor data quality
• Secure funding, execute remediation and monitor progress
Operate • Monitor benefits and costs
• Develop data management strategy
• Identify cost effective data management tools
• Prioritize implementation across remaining data objects
Transition to Operate
12
Measure impact of the pilot program in terms of the business case. For example, for vendors, metrics could be: Safety and Cost Drivers
• Invalid D&B Numbers result in orders from vendors who are not authorized to do business with company
• Incorrect shipping addresses cause delays in receipt of supplies that results in increased down or idle time
Safety Drivers
• Incorrect vendor data result in non-OEM part being shipped or services in lieu of OEM parts or services
Cost Drivers
• Missing vendor data result in increased invoice cycle time resulting in additional processing costs and inability to realize payment discounts.
• Maintenance of large number of inactive vendors results in erroneous analytic reports and higher maintenance costs.
• Improper vendor addresses lead to incorrect tax calculations and collections
• Incomplete vendor data requires additional resources to maintain and process orders and invoices.
• Poor data quality in vendor names results inability to search for and find appropriate vendor in a pick list.
• Invalid contact information results in inability to contact vendor to correct errors which results in processing delays
Summary - Big Data Kick Off Milestones
• Define the Target Operating Model – Solution – Process – Roles – Tools
• Develop Implementation Scope and Approach – Standardize – Harmonize – Align – Operate
• Progress Key Data Management Capability – Navigate the Dependency Matrix – Execute the Activity Set
• Transition to Operate – Ensure Big Data benefits are measurable and traceable
Final Take Away
For most companies, kicking off a Big Data initiative is really about developing foundational data management capabilities
(standardization, harmonization, alignment and operations) across respective business lines.
Unless these core foundational capabilities exist or are developed as
part of the program, Big Data can’t deliver its full potential.
QUESTIONS?