20
IAM/IAG Maturity Assessment Dos & Don'ts Thursday, May 15th, 15:30 – 16:00 Dr. Horst Walther Senior Analyst KuppingerCole [email protected]

IAM/IAG Maturity Assessment Dos & Don'ts - · PDF fileIAM/IAG Maturity Assessment Dos & Don'ts ... •in-depth knowledge of the status of the technology ... risk- & context-based authentication

  • Upload
    trantu

  • View
    222

  • Download
    3

Embed Size (px)

Citation preview

IAM/IAG Maturity Assessment Dos & Don'ts

Thursday, May 15th, 15:30 – 16:00

Dr. Horst Walther Senior Analyst KuppingerCole [email protected]

Rating the maturity of IAM/IAG programs is not easy. Who is the right one to do such rating? Which input is required? How to you ensure that the rating does not become more complex than the rest of the program? What to look at – what are the Key Performance Indicators (KPIs) & Key Risk Indicators (KRIs) to look at & how to do it without years-long collection of such indicators? What are the right benchmarks you can use –Who can help you in benchmarking? Which lessons to draw from the results? In this session, Dr. Horst Walther will talk about the Dos & Don’ts of Maturity Assessments.

IAM/IAG Maturity Assessment Dos & Don’ts

• Maturity models are one of the widespread areas in the field of improving organizational performance.

• They identify organizational strengths & weaknesses as well as providing benchmarking information.

• There are many maturity models like OPM3, CMMI, P3M3, PRINCE, BPMM, Kerzner's model, SPICE, COBIT etc.

• These models differ from each other in terms of their factors & characteristics

• There is no standard related to these models.

• It is important for organizations to be able to assess their situation by a comprehensive and useful model.

Maturity models

5/20/2014 © KuppingerCole 3

P3M3. Portfolio, Programme & Project Management Maturity Model CMMI®: Capability Maturity Model® Integration OPM3®: Organizational Project Management Maturity Model SPICE: Software Process Improvement & Capability Determination COBIT: Control Objectives for Information & Related Technology

CMM – The forefather of all maturity models

2014-05-20 © KuppingerCole 4

• Initial (chaotic, ad hoc) - the starting point for use of a new or undocumented repeat process.

• Repeatable The process is at least documented sufficiently, to enable repeating the same steps.

• Defined The process is defined/confirmed as a standard business process.

• Managed The process is quantitatively managed in accordance with agreed-upon metrics.

• Optimizing process management includes deliberate process optimization/improvement.

© KuppingerCole 5 2014-05-20

CMM gave maturity models a kick start

• In 1986 triggered by the U.S. DoD (Department of Defence), the SEI (Software Engineering Institute) at Carnegie Mellon University, started the development of a system for assessing the maturity of software processes .

• In 1991, the model was issued as Capability Maturity Model 1.0

• CMMI® (Capability Maturity Model Integration) was released in early 2002.

• CMM lead to a proliferation of CM models.

• Popular models, based on the original CMU CMM®, are Spice for maturity assessment & assessment of software processes & COBIT for IT governance processes … & many others.

• The notion of Maturity Models will be henceforth tied to one name: Watts S. Humphrey.

Assessment according to a common maturity model enable …

• Positioning - current achievements in a framework

• Benchmarking - to compare with others (competitors, best of breed, …)

• Quantification - of otherwise qualitative information

• Evidence - for compliance & certification purposes

• Orientation - to define the starting point for change activities.

• Reputation - as it is fancy not to rely on gut feelings.

• Transparency - serving as the foundation for any good governance.

Why Maturity Assessments?

2014-05-20 © KuppingerCole 6

Maturity Models for IAM / IAG & related

2014-05-20 © KuppingerCole 7

• There are overwhelmingly plenty models around; you could well craft your own.

• We deem it prudent to assess IAM / IAG processes for maturity too.

• However this discipline is inherently immature in itself. • Terms (like authorisation, provisioning, …) are weakly defined & poorly understood.

• IT depts carry the burden to solve business tasks – without being mandated.

• Few standards, generic practices have been established.

• Hence Maturity assessments have to be undertaken with some extra care.

• Nevertheless, a huge number of maturity models is around.

• Tailored approaches currently appear to be most promising.

• KuppingerCole pioneered in this discipline.

Maturity assessments & IAM / IAG

2014-05-20 © KuppingerCole 8

• in-depth knowledge of the status of the technology market segment the programs are related today

• knowledge about the status of other organizations, both in the industry of the organization and in other industries

• good understanding of trends that will have an impact on the program and investments

• a rigorous methodological approach based on reliable information

© KuppingerCole 9

What it needs to do Maturity Assessements

• Define goals: Define what should be achieved & how the initiative relates to other initiatives in the organization. There should be one consistent risk management approach in the organization, while starting small & distributed.

• Define metrics: The KRIs/KPIs to be used have to be defined. That includes the definition of thresholds which should be met.

• Define responsibilities: In the beginning, the responsibilities for providing the current values of metrics, the aggregation of these metrics into scorecards & the reporting structures including alerting & escalations have to be defined.

• Define actions: The approach has to result in predefined actions in case that a risk increases beyond the defined threshold.

Build the assessment on top of KPIs / KRIs The following generic approach for deriving KPIs/KRIs is recommended …

5/20/2014 © KuppingerCole 10

It is most important to choose the appropriate KRIs/KPIs:

1. Choose valid indicators: Indicators have to be directly related to a risk. Changes in the value of the indicator have to indicate increasing/decreasing risks.

2. Choose indicators which can be influenced directly: There have to be actions defined for every indicator. Indicators which can be influenced (& improved) easily are a good choice.

3. Choose indicators which are easy to collect: If you need special tools or increased staff to collect raw data, you may have chosen the wrong metric – collection has to be easy.

How to choose KRIs & KPIs

5/20/2014 © KuppingerCole 11

Work example: Digital identities per physical person

5/20/2014 © KuppingerCole 12

Indicator: Average number of Digital identities per physical person. Group(s) of Indicators: IAM, GRC

Interpretation: Defines the ratio of digital identities (e.g. identifiers to which accounts are mapped) and the number of physical persons (internal, external).

Unit type: Percentage Direction: Minimize (Optimum: 1) IT Risks associated: Security risks: Situations in which one person has several digital identities often lead to unmanaged

accounts. As well there are some security risks in preferring elevated accounts or unsecure authentication approaches.

From a GRC perspective, these situations make it very difficult to analyse and control security.

Efficiency risks: Having to deal with several identities is more complex and might lead to an increasing number of password losses.

Operational Risks associated: Due to the security risks these situations might lead to undetected SoD conflicts in case that the relation of several digital identities to one physical person isn’t identified.

How to optimize: Use global identifiers as an abstraction level or map all accounts to one physical identity (if applicable).

Annotations: Some IAM and GRC tools can’t deal with multiple layers of identities, e.g. accounts, digital identities and additional global identifiers as an additional mapping layer.

• Visibility & Acceptance

• Guidelines & Policies

• Organisational Structure

• Status of Organisation Deployment

• Scope & Coverage

• Risk Awareness

Select KPIs / KRIs from these activity domains

13

A typical assessment will evaluate KPIs / KRIs from the following activity domains against Best Practice:

• Technical Master Plan

• Access & Governance Analytics

• Identity Management & Provisioning

• Support for the Extended Enterprise

• Privilege Management & SIEM

• Authentication & Authorisation

• KC proposes Maturity Level Matrices for IAM/IAG for 7 major areas:

1. Access Governance

2. Access Management & Federation

3. Authentication

4. Cloud Identity Management

5. Dynamic Authorization Management

6. Identity Provisioning

7. Privilege Management

• These matrices cover the most important areas of IAM/IAG.

• Including some minor segments, such as Enterprise Single Sign-On.

• Some of the matrices cover a fairly broad range of topics. • E.g. Authentication, includes strong authentication, risk- & context-based authentication & authorization, & versatile

authentication.

Where to assess the Maturity? e.g. in the 7 KC IAM/IAG Maturity domains

5/20/2014 © KuppingerCole 14

© KuppingerCole 15

Maturity Levels tailored to the domain KC example for Access Management / Governance

How to visualise the results Evaluation sample 1 (table)

© KuppingerCole 16

Maturity Level 5 Maturity Level 3 Best of Class Good in Class Current Average

Visibility & Acceptance Guidelines & Policies Organisational Structure Penetration of the Organisation Scope & Coverage Risk Awareness Technical Master Plan Access Governance/Analytics Identity Management Extended Enterprise Privilege Management & SIEM Authentication & Authorisation

0123456789

Visibility and Acceptance

Guidelines and Policies

Organisational Structure

Status of Organisation…

Scope and Coverage

Risk Awareness

Technical Master Plan

Access and Governance…

Identity Management &…

Support for the Extended…

Privilege Management and…

Authentication and…

Customer Best of Class

Maturity Assessment – Example of evaluation

The customer´s status compared to „Best of Class“

How to visualise the results Evaluation sample 2 (graph)

17

The recommended actions example working plan for until the next maturity assessment

© KuppingerCole 18

Visibility & Acceptance

Guidelines & Policies

Organisational Structure

Penetration of the Organisation

Scope & Coverage

Risk Awareness

Technical Master Plan

Access Governance/Analytics

Identity Management

Extended Enterprise

Privilege Management & SIEM

Authentication & Authorisation

No actions required

Consolidate & harmonise the existing stack

Shift IAG responsibility to business

Extend current practices to a 2nd business line

Consider including customer direct access

No actions recommended

Consolidate isolated projects to a controlled program

Employ a big data approach to enable analytics

No actions required

Actions recommended – postponed due to low priority

Apply SIEM to privileged Access Management

Include dynamic authorisation to the enterprise concept

1. Tailor ‚oversize‘ maturity models to your specific needs.

2. There is currently no way to avoid proprietary models

3. They provide (limited) knowledge bases & hence comparability.

4. IAM / IAGs inherent immaturity limit the benchmarking applicability.

5. Accept IAM / IAG purely as a business task.

6. Invest some effort into a clear, rigorous & logical terminology.

7. You may well define your own custom KPIs / KRIs.

7 Dos & recommendations

2014-05-20 © KuppingerCole 19

1. No overkill – assessments must not be huge projects

2. Not for the shelf – assessments should result in actions

3. Not one time effort - Assess regularly at least every 2-3 years

4. Not just IT - Consider business and technology

5. No introspection – look for outside view, experts, external knowledge

5 Don’ts & warnings

5/20/2014 © KuppingerCole 20