28
California Institute for Telecommunications and Information Technologies La Jolla, CA 92093-0405, USA Department of Computer Science & Engineering University of California, San Diego La Jolla, CA 92093-0114, USA Evaluating Software Architectures Stakeholders, Metrics, Results, Migration Strategies Ingolf H. Krueger [email protected]

Evaluating Software Architectures

Embed Size (px)

Citation preview

California Institute for Telecommunications and Information TechnologiesLa Jolla, CA 92093-0405, USA

Department of Computer Science & EngineeringUniversity of California, San Diego

La Jolla, CA 92093-0114, USA

Evaluating Software ArchitecturesStakeholders, Metrics, Results, Migration Strategies

Ingolf H. [email protected]

© Ingolf H. Krueger 2March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 3March 06, 2003 CSE/CAL·(IT)2

The Role of SW-Architecture

• Bounds leeway for implementation

• Fosters (or impedes!) system quality

• Supports the critical system services

• Defines starting point for– Change management

– Product families

– Division of work

© Ingolf H. Krueger 4March 06, 2003 CSE/CAL·(IT)2

Why Evaluate an Architecture?

• Stakeholders have different views/opinions on– what the system does– how the system should be structured– what documentation should look like– how the company/suppliers should conduct their business– …

• Architecture Evaluation– brings different stakeholders together– forum for voicing concerns (can, but need not, be related to

the software architecture under consideration)– forum for establishing common understanding of important

system aspects across development/business teams– means for risk identification and analysis

© Ingolf H. Krueger 5March 06, 2003 CSE/CAL·(IT)2

Why Evaluate an Architecture?

• Find errors early: 55% of all errors are made, but less than 10% are detected during requirements capture and analysis

• Find out if architecture is adequate wrt.– desirable/mandatory requirements

– critical use cases

– anticipated changes/extensions

– budget constraints

• Find out if state of the art development and documentation approaches were used

• Find interfaces for coupling existing architecture with legacy or new neighboring systems

• Decide among several competing architectural alternatives

• Determine migration path towards target architecture

© Ingolf H. Krueger 6March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 7March 06, 2003 CSE/CAL·(IT)2

Goal-Orientation

• First Step of Architecture Evaluation:

determine the specific goals for the system under consideration

• Prioritize goals!

• What are the critical properties/requirements the

architecture must support?

• Is the architecture suitable wrt. these

goals/properties/requirements?

• Determine the points in the architecture that influence

the critical goals

© Ingolf H. Krueger 8March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 9March 06, 2003 CSE/CAL·(IT)2

Who is involved?

Customer End UserManager

AdministratorMarketing

MaintainerTester

Operator

ImplementerEvaluation Team

Architect

© Ingolf H. Krueger 10March 06, 2003 CSE/CAL·(IT)2

Architectural Aspects: What to Look at?

logicalview

implementationview

processview

deploymentview

serviceview

Functional requirements

Models of structure and behavior Source code organization

File structure

Configuration information

... Aspects of distribution and concurrency

Response times

Throughput

...

Mapping of executables to processors

System platform

Installation

...

see also: [RUP98]

© Ingolf H. Krueger 11March 06, 2003 CSE/CAL·(IT)2

Key Elements of Architecture Evaluation

• Determine goals for the evaluation

– why does it happen?

– who has an interest in it?

• Build understanding of the application domain

– where are the difficulties in building this and similar applications?

– are standard solutions available/used?

• Build domain model if it doesn’t exist already

• Determine and prioritize goals for the system/organization

• Identify and prioritize scenarios (“critical”, “medium”, “low”)

• Play through scenarios, record adequacy of the architectural decisions

• Discuss alternative architectures and migration strategies

© Ingolf H. Krueger 12March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 13March 06, 2003 CSE/CAL·(IT)2

Example: Architecture Evaluation

Build Understanding of the Existing Architecture

Hardware

Hardware Abstraction

GUI-Coupling

Application Server

GUI

Middle-ware

Legacy

**

© Ingolf H. Krueger 14March 06, 2003 CSE/CAL·(IT)2

Example: Architecture Evaluation

• Evaluation goals:– Determine if prototypic architecture can be transferred to production

– Determine adequate means for architecture documentation

• Prioritized system goals:– extensibility (applications/internal services)

– understandability

– short time-to-market

– support for emerging standards

• Domain model

• Alternative architectures– use of off-the-shelf middleware vs. proprietary one

– distribution of application “knowledge” between client and host of service

© Ingolf H. Krueger 15March 06, 2003 CSE/CAL·(IT)2

Example: Results of Architecture Evaluation

• Overview documentation of the Software Architecture – Short description of the system

– Domain model

– Essential use cases (including exceptions)

– Component structure(possibly taken from the domain model)

– Interaction patterns

– Other relevant views (process view, distribution view, ... – see above)

• Prioritized list of quality attributes / goals and rationale

• Rationale for the architecture’s adequacy

• Discussion of alternative architectures

• Risk/Cost/Tradeoff Analysis

© Ingolf H. Krueger 16March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 17March 06, 2003 CSE/CAL·(IT)2

Evaluation Methods: Example

ATAM: Architecture Tradeoff Analysis Method*

• Presentation1. Present ATAM

2. Present business drivers

3. Present architecture

• Investigation and Analysis4. Identify architectural approaches

5. Generate quality-attribute utility tree

6. Analyze architectural approaches

• Testing7. Brainstorm and prioritize scenarios

8. Analyze architectural approaches

• Reporting9. Present the results

*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002

© Ingolf H. Krueger 18March 06, 2003 CSE/CAL·(IT)2

Evaluation Methods: Example

ATAM: Architecture Tradeoff Analysis Method*

Utility Tree

*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002

Utility

Extensibility

Availability

Usability

offline-time/year < 5 sec.

handle DOS attacks

……

© Ingolf H. Krueger 19March 06, 2003 CSE/CAL·(IT)2

Evaluation Methods: Example

ATAM: Architecture Tradeoff Analysis Method*

• Phase 0:– Create evaluation team

– Form partnership between evaluation organization and customer

• Phase 1:– Steps 1-6 (architecture-centric)

• Phase 2:– Steps 7-9 (stakeholder-centric)

• Phase 3:– Produce final report

– Plan follow-on actions

*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002

© Ingolf H. Krueger 20March 06, 2003 CSE/CAL·(IT)2

Watchlist*

Be wary, if…

• Architecture follows customer’s organization

• Complex: too many components on every hierarchical

level (rule of thumb: ≤7±2)

• Unbalanced set of requirements

• Architecture depends on specifics of an operating system

• Standards and standard components neglected

• Architecture follows hardware design

• Inappropriate redundancy to cover indecision

*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002

© Ingolf H. Krueger 21March 06, 2003 CSE/CAL·(IT)2

Watchlist*

Be wary, if…

• Exceptions drive architecture

• No architect exists

• Stakeholders difficult to identify

• Architecture = class diagrams

• Outdated documentation

• Disparity in perception of the architecture between

developers and architects

*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002

© Ingolf H. Krueger 22March 06, 2003 CSE/CAL·(IT)2

Metrics: Measure and Control Progress and Quality

• “You can’t control what you can’t measure”– is this true for software development?

• Metrics define how characteristics of a software system are measured

• Examples:– interface complexity

(# ports/component, # messages/protocol, …)

– structural complexity (# components/modules, # components/hierarchical level, # levels, # data classes, height of inheritance tree…)

– behavioral complexity(# calls to other components, # states, # transitions, # synchronization points, # choice-points, # loops, …)

– test complexity(# branches covered, # boundary conditions covered, # data values covered, # loop iterations covered, …)

© Ingolf H. Krueger 23March 06, 2003 CSE/CAL·(IT)2

Metrics: Measure and Control Progress and Quality

• Tool for complexity estimation

• Different techniques for complexity estimation can differ widely in their predictions for the same system

• Initial estimates are almost always wrong– iterative updating required

• Use of prototypes to determine adequacy may be preferable

© Ingolf H. Krueger 24March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 25March 06, 2003 CSE/CAL·(IT)2

Migration Strategies

GUI

Hardware

Hardware Abstraction

Middleware

GUI-Coupling Services

Legacy Systems

ApplicationServices

Hardware

Hardware Abstraction

GUI-Coupling

Application Server

GUI

Middle-ware

Legacy …

• What is the improvement?

• What will it cost?

• What qualifications are needed?

• How long will it take?

• What are the intermediate stages?

• …

© Ingolf H. Krueger 26March 06, 2003 CSE/CAL·(IT)2

Migration Strategies

• Consider sequence of manageable, smaller migration steps as opposed to one “big bang”– depends on application context– influential factors:

time-to-market, competitive advantage, maintenance pressures

• Sketch out architectures of intermediate steps• Discuss “top” scenarios for each intermediate

architecture• Balance

– cost and benefit– available technologies/standards and product delivery schedules– capabilities of development team and ambition– clarity and performance

© Ingolf H. Krueger 27March 06, 2003 CSE/CAL·(IT)2

Overview

• Background: Why Evaluate Software Architectures?

• Goal-Orientation

• Stakeholders and Architectural Views

• Evaluation Results

• Evaluation Methods and Metrics

• Migration Strategies

• Summary and Outlook

© Ingolf H. Krueger 28March 06, 2003 CSE/CAL·(IT)2

Summary and Outlook

• Architecture Evaluation– brings different stakeholders together– increases understanding of goals and their architectural

(mis)representation across team boundaries– exposes risks

• Result should provide– Overview documentation of the Software Architecture – Prioritized list of quality attributes / goals and rationale– Rationale for the architecture’s adequacy– Discussion of alternative architectures– Risk/Cost/Tradeoff Analysis

• Metrics provide aid in measuring and controlling process only ifestimates are regularly updated and compared with reality

• Migration strategies enable transition between current architecture and target– orient along goals– “big bang” vs. sequence of small steps