39
“Model-Driven Performance Evaluation for Service Engineering“ Seminar Emerging Web Services Technology David Jaeger

“Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

“Model-Driven Performance Evaluation forService Engineering“

Seminar Emerging Web Services Technology

David Jaeger

Page 2: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Agenda

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

2

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 3: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Background Information

3

■ Published in Proceedings of European Conference on Web Services

in November 2007

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Dr. Claus PahlDublin City University

Service and Software Engineering

Marko BoskovicUniversity of Oldenburg

Model-Driven Engineering, Performance Engineering

Prof. Dr.Wilhelm HasselbringUniversity of Oldenburg

Software System Quality, Distributed Systems

Page 4: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

4

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 5: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Service Engineering

■ Services are getting more complex over time

■ Composition of services is major topic in research and

business

5

■ Architectural questions getting important

□ Hard to oversee all technologies and code

□ Do not cope with implementation details anymore

■ Shift focus to problem domain

■ Introduction of models as abstraction

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 6: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Model-Driven Engineering (MDE)

■ Key Points:

□ Discourage algorithmic

and code concepts

□ Prefer Models as

Abstraction

6

■ Advantages

□ Formal analysis and

evaluation of model

□ Generation of

implementation from models

■ Employment of Model-Driven Architecture (MDA)

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

[Metaphor by Johan den Haan]

Page 7: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Model-Driven Architecture (MDA)

■ Popular MDE Approach by the Object Management Group (OMG)

■ Guidelines

1. Technologies => Problem domain

2. Automation of relation between problem and implementation

7

2. Automation of relation between problem and implementation

domain

3. Open standards for interoperability

■ Definition of models with domain-specific languages (DSL)

□ BPMN (Web Services)

□ UML

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 8: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

8

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 9: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Performance and Quality of Service

■ One of Quality of Service (QoS) attributes

□ Among reliability, availability and others

■ Covered metrics

9

■ Covered metrics

□ Response time

□ Throughput

□ Resource utilization

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 10: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Motivation for Performance Evaluation

■ Performance is critical property in today’s business software

□ Demand for quality software

□ Client does not want to wait for long time (timeliness)

■ Measurement of certain key properties

□ Durations in service composition

10

□ Durations in service composition

◊ Single service action

◊ End-to-End latency

□ Responsiveness

□ Number of concurrent users

□ Resource consumption

� Reveal performance bottlenecks and improve service

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 11: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Problems with Performance Evaluation of SOAs

■ Services are deployed remotely

□ No direct access

□ Cannot measure performance on one host

□ Measurement results must be collected from multiple locations

11

□ Measurement results must be collected from multiple locations

□ Network delay can influence performance

■ Service implementation is probably not available

□ Neither as binary nor as code

□ Cannot easily inject performance measurement code

□ WSDL-file is only resource available

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 12: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Evaluation Methods

12

•Imitation of program execution focusing on certain aspect

•Pros: flexible

•Cons: Lack of accuracy

Simulation

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

•Mathematical description of system

•Pros: Easy to construct

•Cons: lack of accuracy (because of abstraction)

Analysis

•Measurements and metrics calculation on real system

•Pros: Very accurate

Empirical Evaluation

Page 13: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

13

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 14: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Model-based empirical evaluation

■ Evaluation approach chosen in paper

■ Model-based

□ MDE fits the requirements of services

□ Empirical evaluation has already been researched

14

□ Empirical evaluation has already been researched

on code-level

■ Empirical

□ Accuracy benefits

□ Lacking research for model-level

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 15: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

15

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 16: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Sensors

■ Monitoring is performed by means of sensors

□ Collect information about state of system

■ Two types of sensors exist

16

Traced

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

• Requires code in traced software

• Influences performance

Traced

• Performance not influenced

• Infrequent state changes could be omitted

Sampled

Page 17: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Recording Monitoring Data

■ Recording of data emitted by sensors

□ Data: Time-varying relationship between entities of a

computation

■ Conventional relational databases are static

17

■ Conventional relational databases are static

□ Record state at single moment of time

□ Current state of database is snapshot of system

■ Extend relational databases

□ Record facts with corresponding time information

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 18: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Temporal Databases

■ Two distinct types of databases support recording of data with

time information

18

Start of

Transaction End of

Transaction

Start of

ValidityEnd of

Validity

Data

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Historical

Database

Rollback

Database

Temporal

Database

Validity

Event Time

Page 19: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

1. Service- and Model-Driven Engineering

2. Performance Evaluation

3. Empirical Model-Driven Performance Evaluation

19

3. Empirical Model-Driven Performance Evaluation

4. Monitoring

5. Evaluation Framework

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 20: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Overview of Framework Workflow

20

Plain UML activity diagram

ClassClass Class

Performance Evaluation

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Page 21: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Step 1: Plain UML Activity Diagram

21

Plain UML activity diagram

ClassClass Class

Performance Evaluation

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Page 22: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Plain UML Activity Diagram

■ Model of the service process

□ Created by user/software designer

□ Modeled as UML activity diagram

◊ Best fits requirements of extensibility

22

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 23: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Step 2: Monitoring Annotation for the Model

23

Plain UML activity diagram

ClassClass Class

Performance Evaluation

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Page 24: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Annotation Entities

■ Two types of annotations proposed

□ Each stands for certain trace type (Event, Interval)

■ Events used for control nodes, Intervals used for action nodes

24

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 25: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Monitoring Annotation for the Model

■ Add annotations for instrumentation to plain model

□ Automatically or manually

■ Each decision and action node gets corresponding trace annotation

25

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 26: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Step 3: Instrumentation of the Code

26

Plain UML activity diagram

ClassClass Class

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Performance Evaluation

Page 27: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Implementation: Package Structure

27

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 28: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Tracing Package

■ Actions

□ Intercepted at services

□ Collect start and end time of service

□ Send to temporal database

28

□ Send to temporal database

■ Control nodes

□ Intercepted at process engine

□ Take single timestamp

□ Send to temporal database

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 29: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Instrumentation of the Code

■ Inject sensors into the services

□ Easy to realize

□ No significant performance overhead

29

Aspect Oriented

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

•Controlled environment with access to code

•Separation of instrumentation from code

Aspect Oriented Programming (AOP)

•Open environment with service black boxes

• Interception of method invocations with proxies

Interceptors

Page 30: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Generation of Code

■ Instrumentation code generated automatically

■ Employ ATLAS Transformation Language (ATL)

30

■ Employ ATLAS Transformation Language (ATL)

□ Input: UML activity diagrams with

annotations

◊Service locations needed

□ Output: AOP-based code

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 31: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Step 4: Temporal Database

31

Plain UML activity diagram

ClassClass Class

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Performance Evaluation

Page 32: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Temporal Database

■ Two major implementations available

□ TimeDB

□ Oracle servers

■ Database Structure

32

■ Database Structure

□ Single table for every sensor

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

TransferTrace

startPeriod endPeriod

2:22 2:45

3:03 3:12

3:15 3:29

DecisionTrace

eventTime

2:19

2:50

3:01

3:10

Page 33: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

33

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 34: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Temporal Database

■ Two major implementations available

□ TimeDB

□ Oracle servers

■ Database Structure

34

■ Database Structure

□ Single table for sensor type

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

ActionTraces

type startPeriod endPeriod

login 2:22 2:45

balance 2:47 2:50

logout 2:52 2:54

ControlNodeTraces

type eventTime

start 2:21

decision 2:46

merge 2:51

end 2:55

Page 35: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Step 5: Evaluation of Results

35

Plain UML activity diagram

Class Class Class

Performance Evaluation

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Annotated UML activity diagram

Services with instrumentation code

Service MonitorTraced

SensorEventing>>

Class

Class

Event + Interval

Traces

Temporal Database with

Monitoring Data

Performance Evaluation

Page 36: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Evaluation of Monitoring Data

■ Can perform performance queries on temporal database

■ Special query language required (TSQL2, TQuel)

■ Evaluate response time of single service

36

■ Evaluate the frequency of called services

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

SELECT CAST(VALID(AT) TO INTERVAL SECOND) / COUNT(AT.type)

FROM ActionTraces(type) AS AT

WHERE AT.type = 'balance'

SELECT COUNT(AT.type) / COUNT(CNT.type)

FROM ActionTraces(type) AS AT, ControlNodeTraces(type) AS CNT

WHERE AT.type = 'balance' AND CNT.type = 'decision'

Page 37: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Conclusion

■ New approach for performance evaluation of Web Services

□ Focus on abstract model-layer

□ Evaluation by empirical analysis

■ Good overview of time spent in single action and relations

37

■ Good overview of time spent in single action and relations

between certain control points

□ However…

◊ Cannot associate measuring results of same walkthrough

◊ No association between control points and actions

■ No further work on the topic

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 38: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

Literature

[1] Pahl, C.; Boskovic, M.; Hasselbring, W.: Model-Driven Performance

Evaluation for Service Engineering, 2007

[2] Snodgrass, R.: A Relational Approach to Monitoring Complex Systems,

1988

[3] Pahl, C. et alii: Quality-Aware Model-Driven Service Engineering in

Model Driven Software Development: Integrating Quality Assurance,

38

Model Driven Software Development: Integrating Quality Assurance,

2009

[4] Debusmann, M. et alii: Measuring End-to-End Performance of CORBA

Applications using a Generic Instrumentation Approach, 2002

[5] Liao, Y.; Cohen, D.: A Specificational Approach to High Level Program

Monitoring and Measuring, 1992

[6] Snodgrass, R.: The TSQL2 temporal query language, 1995

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010

Page 39: “Model-Driven Performance Evaluation for Service Engineering“ · Agenda 1. Service-and Model-Driven Engineering 2. Performance Evaluation 3. Empirical Model-Driven Performance

39

Questions?

Model-Driven Performance Evaluation for Service Eng. | David Jaeger | January 14, 2010