Performance Metrics for your Delivery Pipeline - Wolfgang Gottesheim

Preview:

Citation preview

1 #Dynatrace

gottesheim

wolfgang.gottesheim@dynatrace.com

Wolfgang Gottesheim

Performance Metrics for your Delivery PipelineJAX London, October 14

2 #Dynatrace

HOW

WEBSITE

Is 10PM a good time

to find out about performance

problems?

3 #Dynatrace

When do YOU find performance problems?

4 #Dynatrace4

? ? ?

Unit/IntegrationTests

Acceptance Tests

CapacityTests

ReleaseDevelopers

5 #Dynatrace

When to find them?

Requirements /

Specification /

Design

(Load) Test /

QA /

Acceptance

Deployment /

Production /

Maintenance

Development

(Load)Test/

QA /

Acceptance

De

v

Te

st

Development

De

v

Te

st Deployment /

Production /

Maintenance

De

v

Te

st

De

p

De

v

Te

st

De

p

De

v

Te

st

De

p

De

v

Te

st

De

p

De

v

Test

De

p

Dev

Te

st

De

p

De

v

Te

st

De

p

De

v

Te

st

De

p

6 #Dynatrace

7 #Dynatrace

The Challenge

» Performance is not a band aid!

» Architecture has enormous influence!

You have to continuously ensure yourperformance requirements are met!

"I couldn't help but notice your pain."

"My pain?"

"It runs deep. Share it with me!"

(Star Trek V)

8 #Dynatrace

“But we have tests”

9 #Dynatrace

Software testing tells us that our system

» meets the requirements that guided its design and development,

» responds correctly to all kinds of inputs,

» performs its functions within an acceptable time,

» is sufficiently usable,

» can be installed and run in its intended environments, and

» achieves the general result its stakeholders desire.

Source: Wikipedia

What are we learning from our tests?

10 #Dynatrace

Let’s look at the tests we run

Unit Tests

Integration Tests

Acceptance Tests

Load Tests

Meets requirements

Responds correctly to input

Performs in acceptable time

Usability

Deployment

Achieves Correct Result High effort (have to be created and maintained)

Only possible at a rather late development phase

11 #Dynatrace

12 #Dynatrace

The Goal

Unit Tests Integration Tests

Acceptance Tests

Load Tests

Meets requirements

Responds correctlyto input

Performs in acceptable time

Usability

Deployment

Achieves Correct Result

13 #Dynatrace

What you usually get

Measuring Performance of Unit and Integration Tests

[junit] Running com.dynatrace.sample.tests.FastUnitTest

[junit] Tests run: 15, Failures: 0, Errors: 0, Time elapsed: 34 sec

[junit] Running com.dynatrace.sample.tests.SlowUnitTest

[junit] Tests run: 17, Failures: 0, Errors: 1, Time elapsed: 2,457 sec

14 #Dynatrace14

15 #Dynatrace

Basic: Test duration

I don’t like endsWith – I like regex!

16 #Dynatrace

N+1 Queries

Metrics: • # SQL Executions / Request• # of “same” SQL Executions

17 #Dynatrace

Ignoring Architectural Rules

Metrics: • # SQL Executions / Request

18 #Dynatrace

High Number of Requests to Backend System

Metrics: • # Calls to 3rd party system

19 #Dynatrace

Memory Leak

Still crashing…

Problem

fixed!Fixed Version

Deployed

Metrics: • Heap Size• # Objects allocated

20 #Dynatrace

Too Many Exceptions

Metrics: • # Exceptions

21 #Dynatrace

Unnecessary work

Caching framework updates content that is never used

22 #Dynatrace

23 #Dynatrace

What you currently measure

What you could measure

Performance Metrics in your CI

# Test FailuresOverall Duration

# calls to API# executed SQL statements# Web Service Calls# JMS Messages# Objects Allocated# Exceptions# Log MessagesExecution Time of Tests…

24 #Dynatrace

We should not forget about ACCEPTANCE tests

25 #Dynatrace

Large Web Sites

17! JS Files – 1.7MB in Size

Useless Information!Even might be a security risk!

26 #Dynatrace

Missing Resources Cause Delays

46! HTTP 403 Requests for images on the landing page

Lots of time “wasted” due to roundtrips that just result in a 403

Metrics: HTTP 4xx & 5xxTotal Number of Resources

27 #Dynatrace

SLOW or Failing 3rd Party Content

28 #Dynatrace

What you currently measure

What you could measure

Performance Metrics in your CI

# Test FailuresOverall Duration

# calls to API# executed SQL statements# Web Service Calls# JMS Messages# Objects Allocated# Exceptions# Log MessagesExecution Time of Tests# HTTP 4xx/5xxRequest/Response SizePage Load/Rendering Time…

29 #Dynatrace

Starting from…

Production Environment

Developers CI Server TestingEnvironment

Release

? ?

30 #Dynatrace

…or maybe…

Production Environment

Developers CI Server TestingEnvironment

Release

31 #Dynatrace

We get to…

Commit Stage

Automated Acceptance

Testing

Automated CapacityTesting

ReleaseDevelopers

32 #Dynatrace

Performance as a Quality Gate

Automated collection of performance metrics in

test runs

Comparison of performance metrics

across builds

Automated analysis of performance metrics to

identify outliers

Automated notifications on performance issues in tests

Measurements accessible and shareable across teams

Actionable data through deep transactional insight

Integration with build automation tools and

practices

33 #Dynatrace

And finally make PERFORMANCE part of our Continuous Delivery Process

Commit Stage

Automated Acceptance

Testing

Automated CapacityTesting

ReleaseDevelopers

34 #Dynatrace

Performance Scalability

35 #Dynatrace

Collaborate VerifyMeasure

36 #Dynatrace36

When CAN we find performance problems?

37 #Dynatrace37

Unit/IntegrationTests

Acceptance Tests

CapacityTests

ReleaseDevelopers

38 #Dynatrace

Who Cares About Performance?

Developers?

Architects?

Testers?

Operators?

Business?

39 #Dynatrace

Everone!

Developers

Architects

Testers

Operators

Business

40 #Dynatrace

But remember:

41 #Dynatrace

Check out our trialhttp://bit.ly/jaxtrial

Stop by the Dynatrace

booth!

Recommended