Continuous performance: Load testing for developers with gatling

Preview:

Citation preview

#jfall15

Continuous performance testingTim van Eijndhoven - JPoint @timveijndhoven

#jfall15

#jfall15

Tim van Eijndhoven

Software craftsman @ JPoint

#jfall15

Performance testing should be part of the process

#jfall15 performance testing traditionally……happens several times per year

…and/or at major releases

…is performed by specialists

- changes were made long ago

- many different code changes

- at a certain moment in time

- when is test required?

This has some cons:

#jfall15 performance testing traditionallydesign

write code

test code

performance test

release

unit testsintegration tests

#jfall15 with continuous deliverydesign

Write code

Test code

Release

unit testsintegration testsperformance tests

#jfall15 Continuous Delivery

Always production ready Has to be under control

Short feedback cycles Effects should be clear ASAP

Maintained by self-supporting teams No external specialists

Demands code to be With regards to performance

#jfall15 Part of the processWith the same level of support as

- Unit-tests and integration tests

- Continuous Integration

- Zero-downtime deploys

Performed by the development team

#jfall15 Performance testing process

Design Record Operationalise Execute Report

#jfall15 Designing scenarios

Generic tests used to test core functionality

Specialised tests used to test specific features

#jfall15

Tool support is key for performance test adoption

#jfall15 Gatlingpackage computerdatabase // 1

import io.gatling.core.Predef._ // 2 import io.gatling.http.Predef._ import scala.concurrent.duration._

class BasicSimulation extends Simulation { // 3

val httpConf = http // 4 .baseURL("http://computer-database.gatling.io") // 5 .acceptHeader("text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8") // 6 .doNotTrackHeader("1") .acceptLanguageHeader("en-US,en;q=0.5") .acceptEncodingHeader("gzip, deflate") .userAgentHeader("Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0")

val scn = scenario("BasicSimulation") // 7 .exec(http("request_1") // 8 .get("/")) // 9 .pause(5) // 10

setUp( // 11 scn.inject(atOnceUsers(1)) // 12 ).protocols(httpConf) // 13 }

#jfall15 Alternative tools

And many more…

#jfall15 Gatling core conceptsScenario

Feeder

A sequence of http request used to simulate application usage

A tool used to fill request parameters

Recorder The tool used to record http requests or take a HAR-file and convert it to Gatling DSL

#jfall15 Scenario to outcomeBrowser HAR-file

Gatling recorder DSL

Custom changes DSL

Gatling Report

#jfall15 Reports

#jfall15

Demo

#jfall15 Interpretations

See how changes affect performance

Have feedback on performance in short amount of time

See which load the application can endure

We can

We cannot

#jfall15 Results so far- At least one time prevented issues in production

- Discovered configuration error in test infrastructure

- Helped testing performance fixes

- Helped track down and validate a workaround for a memory leak

#jfall15 Further development

Automate recording process

Reuse functional test scenarios as performance tests

for example by leveraging tools like cucumber/protrector/fitnesse/browserstack to specify and record a scenario

Eliminate custom code changes

#jfall15 Some take-awaysPerformance testing should be part of the development cycle at the same level as unit and integration tests

Take changes in your frontend into account with regards to application performance

Gatling is an awesome programmer friendly tool for load testing

Using the approach from this presentation we can only monitor trends in performance, not determine which load the application can endure.

#jfall15

Questions ?

Recommended