7

Click here to load reader

Automated Performance Testing

Embed Size (px)

DESCRIPTION

If you are like most test driven developers, you write automated tests for your software to get fast feedback about potential problems. Most of the tests you write will verify the functional behaviour of the software: When we call this function or press this button, the expected result is that value or that message.But what about the non-functional behaviour, such as performance: When we perform this query the expected speed of getting results should be no more than that many milliseconds. It is important to be able to write automated performance tests as well, because they can give us early feedback about potential performance problems. But expected performance is not as clear-cut as expected results. Expected results are either correct or wrong. Expected performance is more like a threshold: If the performance is worse than this, we want the test to fail.

Citation preview

Page 1: Automated Performance Testing

AutomatedPerformanceTestingLars ThorupZeaLake Software Consulting

May, 2012

Page 2: Automated Performance Testing

Who is Lars Thorup?

● Software developer/architect● C++, C# and JavaScript● Test Driven Development

● Coach: Teaching agile and automated testing

● Advisor: Assesses software projects and companies

● Founder and CEO of BestBrains and ZeaLake

Page 3: Automated Performance Testing

We want to know when performance drops● ...or improves :-)

● Examples● this refactoring means the cache is no longer used for lookups● introducing this database index on that foreign key is way faster

● Write a test to measure performance

● When and how should the test fail?

var stopwatch = Stopwatch.StartNew();for (int i = 0; i < 200; ++i){ var url = string.Format("Vote?text={0}", Guid.NewGuid()); var response = client.DownloadString(url); Assert.That(response, Is.True);}stopwatch.Stop();

Page 4: Automated Performance Testing

We cannot use assert for this

● But the resulting time can vary widely● range too narrow: many false negatives● range too broad: many false positives

Assert.That(requestsPerSecond, Is.InRange(40, 50));

Page 5: Automated Performance Testing

Use trend curves instead

● Does not fail automatically :-(● unless we add automated trend line analysis

● Need manual inspection● weekly, before every release● and it takes only 10 seconds

● So the feedback is not fast● but shows which commit caused the performance issue

Page 6: Automated Performance Testing

Demo: TeamCity from JetBrains● Make the tests output their results

● In an .xml file

● Configure TeamCity to convert the data to graphs

● Read more here● http://www.zealake.com/2011/05/19/automated-performance-trends/

stopwatch.Stop();PerformanceTest.Report(stopwatch.ElapsedMilliseconds);

<build> <statisticValue key='Voting' value='667'/> <statisticValue key='PerfGetEvent' value='3689'/></build>

Page 7: Automated Performance Testing

Demo: Jenkins● Make the tests output their results in CSV files

● Use the Plot plugin