AutomatedPerformanceTestingLars ThorupZeaLake Software Consulting
May, 2012
Who is Lars Thorup?
● Software developer/architect● C++, C# and JavaScript● Test Driven Development
● Coach: Teaching agile and automated testing
● Advisor: Assesses software projects and companies
● Founder and CEO of BestBrains and ZeaLake
We want to know when performance drops● ...or improves :-)
● Examples● this refactoring means the cache is no longer used for lookups● introducing this database index on that foreign key is way faster
● Write a test to measure performance
● When and how should the test fail?
var stopwatch = Stopwatch.StartNew();for (int i = 0; i < 200; ++i){ var url = string.Format("Vote?text={0}", Guid.NewGuid()); var response = client.DownloadString(url); Assert.That(response, Is.True);}stopwatch.Stop();
We cannot use assert for this
● But the resulting time can vary widely● range too narrow: many false negatives● range too broad: many false positives
Assert.That(requestsPerSecond, Is.InRange(40, 50));
Use trend curves instead
● Does not fail automatically :-(● unless we add automated trend line analysis
● Need manual inspection● weekly, before every release● and it takes only 10 seconds
● So the feedback is not fast● but shows which commit caused the performance issue
Demo: TeamCity from JetBrains● Make the tests output their results
● In an .xml file
● Configure TeamCity to convert the data to graphs
● Read more here● http://www.zealake.com/2011/05/19/automated-performance-trends/
stopwatch.Stop();PerformanceTest.Report(stopwatch.ElapsedMilliseconds);
<build> <statisticValue key='Voting' value='667'/> <statisticValue key='PerfGetEvent' value='3689'/></build>
Demo: Jenkins● Make the tests output their results in CSV files
● Use the Plot plugin