Connecting Business and Testing with Test Analytics Twitter: @paul_gerrard Paul Gerrard...

Preview:

Citation preview

Connecting Business andTesting with Test Analytics

Twitte

r: @paul_g

erra

rd

Paul Gerrardpaul@gerrardconsulting.com

gerrardconsulting.com

Agenda

• Agile is Transitional• Business Experimentation and Feedback• Which Methodology?• Things are Changing (everything)• Big Data and a Definition• A New Model for Testing• A Model for Test Analytics• Getting Started/Baby Steps• Close

Agile is Transitional

Intro

• Online and mobile businesses have used highly automated processes for some years– SMAC environments (social. mobile, analytics,

cloud)

• Spec by Example, BDD, ATDD, DevOps Continuous Delivery gaining momentum

• But (familiar story?) ...– Most successes in greenfield environments– Some people are faking it– Not much yet in legacy.

Agile is for some, not all

• Agile (with a capital A)– works for teams with clear goals,

autonomy, business insight and the skills to deliver

– Supports team-based, manual and what might also be called social processes

• The automated processes of Continuous Delivery and DevOps require a different perspective.

Post-Agile?

• Business agility, based on IT needs:– Connected processes ...– ...supported by high levels of

automation ...– ... driven by analytics

• This won’t be achieved until we adopt what might be called a Post-Agile approach

• This is what I want to explore.

Business Experimentation

and Feedback

Emerging trend in business:data-driven experimentation

Where to put the paths?

• A new college campus was built, but one thing was still debated:–Where in the grass should we put the

paved walkways?– Some felt the walkways should be around

the edges– Some felt the walkways should cut

diagonal, connecting all buildings to all buildings

• One professor had the winning idea...

The winning idea• "Don't make any

walkways this year. At the end of the year, look at where the grass is worn away, showing us where the students are walking. Then just pave those paths"• A/B, Split, Bucket, Canary tests use the same principle• Give people a choice; then eliminate the least popular.

Feedback

"We all need people who will give us feedback. That's how we improve." Bill Gates

"I think it's very important to have a feedback loop, where you're constantly thinking about what you've done and how you could be doing it better."Elon Musk

"The shortest feedback loop I know is doing Q&A in front of an audience."Me.

Business Experiments are just tests, aren't

they?Can you see the clown?

What are they testing here?

Business experiments aren't so dramatic• In the context of online businesses,

experiments are a way of evolving systems

• "Survival of the fittest"• Experiments are designed• Measurement process defined• System is built, tested and deployed• Measurements are captured,

analysed and acted on.

Which Methodology?

Feedback loops (NOT to scale)

Unit TestPair Programming

SprintDaily ScrumContinuous Integration

2-4 weekly

Daily

Hourly

Minutes

Seconds

Feedback ranges from personal observations (pairing), throughautomated unit, build/integration tests, user tests and team retrospectives

Three development patterns

Structured

Agile Continuous

Goal-Bas

edHi-Process

Autonomous

Profiles of the three patternsCharacteristic Structured Agile ContinuousStructure Managed Autonomous Production Cell

Pace/cadence Business decision Team decision Feedback

Leadership Project Managed Guided Research Line Managed

Definition Fixed spec Dynamic spec Live Specs

Testing Scripted Exploratory Automated

Auto. Test Retrospective Developer led Pervasive

Measurement Pervasive Avoided Analytics

Governance Bureaucratic Trust-based Electronic

Things are Changing

(Everything)

In the Internet of Everything, everything is a thing• In the IoE, things are sensors,

actuators, controllers, aggregators, filters, mobile devices, apps, servers, cars, aeroplanes, cows, sheep, cups, spoons, knives, forks and ... people

• Our perception of systems is changing• Our software projects are systems too• Our development processes are

things.

Our processes are things

• DevOps processes are triggered, controlled, monitored

• Process dependency and communication similar to Machine to Machine (M2M) communications

• ‘Publish and subscribe’ messaging model is more appropriate than 'command and control'

Our processes have (or are) sensors• DevOps outputs report outcomes and status• These outcomes are the payloads of messages

from processes, collected by instrumentation in development and production

• The data collected is the basis of the analytics that drive software and business stakeholder decision-making

• (BTW We've created our own MQTT/MongoDBServer at gerdle.com)

Using message buses for DevOps

http://sawconcepts.com

Big Data(and a Definition)

Big Data is what Test Analytics collects• Most isn't BIG, it’s just bigger than before• More interesting is its lack of structure• Mayer-Schonberger and Cukier:

1. Traditionally, we have dealt with small samples

2. We had to be ‘exact’3. We were obsessed with causality

• Looking at the entire data set allows us to see details that we never could before

• Correlation, not causality.

Performance testers do ‘Big Data’• Load/response time graphs• Graphs of system resource (network, CPU,

memory, disk, DB, middleware, services etc.)• Usually, the tester or operations analyst has

to manage and merge data from a variety of sources in a variety of unstructured formats

• Common theme: it is all ‘time-series’ data• Performance testing and analysis is a Big

Data discipline and an example of Test Analytics.

My thesis

• Data captured throughout a test and assurance process should be merged and integrated with:– Definition data (requirements and design

information)– Production monitoring data

• … and analysed in interesting and useful ways

• Monitoring starts in development and continues into production.

A (my) Definition of Test Analytics“The capture, integration and analysis of test and production monitoring data to inform business and software development decision-making”.

Test Analytics

Analysis Insight Decision Development and Testing

Production'Testing'

Insight to Action

A customer may search for an item using any text string. If a customer selects an item the system will …

Requirement

Business Story

FeatureFeatureFeatureScenarios

Feature

Example: dictionary

Glossary of Terms

The Index

customer

customer

customer

customer

Record Update

Batch updatefor all reqs, all

stories

A customer is…

Glossary key:[Proposed Term]Defined Term - Not ApprovedDefined Term - Approved<Data Item>

<Data Item>

<Data Item>

(Scenarios Only)

A New Model for Testing

Test Analytics is based on models

Judgement, exploring and testing

Testing(the system)

Our model(s) are adequate

Our model(s) are not adequate

Exploring (sources) Judgement

Creates testmodels

Uses testmodels

We explore sources of knowledge to build test models that inform our testing

BTW – Do Developers explore the same way? I think so.

Feedback loops

New Model Testing

29 page paper: http://dev.sp.qa/download/newModel

Don’t ALM tools already do TA?• Not really.

A Model for Test Analytics

A model for Test Analytics

• There is no fixed view– What does Test Analytics cover?– How should tools modularise their functionality to

provide the required features

• This is a vision for how the data MIGHT be captured across what are often siloed areas of an organisation.

• Six main areas– Approximately sequential, but parallel changes in all

five areas are possible– Silo is a convenient term for the data contained

within it.

Data silos for test analytics(an illustration)

Stakeholder Requirements Development Assurance Production Application Monitoring

Production Environment Monitoring

• Stakeholders

• Business Goals and Measures

• Stakeholder involvement/ engagement

• Risk

• Requirements

• Story/feature descriptions

• Glossary of terms and Term Usage

• Processes

• Process Paths (workflows)

• Feature change history

• Code commits

• Unit Test Count

• Code Merges

• Code Complexity

• Code Coverage

• Fault density

• Code Churn

•Manual Tests

• Generated test code (unit, integration, system)

• Application Instrumentation

• Automated Test Execution History

• Test Status

• Application Process Flows

• User Accesses/ Activities

• Feature Calls, Response Times

• Interface calls

• Application Alerts/Failures

• Database Content

• Production Failures

• System Assets

• Resource Usage Logs

• Performance Data

• System Events

• System Alerts/ Failures/ Incidents

•Outages

Getting Started

I'm sorry – I wish I has the time

Method

• What kind of analyses are possible?• What decisions could these analyses support?• Imagine you have captured data in the areas noted

above– In each silo the data items can be related– In principle, we can link any data to any other data

(through a potentially complicated set of relations)

• But in a Big Data analysis, you are not obliged to figure out these relations

• History of two aspects of this data over time might reveal a pattern

• Think, “correlation rather than causality”

Start with baby-steps

• Create a glossary of terms• Scan your requirements documents and search for each

glossary term and load the usage references to another database table

• Repeat for your stories and tests• You can now link requirements, stories and tests. If you

can’t – you have an interesting requirements anomaly:– How are requirements and examples communicated if they

don’t share common terminology or language?– Cross reference (or use common names for) your features in

requirements and stories and your tests.

• Ask developers to generate a log of usage of all or key features in their application code

• Use the same common feature names or cross reference.

Some simple reports that you

could produce from your databaseI'd rather show you some video

https://www.youtube.com/watch?v=cNBtDstOTmA

Close

• DevOps leads the way but I think most organisations have an opportunity here

• Treat your processes as 'things' with sensors

• Treat your data as fallible and explore it• Experiment; ask questions of your data• Forget scientific analyses until your data is

100% automatically generated 100% reliably

• It (probably) never will be, so don’t worry.

Is Test Analytics Worth the Effort?

I'm afraid you'll have to try it for yourself

Connecting Business andTesting with Test Analytics

Twitte

r: @paul_g

erra

rd

Paul Gerrardpaul@gerrardconsulting.com

gerrardconsulting.com

Recommended