Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Welcome to the Keep Houston Agile Workshop
• Please put your phone on silent mode
• Note: This is an intermediate class
• Q&A will be taken at the close of this presentation
Lean Principle #2: Build Quality In
Speaker: Allan Watty
Key Objectives
Will
cover
• What do we mean by the phrase
• Why is this important?
• Key Practices to get us there
• Obstacles along the way
• Reference information
Concepts
Lean Software Principles are derived from Lean Manufacturing Principles:
1. Eliminate Waste
2. Build Quality In
3. Create Knowledge
4. Defer Commitment
5. Deliver Fast
6. Respect People
7. Optimize the Whole
Concepts
Why is this important?
Bank Statements sent to the Wrong People
Bar exam software failure sets off wave of lawsuits
London Airspace Closed Due to Software Malfunction
Software Glitch Accidentally Releases Prisoners
Software errors cost the U.S. economy $60
billion annually in rework, lost
productivity and actual damages.
Why is this important?
• Avoid lengthy end of development cycle: test, find defects, fix, re-test, repeat…
• It reduces cost of delays associated with recall, rework, and defect fixing.
• Avoid trying to test the quality in at the end.
• Find ways to build quality into our product/projects from the start.
• “Quality is not something that can be added to a product. Trying to add quality after the product has been built would be like adding baking powder to a cake after the cake has been baked. It doesn’t work.” Alexander Tarlinder
Implementation Sprint Review, Retro, planning Final Regression, bug fix,re-test….
week1 week2 week1 week2 week1 week2 week1 week2 week1 week2 week3 week4 week….
Sprint 1 Sprint 2 Sprint 3 Sprint 4 Hardening Sprint
Assumptions
F
E
D D
C C
B B B
A A A A
Sprint 1 Sprint 2 Sprint 3 Sprint 4 GA
We are doing Agile development:• Incremental, Iterative, Adaptive• 2 - 3 week sprints include development and testing• Plan as you go, progress is shown with working software• Features are decomposed into Epics and Stories• Based on an empirical process control model
week1 week2 week1 week2 week1 week2 week1 week2
Sprint 1 Sprint 2 Sprint 3 Sprint 4
Implementation Sprint Review, Retro, planning
Whole Team Approach to Quality
The stereotypical animosity between dev and QA is counterproductive!
Whole Team Approach to Quality
Foundation of A Great Team
Patrick Lencioni: The Five Dysfunctions of a Team
A Whole Team Approach To Quality
• Whole team is responsible for quality and testing
• Daily collaboration• Testing is a first-class central
practice• Building shared understanding
of features• Release & Sprint planning
• No Silos!• Start and finish together• Guiding development with
examples and tests• Learning together from
customers• Common definition of “READY”• Common definition of “DONE”.
Proactive collaboration
Close collaboration among Developers, QE, Product Owners:• The power of 3+• Shared ownership of the story.• Shepherd a Story/Feature from inception to completion• Technical Reviews: Stories, Design, Code, test cases, automation
scripts, defect triage etc.• Everyone tests!• Shared understanding of:
What is being built Conditions of success
• Nothing is thrown over the wall – there is no wall!• Work at a sustainable pace• Start and finish together
Acceptance Criteria
• Story Grooming is key
• Time for grooming must be allocated each sprint
• Grooming sessions are lead by PO with Dev and QE plus other roles as needed.
• Reviewed by the Three Amigos before the sprint starts
• Engineering stories need to be groomed!
• Additional story info: wireframes, constraints, architecture runway
• Backlog of “ready” stories two sprints ahead.
Use Gherkin format for Acceptance Criteria
As an internet banking customer I want to see a rolling balance for my everyday accounts so that I know the balance of my account after each transaction is applied
Acceptance Criteria 1:
Given that I have accessed my account
When I click on my account balance
Then the latest rolling balance is displayed
Acceptance Criteria 2:
Given that I have accessed my account
And my balance was $10,500.00
When I withdraw $500.00 form my account
Then the balance is not recalculated to display the latest total of $10,00.00
Care about code quality
• Lowering quality lengthens development time.”—First Law of Programming.
• The quality of code is inversely proportional to the effort it takes to understand it
• Simple Design:
➢ Passes the tests
➢ Reveals intention
➢ No duplication
➢ Fewest elements
➢ https://martinfowler.com/bliki/BeckDesignRules.html
• Good code should read like a story, not like a puzzle.
• Code must be written for people to read.
“We can’t be agile if our code sucks" Venkat Subramaniam
Test Driven Development
Cycle:
• Write a failing test
• Write the code to make the test pass
• Refactor the code to make it clean
Goal:
• TDD is a practice not a process
• TDD is about good design
• Automated tests are a side effect
Design for Testability
• How can I make it easy for all features to be:➢ Tested➢ Debugged in production
• How can I make it easy maintenance, enhancements and integration:➢ SOLID Design principles➢ Layered Architecture➢ Interface Programming➢ API Contracts
• How do I avoid creating the big ball of mud?➢ Domain Driven Design
SOLID Principles of Software Design
• Single responsibility
➢ A class should have one, and only one, reason to change.
• Open-closed
➢ You should be able to extend the behavior of a class, without modifying it.
• Liskov substitution
➢ Derived classes must be substitutable for their base classes.
• Interface segregation
➢ Make fine grained interfaces that are client specific.
• Dependency inversion
➢ Depend on abstractions, not on the concrete details of other classes.
Domain Driven Design
• An approach that places emphasis on the domain model and carrying it into implementation.
• A shared ubiquitous language based on the problem domain binds analysis and code models.
• Focuses on core domain model and collaboration between business experts and development team.
• Maintenance of domain model integrity helps avoid BBOM.
Domain Driven Design
Body of Knowledge:
• Cliff Notes:
➢ Domain-Driven Design Quickly by by Abel Avram & Floyd Marinescu
• Original idea:
➢ Domain Driven Design by Eric Evans
• Concrete examples:
➢ Implementing Domain-Driven Design by Vaugn Vernon
➢ Patterns, Principles, and Practices of Domain-Driven Design by Scott Millett and Nick Tune
Continuous Integration
• Frequent System-Level Integration and Testing
• Plan for integration tests with external systems
• Maintain consistent integration and testing infrastructure
Continuous Integration
• Implement automated builds on demand and by code check-in: Unit Tests Acceptance Tests Static Code Analysis Code Coverage reports Successful deployment Automated UI Smoke Test
• Use thresholds to drive build failure• Broken Automated acceptance tests = broken build• Stop the Line for broken builds
Automation Focus
• Automated acceptance tests for all new features
• Behind the UI (API layer) tests
• Large scale performance and reliability tests
• Small UI Smoke Test to run against every build
• Reduce manual regression test time
➢ Test Data, deployment and environment preparation
➢ Automation of repetitive manual tests based on ROI
• Research:
➢ Progress on UI Automation tools (avoid snake oil salesmen)
➢ Google Test Automation Conference
➢ Embrace DEVOPS tools
Automating Acceptance Tests
Automated Acceptance
Tests
Customer
Unit Tests
Code
Acceptance criteria Review
Refactor
SBE/ATDD/BDDFitNesseCucumberJbehavePython BehaveSpecflowRobot framework
TDD
Automating Acceptance Tests
• Collaborative effort by Developers and Testers• Should be done as part of Story Implementation• Use any one of these methods:
➢ SBE: Specification by Example➢ ATDD: Acceptance Test Driven Development➢ BDD: Behavior Driven Development
• Broken Automated acceptance tests = broken build!• Automated tests become part of your regression suite
• Reduces risk when late changes are necessary• Early warnings sign of broken functionality• Allows QE to focus on deeper, difficult manual tests.
Automating Acceptance Tests
• BDD is about communication and driving development
• Tests represent expectations of behavior the software should have, created before coding begins.
• Tests provide feedback to the team about how close we are to “Done” and provide living documentation for the application.
• Tests are written in plain descriptive English type grammar
• Tests use examples that clarify requirements
Behavior Driven Development in Action
27
Behavior Driven Development in Action
28
BDD cycleKey artifacts:1. Feature Files2. Step/Scenario definitions
Tools:1. Java, Jbehave2. C#, Specflow3. Ruby, Cucumber4. Python, behave!
Sample Feature File
30
As a [Shopper] I want [to put items in my shopping cart] so that [I can manage items before I check out]
Acceptance Criteria N (scenario)
Given I'm a logged-in User
When I go to the Item page
And I click "Add item to cart"
Then the quantity of items in my cart should go up
And my subtotal should increment
And the warehouse inventory should decrement
Agile Quadrants
Test Automation Pyramid
Exploratory Testing
Key practice to help drive, verify product quality
• Utilizes the concept of charters or tours to guide the work
• Time boxed into multiple sessions
• Provides feedback to stakeholders on observations
• Can lead to formal test cases, new defects, design changes
• Ideas from testing come from conversations in Release Planning, Sprint Planning, Story reviews:
➢ No one is sure how something works
➢ What happens if a change is made to an existing feature
What to do with Manual Test Backlog?
RISK
Damage(Cost of failure)
Usage frequency Damage / Use
Probability of failure
Quality(failure rate / defect
density)t
• Utilize support incidents and Root Cause Analysis output as additional test ideas.
• Review the likely use of each feature by customers.• Analyze the likelihood and the impact of failure of key
features.• Prioritize regression testing based on risk• Surround key functionality with unit and systems tests• Plan to automate key repetitive tests based on ROI
Root Cause Analysis
Pair work
Pairing team members may increase productivity:
• Many different styles of pairing work:
➢ Two developers for all code development
➢ Developer and Tester on a story
➢ Spontaneous pairing: refactoring, automation, interface definitions
➢ Tester and Product Owner
• Focus on solving a problem or delivering faster
• Cultural Challenges
• "Joy, Inc.: How We Built a Workplace People Love"
Agile Metrics
The whole team agrees on a set of metrics to guide there work:
• Velocity
• Build Thresholds exceeded
• Build failures over time
• Customer defects over time
• Sprint Velocity
• CLOC over time
• Bug count
• Mean Time To Repair
• Total number of tests; % passing; %failing
• Test Code coverage
Agile Metrics
Agile Metrics
Lead time and MTTR in the delivery process
Measuring Team Productivity?
Mike Cohn:
My advice is to abandon the pointless pursuit for a metric by which to judge the performance of knowledge workers such as product development teams and instead observe and evaluate those teams. You may never be able to pinpoint which team or person is “the best of all time,” but you don’t need to know that to succeed with agile.
Key Challenges on Agile teams
• Untested legacy code• Mini-Waterfalls• Testers are always behind• Short-cuts• Cone of Uncertainty• Clarity on requirements• Huge Manual Regression Backlog• Poor estimates • What should be automated?• Applications not designed for testability• Walls between different roles• Waiting to do all testing at the end does not work• Moving / unclear definition of “DONE”
Adopt Regular Release Cadence
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Release Planning
Release Review 3/31/2016 6/29/2016 9/23/2016 12/16/2016
Beta Release 7/14/2016
GA Release 9/28/16
2016
Q1 - PSI Q2 - PSI GA Release Q4 - PSI
1/4/2016 4/4/2016 7/11/2016 10/3/2016
Summary
• Whole Team Approach• Power of 3• Agreement on Acceptance Criteria before Writing Code• Test First Development• Good design still matters• Automated Acceptance Tests• Exploratory and Risk Based Testing• Continuous Integration• Utilize key metrics to assess where team is at any point.• Pair Work• Learning from Customers
Key References
• Agile Testing, by Lisa Crispin
• ATDD by Example: A Practical Guide to Acceptance Test-Driven Development, by Markus Gärtner
• Lean-Agile Acceptance Test-Driven Development, by Ken Pugh
• BDD in Action by John Ferguson Smart
• Essential Scrum by Ken Rubin
• Continuous Delivery,, by Jez Humble and David Farley
• The FIVE DYSFUNCTIONS of a TEAM, Patrick Lencioni
• Developer Testing by Alexander Tarlinder
• Patterns, Principles, and Practices of Domain-Driven Design by Scott Millett, Nick Tune
• SPECIFICATION BY EXAMPLE by Gojko Adzic
Agile Leadership Network – Houston Chapter
• Venue: Sysco Corporation, 1390 Enclave Parkway.
• Meetings are held the 3rd Thursday of each month unless otherwise noted on the chapter’s website.
• Meeting format:
o 6:00pm – 6:30pm – Network with your fellow Agile practitioners
o 6:30pm – 8:00pm – Announcements and introductions, followed by our Featured Program
• Admission is Free
• Registration: http://alnhouston.org
Speaker Bio
SoftNAS, 2016 : Sr. Director Quality AssuranceABB Inc., 2012 – 2015: Senior Development ManagerPROS, 2005 – 2012: Development Manager, QA ManagerAspen Technology, 2003 – 2005: Development ManagerBMC Software, 1995 – 2003: Senior Development Manager
Education• B.S. Math & Biology, Lehman College, CUNY • M.S. Computer Science, NYU Polytechnic • M.B.A , University Of Houston Bauer College,Certifications: PMP, CSM, SAFe AgilistPresident – Agile Leadership Network Houston ChapterFormer Life: Musician, Math Teacher
Any Questions?