User centered agile dev balanced team 2013

Preview:

DESCRIPTION

 

Citation preview

User Centered Agile Dev at NASA - One Groups Path to Better Software

Jay Trimble NASA Ames Research Center

!For Balanced Team

11-3-13

My Background• Missions

• NASA Johnson Space Center, Houston

• Shuttle Mission Control, Payloads

• Jet Propulsion Lab

• Robotic - Voyager Neptune

• Shuttle - Space Radar Lab, Lead Ops Director

• Current

• Mission Operations & Ground Data System Manager, Resource Prospector Lunar Rover

Space Radar Lab-1 Ops Director

Internship in Mission Control (A long time ago)

My Background

• Software Technology

• Human Centered Computing for Mars Rovers

• Founded User Centered Technology Group

• User centered technologies for mission control

One Story of Agile at NASA

• This is a bottom up story of how a group at NASA applied agile methods to software development for mission control

• This was approved, but not initiated by, management

The Project

• Our task was to build an architecture for mission control user applications, the primary focus being on developing interaction paradigms and technology for user-composable software

!

• See the results at https://github.com/nasa/mct

The Collaboration

• Design and Development Team at NASA Ames

• The Customer

• Mission Control Users at NASA

• Using Participatory Design, we created an integrated team that included customer representation

Issues and Mandates

• Some customers want a new product, others do not

• The product must have new capability, but must also not be disruptive within the organization

• Functional and visual connection to legacy product

The Journey

• We began with a six month software delivery cycle

• By iteratively fixing issues, we got the delivery cycle down to three weeks

• It took close to two years to complete the transition

Where we started

• Four six-month deliverables

• One User Experience Spec

Subsystem1 Subsystem2 Subsystem3 Subsystem

6 Months 6 Months 6 Months 6 Months

Module 1

Issues we faced• Long delivery cycle

• Difficult to manage feature prioritization and development, integration and testing

• Progress invisible to customer, lack of meaningful ongoing customer interaction to drive design

• Mismatch in expectations between design/dev team and customer

• Difficult for the development team to know state of progress relative to goals

• Deliveries focus on subsystems rather than meaningful end user functionality

• Two-year final deliverable created a tendency to defer key issues

jtrimble2@gmail.com

Initiating Internal Change

• Fix the problems iteratively, without a broad proclamation of methodology, i.e. “we are going to be agile” or “we are going to be “lean”

• Just fix the problems

First Step - Six Week Cycle

• We took the six month cycle and divided it into smaller pieces

• This was a start, but still left many issues

It 1 It 2 It 3 It 4 It x6

Weeks6

Weeks6

Weeks6

Weeks

jtrimble2@gmail.com

Incremental Improvements• Six week delivery cycle

• Prioritization of work at the start of each six-week iteration

• User Experience spec for every iteration due one week before iteration start

• UE testing and design session during coding period of each iteration

Six Week Cycle

Code (3.5 weeks)

UE Specification

UE Spec

Stack Rank

Pre-Stack

Rank 1

Pre-Stack

Rank 2

JIRA Updates/Priorities

Iteration n-1 Iteration n

Kickoff

Eng design & spec (3 days)

Coding/UE Spec Revisions/Daily Acceptance Test

UE Testing Iteration n-1 (delivered s/w)

UE Design/Testing Iteration n+1 (paper)

Develop Test Plan

RlsDocs

Demo

Demo new featuresfor QA

Test (2 weeks) PS Review

Pre-Ship Review, exit critera, customer demo

Deliver De-Brief

Iteration n+1

jtrimble2@gmail.com

Almost There

• Better, but still not where we need to be

• Six week iterations are focused on subsystem capabilities, they lack user-focus

• Customers see progress every six-weeks, this is not often enough

Next Steps• Identify the issues

• After each iteration we had a team de-brief where we identified issues and discussed fixes

• Fixing the issues, one step at a time

• Some issues we fixed with policy changes based on team de-briefs

• Many of the changes were bottom up within the team, such as

• Daily communication between user experience designers and the customer as new features rolled out and QA testing of features on rollout,

• Some changes were top down, such as the length of an iteration (or sprint) and the release cycle

Agile

• We shortened the cycle to three weeks

• Replaced discrete events, with integrated interactions

• Integrated strategic and tactical into our ranking process

• Each iteration had clear purpose, goals, ranked priorities

• Daily Build, Iterations, Release

• Strategic road map

Designing with the Users• Participatory Design &

Analysis

• Customers are part of the design team

• Designers facilitate, customers are the domain experts

• Shared ownership

Design Artifacts• Triggers/Results

• Really big picture

• Big Picture

• Task Flows

• Blue sky

• Real world

Design Artifacts

• Task Objects

• User Objects

• Windows

jtrimble2@gmail.com

Agile Cycle• Nightly Build

• Iteration delivered every 3 weeks

• Release every 3 months

Release n

Iteration 1 Iteration 2 Iteration 3 Iteration 4

Release to Mission Control User Test Community

Release to Mission Control User Test Community

Release to Mission Control User Test Community

Release to Mission Control Ops

3 Weeks 6 Weeks 9 Weeks 12 Weeks

jay.p.trimble@nasa.gov

The Three-Week Cycle

User Feedback

3 Weeks Iteration n

Daily iteration nBuild to Customer

TestFeature mods/additions,bug fixes

Optional Mid-Iteration Hackathon tests bigfeatures

Pre-ShipHackathon

Priorities/JIRARankings

Nightly Build/Internal testing as features roll out

Coding

Issue Tracking Updates/Priorities/RankingsUE & Tech Spec dates driven by coding dependencies

Deliver to customer

Agile Development Iteration

Code Freeze (-3 days)

Feature Freeze(-7 days)

Customer triages issues it discovered

Customer acceptance test

Customer verification of closed JIRA issues

Customer installsiteration n-1

Optionally, hot patch

Iteration n+1

Start 24 hour test (-2 day)

The Release Cycle

Release

3 Weeks

Iteration 1 Iteration 2 Iteration 3 Iteration 4 Bugs/Usability/More Testing

6 Weeks 9 Weeks 12 Weeks

Release to Customerfor Mission Control Certification

Release to Mission Control User testCommunity

Coding/UE Specs

Issue Tracking Updates/Priorities/Rankings

Build/Internal testing as features roll out

Customer FeatureVerification

Release to Mission Control User testCommunity

Customer FeatureVerification

Release to Mission Control User testCommunity

Customer FeatureVerification

Agile Release Into Operations

Strategic Road Map

The TeamTraditional Agile 1 Agile 2*

Developers 5-9 Developers 7 Developers 4

User Experience Design (2)

User Experience Design (2)

User Experience Design (1)

QA/Process Engineers (2)

QA/Process Engineers (2) QA (.5)

Project Manager (1) Project Manager (1)Developers rotate

PM role

Principle Investigator (Part Time)

Principle Investigator (Part Time)

Principle Investigator (Part Time)

Interns Interns Interns

*Reduced Budget

Focus• Work on issues in order of priority

• Easier said than done

• JIRA/Greenhopper for issue tracking and ranking

• Developers should know what their priorities are

• Priorities should be achievable

• Don’t over-manage ranking, or over-assign

Where are we?• There is one, and only one measurement of

progress and that is working code

• Replace presentations, code line counts and other management metrics with the nightly build

• For progress relative to strategic and tactical situation see issue tracking system (we use JIRA)

Testing• Internal QA tests features as they roll out

• Our customer tested features daily to provide feedback

• Our customer used iteration deliveries and releases for final feature verification

• “Hackathons” tested scaleability in a lab environment

Some Lessons Learned• The train leaves the station on time

• A feature that misses one train just gets on the next one

• This requires frequent departures

• Do not ever delay a shipment unless the software does not work

It Takes Time

• Our journey was driven by need, i.e. we addressed issues as they came up, rather than being driven by a formal methodology

• We iteratively refined our methods over two years

Lessons Summary

• The measure of progress is working code

• Work on highest priorities first, avoid the temptation to do the easier things first

• Demonstrations, not presentations

• Customer interaction over extensive documentation

• Progress always visible, nightly build available

• The train leaves the station on time, only working features ship

• Do not delay shipment for features - if a feature is not ready it goes into the next iteration

!

Conclusion

• There is no one right way to do agile

• Fit and evolve the solution to your context of work

Recommended