34
Prepared by: Karen Richey DM Number: 2158289 Date Prepared: 5/12/17 DM Library: All Staff Reviewed by (optional): Job Code: 100782 Mnb,mnb,mnRecor d of Interview Title GAO Agile Expert Meeting Minutes Purpose To Discuss Best Practices Related to Agile Development Contact Method In Person Contact Place GAO Headquarters Contact Date 5/4/2017 Participants See spreadsheet at the end of the document Comments/Remarks: Discussion from Agile Expert Meeting on May 4, 2017 Mat started by welcoming everyone to the meeting and asking those in attendance to introduce themselves. He told the participants that GAO was recording the meeting in order to accurately capture everyone’s comments. Mat added that everyone attending the meeting by phone/webex or in person would have an opportunity to review the minutes and make updates before they are sent out to all experts. Finally, participants were encouraged to send additional comments by email after the meeting once they had some time to think about what was discussed. Tim Persons kicked off the meeting thanking everyone for their attendance as their input is critical and GAO recognizes that the attendees were volunteering their time. Tim mentioned how GAO wanted to listen to what the experts had to say about each agenda item and synthesize the information before formally capturing comments in our meeting minutes. He said that GAO best practice guides, like the Agile guide we will be discussing today, are rare products because they are the only reports that are crowd sourced among a distinguished group of Page 1 Record of Interview The mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

  • Upload
    ledien

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Mnb,mnb,mnRecord of InterviewTitle GAO Agile Expert Meeting Minutes

Purpose To Discuss Best Practices Related to Agile Development

Contact Method In Person

Contact Place GAO Headquarters

Contact Date 5/4/2017

Participants See spreadsheet at the end of the document

Comments/Remarks:

Discussion from Agile Expert Meeting on May 4, 2017

Mat started by welcoming everyone to the meeting and asking those in attendance to introduce themselves. He told the participants that GAO was recording the meeting in order to accurately capture everyone’s comments. Mat added that everyone attending the meeting by phone/webex or in person would have an opportunity to review the minutes and make updates before they are sent out to all experts. Finally, participants were encouraged to send additional comments by email after the meeting once they had some time to think about what was discussed.

Tim Persons kicked off the meeting thanking everyone for their attendance as their input is critical and GAO recognizes that the attendees were volunteering their time. Tim mentioned how GAO wanted to listen to what the experts had to say about each agenda item and synthesize the information before formally capturing comments in our meeting minutes. He said that GAO best practice guides, like the Agile guide we will be discussing today, are rare products because they are the only reports that are crowd sourced among a distinguished group of experts. Tim acknowledged how many participants from the executive branch have most often been on the other side of auditing where GAO has analyzed their work. However, in this meeting, the purpose is to help GAO determine fair best practices that make sense to hold agencies accountable to when it comes to employing Agile methods for developing software.

Tim announced how GAO has been holding expert meetings like this one since 2005, which started with the GAO Cost Estimating and Assessment Guide (www.gao.gov/products/GAO-09-3SP). GAO released the original exposure draft in 2007 and collected comments for one year before finalizing the updated Cost Guide in 2009. Tim noted that GAO is currently in the process of updating the 2009 Cost Guide since it is now 8 years old. Three years after the publication of the final version of the Cost Guide, GAO released an exposure draft of the Schedule Assessment Guide in 2012. Following the same process as

Page 1 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 2: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

the Cost Guide, GAO collected comments on the Schedule Guide, reviewed and incorporated accepted comments, and published a final version last year (www.gao.gov/products/GAO-16-89G). In addition, in August 2016, GAO published an exposure draft of its Technology Readiness Assessment Guide (www.gao.gov/products/GAO-16-410G) and is collecting comments until August 2017. The Agile guide will follow the same process as GAO’s other best practices guides. Tim discussed how GAO’s brand of reports must be trustworthy, non-partisan, non-ideological, and fact based so having experts participate in the review of draft chapters and offering their comments during these meetings will help us meet those goals.

Agenda Item #1: Update From the Last Expert Meeting GitHub Update (Martin/Juaná) Updated Draft High Level Outline (Mat)

Mat discussed the status of GitHub and mentioned how only 10 experts expressed an interest in using it for sharing comments on the guide. Therefore, due to the low amount of responses (we have over 200 experts), GAO did not think it made sense to use GitHub. Mat explained how GAO’s internal IT policies create some restrictions using GitHub which weakens our business case as well. He mentioned that GAO could continue to look into a free GitHub license to collect comments on the Exposure draft of the guide because it would make things easier, but from a security standpoint we would probably require a private repository. As a result, unless we hear back from more experts about the usefulness of GitHub, we will not be pursuing that approach and will stick with our standard method of sending out Word Files for collecting comments.

Dane Weber liked the idea of using GitHub so that he could see whether GAO had rejected or accepted his comments. Mat explained how GAO vets comments for its guides. GAO collects and tracks every comment in a spreadsheet and holds internal working group meetings to discuss each one. If a comment is accepted then it will be incorporated into the next update. If it is rejected, the reason why is captured in the spreadsheet. Zach Cohn acknowledged that he is a constant thorn in GAO’s side about using GitHub and thinks GAO should determine if GitHub will meet its needs instead of waiting for a large call for adoption amongst the group. (Zach responded by comment to the draft minutes that just because a few people said they wanted to use GitHub does not mean that everyone else who did not respond were against using it. Zach also said that if something like GitHub would save you many hours of labor collecting, consolidating, discussing and resolving comments (which he believes it would), he strongly believes we should use this tool. He very much agreed with Jim Barclay’s email after the meeting – see below).

Karen Richey discussed how GAO values everyone’s input and reviews every single comment. She added that GAO relies on group consensus regarding which ones are accepted, modified, or rejected. She said that the experts would see whether their comments were incorporated when new updates come out and if they wish to understand why a comment was not accepted they could always contact us to discuss in more detail. In addition, Karen said that GAO can get conflicting comments from various experts. When that happens, GAO adds the comment as an agenda topic discussion at a formal expert meeting like this.

Page 2 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 3: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Mat said that for security reasons, GAO management would most likely insist on having a private GitHub repository which would cost money depending on how many people were actually using it to provide comments. If GAO decided to use GitHub, we would need to look at the IT budget to see if there were enough funds for this effort.

Neil Chaudhuri provided these comments after the meeting:

You might also want to check out GitLab, which is free for private repositories, but lacks the cachet of GitHub.

James Barclay emailed the following after the meeting:

I am disappointed in the lack of interest in using GitHub which is a very useful tool. It allows for the transparency which is fundamental to the practices of agile. I look at the use of GitHub as walking the walk with agile behavior. 

Debra Dennie shared these thoughts by email after the meeting:

For collecting comments, you may want to consider an approach we are using. LMI is supporting a DHS Grant on Information Sharing and Analysis Organizations (ISAO) Standards Organization. We developed a website following an Agile Scrum approach that includes an online process to capture and consolidate comments using WordPress. Our working groups are developing guidelines and post drafts for public comment. The working groups then get a consolidated MS Excel spreadsheet with all submitted comments to adjudicate internally. The public can either register on the site to provide comments, or submit them anonymously using an online form. Feel free to register to try the process to submit comments. See https://www.isao.org/resources/draft-products/

Agenda Item #3: Group Announcements SPAWAR Data Collection Effort

In the interest of time, SPAWAR asked to speak next about its data collection effort. Jeremiah Hayden and Omar Mahmoud discussed the attached presentation that provides an overview of their organization (SPAWAR 1.6 Data Science Division). Jeremiah Hayden (Jerry) introduced himself as the government lead for SPAWAR’s cost division. Omar Mahmoud is a contractor supporting Jerry’s effort to collect Agile metrics within the Navy. Jerry discussed how in 2015, SPAWAR started this effort after noticing that software implementation was a big cost driver for ship installations while headcount was a cost driver for Enterprise Resource Planning (ERP) systems. To improve their cost estimating efforts, they developed a dashboard of Agile metrics that the Navy can use. This dashboard only includes Navy

Page 3 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 4: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

data, so they wanted to poll the rest of the community to see if they could collaborate with others to share data that could help them develop estimating methods to cross check Agile projects. The concept of big data is also a focus on deriving some multi-based analysis.

Access to the ERP system provides a large database of information from which they can collect data. Right now, they are collecting headcount information that can be used to validate future estimates. Currently, SPAWAR has only a few (about five) Agile programs underway. Jerry added that he and Omar have started creating a database of Agile metrics using data from these programs. The database includes information from programs that just began as well as some that are mid-way through development. They are happy to share what they have collected so far with anyone who is interested. Jerry mentioned how the IT Dashboard has hundreds of programs with a lot of valuable information such as large data test sets that reflect cost data and Agile metrics. One of the main metrics that they would like to collect from other programs is the number of requirements / features completed.

Jerry Frese informed the group that Bill Pratt from the Department of Homeland Security was doing something similar to SPAWAR and they should contact him. He agreed to send Bill’s information to the speakers. Omar Mahmoud said that they were working with the Navy system commands to collect data, but this was very challenging because most programs do not have an Agile suitable database of metrics. Their goal is to collect data and track progress so they can have a repository of historical actual information to improve their software cost estimating. To do this they need to have a uniform method for collecting data and their vision is to use this to not only collect data, but to also track progress while a program is using Agile development. Their hope was if everyone could collect the same type of data then as a community they could build and sustain a database that could be used for cost estimating, program management, and software development.

Tricia Hall asked whether they were also collecting non-cost metrics and could they show trending over time. Jerry Hayden said that they were collecting many Agile metrics such as the number of story points per sprint, technical debt associated with fixing bugs, human system integration (HIS) efforts, as well improvements and enhancement information. The idea is when you are capturing the cost of developing requirements you will also have information regarding the expected amount of technical debt. They would also like to show on their dashboard information related to hours per feature, hours per sprint, hours per story, etc. Therefore, they will be collecting both quantitative and qualitative information for each program. For instance, they want to capture information about the software environment such as what kind of program is the Agile development supporting (e.g., a radar system, command and control system, etc.). In addition, they want to collect information on the acquisition category (ACAT) level, contract type, etc. While cost is a small part of the data collection effort, having non-cost information is imperative too since cost is usually fixed. When requirements shift, Agile programs have to adapt quickly to those changes.

Regarding trend information, Omar said that their contract data requirements lists require contractors to provide trend graphics that they would import to show progress over time using their data collection tool. Trends would be dynamically updated making program monitoring simple. SPAWAR recognizes that their dashboard is very Navy weapons centric which is why they want to invite other programs to share their data so they can have more information on complexity and can develop better analogies for

Page 4 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 5: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

estimating. Jim Barclay asked whether they were collecting information on the plan and to what extent they were meeting their planned efforts. Omar responded saying that Earned Value Management (EVM) analysis for Agile programs could collect this information using data from the roadmap as well as sprint plans that describe the specific features that are planned for completion. They would use this data to track progress against the plans. Manik Naik said that in addition to traditional Agile metrics, SPAWAR needed to collect management and executive level metrics as well. He offered to share the metrics he was collecting with SPAWAR and Karen Richey asked him to share them with her and she would include them in the minutes so that everyone could see them. Manik said that he is trying to automate the download of JIRA data into an executive level dashboard that displays information decision makers need to see about a program.

Mat explained that he wanted SPAWAR to go first on the agenda because we are trying to figure out where metrics will be addressed in our guide. For instance, should metrics be captured in various chapters or are they important enough that they should have their own chapter. To that end, Mat next opened the discussion for the remainder of Agenda Item 1 below which included a discussion of the proposed Agile Guide outline.

Updated Outline: Chapter 1: BackgroundChapter 2: Compliance and Past WorkChapter 3: Agile Adoption Best Practices

Team Activity Best Practices Program Processes Best Practices Organizational Environment Best Practices

Chapter 4: Agile Implementation ChallengesChapter 5: Requirements DecompositionChapter 6: Agile and the Federal Acquisition Process

Agile and the Federal Contracting Process Agile and the Federal Budget Process

Chapter 7: Agile Program Management Factors Program Planning and Trade-Offs Team Composition

Chapter 8: Agile Program Control Best Practices Cost Estimating Best Practices Schedule Best Practices Earned Value Management Best Practices

Appendixes Glossary/Rosetta Stone Effects of not following Best Practices Agile Methodologies Debunking Agile Myths Questions for Auditors and Managers

Dane Weber liked that we included the idea about having cross-functional teams, but felt that this needed more emphasis in Chapter 3 since the discussion was interspersed throughout various paragraphs. He said it was important that the team collectively can complete tasks and not so much that individuals could. He discussed how you should not have an Agile team of developers and an Agile team

Page 5 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 6: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

of requirements analysts. Instead, you need one team with all of these talents working together as a whole. He used an example where at United States Customs Immigration Services (USCIS) one of the big things they had to do outside of their Agile team was 508 compliance testing. The Agile team would develop the stories and then send the code off to the 508 testing team and wait for feedback that would then have to be incorporated. These handoffs between teams created a slow loop mechanism that was contrary to Agile principles. They made changes to team to bypass this loop and now there is a 508 tester on each Agile team so that every story that is developed has to undergo 508 testing. This is now happening multiple times every iteration.

James Barclay sent the following comments after the meeting:

In general, I believe continuous improvement is missing or not emphasized enough as a best practice. It also seems that inspect and adapt may not have gotten the attention it needs as a best practice.

Suggest looking at Agile from the perspective of Contracting, Program Management, and Management Oversight because each part supports the others to create the whole. You have to address all three of these aspects when using Agile.

Managers need to be asking the right kind of questions to support the best practices of agile. See file attached below for ideas on what needs to be asked.

We need to be very careful we do not force Agile measures into Waterfall reporting. Mapping of Agile artifacts to typical management reporting is not going to give the most benefit or encumber agile behavior.  For example, Story Points were only intended be used by the team for estimating the amount of the work they thought they could accomplish. Because teams are not equal in their ability to complete a specific number of story points, management should not rely on story points as a way to normalize against hours or cost. The focus should instead be on whether the agreed upon plan was accomplished (i.e., Were all of the stories in the 2 week plan completed or were all of the features agreed upon in the 10 week increment plan completed?). Management can use burn up or burn down charts for the sprints and increments to determine what was completed.

Debra Dennie sent the following comments after the meeting:

LMI was appraised at CMMI Level 3 for Development and Services using an Agile approach. Although CMMI is agnostic on the Software Development Lifecycle (SDLC), when CMMI was introduced many of the first projects assessed were Waterfall. Now that many companies have embraced Agile, CMMI published a mapping of Agile to the CMMI framework. This may help some who are trying grasp Agile concepts compared to Waterfall. See http://cmmiinstitute.com/cmmi-and-agile

Page 6 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 7: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

I will reach out internally to consider what Agile metrics we find most valuable. We use the full suite of Atlassian tools including JIRA to manage the work flow and have dashboards that show burndown charts and provide metrics to track project health. Other metrics we track are “escaped defects” which are bugs that make it all the way to production. Using Agile, we believe we have greatly reduced “escaped defects.”

Matthew Kennedy sent the following comments after the meeting:

I consider the change from traditional development to agile development a revolutionary change versus an evolutionary change – it’s like going from a gas car engine to an electric engine.  Due to this large change, it requires a complete revamping of our tactics, techniques, and procedures (TTP) versus an adaptation of our current TTPs.  If the group accepts this to be true, it will dramatically influence the content in the guide.  Since there is such a disparity between the two models (traditional/waterfall and agile), we may consider ensuring our terminology does not use an old (waterfall) nomenclature or practices.   For example, “Earned Value Management Best Practices” under “Agile Program Control Best Practices.”   There is an agile / EVM working group, some members of this group may also be on this committee, but I am not convinced EVM is an agile best practice.   In my current implementation of agile development, I do not find any use for EVM-like reporting due to these factors:

1) EVM reports are usually delayed by several weeks, by the time I receive a report the information would be old as we continually adapt on a bi-weekly basis.

2) More importantly, I recently released a Firm-fixed-Price capacity based contract, negating the cost portion of EVM since my costs for agile teams are fixed each month.  

3) I have seen a demo of a solid end-to-end implementation of agile/EVM from the Raytheon Corporation, but I am not sure if the overhead of the agile/EVM yields a positive return on investment (ROI) since there are lighter-weight methods to accurately track an agile projects progress.

4) Though EVM is a tool to track progress, I am not sure it is a best practice.   New evidence can easily convince me otherwise, but right now, I struggle to see the ROI.  My overarching concern is that if we place it as a “best practice” it may be applied blindly and cause undue Burdon (both cost and time). 

Another example may be the “Requirements decomposition” section.   Maybe we should title this “User Story Generation” as the content will likely be referring to the traceability of the “Vision->Epic/Feature->story.” Though this is a semantic change, it could have a large impact.  Using the traditional approach, the requirements department would come up with the requirements and my fear is that the requirements department will just create the user stories which would break the core agile principle of collaboration.  Changing the name to something like “User Story generation” may further clarify it is not business as usual with a new name.

Agenda Item #2: Draft Chapter 3: Agile Best Practices (Mat)

Page 7 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 8: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Organizational Environment:

Program Processes:

Team Activities: TBD

Comments stats: Total: 227 and counting (still entering comments into the tracker)Vetted: 10 (mostly comments from the glossary)

Mat started off the discussion saying GAO had sent out draft sections of chapter 3 related to the organizational environment and program processes that should be implemented for an agency to successfully adopt Agile methods. The team activities section was not ready in time for the meeting, but would be sent out for everyone to review and comment soon. Mat said that towards the end of the month or early June, GAO would send the whole chapter out with all three sections included (team activities, program processes, and organizational environment) along with the preface and background so that everyone could see how the structure and format of the guide was evolving. He stressed how these chapters do not include any of the comments received to date because the team has been busy writing and has not yet begun vetting comments.

Tricia Hall said that in general, she was pleased with tenor of the draft chapter and thought GAO had included many of the Agile principles that are needed for this approach to be successful. However, reading the first two paragraphs, she was turned off by the message because if she was someone who was hesitant to try Agile, the introduction to this draft chapter, as currently written, talks more about the risks related to this new method and the big learning curve that everyone will have to overcome to successfully adopt Agile methods. She said that reading the first page gave her enough pause to question whether Agile would be a good idea. Tricia also commented that there were placeholders for graphics in this chapter and she would like to see GAO include a graphic that shows how program teams should be organized to support a specific organization enterprise portfolio. This is imperative because all programs should be addressing gaps in strategic enterprise needs. By having a graphic that shows this general relationship, it will drive home the idea that Agile teams are focused on delivering value to the customer that is clearly aligned to strategic business vision goals and objectives. Tricia said that her company had a generic framework for organizing teams that they use when training their customers. This framework clearly shows workflows among the business owners, the Agile team, and Portfolio teams. She said she would share the graphic with us for consideration.

Page 8 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 9: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Zach Cohn commented next that cross-functional and self-organizing teams are not the same thing. Karen responded that during the January 2017 Agile expert meeting GAO received feedback that we should flip the order of the Agile adoption best practices so that they built up from the team to the program and then finally the organization. The idea was that Agile involves big changes so it is easier to start slow with a pilot or two first before going all Agile throughout the whole organization. The reason for not rushing Agile adoption is that an architecture needs to be in place to support Agile methods so that the development teams have an environment from which to plug and play small modular pieces of functionality into existing systems. Karen also reiterated that the next version of Chapter 3 will not include the over 200 comments received to date on the program and organizational adoption best practices.

Jerry Frese said it was important for the guide to emphasize the team including people from all parts of the agency (i.e., team, program, and organization) because that is the number one reason why Agile projects fail in his opinion. He said that without integrating all three parts, people would go back to their silos immediately. Jim Wade commented that he would like the guide to address both the good and bad associated with Agile because Agile is not perfect and there will be risks associated with it. Trisha agreed with Jim saying that there is definitely risk with Agile, but she did not think we should start out with that fact. She added that we could present both the pros and cons with Agile just like there are with any methodology. Jim stated that he wanted the guide to be comprehensive and discuss the good and bad with Agile methods and to avoid trying to sell a particular method to the reader. He said it is important not to lose sight of the goal of Agile that is to save money and have successful projects. The goal is not to do Agile, but to see Agile promote benefits to the users.

Karen discussed how every GAO report has four main elements: criteria, condition, cause, and effect. Criteria is what the Agile guide will represent which is the benchmark of what programs should be doing to meet best practices. Condition is the state of reality for a program. If there is a difference between the criteria and the condition, then GAO will report an audit finding and will reflect the cause or reason the condition is not in line with the criteria. Chapter 3 which discusses Agile adoption best practices will address many of the causes for why there are audit findings. Finally, the effect is the “so what?” which captures why falling short of the criteria is a problem.

Jim Barclay mentioned how GAO has published many best practices that outline things agencies should do to be successful so many of the best practices for Agile already exist in current GAO reports and we should take advantage of that. Mat Bader responded that Chapter 4 (Agile Implementation Challenges) will draw heavily from GAO’s 2012 report (www.gao.gov/assets/600/593091.pdf) which discusses at a high level many general issues programs have encountered when using Agile methods. The vision for this Agile best practices guide is to provide much more rigor regarding what practices can make Agile successful as well as presenting a structured way an agency can approach a transition to using Agile methods. Zach Cohn agreed with Mat saying the purpose of the guide should be to determine if a program is using Agile methods that are in line with best practices and we should present both good and bad case studies. Suzi Miller said she could see the guide being a helpful resource to agencies that are just starting to adopt Agile because they can turn to the guide to see what the best practices are. She suggested we put the many Agile frameworks available in an appendix so we do not look like we are advocating any one way to do this work. Suzi added that Agile is not a silver bullet and our guide should

Page 9 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 10: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

be careful about making a sales pitch, but we do not want to steer people away from doing Agile either. While Agile can solve many problems, it can only do that if implemented effectively.

Zach Cohn said that the reason everyone is at this meeting is to help GAO build a guide for its auditors to assess Agile projects. Therefore, we do not need to sell Agile because the guide assumes that the decision has already been made to use Agile methods and he did not see GAO auditors using the guide to tell a program that it should have done Waterfall because of all its risk factors. Karen added that to address Jim’s concern, the guide will contain case studies from GAO reports, none of which will be reporting good news. These case studies will highlight programs that were doing Agile haphazardly and will identify the causes based on gaps in best practices. She mentioned that for balance, GAO also wanted to include success stories, but since GAO has not audited any programs with that outcome, we need to gather information from the experts to highlight what worked well from their experience. The purpose would be to show that Agile can be done well, but this does not happen without commitment from everyone, investment in training and infrastructure, and a willingness to change the culture. In addition, the aging federal workforce is not comfortable with the big culture changes that Agile demands. For example, older staff may fear technology, not trust automation, and worry that their jobs could be threatened. None of these factors are easy to deal with which is why there are so many challenges associated with Agile.

Suzi Miller commented that most people doing Agile want to do the best job possible and our guide could help them to understand what practices are feasible, possible, and relevant to their situation. She gave an example just how much technology is changing things where a lieutenant was speaking to a deputy Program Executive Officer (PEO) in the Department of Defense about his program’s Agile approach and the things his team was learning from it. The deputy PEO asked him how Agile was different from the Waterfall method, and he replied that he had only been in the service for three years and had never seen a Waterfall program. In her opinion, Waterfall will eventually become outdated as Agile soon becomes the norm.

Dane Weber addressed the risks involved with doing Agile and acknowledged Trisha’s concern that the first two paragraphs of chapter 3 could scare people from even attempting to try Agile. He said that Agile is not an end in and of itself, but is more of a strategy for mitigating risks so that the team can create value sooner. He sees Agile as a good way to counterbalance software development risks despite the initial pain points that must be endured as an agency adopts this method. However, Agile works to avoid the risk of complete failure where a program delivers something over budget that nobody wants. Mat agreed with Dane and mentioned that Chapter 1 (Background) will talk about why many people are transitioning to Agile as a way to mitigate typical risks associated with software development. The experts agreed that people need to see Agile as an evolutionary process and should go into the transition with their eyes wide open realizing that it may not be simple. Karen Richey said that for the next review, GAO will send out an incremental build of the guide with the preface, Chapter 1 (Background), and Chapter 3 (Agile Adoption Best Practices) so everyone can have a better idea of how the guide is developing. Michael Holland said that he printed out draft chapters 4-7 which address Agile Implementation Challenges, Requirements Decomposition, Agile and the Federal Acquisition Process, and Agile Program Management Factors (Program Planning and Trade-offs and Team Composition). When he looked at these draft chapters altogether, he started seeing the pieces form a better picture.

Page 10 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 11: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

He agreed that GAO should send out a more complete package of the initial chapters so the experts could get a better sense of where the guide is going. Doing this could cause some of the questions to fall by the wayside.

Rohit Gupta stated that every time this group meets we have the same discussion for the first 30 minutes because new people show up to the meetings every time. He added that this is not an Agile methodology guide but a best practices guide for those who are choosing to do Agile methods for their programs. Jerry Frese agreed that if you have to fight about doing Waterfall or Agile, you are already dead in the water. Neil Chaudhuri used a football analogy saying that if you are merely running the Patriot’s offense this does not mean you will win. But, if you have Tom Brady as your quarterback then you are more likely to win because he has such an impressive historical record. Said another way, it is not enough to know what the practices are, you also have to believe in them, understand why they work, and know how to execute them the best way. That said, Neil commented that having cross-functional teams performing small batches of limited work should minimize program risk. That is why Agile is so successful. However, Agile can go wrong when people misapply Scrum principles to the wrong things, you lack leadership support, or you do not have the organizational mechanisms in place to do DevOps. These are the areas where Agile risks can occur. One thing that could make Jim Wade feel more comfortable is that many of the experts involved in this group have decades of experiences across multiple industries that show how Agile can work when done properly so it should not really be a matter of debate now. What still needs to be determined by each program is what best practices apply to them, do the teams actually do these best practices, and what kinds of assumptions, methodologies, and organizational support do they have in place to work with. Without these things in place, Agile will fall apart quickly. Therefore, Neil emphasized how important it is have cross-functional teams that self-organize how they will accomplish limited work during each iteration. If this is happening, then a program is unlikely going to fail.

Tom Williams said that a “best practice” is an often misused word because people do not understand the environment. He gave an example using Amazon. When this company first started, they said that the best practice for their warehouse and operations was modeled after a large retailer’s distribution center and trying to convince them that they should not be in that business would have been a big mistake. Therefore, while we need to have clarity around our best practices, we also need clarity around the environment that people will be working in to know if the application of certain best practices is right for a particular situation. This clarity is important as we assess change management risks.

Bob Schatz agreed with Neil that the biggest risk with Agile is not knowing the outcomes that a program is going after because he has seen programs that had everything in place and ended up going very fast in the wrong direction. Neil Chauduri added that Eric Ries in Lean Startup calls this “achieving failure”—where you execute the project perfectly only to have built something no one wants. Bob said that having an eye focused on the end game of what a program is strategically trying to accomplish is key. Trisha Hall said that the only risk she did not see GAO adequately address was at the program and organizational level and how those in management and leadership are going to understand what their new duties are in an Agile environment. This is a big issue because having cross-functional, self-organizing teams means that management does not need to direct teams regarding their daily tasks. Instead, managers and leaders need to understand that their role is now to successfully lead people.

Page 11 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 12: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Many managers do not have these leadership skills and that is a major risk we should address in our guide. Trisha’s company teaches management that their new role is to be subject matter experts, great people enhancers, and process improvement aficionados. She offered to share information with GAO about the necessary culture change that management will need to embrace.

Mat agreed with Trisha’s observations that GAO should also address management and leadership understanding what their changing responsibilities are in an Agile environment and how they need to shift from coaching and facilitating to embracing the necessary culture change. Suzi Miller said that one of the ways they teach people how life is different in an Agile world is they provide exercises that inform management of the different questions they should be asking. These questions are not micromanagement based, but more along the lines of asking teams how is their collaboration going, what barriers are they running into, how are the getting things done, etc. Therefore, with Agile, you end up with a fresh set of questions to go along with a new management role.

Laura Bier commented that sometimes projects do not choose to do Agile because it is not applicable for the work they are doing, but the contract dictates that this is the method to use. As a result, a company may not even bid on the contract because they do not think their organization is ready for Agile or they do not think it is the right method. She has seen companies forced into doing Agile because an agency is under the impression that this is the way to go despite having done nothing on their part to embrace the changes and set up the right environment.

Mat shifted the topic of discussion back to these specific questions about metrics that were under Agenda item 1:

Discussion: Where should we include a discussion of metrics in the guide? Embedded in the three subsections in Chapter 3? As a separate subsection in Chapter 3? As a part of Chapter 7 (Agile Program Management Factors)? Somewhere else?

Mat explained that GAO has been grappling with not specifically calling out metrics anywhere in this guide as its own chapter. He recognized that Chapter 8 (Agile Program Control Best Practices) will have a lot of discussion about metrics, but that is way far down in the guide and the reader has to review a lot of material before getting to this subject. He stated that while we touch upon some metrics in Chapter 3 lightly and without much detail, GAO was wondering if metrics should be in a separate chapter so that they are easy to locate or should they be addressed in another chapter besides chapter 8?

Karen added to Mat’s comments that the Agile metrics that relate to cost estimating, scheduling, and earned value management (EVM) will likely overlap quite a bit. She added that our intent is not to map Agile metrics back to specific cost / schedule / EVM metrics, but to show how existing Agile metrics can provide similar information just packaged in a different way. Karen said that many programs using Agile are subprograms within a bigger acquisition that require formal reporting like EVM. She asked Jonathan Kiser how Boeing handles reporting Agile status metrics for the C-17 program and how these metrics tie into program EVM reports and updates to the integrated master schedule. Jonathan said that many of

Page 12 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 13: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Boeing’s Agile teams struggle with EVM reporting and it is even worse now that his company has started implementing scaled Agile. Jonathan said that it is important for GAO to call out metrics separately because without metrics, Agile is not transparent and people from all levels of the organization lose visibility. He said that the only way to know if a program is successful and its teams are managing themselves well is to review Agile metrics. He said that many kinds of metrics are needed including team, program, management, leadership, and executive level metrics. In addition, management needs to understand what these metrics mean and what they represent such as velocity. Jonathan stipulated that each team should tailor what metrics are applicable and work best for them and management should request what metrics they need to gain insight into progress.

Jim Barclay shared his concern that he has seen people force-fitting Agile metrics into DOD or government measures in an effort to jam them altogether. There is a big struggle in this area because Agile makes people look at things differently and use new approaches that require different questions to be asked. While he agrees each program needs to be accountable for its results, when you are using Agile the results will be viewed in a new way. Trying to force-fit Agile metrics into what you already know and are comfortable with creates a horrible environment. Therefore, he would like us to keep the best practices as pure as possible and encourage programs to choose the metrics that make the most sense for their approach. Karen responded to Jim Barclay that GAO could dedicate a chapter to discussing Agile metrics in detail and then follow up with chapter 8 on Program Control Best Practices showing how to tie in Agile metrics into bigger programs that have formal reporting requirements. She said that GAO could discuss what Agile artifacts and metrics could be potential feeders into progress measurements in a schedule or EVM system. She mentioned how a burndown chart can provide similar information as a schedule to show accomplishments over time.

Mat interjected saying that he thought we had two different discussions going on. He said we will be identifying specific metrics that typical Agile programs need and collect and then there is a more general need for metrics at an executive level to show overall progress and accountability. Mat said that in the organizational environment best practices discussion in Chapter 3 we discuss the need for a clear alignment of goals and objectives to Agile team efforts and he wondered if that was an appropriate place to talk about having a transparency metrics that could be presented to executive governance boards. Trisha responded saying everyone should have business outcomes tied specifically to all of the work that they are doing in order to show traceability to some kind of value (e.g., increased revenue, improved customer satisfaction, faster delivery, etc.). The main point is that the work should be described at a high enough level to show the value of what you are doing. Therefore, if you are tying your work to EVM, you have to have the performance measurement baseline high enough so that the teams have the ability and freedom to change the scope so they can do their work without having to worry about developing constant change orders. Organizations need to understand that if they are going to do Agile, then they have to trust the teams to have the capacity and authority to decide how they will do the work. The main thing is that the work links to business outcomes and there should be leading indicators that tell the team and management whether they are on the right track. There also need to be lagging indicators that capture what has happened already in the past.

Trey Wiesenburg said that he wanted to make sure that we look at the Capital Planning Investment Control (CPIC) Office of Management and Budget (OMB) business case reporting and how the metrics

Page 13 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 14: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

we are talking about align with those required reports because these metrics are driving federal decisions over which programs will be funded or not. As a result, we should also review how these metrics align with these reporting requirements. Mat Bader responded that Chapter 6 (Agile and the Federal Acquisition Process) captures best practices related to Agile and the Federal Budget Process and this may be good place to have that discussion. He said we envision talking about how business outcomes need to be linked to Agile efforts and we can also address the CPIC process and how you need to identify upcoming work for the next budget cycle. Trey said that some Agile metrics should be reported all the way up to the formal CPIC reports.

Rohit Gupta suggested that instead of making metrics a separate chapter we could just address them in Chapter 8 (Agile Program Control Best Practices). Karen replied that it made sense to have a chapter on Agile metrics before Chapter 8 that describes the metrics in detail. This way, Chapter 8 does not have to explain each metric and will only have to create the link back and say that in order to answer questions about a program’s cost, schedule, and performance one can look at various metrics and then discuss which ones. She added that most of these metrics would already be captured in Agile tool suites and managers need to know what metrics can provide similar information as traditional approaches. Jerry Frese said that management needs specific information such as how much money is being spent and what are we getting for it? He used an analogy to parenting where the first couple of years are really hard because you are not getting much sleep and you are adjusting to a whole new way of life by being responsible for a tiny person who is totally dependent upon you for everything. One way to measure success during that transition is whether you get a full night’s sleep or not. You have to keep the metrics at a high level though because the change that is occurring is a total life change. Using an integrated master schedule to manage an Agile program is meaningless because it is a lot of work to develop and maintain, but it tells you basically nothing.

Neil Chaudhuri commented that if GAO is going to evaluate Agile programs the only way to do that is with metrics. For example, you need metrics to determine how well a contractor is doing Agile and whether it is effective. You also need metrics that can be rolled up to the organizational level. He added that programs using automation will have access to numerous metrics that are generated automatically and can be downloaded for review. In his opinion, metrics need to be a huge focus of this guide and they need to cover the whole life cycle of a project through initial contract award to final deployment of all roadmap features. Neil offered to provide GAO with some common metrics that we should include and can show how many of these metrics can be rolled up and mashed together to create new measurements of what you want to know. Zach Cohn responded that he is leaning towards having a chapter that talks all about metrics because then we can get into a lot of detail about the various metrics that accompany Agile methods. In addition, having a chapter dedicated to metrics makes it easier to find rather than the discussion being interspersed among many chapters without much depth in their discussion. He added that the chapter should conclude with what questions auditors should be asking. Some examples of metrics include sprint duration and release cadence, program status information, and oversight metrics for leadership and auditors to determine if a program is on a good trajectory. He warned that our guide should not dictate specific thresholds for any metrics. Instead, the metrics should be collected on a regular basis and reviewed to examine trends over the last six months or so to see how things are improving or declining. Zach was also sensitive that our guide should not punish people for not yet reaching an ideal state as long as they are showing progress and trending

Page 14 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 15: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

towards meeting agency goals. Therefore, the guide should have programs identify what an ideal state looks like so teams can strive to achieve these goals so there is less wiggle room for people to try to redefine the goalpost. Some examples include asking questions about progress so that people do not misrepresent their progress. For example, GAO auditors could ask how long does it take the team to deploy code into production, how long does it take to provision a server, what processes are automated vs. manual, etc. The idea is to see how quickly they can provide an answer that is backed up by evidence. Auditors should also see trends showing time decreasing for deployments as well as the amount of automation increasing. He added that things like Scrum or Kanban may wax or wane in popularity, but continuous integration, automated one-button deployments, and high percentage of test coverage will always be important.

Karen asked the attendees to share whatever metrics they are using and what each one tells you, which metrics they find to be the most valuable, and which metrics do they relay up to management and executive level staff. With that information, GAO will compile a list of metrics to discuss at the next meeting. She suggested GAO could create a table of the metrics with a detailed discussion of each one including examples in this new chapter. Neil Chaudhuri said that top-level questions should also be presented in the guide such as, “How long did it take you to deploy your last feature to production?” While it may seem to be a simple question, it addresses more than one would think. For instance, where would one get this information? Who provided that information? Has deployment time been shrinking or increasing over time? If the deployment time is trending badly, do you know where to look to find out the cause? What about defects? This one question addresses the availability of the data, how to read and interpret it, and any actions taken to address any problems. He noted that this question has nothing to do with the Agile methodology being used by the teams because it does not matter whether the team was using Scrum or Kanban or whatever. What is important is if someone can answer that question capably, it implies that team is Agile as intended and the question conveys that without any need to get into overloaded terminology or Agile specifics.

Mat asked the group if anyone had collected metrics that compared Waterfall to Agile and whether any improvements had been seen by shifting to Agile. Anthony Burley said that it was also important to include qualitative questions in our guide such as how happy is the customer now compared to several months ago. He felt it was important for the auditors to also meet with the customers to get their point of view because just pulling metrics out of their Version One Agile tool is not as beneficial as also meeting with the product owners and having a conversation. At his organization, they take a more people-centered focus on how they are measuring progress and ask product owners whether the services they are delivering are meeting customer needs and if they see the quality of their Agile process.

Zach Cohn said that GSA’s 18F has been working with the State of California helping them to replace their outdated child welfare system. This work was originally planned to cost $500 million and expected to take 7-9 years before the new system would be available to users. The request for proposal (RFP) for this major overhaul was over 1,400 pages long. Due to the high cost and time to deliver, California decided to try Agile. 18F started working with that group about 16 months ago and helped them to break this massive list of requirements down into smaller, more feasible modules. California relied on modular procurement to get the work done and contracted the work with Agile software development

Page 15 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 16: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

vendors. Now, less than a year and a half later, there is working software already out in production that is delivering value. At this time, according to the original plan, the contract would have just been awarded. So, using modular procurement and Agile development they are currently six years ahead of their plan in terms of users getting some value out of the software. Mat Bader asked the group if they had similar experiences and if so could they send that information along with the metrics that program was collecting? Having examples like this could help GAO to show quantitative benefits as a result of implementing Agile methods.

Anthony Burley said to be wary of people gaming the numbers and skewing the data. Karen agreed saying she saw some articles about not using Velocity as a metric because this number can be manipulated by increasing the number of story points so that teams look like they are improving. Auditors need to see stable number of story points so that the metrics can reveal trends that can be acted upon.

Jonathan Kiser said that at Boeing they need to justify their budget every year and since they started to use Agile, they had to make big investments in training their people. Top management starting asking questions about how much Agile was costing the company along with expected savings. His team has had to continuously educate high-level management about Agile. He added that they do not just look at one metric, but a suite of metrics because if they just look at software lines of code, the team could be writing software like crazy, yet not doing any peer reviews or unit testing. Therefore, they look at many metrics at once such as defect containment that allows for the early discovery of problems that can be fixed on the spot rather than six months from now. Doing nightly builds allows them to isolate problems because you only have to look at what changed the day before that caused an issue. (Zach Cohn commented on a draft of the minutes that while nightly builds are better than monthly or annual builds, it is important for the guide to highlight continuous integration. Nightly builds means something can get committed that breaks the build and you will not find out until the next day. Continuous integration means you commit something that would break the build and you cannot move forward until all commits have passed testing that means that the build will not break anymore so the team does not have to spend time tracking down why a build broke and how to fix it). Regarding the comments about mitigating risks with Agile, part of the problem is qualitative which means you should constantly be asking if you have built the right product and if the amount of rework is decreasing. Jonathan offered to share the metrics they use with us.

Suzi Miller said that we should discuss the before and after aspects of moving to Agile. She used the California example where the program shifted from insisting on having 1,400 requirements that would have taken an estimated seven years to accomplish. By moving to Agile, the team focused on breaking the work down into bigger conceptual modules with only near term work having detailed requirements identified. The program is no longer the same once the decision was made to move to Agile and you cannot compare the before and after of the program because it is now completely different. The 1,400 requirements may never all get done so trying to compare the two approaches is an apples and oranges problem. As soon as the program moved to Agile, it became an orange and there is no longer a practical way to compare it to a traditional apple. Doing work incrementally and iteratively is a completely new method that is not equivalent to Waterfall.

Page 16 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 17: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Jerry Frese commented that while metrics are important, trends are even better and more meaningful to management. He said that IRS has been doing Agile for two years now and when they first started, it took 14 months to develop the first release and now it only takes about two months. They also have discovered that they find errors much faster than before. Finally, the skill level of all team members is increasing due to the cross-functional aspects and this even includes the auditors because they have to be smarter at evaluating whether things are getting better. Jerry said that everyone has to rise higher and perform better when doing Agile. No one can slack off for too long without the rest of the team confronting the person who is not pulling his or her own weight.

Crystal Taylor spoke next and said she was from the California Project Management Office that worked with Zach. She completely agreed with Jerry saying that Agile metrics need to be iterated and should be refined as they are collected. They are continually reassessing what they measure to see if they need change anything. They focus on what they are measuring and whether it is valuable along with examining the results. Neil Chaudhuri said this discussion is what Eric Ries calls the difference between vanity metrics and actionable metrics in his book called Lean Startup.1 These metrics are described really well on the TV show “Silicon Valley” where near the end of last season the developers were worried about downloads when daily active users were what really mattered the most. He added that vanity metrics are numbers that make you feel good but do not actually tell you anything useful that can be acted upon.

Neil went back to the example about velocity because it can be very subjective since every team has their own view of what a story point is. Comparing velocity could be dangerous because a team could be producing tons of code with many features, but the code is buggy. As a result, velocity can create the illusion of productivity, but because the quality is so low, the software keeps breaking. For this reason, Neil agrees with Jonathan that a suite of metrics as well as trends are much more useful. Also comparing velocity across teams is meaningless precisely because that metric is very team-specific. Agile literature is quite clear that velocity is intentionally vague as a way of estimating the “size” of features relative to each other as perceived by the team developing them. However, because story point values are typically devised through fancy sequences like Fibonacci, people often mistakenly believe there is sophisticated math or algorithms going on. In reality, story points are simply—or not so simply as it turns out—a loose way to estimate features with a better method than hours because you do not know those until it is time to build the features.

Neil further stated that automation is a great way to collect data because the data is generated in a systematic way unlike story points which are estimated by teams based on their unique point of view. You can go look at the Agile tool and see what everyone has committed to doing for an iteration and that is an objective metric. DevOps metrics are all easily obtained through automation and cannot be subjective. In summary, GAO needs to say that metrics are important to collect, but the metrics need to be actionable and not vanity metrics. Neil agreed to send GAO a lot of information about metrics.

1 http://theleanstartup.com/principles

Page 17 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 18: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Mat changed the subject to whether piloting Agile on small projects first is a best practice because it gives people time to figure out how they going to shift to this new method vs. a whole organizational roll out as a result of a policy change. He asked the experts to provide their opinion on small-scale Agile implementations eventually building up to a whole organization embracing this method. Trisha Hall said she is a huge proponent of piloting first because every agency and organization has a different culture that takes time to change. To be successful, Agile needs to fit an organization and you may have to try several different things before you know what is going to work well. Piloting allows agencies to see how Agile could be incorporated and where flexibility is needed within a specific framework. Once you try it on one program, you can move to another program and eventually you can try Agile at a portfolio level before you mandate Agile across an organization. Starting small and adding more programs allow you to help the culture change and filter out what works and what does not. Teams are empowered to say what works best for them and that control makes the change easier for those involved.

Suzi Miller said she could send excerpts from her book on this subject that recommend two types of pilots that elaborate more on what Trisha has discussed today about technical feasibility and adoption feasibility. Technical feasibility addresses whether a team can even do Agile successfully meaning the necessary are changes in place, the contract is amenable to Agile principles, etc. Adoption feasibility has to do with the culture and being able to do Agile methods repeatedly. The technical changes are much easier than the adoption changes because people tend to resist change. Adoption feasibility addresses training aspects, the kinds of measures needed, policy changes/waivers, and other things that will enable to Agile to really take root. She mentioned an example in the Capability Maturity Model Integrated (CMMI) Survival Guide that is not discussing Agile but many of the concepts in the book still apply to this subject.

Zach agreed with Suzi saying that it is not feasible logistically to do Agile in one fell swoop as there are just too many people that have been doing things a certain way for too long. It takes time for people to buy into a new process and it is important to have empathy for those going through these changes and assume that each one has the best intent to do good work, but it may take time for the team to gel and become productive. It will also take a significant amount of time to work with and convince every part of the organization like legal, finance, IT, to try to let the team do things differently. In his opinion if you try to adopt Agile universally it presents excessive risk. As a result, they push agencies to pilot first. Trisha agreed with Zach saying that teams piloting Agile should be held in positive regard even if they make many mistakes at first because what they are doing is not easy. They have to deal with financial and contracting staff saying they cannot do things differently and there are several battles up front. It is important for auditors to understand this tension and how it takes time to break down these barriers before Agile can work well within an organization.

Trey Wiesenburg said you have to absolutely allow for piloting when transitioning to Agile. Too often, management is resistant to change and you need to start with the federal development staff initially on internal legacy systems that they are already very familiar with to see if Agile will even work. Top down management tends to be very risk adverse and will not allow Agile to be used on mission critical systems until they see the benefits from it on a smaller level and can gain confidence in the outcomes. In addition, support structures need to be in place from functional areas like contracting, legal, etc. before

Page 18 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 19: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

you can go organization wide with Agile. Suzi Miller said that allowing people to have incremental improvements and not demanding that everything be changed all at once is actually quite Agile.

Mat brought up the topic of delegation of authority and trying to make decisions at the lowest possible level. Anthony Burley said that in order to do Agile you need someone to be a champion and say that traditional governance artifacts will not work with Agile because people have to be free to pick what they are going to do. That is a scary concept for some government folks. Trisha responded saying that you need to have an executive sponsor that allows teams to have freedom to do things differently during a pilot. However, there must be a stipulation that whatever work they are doing has to track back to a strategic objective and a budget line item so you can account for the funding needed. Executive sponsorship is key because it provides the team top cover to make changes and not follow all of the rules. When you scale Agile across an organization you end up funding teams rather than projects and that is a new thing for agencies to accept. However, you still track how much is being spent and what you are getting in return. Creating an enterprise clarity room can help track features and epics to strategic objectives making the assignment of business value much easier to individual pieces of work. You can also add monetary values and management can rank the work according to risk, revenue, money saved, etc. All work should track back to these items so that when a feature is done you can quantify its value. When you have all of these things in place, it is very easy to govern an Agile program and make decisions.

Jerry Frese said that during the web apps development the commissioner was very concerned about public opinion and wanted to be involved in all of the decisions before anything went into production. When they started shortening the frequency of delivery management started to grumble. However, when they shortened the delivery from nine weeks to just three weeks, management said they were crazy because they could not even get together every three weeks. At this point, it became very clear to management that they could not be involved in every decision when it came to Agile, but the point was that management came to this conclusion by themselves and the team did not have to argue with them about it. Crystal Taylor said that delegation of authority with governance is good because it provides recognition of the company’s / organization’s comfort level and provides a level of readiness assessment. Jerry Frese said that they called those things guardrails and as long as teams stay within the boundaries, they are fine. Over time, the teams tend widen the guardrails as they earn more autonomy.

Suzi Miller said that teams should have authority to make decisions about anything that affects only them. However, if a decision affects the definition of done then you need to reach out to whomever it affects. DevOps has a major advantage in this area due to the automated governance aspect that allows code to go into production if it passes certain tests without any errors. DevOps vastly streamlines decision making because you have automated mechanisms and criteria in place.

Neil Chaudhuri added that the Agile program management office should ensure that the team builds the right thing. They can do this by focusing on the minimal viable product (MVP) and how the users react to it. The team takes the users feedback and they either pivot or persevere with their work depending on what the users think and what they actually want. Trisha Hall wanted to make sure that governance should include a regular cadence of demos with the customers and retrospectives. By having a demo at the end of each quarter where the executives can see what the team has been doing and getting

Page 19 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 20: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

feedback from them, the team can ask management whether they are still providing value. If so, the team can continue along the same path and continue planning the next iteration. Neil added by email that while it is perfectly fine to have a grand demo for executives on a quarterly basis, we should keep in mind that the Scrum process demands a “Sprint Review” at the end of each sprint (i.e. typically every 2 weeks). This review should include all kinds of metrics, but also a demo. It certainly would be acceptable to only have the executives show up to the demo at the end of the release as opposed to the more frequent demos at Sprint Review, but there should be key stakeholders at the “mini-demos” too. Trisha said that it is not only important to do these reviews after every sprint, but to do them with executive involvement for every release at the end of the quarter. Doing so allows executives to confirm that the team is working on the right features. Having a cadence of executive level reviews like this ensures quick course corrections will take place for the next release and the feedback provided will be reflected in adjustments to the plan for the next release. Therefore, adding executive involvement at release reviews into governance will help ease concerns.

Tim Persons asked Trisha what can be done when the customer does not know what they really want, and if you show them a demo they will always find something wrong with it or they may ask for new features not realizing the disruption that this will cause. While Agile is all about pleasing the customer, sometimes the team will have to push back. Trisha responded saying that having a conversation helps to clear up some misconceptions. For example, if a customer says they want something to be blue and the team gives them blue, but their response is that they think green would be better, the team needs to ask them why. This way, the customer is not just barking out orders that the team must follow. They are having a two-way conversation that helps the team understand what needs to be changed and how it ties to business value. Trisha added that you need to have solid acceptance criteria and a clear definition of done both of which a good product owner can help write along with business analysts that can help groom the backlog.

Bob Schatz said that the team is not doing everything that the customer wants, but they are focused on the biggest thing that the customer needs including understanding why there is a problem that should be solved. It helps to have a conversation about what does not work and how to make adjustments to reach a certain goal. Tim Persons added that GAO evaluates programs all of the time and sometimes programs get in trouble because instead of ordering the cake, they are saying here is the recipe for baking my cake. In a sense, the government is pre-supposing what the answer is even though they do not know what they want. Trisha Hall said that is why sprint reviews are important because they help the team to refine requirements. Zach Cohn said that at the end of the day it all comes down to asking customers what they want which is really an art. Therefore, it is better to show the customer what you built often so if it is not right, you fail early and do not end up too far down the wrong path.

Neil Chaudhuri said that the MVP helps to minimize risk because you are showing people features before you have actually committed a lot of resources to development. Rohit Gupta said that teams can use the Roadmap to keep iterating and getting feedback, but he finds that some customers keep saying they are not happy no matter what they do. Trisha said that it is not the customer’s job to tell you what they want. It is their responsibility to tell you what the problem is and the team has the flexibility to figure out a solution. Neil said that sometimes it is not even as simple as a customer wants blue and then changes their mind to green. It is more the case that the customer did not even know that fuchsia

Page 20 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 21: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

was a color and when they do they find that they really love fuchsia. The customer could never imagine what could possibly be done. Agile processes enable customers to open their minds to completely new possibilities because the development team has the creativity to solve the problem in ways the customer could never think of in the first place. He added that the idea of using an MVP to get user feedback that the team is on the right track or not and therefore should pivot is actually more of a lean rather than Agile concept.

Bob Schatz sent the following comment after the meeting:

Great session today. I just got this in my email and thought it would be great info for the metrics discussion. There are many others out there that I am sure we can draw on. I have a bunch of stuff to send you. https://martinfowler.com/articles/useOfMetrics.html

Closing Remarks and Next Expert Meeting Date

Mat thanked everyone for their time and said that GAO would compile the meeting minutes within a few weeks and send them out to the participants for their review and comment first, before sending the minutes out to the rest of the distribution list. He thanked the participants for their time and a great discussion. He reminded everyone to send GAO their metrics and why they think they are beneficial. He also asked the participants to send any qualitative metrics or anecdotal examples that could be useful for our guide. Mat added that GAO would send out the preface, Chapter 1 (Background), and a full Chapter 3 (Agile Adoption best practices) for review and comment soon. Everyone should keep in mind that GAO has received several hundred comments on these early chapters that have not yet been vetted or incorporated yet. However, in the spirit of Agile, we are sending out a more complete set so readers can see how the guide is forming.

The date for the next expert meeting has already been scheduled for August 24, 2017 from 2-4pm EST. GAO will send out more chapters for review prior to this next meeting. Please mark your calendars. An invite with the agenda will be sent 2 weeks prior.

List of Attendees

The following is a list of who attended the May 4, 2017 meeting.

Page 21 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.

Page 22: Record of Analysis - api.ning.comapi.ning.com/.../ALL_STAFF2158289v3MAY...MINUTES.d…  · Web viewTim mentioned how GAO wanted to listen to what the experts had to say about each

Prepared by: Karen Richey DM Number: 2158289

Date Prepared: 5/12/17 DM Library: All Staff

Reviewed by (optional): Job Code: 100782

Last Name First Name Agency Phone Number EmailAffo Harold Prometheus Computing LLC 240-898-8391 [email protected] Matthew GAO 202-512-3874 [email protected] James DOD [email protected] Jen GAO 202-512-7330 [email protected] Laura CGI Federal 760-814-1410 [email protected] Brian GAO 202-512-4969 [email protected] Anthony DOJ 202-616-1578 [email protected] Ed CGI Federal 703-861-4894 [email protected] Neil Vidya LLC 703-785-8855 [email protected] Zachary Consultant 410-370-8000 [email protected] Chris DAU 256-922-8765 [email protected] Juana GAO 202-512-3024 [email protected] Debra LMI 210-526-8101 [email protected] Ty DOE [email protected] Jerome IRS 240-613-4671 [email protected] Rohit Artemis Consulting 703-598-0077 [email protected] Trisha Agile Transformation Inc. 402-507-6223 [email protected] Jerry Navy SPAWAR [email protected] Michael GAO 214-777-5686 [email protected] Matthew Treasury 202-649-6547 [email protected] Jonathan Boeing 714-616-0187 [email protected] Pete MITRE 443-742-7399 [email protected] Phil Leidos 703-336-2328 [email protected] Omar Cask LLC 714.421.1231 [email protected] Suzanne SEI [email protected] Manik NSF 703-292-4213 [email protected] Sherli NARA 301-837-1896 [email protected] Tim GAO 202-512-6522 [email protected] Karen GAO 202-512-4784 [email protected] Bob Agile Infusion, LLC 215-435-3240 [email protected] Deepak California State Univeristy - Fullerton 657-278-3450 [email protected] Maria DOC 301-713-0262 [email protected] Zack GSA 202-412-0231 [email protected] Martin GAO 202-512-4915 [email protected] Myke IBM 262-825-8442 [email protected] Crystal State of California 916-403-9611 [email protected]  Jim OMB 202-395-2181 [email protected] Dane Excella Consulting 571-289-0000 [email protected] Trey Education 202-377-3510 [email protected] Tom GAO 202-512-5007 [email protected]

Page 22 Record of InterviewThe mention by participants of a specific commercial product, service, or methodology is not to be construed as an endorsement of that product, service, or methodology by GAO or the U.S. Government.