Upload
antonio-vetro
View
373
Download
2
Tags:
Embed Size (px)
DESCRIPTION
We identified and organized a number of statements about technical debt (TD Folklore list) expressed by practitioners in online websites, blogs and published papers. We chose 14 statements and we evaluated them through two surveys (37 practitioners answered the questionnaires), ranking them by agreement and consensus. The tatements most agreed with show that TD is an important factor in software project management and not simply another term for “bad code”. This study will help the research community in identifying folklore that can be translated into research questions to be investigated, thus targeting attempts to provide a scientific basis for TD management.
Citation preview
A Case Study on Effectively Identifying Technical DebtNico Zazworka
Rodrigo O. Spínola
Antonio Vetró
Forrest Shull
Carolyn Seaman
EASE 2013, Porto de Galinhas
Abstract A study focused on the identification of Technical Debt Compared human elicitation of Technical Debt with tool-
assisted identification Results revealed that there is little overlap in the Technical
Debt reported by different members of a development team Results also showed that tools can identify some of the same
types of Technical Debt that human developers consider important, but not all.
Contribution to the Technical Debt Landscape.
What is Technical Debt? Imperfections in a software system…
That were caused by lack of time… And that run the risk of causing higher future maintenance cost
Examples: Classes that need refactoring (design debt) Known defects that were never fixed (defect debt) Requirements that were only partially implemented (defect debt) Inadequacies in the test suite (testing debt)
Some debt is obvious and explicit, other types are hidden and need to be identified, or detected.
Current research focuses on techniques for identifying and managing Technical Debt
This study focuses on identification
Research Goal and Questions Goal: understand the human elicitation of Technical Debt and
compare it to automated Technical Debt identification
Research Questions: Do the Technical Debt identification tools find issues that are
similar or different from those reported by developers? How much overlap is there between the Technical Debt items
reported by different developers? How hard is the Technical Debt item template to fill in?
Study Procedure Group of 5 developers working on a small (25KLOC) database-
driven web application for the sea transportation domain We asked them:
If you were given a week to work on this application, and were told not to add any new features or fix any bugs, but only to address Technical Debt, what would you spend your time on?
Resulted in 21 Technical Debt items, documented using a template At the same time, we ran an automated static analysis tool, a code
smell detector, and a metrics calculator on the current code base Compared the results
Technical Debt TemplateID TD identification numberResponsible Person or role who should fix this TD item
Type design, documentation, defect, testing, or other type of debt
Location List of files/classes/methods or documents/pages involved
Description Describes the anomaly and possible impacts on future maintenance
Estimated principal
How much work is required to pay off this TD item on a three point scale: High/Medium/Low
Estimated interest amount
How much extra work will need to be performed in the future if this TD item is not paid off now on a three point scale: High/Medium/Low
Estimated interest probability
How likely is it that this item, if not paid off, will cause extra work to be necessary in the future on a three point scale: High/Medium/Low
Intentional? Yes/No/Don’t Know
Results I
Results II
Discussion Tools would have facilitated the discovery of all the important
defect debt and about half of the design debt, as reported by developers.
Tools would not have found any of the other types of debt deemed important by developers
Developers reported spending a reasonable amount of time documenting the Technical Debt items, but found principal and interest hard to assess
Bottom line: No silver bullet – need a variety of tools and humans To identify Technical Debt To interpret Technical Debt To decide what’s important
Conclusion A study investigating and comparing different approaches to
identifying Technical Debt Tools – ASA, code smells, and metrics Human elicitation – developers reporting the most important
instances Tools can’t replace humans, but may find things that humans miss Aggregation, not consensus, is an appropriate approach to
combining the Technical Debt items reported by different developers
Next step: a focus group with the developers to get their feedback on the results
Thank you!
Questions?