Upload
buinhi
View
223
Download
1
Embed Size (px)
Citation preview
Measurement Standards
for System –ilities
(or is it System ill-at-ease)
Marc Jones, Federal Outreach, CISQ
Bill Curtis, Director, CISQ
www.it-cisq.org
1
The ‘ill-at-ease’ Premise
In his Keynote, ‘Balancing a System’s –ilities’
Barry Boehm stated: – “Adding a functional requirement will generally produce an additive effect on
the project’s budget and schedule, while adding an -ility, such as reliability, availability, scalability usability, interoperability, secureability, adaptability, or affordability, will generally produce a system-wide multiplicative effect.”
How do we measure the –ilities ? – Gut feel………………………………………….whose gut?
– ISO 25023 or 9126………………………….too conceptual
– Past operational behavior………………… what about lurking monsters
We need strong measures of –ilities ! – Quality assurance and benchmarking
– System risk analysis
– Contract SLAs and acceptance criteria
2
CISQ History and Objectives
• Consortium for IT Software QualityMission
Define automatable measures for software characteristics
Deploy as OMG Approved Specifications
Influence public policy on software quality
Become industry resource on structural quality
• Launched by SEI & OMG in Executive Forums: Frankfurt, Germany
Washington, DC
Bangalore, India
Initial Members Current Sponsors
4
CISQ
Co-founders
IT Executives
Technical experts
CISQ Sponsors
CISQ & OMG Standardization Process
CISQ
Exec
Forum
Automated
Function Points
Reliability
Performance
Efficiency
Security
Maintainability
CISQ specifications for
automated measures
ISO
Fasttrack
OMG Approved
Specifications
OMG
Deployment
Workshops
5
Automated FP Specification
• OMG Approved specification for Automated Function Points
• Mirrors IFPUG counting guidelines, but automatable
• Specification developed by international team led by David Herron of David Consulting Group
ISO 25010 and CISQ -ility Measures
• ISO 25010 was starting point for CISQ work
– 25010 defines quality characteristics and sub-characteristics
– CISQ defined quality attributes and measurable elements
– CISQ sponsors selected 4 quality characteristics for definition
8
CISQ Security Measure
Objective
Develop automated source
code measures that predict the
vulnerability of source code to
external attack. Measure based
on the Top 25 in the Common
Weakness Enumeration
Team Lead
Robert
Martin
MITRE Supported by US Dept. of
Homeland Security
CISQ Automated Quality Metrics
CISQ Quality Characteristic measures based on ISO 25010:
• Defined Quality Issues as root causes of problems in each of the quality characteristic domains
• Defined Quality Rules and Measurable Elements that address these
issues for each quality characteristic
• For each CISQ measure, result is the # of rule violations in the source code
Partial
example of
derivation for
measureable
elements
aggregated
into CISQ
Reliability
measure
that do not
10
Measuring –ilities by Violations
Software Quality Characteristics
Quality Sub-Characteristics
Software Quality Attributes
Maintainability
Modularity, Changeability, etc.
Quality Measure Elements
Quality Rule Violations
Dead code
Copy/Paste
Excessive coupling ISO structure
Examples incorporated
into CISQ measures
Structure of ISO 25023 Measures
11
System Level –ilities matter
UNIT LEVEL FLAWS
Downtime caused
by system-level flaws!
Of all
defects
Of total
repair
effort
92%
8%
52%
48% 90%
10%
SYSTEM LEVEL FLAWS
Software Risk Prevention Process that
be compromised to acclerate
Test/Release cycle:
Focus on critical violations that
matter
Focus resources on areas of highest
impact not highest number of flags
“Tracking programming practices
at the Unit Level alone may not
translate into the anticipated
business impact,… most
devastating defects can only be
detected at the System Level.”
- OMG
12
Use –ility Measures as Acquisition SLAs
CISQ measures can be used in
contracts to establish objective,
measureable agreements on
quality priorities that comply
with industry standards
What is the “Rea-lity”
Industry-leading repository on structural quality
– 745 Applications
– 160 Companies, 14 Countries
– 321,259,160 Lines of Code; 59,511,706 Violations
Financial
Insurance Other
Government
Retail
IT
Telecom
Estimated Technical Debt by – “ility”
14
Transferability
40%
Changeability
40%
Security 7%
Robustness
18%
70% of Technical Debt is in IT Cost (Transferability, Changeability)
30% of Technical Debt
is in Business/Mission Risk (Robustness,
Performance, Security)
-ility’s Consistent across Technologies Proportion of Technical Debt by health factor
is reasonably consistent across technologies
-ilities, Measure Often/Minimize Rework
Release N+2
Develop N+2
Rework N+2
Rework N
Rework N+1
Develop N
Release N
Rework N
Unfixed defects
release N
Release N+1
Develop N+1
Rework N+1
Rework N
Unfixed defects
release N
Unfixed defects
release N+1
16
Productivity is often
overstated as quality
related re-work is
measured as new
work.
Recommendations for Acquisition
1. Set structural quality objectives
2. Set compliance targets
3. Measure unacceptable violations
4. Use a common vocabulary
5. Measure quality at the system level
6. Define roles and responsibilities
7. Follow a measurement process
8. Evaluate contract deliverables
9. Use rewards and penalties wisely
17
1Set Structural Quality Objectives
Quality
Priorities
Mission-facing
Internal support
Reliability
Performance
Security
Maintainability
Reliability
Performance
Security
Maintainability
Online
System
Logisitic information
Online fulfillment
Delivery logistic
Reliability 3.5
Performance 3.5
Security 3.9
Maintainability 2.5
Reliability 3.8
Performance 3.9
Security 2.5
Maintainability 3.0
Reliability 3.5
Performance 3.0
Security 3.9
Maintainability 2.5
2Set Compliance Targets
2.5
2.7
2.9
3.1
3.3
3.5
3.7
3.9
Reliability
Performance
Security
Maintainability
Qu
ali
ty C
ha
racte
risti
c S
co
re Quality Score Target by Release
3Measure Unacceptable Violations
20
Analyze
Application
Evaluation of
architectural &
coding rules
Application
meta-data
Transferability
Changeability
Robustness
Performance
Security
Measure
Quality
Detect
Violations
Expensive operation in loop
Static vs. pooled connections
Complex query on big table
Large indices on big table
Empty CATCH block
Uncontrolled data access
Poor memory management
Opened resource not closed
SQL injection
Cross-site scripting
Buffer overflow
Uncontrolled format string
Unstructured code
Misuse of inheritance
Lack of comments
Violated naming convention
Highly coupled component
Duplicated code
Index modified in loop
High cyclomatic complexity
Parse App
Layers
Oracle PL/SQL
Sybase T-SQL
SQL Server T-SQL
IBM SQL/PSM
C, C++, C#
Pro C
Cobol
CICS
Visual Basic
VB.Net
ASP.Net
Java, J2EE
JSP
XML
HTML
Javascript
VBScript
PHP
PowerBuilder
Oracle Forms
PeopleSoft
SAP ABAP,
Netweaver
Tibco
Business Objects
Universal Analyzer
for other languages
4Use a Common Vocabulary
Term Definition
Software Quality
Software quality will be measured by the CISQ Quality Characteristic measures for Reliability, Security, Performance Efficiency, and Maintainability
Application Size
Application size will be measured by the CISQ specification for Automated Function Points. The work performed on an application will be measured by the CISQ Enhanced Function Point measure
Violations Application source code that violates a rule incorporated into one of the four CISQ Quality Characteristic measures or a coding standard incorporated into an SLA
Unacceptable Violations
Violations specified in an SLA that must not exist in delivered code. All unacceptable violations must be remediated before delivery will be accepted
Minimum Threshold
Minimum score established in an SLA that must be attained on one or more CISQ Quality Characteristic measures for a delivery to be accepted. Failure to achieve the minimum threshold can result in rejection of the delivery or financial penalties
Expected Target
The score established in an SLA that should be attained on one or more CISQ Quality Characteristic measures that the supplier should seek to achieve and sustain over releases. A schedule over several releases may be established for achieving the expected target
5Measure Quality at the System Level A
rch
itec
ture
Co
mp
lia
nce
Data Flow Transaction Risk
EJB
PL/SQ
L
Oracle SQL
Server
DB2
T/SQL
Hibernate
Sprin
g
Struts .NET
COBOL
Sybase IMS
Messaging
Integration quality
Architectural
compliance
Risk propagation
Application security
Resiliency checks
Transaction integrity
Function point,
Effort estimation
Data access control
SDK versioning
Calibration across
technologies
IT organization level
System Level
3
Code style & layout
Expression complexity
Code documentation
Class or program design
Basic coding standards
Developer level
Unit Level
1
Java
Java Java Java
Java
Web
Services Java Java Single language/technology layer
Intra-technology architecture
Intra-layer dependencies
Inter-program invocation
Security vulnerabilities
Development team level
Technology Level
2 JSP ASP.NET APIs
6Define roles and responsibilities
Customer Supplier
For each application under contract, the company should develop SLAs in consultation with the supplier that clearly define the minimum threshold and expected target of measures, as well as all unacceptable violations that may not be in the source code at delivery.
The supplier should make all developers, testers, and other staff involved with the application aware of the SLA requirements and how they are to be achieved and sustained.
In their SLAs the company should clearly define the measures to be used in assessing these thresholds and targets and how the scores will be determined.
For each application the supplier should commit to meet the minimum threshold and provide an action plan for achieving the expected target scores defined in the SLAs.
They should conduct a defined analysis and measurement process when code is delivered and provide a report to the supplier that clearly describes the results and any actions expected of the supplier.
The supplier should conduct quality assurance activities during the development period to ensure that SLA agreements will be achieved.
Customers should host a post-analysis meeting with a supplier representative to discuss measurement results and plan for future actions and expectations regarding application size and structural quality.
Upon request, the supplier should deliver whole source code including SQL structure definition files of the application to support an application audit by the customer, even if the new release does not modify all components of the application.
The supplier should meet with the customer to discuss results of analysis and measurement activities, to identify any remedial action required immediately, and to plan for future actions and expectations regarding application size and structural quality.
7Follow a Measurement Process
Establish a
Baseline
Set thresholds
and targets
Monitor and
review results
Determine
remediation
Update SLAs
-ilities in a mission critical portfolio
Code No RC Non Code Projected Count
Releases
Syste
m t
est
defe
cts
S
tru
ctu
ral
qu
ality
Before structural quality
analysis and management
New critical violations
CAST Analysis starting point
CLIENT STUDY OVER 24 MONTHS
0
500
1000
1500
2000
2500
3000
3500
R1
R1.1
R1.2
R2
R2.1
R3
R3.1
R4
R5
R6
R7
R7.1
R8
R9
R9.1
R9.2
R10
R10.1
R10.2
R10.3
R11
R11.1
R11.2
R11.3
R12
R13
R14E
Order Management Inventory Management Billing Customer Service
Measured -ilities in a complex enhancement-heavy environment
-ilities Reducing Maintenance Effort
26
Correlation of maintenance effort with incident tickets
across 20 customers of a global system integrator
Increase of TQI by 0.24 = decrease in maintenance activity by 50%
R2 = 0.3365
TQI
www.it-cisq.org — Free Membership
Marc Jones, Federal Outreach, CISQ
[email protected], 703 863 9908
Dr. Bill Curtis, Director, CISQ
Join:
www.it-cisq.org
28