Upload
axel
View
14
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Intro
Citation preview
Wilhelm-Schickard-Institut - Technische Informatik
Dr. Jürgen Ruf
Introduction
Software Quality In Industry and Practice
© 2012 University of Tuebingen 2 | Jürgen Ruf
Outline
Organization
Motivation
Introduction & Preliminaries
Overview
© 2012 University of Tuebingen 3 | Jürgen Ruf
Organization
© 2012 University of Tuebingen 4 | Jürgen Ruf
This Course
• Applied Computer Science (Praktische Informatik) • Credits: 4 LP (class + exercises) • SWS: 2 + 1 • Exam: oral or written • Exercises: every week, theoretical or hands-on • Lecturers:
- Dr. Jürgen Ruf, Verification Engineer and Team Lead at IBM - Patrick Heckeler - Stefan Huster - Sebastian Burg
• Related Seminar in summer term (associated topics)
© 2012 University of Tuebingen 5 | Jürgen Ruf
Exercises
• Register at CIS: https://cis.informatik.uni-tuebingen.de/sq-ws-1213/ up to 1.11.2012
- Download weekly exercise sheets - News about lecture and exercises
• Weekly exercises sheets
- 20 points per sheet - 50% overall of points to pass - Corrected by tutor
Fotios Hatziioannidis
- Return digitally in CIS system - Groups of 2 students
• Pool-room B203 in Sand 13
- Send student ID-card number to [email protected]
© 2012 University of Tuebingen 6 | Jürgen Ruf
Training group
• Weekly training group - Compulsory attendance
• Possible dates
- Monday: 2 pm to 3 pm 3 pm to 4 pm (after 6 pm)
- Wednesday: Whole day possible
- Thursday: 10 am – 11 am 11 am – 12 am
- Friday: 2 pm – open end
© 2012 University of Tuebingen 7 | Jürgen Ruf
Grading
• Course Grading: 20% exercises, 80% exam
• Exercise Grading: 50% of points: Mark 4
• Exam Grading: 50% of points: Mark 4
• Both must be passed at least with mark 4!!!
• Exam date: 7.2.2013 at lecture time
© 2012 University of Tuebingen 8 | Jürgen Ruf
Motivation “”People often say that
motivation doesn’t last. Well, neither does bathing –
that’s why we recommend it daily.” [Zig Ziglar]
© 2012 University of Tuebingen 9 | Jürgen Ruf
Software Yesterday and Today
Yesterday (~30 years ago) Today Hardware dominated the system Software dominates the system Hardware defined the system behavior
Software defines the behavior
SW only in computers SW everywhere, also controlling safety critical systems
"I think there is a world market for maybe five computers“
"640K Ought to be Enough for Anyone”
© 2012 University of Tuebingen 10 | Jürgen Ruf
Safety Critical Systems
• Medical Devices • Automotive (drive-by-wire) • Air and Space Flight (fly-by-wire) • Power Plants • ...
© 2012 University of Tuebingen 11 | Jürgen Ruf
SW dominated Systems
© 2012 University of Tuebingen 12 | Jürgen Ruf
SW in the 21st Century
• More safety critical, real-time software … • Embedded software is ubiquitous … check your
pockets • Enterprise applications means bigger programs,
more users • Paradoxically, free software increases our expectations! • The web offers a new deployment platform
- Very competitive and very available to more users - Web apps are distributed - Web apps must be highly reliable
Industry desperately needs our inventions and expertise!
© Ammann & Offutt
© 2012 University of Tuebingen 13 | Jürgen Ruf
Dependability Aspects
Dependability
The ability of a system or component to perform its required functions under stated conditions for a specified period of time.
Ability of a System to perform its agreed Function when required, i.e. 7/24 availability, 42 years MTBF
Avoid catastrophic consequences for the environment
Protecting information from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction.
Reliability
Availability
Safety
Security
Definitions: www.wikipedia.com
© 2012 University of Tuebingen 14 | Jürgen Ruf
The First Bugs
© Ammann & Offutt
“It has been just so in all of my inventions. The first step is an intuition, and comes with a burst, then difficulties arise—this thing gives out and [it is] then that 'Bugs'—as such little faults and difficulties are called—show themselves and months of intense watching, study and labor are requisite. . .” – Thomas Edison
“an analyzing process must equally have been performed in order to furnish the Analytical Engine with the necessary operative data; and that herein may also lie a possible source of error. Granted that the actual mechanism is unerring in its processes, the cards may give it wrong orders. ” – Ada, Countess Lovelace (notes on Babbage’s Analytical Engine)
Hopper’s “bug” (moth stuck in a relay on an early machine)
© Ammann & Offutt
© 2012 University of Tuebingen 15 | Jürgen Ruf
Costly Software Failures
NIST report, “The Economic Impacts of Inadequate Infrastructure for Software Testing” (2002)
– Inadequate software testing costs the US alone between $22 and $59 billion annually
– Better approaches could cut this amount in half
Huge losses due to web application failures
– Financial services : $6.5 million per hour (just in USA!) – Credit card sales applications : $2.4 million per hour (in USA)
In Dec 2006, amazon.com’s BOGO offer turned into a double discount
2007 : Symantec says that most security vulnerabilities are due to faulty software
World-wide monetary loss due to poor software is staggering
© Ammann & Offutt
© 2012 University of Tuebingen 16 | Jürgen Ruf
Spectacular Software Failures
Major failures: Ariane 5 explosion, Mars Polar Lander, Intel’s Pentium FDIV bug
Poor testing of safety-critical software can cost lives : THERAC-25 radiation machine: 3 dead
Mars Polar Lander crash site?
THERAC-25 design
Ariane 5: exception-handling bug : forced self destruct on maiden flight (64-bit to 16-bit conversion: about 370 million $ lost)
We need our software to be dependable Testing is one way to assess dependability
NASA’s Mars lander: September 1999, crashed due to a units integration fault
Toyota brakes : Dozens dead, thousands of crashes
© Ammann & Offutt
© 2012 University of Tuebingen 17 | Jürgen Ruf
Airbus 319 Safety Critical Software Control
Loss of autopilot
Loss of both the commander’s and the co-pilot’s primary flight and navigation displays !
Loss of most flight deck lighting and intercom
© Ammann & Offutt
© 2012 University of Tuebingen 18 | Jürgen Ruf
Northeast Blackout of 2003
© Ammann & Offutt
Affected 10 million people in Ontario,
Canada
Affected 40 million people in 8 US
states
Financial losses of $6 Billion USD
508 generating units and 256
power plants shut down
The alarm system in the energy management system failed due to a software error and operators were not informed of the power overload in the system
© 2012 University of Tuebingen 19 | Jürgen Ruf
What Does This Mean?
© Ammann & Offutt 19
Software Quality is getting more and more important
„Do we have enough confidence in computer systems to commit to them what is most valuable for us, our
money and our lives? “ (J.C. Laprie)
© 2012 University of Tuebingen 20 | Jürgen Ruf
Introduction and Preliminaries
. []
© 2012 University of Tuebingen 21 | Jürgen Ruf
Software Lifecycle
Con-cept
Imple-men-tation
Archi-tecture
Deploy
Re-quire-ments
Test Design
Sup-port/
Maintenance
© 2012 University of Tuebingen 22 | Jürgen Ruf
SW Quality in the SW Lifecycle
Phase Participants Aspects Methods/ Outcome
Marketing, PM, Management
Feasibility, cost, risk Cost-benefit-Analysis, Risk-Management plan
PM, clients Client’s needs Functional requirements document, use-cases
Architect, developer, tester
Dependability, maintain-ability, testability, …
Architecture document
Architect, developer, QA
Transform requests into system design
Design documentation and development plan
Developer, QA Implement design in SW Software + SW documentation
Developer, QA Design and execute tests based on requests
Various validation and verification methods, reviews
Developer, architects, sales
Manufacturing, distribution, installation
Developer, sales customer support
Ensure systems dependability in operation
Post implementation and in-process reviews
Concept
Implemen-tation
Architecture
Deploy
Requirements
Test
Design
Support/ Maintenance
© 2012 University of Tuebingen 23 | Jürgen Ruf
SW Quality Needs Structured Processes SW Quality is not coincident or luck, it is a matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - Software development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks and reviews - Test plan - SW test techniques - Test automation
Content of Lecture 10
© 2012 University of Tuebingen 24 | Jürgen Ruf
SW Quality Needs Structured Processes SW Quality is not coincident or luck, it is a matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - Software development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks and reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 25 | Jürgen Ruf
SW Development: Waterfall
Imple-men-tation
Archi-tecture
Deploy
Re-quire-ments
Test
Design
Sup-port/
Maintenance
© 2012 University of Tuebingen 26 | Jürgen Ruf
SW Development: V-Model
Imple-men-tation
Archi-tecture
Design
Re-quire-ments
Use Cases
Test Cases
Test Cases
Test Cases
ModulTest
Inte-gration
Test
Sys-tem Test
Acceptance Test
© 2012 University of Tuebingen 27 | Jürgen Ruf
Sprint
SW Development: Agile process
Archi-tecture
Re-quire-ments
Product backlog
Sprint backlog
Inte-gration
Test
2-6 weeks
Design
ModulTest Imple-
men-tation 24 hours
Daily SCRUM meeting
Potentially shippable product
increment
Sys-tem Test
Acceptance Test
© 2012 University of Tuebingen 28 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 29 | Jürgen Ruf
Coding Guidelines
• Unification of different coding styles - Quicker understanding of “foreign” code - Increases maintainability
• Error reduction - Avoiding error prone constructs (dynamic memory allocation,
pointer arrithmetic, …) • Mandatory in safety critical applications
- i.e. MISRA/MISRA C++ in automotive applications - http://www.ristancase.com/html/dac/manual/3.12.02%20MISRA-C%202004%20Rules.html
• Naming of identifiers, indentation, white spaces, declarations (i.e. avoid shadowing), comments, statements (i.e. one statement per line)
© 2012 University of Tuebingen 30 | Jürgen Ruf
Indent Styles (from www.wikipedia.de)
• K&R style (Kernighan & Richie)
• 1TBS
("the one true brace style“) • GNU style
© 2012 University of Tuebingen 31 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 32 | Jürgen Ruf
Version Control Scenarios • Scenario 1:
- Multiple developers work on one product at the same time - How do you manage multiple “locally changed” versions and
how do you merge them to one unit?
32
© 2012 University of Tuebingen 33 | Jürgen Ruf
Version Control: Concurrency
Repository
Commit: Moves changed File back to repository
*
Check-out: creates a local copy Of the repository
*
*
*
Commit: Leads to up-2-date fail
Update: Merges changes or Leads to a conflict
*
Commit
*
Sandbox Sandbox
© 2012 University of Tuebingen 34 | Jürgen Ruf
Version Control Scenarios • Scenario 1:
- Multiple developers work on one product at the same time - How do you manage multiple “locally changed” versions and
how do you merge them to one unit? • Scenario 2:
- Version 1 of your product is shipped - You are working on Version 2 - Now a customer finds a severe bug in Version 1 which needs
to be fixed ASAP - How do you manage multiple levels of your product
© 2012 University of Tuebingen 35 | Jürgen Ruf
Version Control: Tagging
35
1.0 1.0
fileA fileB fileC
REL_X
REL_Y
tag command
1.1
1.2 1.2
1.3
1.1
1.0
1.1
© 2012 University of Tuebingen 36 | Jürgen Ruf
Development Branch Shipped Product Branch
Version Control: Branching
36
1
fileA
3
4
2
2.1 Branch tag Version delivered to customer
2.2 Bugfix for customer
5 Merge branch to head
© 2012 University of Tuebingen 37 | Jürgen Ruf
Version Control Scenarios • Scenario 1:
- Multiple developers work on one product at the same time - How do you manage multiple “locally changed” versions and
how do you merge them to one unit? • Scenario 2:
- Version 1 of your product is shipped - You are working on Version 2 - Now a customer finds a severe bug in Version 1 which needs
to be fixed ASAP - How do you manage multiple levels of your product
• Scenario 3: - Your working on a large scale development project. You
believe one of your team members has added malicious code to the project
- How do you monitor what they’ve changed in your 10,000 line, or 100,000 line project?
© 2012 University of Tuebingen 38 | Jürgen Ruf
Version Control: diff, stat, log
• Information about file status • Information about file history • Different versions of files can be compared:
- Two versions (same or different branches) - Version with sandbox
© 2012 University of Tuebingen 39 | Jürgen Ruf
Showing Branches and Tags
© 2012 University of Tuebingen 40 | Jürgen Ruf
Showing Differences of Versions
© 2012 University of Tuebingen 41 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 42 | Jürgen Ruf
Bug Tracking/Defect Management
The test team found a bug • How to document and communicate to the development team • How to track the progress of the bug correction • How does management know what bugs are “out there”?
defect database
test engineer
designer
project manager
report statistics
© 2012 University of Tuebingen 43 | Jürgen Ruf
Bug states in Bugzilla
www.bugzilla.org
© 2012 University of Tuebingen 44 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development and architecture - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 45 | Jürgen Ruf
Static Code Analysis
• Software measurement • Coding style analysis or pretty printer • Syntax checking
- What a compiler needs to do anyway • Semantic analysis
- Lint: more strict type checking, detects unused variable, … - Many compilers do lint-like checking g++ -Wall –ansi –pedantic …
• Exploit analysis - Syntactic analysis reports usage of unsecure functions
(e.g. gets, strcpy) - Semantic analysis methods
• Formal software verification (proof, calculus) • Anomaly analysis
More details in lectures 8 & 9
More details in lecture 11
© 2012 University of Tuebingen 46 | Jürgen Ruf
Anomaly Detection in Dataflows
• Static analysis of - Definition (d) of variables, x = 5 - Reference (r) of variables, if ( x>3 ), y = x + 2 - Variable becomes undefined (u), e.g. end of variable scope,
variable is also undefined after its declaration
Program code inOutMin inOutMax help Void MinMax ( int& inOutMin, d int& inOutMax ) d { int help; u if ( inOutMin > inOutMax ) { r r inOutMax = help; d r inOutMax = inOutMin; r d help = inOutMin; r d } } u
ur-Anomaly
dd-Anomaly
du-Anomaly
© 2012 University of Tuebingen 47 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 48 | Jürgen Ruf
Reviews
Walkthrough Review Inspection
What? Ask your peer to have a look at your code
Where? In front of code owner’s monitor
Formal? No
Spontaneous? Yes
Moderated? No
© 2012 University of Tuebingen 49 | Jürgen Ruf
Reviews
Walkthrough Review Inspection
What? Ask your peer to have a look at your code
Like a Walkthrough but with checklists (often more attendees)
Where? In front of code owner’s monitor
Workplace, or team room, also as offline review possible
Formal? No Yes
Spontaneous? Yes Yes
Moderated? No No
© 2012 University of Tuebingen 50 | Jürgen Ruf
Reviews
Walkthrough Review Inspection
What? Ask your peer to have a look at your code
Like a Walkthrough but with checklists (often more attendees)
Moderated review process Using project management techniques like phases, documents and clearly defined roles
Where? In front of code owner’s monitor
Workplace, or team room, also as offline review possible
Team room
Formal? No Yes Yes
Spontaneous? Yes Yes no
Moderated? No No Yes
© 2012 University of Tuebingen 51 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
© 2012 University of Tuebingen 52 | Jürgen Ruf
Test Plans
• Why writing test plans? - Documentation of what needs to be done - Good starting point for new team members - Reference for testers - Documentation of what has been done
(good to know in case of field bugs, also to improve testing for next iteration)
• It’s the contents that are important, not the structure - Good testing is more important than proper documentation - However – documentation of testing can be very helpful
• Most organizations have a list of topics, outlines, or templates
© 2012 University of Tuebingen 53 | Jürgen Ruf © Ammann & Offutt
Types of Test Plans
• Mission plan – tells “why” - Usually one mission plan per organization or group - Least detailed type of test plan
• Strategic plan – tells “what” and “when”
- Usually one per organization, or perhaps for each type of project - General requirements for coverage criteria to use
• Tactical plan – tells “how” and “who”
- One per product - More detailed - Living document, containing test requirements, tools, results and
issues such as integration order
© Ammann & Offutt
© 2012 University of Tuebingen 54 | Jürgen Ruf © Ammann & Offutt
Test Plan Contents – System Testing
• Purpose • Target audience and application • Deliverables • Information included
- Introduction - Test items - Features tested - Features not tested - Test criteria - Pass / fail standards - Criteria for starting testing - Criteria for suspending testing - Requirements for testing restart
– Hardware and software requirements
– Responsibilities for severity ratings
– Staffing & training needs – Test schedules – Risks and contingencies – Approvals
© Ammann & Offutt
© 2012 University of Tuebingen 55 | Jürgen Ruf
Test Plan Contents – Tactical Testing
• Purpose • Outline • Test-plan ID • Introduction • Test reference items • Features that will be tested • Features that will not be tested • Approach to testing (criteria) • Criteria for pass / fail • Criteria for suspending testing • Criteria for restarting testing • Test deliverables
• Testing tasks • Environmental needs • Responsibilities • Staffing & training needs • Schedule • Risks and contingencies • Approvals
© Ammann & Offutt
© 2012 University of Tuebingen 56 | Jürgen Ruf
SW Quality Needs Structured Processes Software quality is not coincidence or luck, Software quality is matter of structured processes, good tools and hard work! • Project management processes and plans for
- Requirements-, communication-, cost-, risk-, change-management - Project tracking
• Software development - SW development process (waterfall, V-model, agile, …)
• Software implementation - Coding guidelines - Version control - Documentation
• Software Validation/Verification - Bug tracking - Static checks - Reviews - Test plan - SW test techniques - Test automation
Content of lectures 2-7
© 2012 University of Tuebingen 57 | Jürgen Ruf
Software Testing Terms
Like any field, software testing comes with a large number of specialized terms that have particular meanings in this context
• Some of the following terms are standardized, some are used
consistently throughout the literature and the industry, but some vary by author, topic, or test organization
• The definitions here are intended to be the most commonly used
© Ammann & Offutt
© 2012 University of Tuebingen 58 | Jürgen Ruf
Validation & Verification (IEEE)
• Validation : The process of evaluating software at the end of software development to ensure compliance with intended usage
• Verification : The process of determining whether the products of a given phase of the software development process fulfill the requirements established during the previous phase
© Ammann & Offutt
© 2012 University of Tuebingen 59 | Jürgen Ruf
Test Engineer & Test Managers
• Test Engineer : An IT professional who is in charge of one or more technical test activities
- designing test inputs - producing test values - running test scripts - analyzing results - reporting results to developers and managers
• Test Manager : In charge of one or more test engineers
- sets test policies and processes - interacts with other managers on the project - otherwise helps the engineers do their work
© Ammann & Offutt
© 2012 University of Tuebingen 60 | Jürgen Ruf
Static and Dynamic Testing
• Static Testing: Testing without executing the program - This include software inspections and some forms of analyses - Very effective at finding certain kinds of problems – especially
“potential” faults, that is, problems that could lead to faults when the program is modified
• Dynamic Testing: Testing by executing the program with
real inputs
© Ammann & Offutt
© 2012 University of Tuebingen 61 | Jürgen Ruf
Software Faults, Errors & Failures
• Software Fault : A static defect in the program code of the software - Example: infinite program loop
• Software Failure : External, incorrect behavior with respect to the
requirements or other description of the expected behavior. Failures are observed when the software is executed
• Software Error : An incorrect internal state that is the manifestation of some fault or: Human interaction which produces an incorrect result
© Ammann & Offutt
• Testing : Finding inputs that cause the software to fail
• Debugging : The process of finding a fault given a failure
© 2012 University of Tuebingen 62 | Jürgen Ruf
Fault & Failure Model
Three conditions necessary for a failure to be observed
1. Reachability : The location or locations in the program that contain the fault must be reached
2. Infection : The state of the program must be incorrect
3. Propagation : The infected state must propagate to cause some output of the program to be incorrect
© Ammann & Offutt
© 2012 University of Tuebingen 63 | Jürgen Ruf
Test Case
• Test Case Values : The values that directly satisfy one test requirement
• Expected Results : The result that will be produced when executing the test if the program satisfies it intended behavior
© Ammann & Offutt
© 2012 University of Tuebingen 64 | Jürgen Ruf
Observability and Controllability
• Software Observability : How easy it is to observe the behavior of a program in terms of its outputs, effects on the environment and other hardware and software components
- Software that affects hardware devices, databases, or remote files have low observability
• Software Controllability : How easy it is to provide a program with
the needed inputs, in terms of values, operations, and behaviors - Easy to control software with inputs from keyboards - Inputs from hardware sensors or distributed software is harder - Data abstraction reduces controllability and observability
© Ammann & Offutt
© 2012 University of Tuebingen 65 | Jürgen Ruf
Course Overview
© 2012 University of Tuebingen 66 | Jürgen Ruf
Overview
Software Quality Introduction
Functional Testing (ISP & SBT)
Graphs
Software Testing Verification
Formal I
Formal II
Conclusion
General Testing
Techniques (Unit Test,
Regression)
1. L
ectu
re
12. L
ectu
re
2. Lecture 3. Lecture 4., 5., 6. Lecture
7. Lecture
8., 9. Lecture
Measuring
Processes & Measuring
Processes& Norms
10., 11. Lecture
Source
Spec & Design
Logic
© 2012 University of Tuebingen 67 | Jürgen Ruf
Bibliography
• Peter Ligesmeyer, Software-Qualität - Chapter 1 (Introduction) - Chapter 8.5 (Static Code Analysis)
• Dirk W. Hoffmann, Software-Qualität - Chapters 1-2 (Introduction) - Chapter 5 (Static code analysis) - Chapter 7 (Software Life Cycle) - Chapter 8 (Software Infrastructure)
• P. Ammann & J. Offutt, Introduction to Software Testing - Chapter 1 (Introduction) - Chapter 6.4 (Test Plans)