Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Testing Review
Spencer Smith & Mark Lawford
Software Quality Research LaboratoryMcMaster University
Hamilton, ON, Canada
January 18, 2008
Smith & Lawford (SQRL) Testing January 18, 2008 1 / 56
Outline
1 Test PlanMotivationTest Plan TemplateTest Factors & Techniques
2 System Functional and Structural TestingOverviewFault TestingTesting Technique Selection
3 Unit TestingChallenges for TestingWhite Box TestingBlack Box TestingOther Analyses
Smith & Lawford (SQRL) Testing January 18, 2008 2 / 56
Test Plan
Introduction to Writing a Test Plan
Computer system strategic risksEconomics of system testingA testing policyTest factors and test strategies
Smith & Lawford (SQRL) Testing January 18, 2008 3 / 56
Test Plan Motivation
Overview of Test Plan
You will be building and demonstrating your final system this termThe system is too important to allow this to occur in an ad hocmannerNeed to plan how the system will be testedTime and resources are limited, so you will not be able to producea perfect systemYou need to use your risk analysis to determine where your limitedresources are best employedEach team should complete a test planThe test plan will help you produce the test report that is due atthe end of the termSuccessful teams in the past devoted significant time to testing
Smith & Lawford (SQRL) Testing January 18, 2008 4 / 56
Test Plan Motivation
System Test Plan
Be specific, include detailsPossibly including what team member will do whatYour planned scheduleList specific test cases, Etc.
The document should be specific to your project (no need to planfor portions you will not be implementing)
What are the special issues for a pacemaker and its programminginterface?How are you going to test the pacemaker? the interface? theirinteraction?How are you going to test the filtered sense signals? the paceoutput? accelerometer?How are you going to test reed switch?How are you going to test the each of the modes?How are you going to check proper calibration of the pacemaker?What are the differences between your testing environment and theactual environment?Etc.
Like other documents the test plan should be kept aliveThe test plan should facilitate making decisions now“Testing” includes code walkthroughs, formal proof, test cases, etc.
Smith & Lawford (SQRL) Testing January 18, 2008 5 / 56
Test Plan Motivation
References on Test Plans
Dr. Maibaum’s SFWR ENG 3S03 Software Testing andManagementIn the past Dr. Khedri course notes for SE 3R03Perry, William E., Effective Methods for Software Testing, SecondEdition, John Wiley and Sons, Inc., 2000.
Smith & Lawford (SQRL) Testing January 18, 2008 6 / 56
Test Plan Motivation
Computer System Strategic Risks
A risk is a condition that can result in lossRisk analysis involves looking at how bad the loss can be and atthe probability of the loss occurringRisks cannot be eliminated, but the development process canreduce the probability of loss associated with risks to an“acceptable” levelMany teams already looked at risks as part of the systemrequirements specificationWant to reduce risks, such as:
Incorrect results being producedUnauthorized transactions being accepted by the systemLoss of computer file integrityEtc.
Smith & Lawford (SQRL) Testing January 18, 2008 7 / 56
Test Plan Motivation
Economics of System Testing
Each of the risks can affect the proper functioning of a systemNeed to identify and evaluate the risks to the systemThe risks will contribute to the test factorsThe test factors are selected because they represent the highestrisksThe design of the test plan is based on economic considerationsWant to avoid over and under testing to perform optimal testing
Smith & Lawford (SQRL) Testing January 18, 2008 8 / 56
Test Plan Motivation
Establishing a Testing Policy
Ideally this would have been done earlier in the yearThe team’s definition of testing is their testing policyInvolves four criteria:
Definition of testing: determination of the validity of the system forsolving an engineering problemTesting system: development and execution of a test plan andsatisfaction of the system requirements specificationEvaluation: Cost (marks and IBM judges opinions) of undetecteddefectsStandards: One defect per 250 executable program statements
The first step toward the test plan
Smith & Lawford (SQRL) Testing January 18, 2008 9 / 56
Test Plan Motivation
Test Factors
CorrectnessAuthorization - for actionsFile integrity - data will be unalteredAudit trail - process of saving supporting evidential matterContinuity of processingService levels - desired results available within an acceptable timeframeAccess control - for system resourcesCompliance - with organizational strategies, policies, etcReliability - perform correctly over an extended time periodEase of useMaintainability
Smith & Lawford (SQRL) Testing January 18, 2008 10 / 56
Test Plan Motivation
Test Factors Continued
PortabilityCoupling - effort to interconnect componentsPerformance - computing resources requiredEase of operation
Smith & Lawford (SQRL) Testing January 18, 2008 11 / 56
Test Plan Motivation
Examples
CorrectnessPaces after the correct delayEnter’s “programming” mode when magnet is “present”.
Continuity of processingPacemaker continues to function even if it loses comm link duringprogrammingNotifies doctor if it detects problem (loose lead? low battery? etc)
PerformanceSystem is completed within time and budget constraintsSystem achieves performance acceptance criteria
Smith & Lawford (SQRL) Testing January 18, 2008 12 / 56
Test Plan Motivation
Test Strategy
Identify the concerns that will become the focus of test planningand executionThe test strategy must identify the test factors (the risk or issuethat needs to be addressed) and the test phase (the phase of thesystem development)Rank the risks by importance (low, medium, high)Not all test factors are applicable to your software systemYou may need to come up with new factors, or refine the listedonesSelect and rank the test factors
Smith & Lawford (SQRL) Testing January 18, 2008 13 / 56
Test Plan Motivation
Steps in Developing a Test Strategy
1 Select and rank test factorsUsually 3 to 7 test factors are neededOther factors will be implicitly addressed in a manner consistentwith supporting the key factors
2 Identify the development phases: the development phasesroughly correspond to the deliverables in 4GP6
3 Identify the business risks associated with the system: brainstorm4 Look at the risks you identified as part of the SRS5 Build the test factor/ test phase matrix
Test factors listed previouslyTest phases include requirements, system architecture, detaileddesign, coding, unit testing, integration testing, system testing,maintenance
Smith & Lawford (SQRL) Testing January 18, 2008 14 / 56
Test Plan Motivation
Test Factor/ Test Phase Matrix
TF \ TP SRS MG MIS Impl. UT IT STSafety 1 1 1Correctness 2 2 2 2Continuity of processing 3 3 3 3 3...
Risks
1 Pacemaker hurts/kills patient2 System loses comm link with control/interface computer3 Lead comes loose4 ...
Smith & Lawford (SQRL) Testing January 18, 2008 15 / 56
Test Plan Motivation
Build the Test Plan
Now that the test policy and the test strategy have beenestablished, write the test planThe test plan shows how to address each concern/risk identifiedin the test factor/test phase matrixThere should be a test plan (includes system tests)There should be a unit test plan for each unitBorrow from these suggestions pieces you missed, or techniquesthat can force you to look at the system from a new perspective
Smith & Lawford (SQRL) Testing January 18, 2008 16 / 56
Test Plan Test Plan Template
A Suggested Template for the Test Plan (Perry 2000)
1 General InformationSummaryEnvironment and pretest backgroundTest objectivesExpected defect ratesReferences
Project request authorizationPreviously published documents on the projectDocumentation concerning related projects
Outline of report: roadmap
Smith & Lawford (SQRL) Testing January 18, 2008 17 / 56
Test Plan Test Plan Template
Template Continued
2 Test Strategy3 Test Plan
Software descriptionTest teamMilestonesBudgets (in terms of time and resources)
Smith & Lawford (SQRL) Testing January 18, 2008 18 / 56
Test Plan Test Plan Template
Template Continued
3 Test Plan (Continued)Testing <system milestone 1>
Schedule (and budget) - including trainingRequirements (resource requirements, equipment, software,personnel)Testing material (system documentation, software to be tested, testinputs, test documentation, test tools)Reference tests to be conducted (reference specific tests to beconducted)
Testing <system milestone 2>...
Smith & Lawford (SQRL) Testing January 18, 2008 19 / 56
Test Plan Test Plan Template
Template Continued
4 Specification and EvaluationSpecifications (scenarios, nonfunctional requirements, test matrix,test progression)Methods and constraints (overview of methodology, test tools, datarecording, extent of testing (total or partial), constraints)Evaluation (criteria and data reduction)
5 Test DescriptionsTest <identity1>: describe the test to be performed (control, inputs,outputs, procedures)Test <identity2>: describe the test to be performed (control, inputs,outputs, procedures)...
6 Appendix: Checklist for test plan
Smith & Lawford (SQRL) Testing January 18, 2008 20 / 56
Test Plan Test Plan Template
Test Matrix
T1 T2 T3 ...R1 X ...R2 X X ...R3 X X X ...... ...
Smith & Lawford (SQRL) Testing January 18, 2008 21 / 56
Test Plan Test Plan Template
Unit Test Plan
1 PlanUnit descriptionTest approach - the general method or strategy (test scaffolding?)Functions not testedTest constraints
2 Functional testingList the functional requirements included in this unitTest descriptionsExpected test resultsConditions to stop tests
3 Interface test descriptionsInterfaceTest descriptionExpected test results
4 Test progression (from system test plan)
Smith & Lawford (SQRL) Testing January 18, 2008 22 / 56
Test Plan Test Plan Template
Start of a Checklist
Spell checked?Grammar reviewed?Title pageAuthor names on title pageTable of contentsList of figuresList of tablesRoadmap of the reportDouble spacedPage numbersEvery figure has a caption
Smith & Lawford (SQRL) Testing January 18, 2008 23 / 56
Test Plan Test Plan Template
Checklist Continued
Every table has a titleSpecific about technologySpecific about test scaffoldingSpecific about test typesExplain how to test the calibration of the sensorsEtc ..
Smith & Lawford (SQRL) Testing January 18, 2008 24 / 56
Test Plan Test Factors & Techniques
Test Factors and Test Techniques
Reliability: Execution, Recovery, Requirements, Error Handling,Unit TestingAuthorization: Security, Requirements, Unit TestingFile Integrity: Recovery, Requirements, Error Handling, UnitTestingAudit Trail: Recovery, Requirements, Unit TestingContinuity of Process: Stress, Recovery, Operations, Unit TestingService Level: Stress, Execution, Operations
Smith & Lawford (SQRL) Testing January 18, 2008 25 / 56
Test Plan Test Factors & Techniques
Test Factors and Test Techniques Continued
Access Control: SecurityMethodology: ComplianceCorrectness: Requirements, Regression, Error Handling, ManualSupport, Inter-systems, Control, Parallel, Unit TestingEase of Use: Compliance, Requirements, Manual Support, UnitTestingMaintainable: Compliance, Unit TestingPortable: Operations, Compliance
Smith & Lawford (SQRL) Testing January 18, 2008 26 / 56
Test Plan Test Factors & Techniques
Test Factors and Test Techniques Continued
Coupling: Operations, Inter-systems, ControlPerformance: Stress, Execution, Compliance, Unit TestingEase of Operation: Operations, Compliance
Smith & Lawford (SQRL) Testing January 18, 2008 27 / 56
Test Plan Test Factors & Techniques
Example Test Plans
Examples from 2003–2004 classExamples from 2004–2005 classYou should attempt to include more detailYou should include specific testsMany teams have already done system test plans, please revisit ithttp://www.soften.ktu.lt/∼virga/mag_atmintine/3sem/informacines.html
A non-English web-pageFirst two links have some useful information
Smith & Lawford (SQRL) Testing January 18, 2008 28 / 56
System Functional and Structural Testing Overview
System Functional and Structural Testing
Structural versus functional testingDynamic versus static testingManual versus automatic testingFault seedingStructural system testing techniques: stress, execution, recovery,operations, compliance, securityFunctional system testing techniques: requirements/acceptance,regression, error-handling, manual-support, intersystems, control,parallel testingTesting technique selectionCorrection of image distortion
Smith & Lawford (SQRL) Testing January 18, 2008 29 / 56
System Functional and Structural Testing Overview
Structural Versus Functional Testing
Structural testing is derived from the program’s internal structureFunctional testing is derived from a description of the program’sfunctionShould perform both structural and functional testingFunctional testing
Uncovers errors that occur in implementing requirements or designspecificationsNot concerned with how processing occurs, but with the resultsFocuses on functional requirements for the systemFocuses on normal behaviour of the system
Smith & Lawford (SQRL) Testing January 18, 2008 30 / 56
System Functional and Structural Testing Overview
Structural Testing Continued
Structural testingUncovers errors that occur during implementation of the programConcerned with how processing occursEvaluates whether the structure is soundFocuses on the nonfunctional requirements for the systemFocuses on abnormal or extreme behaviour of the system
Smith & Lawford (SQRL) Testing January 18, 2008 31 / 56
System Functional and Structural Testing Overview
Dynamic Versus Static Testing
Use a combination of dynamic and static testingDynamic analysis
Requires the program to be executedTest cases are run and results are checked against expectedbehaviourExhaustive testing is the only dynamic technique that guaranteesprogram validityExhaustive testing is usually impractical or impossibleReduce number of test cases by finding criteria for choosingrepresentative test cases
Smith & Lawford (SQRL) Testing January 18, 2008 32 / 56
System Functional and Structural Testing Overview
Static Testing Continued
Static analysisDoes not involve program executionTesting techniques simulate the dynamic environmentIncludes syntax checkingGenerally static testing is used in the requirements and designstage, where there is no code to executeDocument and code walkthroughsDocument and code inspections
Smith & Lawford (SQRL) Testing January 18, 2008 33 / 56
System Functional and Structural Testing Overview
Manual Versus Automated Testing
Manual testingHas to be conducted by peopleIncludes by-hand test cases, structured walkthroughs, codeinspections
Automated testingThe more automated the development process, the easier toautomate testingLess reliance on peopleNecessary for regression testingTest tools can assist, such as Junit, Cppunit, CuTest etc.Can be challenging to automate GUI testsTest suite for Maple has 2 000 000 test cases, run on 14 platforms,every night, automated reporting
Smith & Lawford (SQRL) Testing January 18, 2008 34 / 56
System Functional and Structural Testing Overview
Automated Testing at Maple
Three stepsWrite the problem descriptionresult := solver(problem)assert(result == expected)
Assert writes out code to reproduce any failuresTrack failures
Source code management (like CVS or Subversion)Database of test cases, functions calledDatabase of source files, functions definedDatabase of 40 days of timings and resources used
Automatically sends an e-mail to the programmer and his/her boss
Smith & Lawford (SQRL) Testing January 18, 2008 35 / 56
System Functional and Structural Testing Fault Testing
Fault Testing
Common analogy involves planting fish in a lake to estimate thefish populationT = total number of fish in the lake (to be estimated)N = fish stocked (marked) in the lakeM = total number of fish caught in lakeM’ = number of marked fish caughtT = (M - M’)*N/M’Artificially seed faults, discover both seeded and new faults,estimate the total number of faults
Smith & Lawford (SQRL) Testing January 18, 2008 36 / 56
System Functional and Structural Testing Fault Testing
Fault Testing Continued
Method assumes that the real and seeded faults have the samedistributionHard to seed faults
By hand (not a great idea)Independent testing by two groups and obtain the faults from onegroup for use by the other
Want most of the discovered faults to be seeded faultsIf many faults are found, this is a bad thingThe probability of errors is proportional to the number of errorsalready found
Smith & Lawford (SQRL) Testing January 18, 2008 37 / 56
System Functional and Structural Testing Fault Testing
Structural System Testing
Stress testing: Determines if the system can function whensubject to large volumesExecution: Determines if the system achieves the desired level ofproficiency in production status (performance)Recovery: Determines if the system has the ability to restartoperations after integrity has been lostOperations: Determines if the operating procedures and staff canproperly execute the system (documentation)Compliance (to process): Determines if the system has beendeveloped in accordance with information technology standards,procedures and guidelinesSecurity: Determines if the system can protect confidentialinformation
Smith & Lawford (SQRL) Testing January 18, 2008 38 / 56
System Functional and Structural Testing Fault Testing
Functional System Testing
Requirements: Determines if the system can perform its functioncorrectly and that the correctness can be sustained over acontinuous period of timeRegression: Determines if changes to the system do notinvalidate previous positive testing resultsError Handling: Determines the ability of the system to properlyprocess incorrect transactionsManual Support: Determines that the manual support proceduresare documented and complete, where manual support involvesprocedures, interfaces between people and the system, andtraining proceduresInter-systems: Determines the that interconnections betweensystems function correctly
Smith & Lawford (SQRL) Testing January 18, 2008 39 / 56
System Functional and Structural Testing Fault Testing
Functional System Testing Continued
Control: Determines if the processing is performed in accordancewith the intents of management
Includes data validation, file integrity, audit trail, backup andrecovery, documentation and other aspects related to integrityControls are designed to reduce risks
Parallel: Determines the results of the new application areconsistent with the processing of the previous application orversion of the application
Smith & Lawford (SQRL) Testing January 18, 2008 40 / 56
System Functional and Structural Testing Testing Technique Selection
Testing Technique Selection
1 Start with test factor2 Determine the software development life cycle phase to be tested3 Identify type of test
System structuralSystem functionalUnit testing
4 Select dynamic or static5 Select manual or automated tool support6 Select test method
Smith & Lawford (SQRL) Testing January 18, 2008 41 / 56
Unit Testing
Unit Testing
Challenges for testingBlack box versus white box testingWhite box testing criteria: statement coverage, edge coverage,condition coverage, path coverageBlack box testingAnalysis of unitsReference information
Smith & Lawford (SQRL) Testing January 18, 2008 42 / 56
Unit Testing Challenges for Testing
Challenges for Testing
The purpose of testing is to detect program errors, not to showthat the program is free of errorsIn general it is impossible to test under all possible operatingconditionsNeed to find suitable test cases to provide evidence that thedesired behaviour will be exhibited in the remaining casesAnalogy with engineering does not apply
No continuity propertySmall differences in operating conditions can result in dramaticallydifferent behaviour
It is difficult to thoroughly test a module when it is installed in aproduction system
Smith & Lawford (SQRL) Testing January 18, 2008 43 / 56
Unit Testing Challenges for Testing
Challenges Continued
Testing should be systematic not ad hocNeed to know expected results (requirements)Testing can show the presence of bugs, but it cannot show theirabsence (Dijkstra)A formal specification can help the reliability of testingFor unit testing an MIS provides this specificationNo absolute certainty from pure (nonexhaustive) dynamic testingShould also use other verification techniquesRandom test cases are inappropriate in many situations
Smith & Lawford (SQRL) Testing January 18, 2008 44 / 56
Unit Testing Challenges for Testing
Challenges Continued
Testing should help locate errors, not just detect themDesign for testingTesting should be repeatable
Unitialized variablesOperating environmentConcurrent systems
The theory of testing leads to several undecidable problemsWe need some practical criteria for the selection of test cases
Smith & Lawford (SQRL) Testing January 18, 2008 45 / 56
Unit Testing Challenges for Testing
Challenges Continued
A test criteria attempts to group elements of the input domain intoclasses so that we can choose a single test case asrepresentative of each classGoal is to satisfy the complete coverage principleTo do module testing it may be necessary to build a testscaffolding including test drivers and stubsConsider a module M
A driver is test code written to call access routines provided by MA stub is test code that serves as a substitute for an access routinecalled by M
Smith & Lawford (SQRL) Testing January 18, 2008 46 / 56
Unit Testing Challenges for Testing
Testing in the Small
Black box testingNo knowledge of how designed or codedResults evaluated against specificationTests what the program is supposed to doAlso called functional (unit) testing
White-box testingUse information about the internal structureTest what the program doesAlso called structural (unit) testing
Smith & Lawford (SQRL) Testing January 18, 2008 47 / 56
Unit Testing White Box Testing
White Box Testing
Statement coverage criterionStrive to execute all statementsNecessary because only program text reveals the detaileddecisions of the programmerUnfortunately executing a statement once and observing properbehaviour is not a guarantee of correctnessFor block-structured languages it is not clear what is meant by astatement
Smith & Lawford (SQRL) Testing January 18, 2008 48 / 56
Unit Testing White Box Testing
Control Flow Graph
For each I/O, assignment, or procedure call statement, a graph oftwo nodes connected by one edge is constructed. The edgerepresents the statement and the nodes represent entry and exitfrom the statement.The sequence of two statements S1 and S2, where the statementsare represented by graphs G1 and G2 respectively, is shown by anedge between the graphs G1 and G2
If-then and If-then-else conditions are shown as branches in thegraphwhile loops are shown as loops (cycles) in the graph
Smith & Lawford (SQRL) Testing January 18, 2008 49 / 56
Unit Testing White Box Testing
White Box Testing Continued
Edge coverage criterionDescribe the program structure by a graphical representation of theprogram control flowThe goal is to transverse each edge of the control flow graph atleast once
Condition coverage criterionAs for edge coverage traverse each edge of the control flow graphat least onceIn addition all possible values of constituents of compoundconditions are exercised at least once
Smith & Lawford (SQRL) Testing January 18, 2008 50 / 56
Unit Testing White Box Testing
White Box Testing Continued
Path coverage criterionSelect a test set so that all paths leading from the initial to the finalnode of the control graph are traversedLoops cause a problemGenerally undecidable whether a path is feasible
An empirical guideline for testing loops is to look for conditionsthat execute loops
Zero timesA maximum number of timesAn average number of times
Enforcing 100% statement coverage is impossible if there areunreachable statementsWhite box testing will not uncover errors of “omission”
Smith & Lawford (SQRL) Testing January 18, 2008 51 / 56
Unit Testing White Box Testing
Example of Test Coverage
void method(int x, int y, int z, int w){
y = y + 1if (x==y & & z > w){
x = x + 1}
}
Statement coverage test setx = 2, y = 1, z = 4, w = 3
Smith & Lawford (SQRL) Testing January 18, 2008 52 / 56
Unit Testing Black Box Testing
Black Box Testing
Testing is driven by the logic specificationUse your MISTesting boundary conditions (also used for white box)Use the interval rule (heuristic) for integer parameters
for an interval [L, U]normal cases: L, U, an interior pointexception cases: extreme negative, L-1, U+1, extreme positiveFor [1,100] could test {-1000, 0, 1, 50, 100, 101, 1000}
Suggest testing every value of an enumerated type (if smallnumber of values)
Smith & Lawford (SQRL) Testing January 18, 2008 53 / 56
Unit Testing Black Box Testing
Black Box Testing Continued
Test oracles to judge the correctness of outputIf there is a formal specification it might be used as an oracle or atest case generatorModel checking may be an option
Smith & Lawford (SQRL) Testing January 18, 2008 54 / 56
Unit Testing Other Analyses
Analysis of Units
Dynamic testing of units is not the only optionStatic testing (analysis) includes the following
Informal inspectionSystematic inspectionCode walkthroughs, data flow analysisCorrectness proofs (for instance using pre and post conditions)Complexity measures
Smith & Lawford (SQRL) Testing January 18, 2008 55 / 56
Unit Testing Other Analyses
Reference Information
Carlo Ghezzi, Mehdi Jazayeri and Dino Mandrioli, Fundamentalsof Sofware Engineering, 2nd Edition, Prentice Hall, 2003Perry, William E., Effective Methods for Software Testing, SecondEdition, John Wiley and Sons, Inc., 2000.
Smith & Lawford (SQRL) Testing January 18, 2008 56 / 56