Transcript
Page 1: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

1

Implementation and Integration• Implementations and integration are two different

phases performed in sequence• Problems with this approach

– To test a module independently, Development and SQA members need

• Stubs for modules that call other modules• Drivers for modules that are being called

– Isolating faults are more difficult during integration• To solve these problems these phases must be

performed in parallel

Page 2: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

2

Top-Down Implementation and Integration

• In this approach we develop modules in order:– Root modules

• Use stubs to test the modules it calls

– Expand stubs to create the actual modules• Reduces the stub waist

Page 3: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

3

Top-Down Implementation and Integration

• Advantages– Fault isolation

• Integration testing is done when each module is completed– Stubs are not wasted– Major design faults show up early

• By dividing the modules into two classes– Logic modules

» Decision making modules– Operational modules

» Perform the actual operations• In is important to code and test logic modules before operational

modules

Page 4: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

4

Top-Down Implementation and Integration

• Disadvantages– Lower level modules are not tested frequently

• If a program has 200 modules– Root is tested 200 times (highest)– Lowest modules are tested fewer times (lowest)

• Operational modules are tested the least– Reusable modules are not properly tested

• Logic modules are less likely to be reused– Problem specific

• Reused modules are most likely the lower modules– Operational modules

• Top-Down makes reuse risky due to poor testing

Page 5: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

5

Top-Down Implementation and Integration

• Defensive programming is when programmers test the input before use– Example:

if (x >= 0) y = computeSquareRoot (x, errorFlag);

– Subroutine is never tested with x < 0• This shields any fault cause by x < 0

– Reuse implications

Page 6: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

6

Bottom-Up Implementation and Integration

• In this approach we develop modules in order:– Operational (lower) level modules

• Use drivers to test the modules

– Logic (higher) level modules

Page 7: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

7

Bottom-Up Implementation and Integration

• Advantages– Operational modules are thoroughly tested

• Use drivers to test these modules• Bypass fault shielding

– Defensively programmed calling modules

– Fault isolation

• Disadvantages– Major design faults are detected late

Page 8: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

8

Sandwich Implementation and Integration

• Combination of top-down and bottom-up strategies– Making use of their strengths and minimizing their

weaknesses

• Development order– Logic modules are implemented and integrated top-down– Operational modules are implemented and integrated

bottom-up– Finally, the interfaces between the two groups are tested

Page 9: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

9

Sandwich Implementation and Integration

• Advantages– Major design faults are caught early– Operational modules are thoroughly tested

• They may be reused with confidence

– There is fault isolation at all times

Page 10: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

10

Implementation and Integration Summery

Page 11: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

11

Implementation and Integration of Object-Oriented Products

• Almost always using sandwich implementation and integration

• Objects are integrated bottom-up• Other modules are integrated top-down

– Logic modules such as Main

Page 12: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

12

Management Issues during the Implementation and Integration

• Finding out module interfaces does not work– M1 calls M2 with 4 parameters– M2 is defined with 3 parameters

• Documents must be controlled• SQA managers must have responsibility of deciding

which modules should be developed top-down and which will be developed bottom-up

• Testing must be performed by SQA Team

Page 13: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

13

Testing during the Implementation and Integration

• Module testing– Test new modules thoroughly– Integrate with previously integrated modules– Test to see the behavior did not change

Page 14: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

14

Testing during the Implementation and Integration

• Graphical User Interface– Test Cases

• Mouse clicks• Menus• Input fields

– Not easy to test GUI– Mostly manual process– Need CASE Tool to make it easy

• QAPartner• XRunner

Page 15: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

15

Testing during the Implementation and Integration

• Product testing– Performed by SQA team to ensure that the product will not

fail its acceptance test– Failing the acceptance test is a disaster for management

• The client may conclude that the developers are incompetent• Worse, the client may believe that the developers tried to cheat

– The SQA team must ensure that the product passes its acceptance test

– Black box test cases for the product as a whole– Robustness of product as a whole

• Stress testing (under peak load)• Volume testing (e.g., can it handle large input files?)

Page 16: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

16

Testing during the Implementation and Integration

• Product testing– Check all constraints

• Timing constraints • Storage constraints• Security constraints• Compatibility

– Review all documentation to be handed over to the client

– Verify the documentation against the product

Page 17: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

17

Testing during the Implementation and Integration

• Acceptance testing– The product (software plus documentation) is now

handed over to the client organization– The client determines whether the product satisfies

its specifications– Performed by either

• The client organization• The SQA team in the presence of client representatives• An independent SQA team hired by the client

Page 18: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

18

Testing during the Implementation and Integration

• Acceptance testing– Performed with actual data provided by client– Four major components of acceptance testing

• Correctness• Robustness• Performance• Documentation

– Precisely what was done by developer during product testing

Page 19: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

19

CASE Tools for Implementation and Integration Phase

• Version control tools• Build tools• Configuration Management tools

Page 20: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

20

CASE Tools for Complete Software Process

• CASE tools– Single tool– Performs one task – A small organization can usually manage with just CASE tools

• Workbench– Supports few activities – A medium-sized organization can probably manage with a workbench

• Environment– A complete solution for most of the process– A large organization needs an environment

Page 21: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

21

Integrated Environment

• Integrated CASE tools are those tools that have common interface– Similar “look and feel”

• Tool integration means that all tools communicate via the same data format– ASCII files

Page 22: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

22

Integrated Environment• Process integration refers to an environment that

supports a specific software process• Technique-based or method-based environment

– Supports specific technique• Structured systems analysis

– Usually comprises• Graphical support for specification, design phases• Data dictionary• Some consistency checking• Some management support

Page 23: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

23

Integrated Environment

• Technique-based or method-based environment– Support and formalization of manual processes

• Analyst/Designer, Rose, Rhapsody (for Statecharts)

– Advantage• Forced to use specific method, correctly

– Disadvantages• Forced to use specific method• Inadequate support of programming-in-the-many

Page 24: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

24

Environment for Business Applications

• Environment to build business oriented products– Emphases is ease of use– Includes number of standard screens (GUI)

• Can be easily modified

– Standard forms for input, output– Provides a capability to generate code

• Input is Detailed design – lowest level of abstraction

• Output is software code• Expect productivity rise

Page 25: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

25

Potential Problems with Environments

• No one environment is ideal for all organizations• Each has its strengths and weaknesses

– Choosing the wrong environment can be worse than no environment

– Enforcing a wrong technique is counterproductive

• CASE environments should be avoided until the organization is at CMM level 3– We cannot automate a nonexistent process– A CASE tool or CASE workbench is fine

Page 26: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

26

Metrics for The Implementation and Integration Phase

• The five basic metrics mentioned before• Complexity metrics• Number of test cases, percentage which resulted

in failure• Total number of faults, by types

Page 27: Implementation and Integrationeugean/comp380/slides/ch15-1spp.pdfCh15 Copyright © 2001 by Eugean Hacopians 15 Testing during the Implementation and Integration • Product testing

Ch15 Copyright © 2001 by Eugean Hacopians

27

Challenges of The Implementation and Integration Phase

• Management issues are– Choosing an appropriate CASE tools– Test case planning– Communicating changes to all personnel– Deciding when to stop testing