21
Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Basics of Performance Testing Abu Bakr Siddiq

Basics of Performance Testing

  • Upload
    clive

  • View
    89

  • Download
    0

Embed Size (px)

DESCRIPTION

Basics of Performance Testing. Abu Bakr Siddiq. Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion. Introduction. Why Performance Test? What is Performance Test? How Performance Test? When to Performance Test? - PowerPoint PPT Presentation

Citation preview

Page 1: Basics of Performance Testing

Topics to be discussed

Introduction

Performance Factors

Methodology

Test Process

Tools

Conclusion

Basics of Performance TestingAbu Bakr Siddiq

Page 2: Basics of Performance Testing

Introduction Why Performance Test?

What is Performance Test?

How Performance Test?

When to Performance Test?

Performance Vocabulary

Page 3: Basics of Performance Testing

Why, What, How & When Performance Test Why Performance test the application?

To improve the software speed, scalability, stability and confidence of the system under different loads for desired performance.

Ex: USA Shopping in Dec . Indian Railway Reservation.

What is Performance testing?Testing for performance factors against the acceptable or suggestible configuration is performance testing.

How Performance test the application?Using a Tool which follows Performance Methodology & Performance Process

When Performance test the application?Available Performance Requirements, Stable Build, Resource Availability, Defined Performance Plan and methodology, Set entry and Exit Criteria.

Page 4: Basics of Performance Testing

Performance Testing VocabularyThe different types of performance testing

Capacity Testing: Determining the server’s failure point is called Capacity Testing.

Component Testing: Testing the architectural component of the application like, servers, databases, networks, firewalls, and storage devices.

Endurance Testing: Testing for performance characteristics with work load models anticipated during production.

Load Testing: Subjecting the system to a statistically representatives(load) is called Load Testing.

Smoke Testing: A initial run to see the system performance under normal Load.

Spike Testing: Testing for performance characteristics with work load models during production which repeatedly increase beyond anticipated.

Stress Testing: Evaluating the application beyond peak load conditions.

Validation Testing: Testing against the expectations that have been presumed for the system

Volume Testing: Subjecting the system to variable amount of data and testing for performance is Volume Testing.

User Load

Saturation point

Load Testing

Stress Testing

Thro

ugh

put

Page 5: Basics of Performance Testing

Performance Factors Throughput

Response Time

Latency

Tuning

Benchmarking

Capacity Planning

Throughput

Response Time

Latency

Tuning

Benchmarking

Capacity Planning

Page 6: Basics of Performance Testing

Client Web Server

DB ServerInternet

N1

A1

N4

A3 N3

N2 A2

Network Latency = N1 + N2 + N3 + N4Product Latency = A1 + A2 + A3

Actual Response Time = Network Latency + Product LatencyLatency = Actual Response Time + O1

Throughput, Response Time & Latency

O1

Throughput : The number of requests/business transactions processed by the system in a specified time duration.

Page 7: Basics of Performance Testing

Tuning : Procedure by which the system performance is enhanced by setting different values to the parameters (variables) of the product, OS and other components.

Ex: “Search”

Benchmarking: Comparing the throughput and response time of the system with that of competitive products .

Ex: Comparing Open office with that of MS- Office.

Capacity Planning: The exercise to find out what resources and configurations are needed.

Ex: suggesting the ideal software, hardware and other components for the system to the customer.

Tuning, Benchmarking & Capacity Planning

Page 8: Basics of Performance Testing

Methodology Collecting Requirements

Writing Test Cases

Automating Test Cases

Executing Test Cases

Analyzing Results

Tuning

Benchmarking

Capacity Planning

•Collecting Requirements•Authoring Test Cases

•Automating Test Cases•Executing Test Cases

•Analyzing Results•Tuning

•Benchmarking•Capacity Planning

Page 9: Basics of Performance Testing

Collecting Performance Requirements Performance Requirements types: Generic Specific

Performance Requirements Characteristics: Testable Clear Specific

Sources for gathering Requirements: Performance compared to previous release of the same product Performance compared to the competitive product(s) Performance compared to absolute numbers derived from actual

need Performance numbers derived from architecture and design

Graceful Performance Degradation

Page 10: Basics of Performance Testing

Authoring Test CasesPerformance Test Cases should have the following details:

List of Operations or business transactions to be tested

Steps for executing those operations/transactions

List of Product, OS parameters that impact the Performance

Testing

and their values

Loading Pattern

Resource and their configuration (network, hardware, software

configurations

The expected results

The product version/competitive products to be compared with

the related information such as their corresponding fields

Page 11: Basics of Performance Testing

Automating Performance Test CasesPerformance Testing naturally lends itself to automation, reasons :

Performance testing is repetitive

Performance test cases cannot be effective without automation

Results of Performance testing need to be accurate, manually

calculating may introduce inaccuracy

Performance Testing takes into account too many permutations

and combinations hence if done manually will be difficult

Extensive Analysis of performance results and failures needs to

be taken into account, is very difficult to do things manually

Page 12: Basics of Performance Testing

Executing Performance Test CasesThe following aspects need to be considered while executing Performance Tests:

Start and End time of Execution

Log and trace/audit files of the product and OS. (for future debugging and repeatability)

Utilization of resources ( CPU, memory, disk, network utilization and so on) on periodic basis

Configuration of all Environmental factors (Hardware, software and other components)

The performance factors listed in the test case documentation at regular intervals

Page 13: Basics of Performance Testing

Analyzing Performance Test resultsPerformance Test results are concluded as follows:

Whether performance of the product is consistent when tests are executed multiple times

What performance can be expected for what type of configuration What parameters impact and how they can be derived for better

performance What is the effect of scenarios involving a mix of performance

factors What is the effect of product technologies such as caching What Loads of performance numbers are acceptable

What is the optimum throughput and response time of the product for set of factors

What performance requirements are met how the performance looks compared to previous version or expectations

Page 14: Basics of Performance Testing

Performance TuningTwo ways to get optimum mileage

Tuning the Product parameters Tuning the Operating system parameters

Product parameters: Repeat the Performance tests for different values of each

parameters that impact A Particular Parameter change values is changed, it needs changes

in other Repeat the tests for default values of all parameters( factory

settings tests) Repeat the performance tests for low and high values of each

parameter and combinationsOperating system parameters:

Files system related parameters Disk Management parameters Memory Management parameters Processor Management parameters Network management parameters

Page 15: Basics of Performance Testing

Performance Benchmarking & Capacity Planning

The steps involved in Performance Benchmarking are the as follows:

Identify the transactions/ scenarios & the test configuration Comparing the performance of different product

Tuning the parameters of the products bring compared fairly to deliver the best performance

Publishing the results of performance benchmarking

Capacity Planning is identifying the right configuration, which is of 3 types:

Minimum required configuration Typical configuration Special configuration

Page 16: Basics of Performance Testing

Test Process Resource Requirements

Test Lab Setup

Responsibilities

Setting up Traces & Audits

Entry & Exit Criteria

Resource Requirement

Test Lab Setup

Setting up Traces &

Audits

Responsibilities

Entry & Exit Criteria

Page 17: Basics of Performance Testing

Performance Test ProcessObtain Measurable,

testable requirements

Create a Performance

Test Plan

Design Test Cases

Automate Test Cases

Evaluate Entry Criteria

Perform and Analyze performance test

cases

Evaluate Exit Criteria

Page 18: Basics of Performance Testing

Resource Requirements:All the resources are specifically needed , hence shall be planned and obtained.Resources are to be exclusively dedicated to the current system without interchanging the roles and responsibilities often.

Test – Lab Setup:Test Lab with all required equipment is to be setup prior to execution. Test lab has to be configured cautiously as a single mistake can lead to running the tests again.

Responsibilities:Performance defects may cause changes to architecture, design and code. Team facing customer communicates requirements for performance. Multiple teams are involved in Performance testing the system. Hence a matrix which describes the responsibilities of the team is part of the test plan.

Page 19: Basics of Performance Testing

Setting up Product traces, Audits:Performance test results need to be associated with traces and audit trails to analyze the results. Audits and traces are to be planned in advance or else may start impacting the performance results.

Entry and Exit Criteria:Performance tests require a stable product due to its complexity and accuracy that is needed. Changes to the product mean tests are to be repeated. Hence Performance tests starts after the product meets a set criteria.

Page 20: Basics of Performance Testing

Tools for Performance Testing Commercial Tools

Load Runner – HP QA Partner - Compuware Silk Test - Segue MS Stress Tool – Microsoft

Open Source Tools Web Load JMeter Open STA

Challenges

Conclusion

Commercial

Load Runner

Silk Test

MS Stress

Page 21: Basics of Performance Testing

Thank you