Upload
techwell
View
212
Download
3
Embed Size (px)
Citation preview
BT5 ConcurrentSession11/12/1511:30am
“Performance Testing Cloud-Based Systems”
Presented by:
Edwin Chan
Deloitte Inc.
Broughttoyouby:
340CorporateWay,Suite300,OrangePark,FL32073888-268-8770·904-278-0524·[email protected]·www.techwell.com
Edwin Chan Deloitte Inc.
Edwin is a Technology practitioner in Deloitte, Quality Assurance Community of Practice. He has extensive experience in all phases of SDLC, with PMI PMP and QAI CSTE credentials. He has about ten years of consulting experience in the delivery of complex implementation projects in the Financial Services Industry, including major banks and financial institutions in Canada and the United States. With his deep knowledge in QA and Testing, he has been in a variety of Test Lead roles in different types of client engagements. Edwin is knowledgeable of the latest trend in performance testing and automation of cloud solutions with emphasis in Agile/Lean Project Management methodology.
Better Software ConferencePerformance Testing Cloud-Based Systems November 12th 2015Edwin Chan
1
Speaker
© Deloitte LLP and affiliated entities.
Edwin is a Technology practitioner in Deloitte, Quality Assurance Community ofPractice. He has extensive experience in all phases of SDLC, with PMI PMP andQAI CSTE credentials. He has about ten years of consulting experience in thedelivery of complex implementation projects in the Financial Services Industry,including major banks and financial institutions in Canada and the United States.With his deep knowledge in QA and Testing, he has been in a variety of Test Leadroles in different types of client engagements. Edwin is knowledgeable of the latesttrend in performance testing and automation of cloud solutions with emphasis inAgile/Lean Project Management methodology.
Edwin Chan
2© Deloitte LLP and affiliated entities.
Taxonomy and Scope of Discussion
Cloud computing in this discussion refers to the SaaS Model, focusing on Hybrid Cloud services
Deployment Models
SaaS introduces challenges, including:• integration with identity systems for single-sign-on • data integration with on premises systems or
other SaaS applications• variable networking performance
Types of Services
From the deployment channel perspective, co-existing with the Hybrid Cloud, there is often integration of cloud application with one or more on-premises applications in the enterprise IT landscape.
Cloud computing is a broad term that encompasses multiple Deployment Models and Types of Services
On-premises application
PrivatePublic
HybridSaaS – Software as a Service
PaaS – Platform as a Service
IaaS – Infrastructure as a Service
Examples
3 © Deloitte LLP and affiliated entities.
Taxonomy and Scope of DiscussionPerformance testing determines how a system performs under particular workloads. This discussion refers to the following types of performance tests.
Load/ Volume Test Can the system handle a normal load?
Stress TestCan the system handle the load of a peak season (e.g. sale transactions on Black Friday, Boxing Day)
Endurance Test/ Longevity Test
Will it stay up? And how long will it stay up?
Break Test How long will it stay up before breaking apart or performance degrades to an unacceptable level?
Scalability Test Can the system handle more users / transactions and grow with the enterprise?
4
Cloud-based Performance Testing ChallengesCloud computing is growing to be of strategic importance in the enterprise, inevitably part of the solution delivered is no longer on-premises, adding a layer of complexity and challenges to performance testing of cloud based solutions.
© Deloitte LLP and affiliated entities.
Networks are no longer just within the control of an organization’s domain and variances in the end-to-end performance maybe due to cloud network latency. The slowest network segment could be the performance bottleneck of an application.
Leverage sophisticated application monitoring tools with network monitoring capability to identify network performance bottlenecks.
With Cloud based systems relying on web 2.0 technologies, which operate on a variety of devices, and frameworks that perform differently on different devices, browser related performance issues are amplified.
Have a clear understanding of the non-functional requirements to define a testing strategy early on, addressing the combined device-browser-framework performance issues.
On-premises application
Private
Public250mbps
10mbps
250mbps
bottleneck
10mbpsbottleneck
5
Cloud-based vs Traditional Performance Testing
© Deloitte LLP and affiliated entities.
Is performance testing of cloud solutions fundamentally different from that of on-premises applications?
What are the best practices that works for performance testing of cloud and on-premises solutions?
DIFFERENCEBEST PRACTICES
Cloudvs
Traditional
6
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
Difference
• Late initiation of performance test strategy.
TypicalIssue
Impact / Risk to Project Delivery
• One of the key risks that performance test aims to uncover is the architectural / solution design flaw early in the project so that these issues can be remediated before it is too late.
• Typically performance testing is executed towards the end of the project before go-live. Hence, project leadership often initiates the formulation of performance test strategy late in the game.
Challenge1
© Deloitte LLP and affiliated entities.
7
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
Difference
• Poorly defined performance requirements.
• Typical quality issue in performance requirements definition include statistically unquantifiable, unrealistic, vague and inaccurate definitions.
• Having well-defined performance requirements is a key success factor to assure we are building the right solution. Without well defined performance requirements we introduce risks that might lead to us not building the right solution.
TypicalIssue
Impact / Risk to Project Delivery
Challenge2
© Deloitte LLP and affiliated entities.
8
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
Difference
• Incomplete performance requirements.
There are 3 independent categories of performance requirements that constitutes their completeness:• Response/ Processing times - define how fast requests would be processed in interactive
online transactions or batch jobs. Thirty minutes may be excellent for a big batch job, but unacceptable for loading a web page.
• Throughput - the rate at which incoming requests are completed. It defines load on the system and is measured in operations per time unit (transactions/sec or number of adjudicated claims/hr.
• Concurrency - the number of users or threads working simultaneously.
Very often, one or more categories of performance requirements are missing.
• Missing requirements is the root cause of missing test scenarios. Missing performance scenarios introduce negative user experience from the performance perspective.
TypicalIssue
Impact / Risk to Project Delivery
Challenge3
© Deloitte LLP and affiliated entities.
9
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
Difference
• Lack of sound data seeding strategy in the overall performance test strategy.
• Due to increased reliance on having someone else providing the right volume of data to execute performance tests, little consideration is given to developing a sound data seeding strategy as part of the performance test.
• For projects involving data migration from a legacy application to a cloud based application, we often wait for the availability of converted data to begin conducting the performance test, which is often too late in the game.
• Without the right mix of data (types and distribution) and appropriate volume, performance test results become non-reliable or trustworthy. Impact / Risk
to Project Delivery
Challenge4
TypicalIssue
© Deloitte LLP and affiliated entities.
10
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
Difference
• Lack of investment in Application Performance Management (APM) tools to monitor and perform proactive diagnostics in performance tests.
• Application Performance Management (APM) tools for monitoring and diagnosis are often not available for use in performance testing.
• Without the proper APM tools available to collect technical metrics across tiers of the application architecture, development teams will take significant level effort to troubleshoot, diagnose and pin-point root cause for performance defects, causing lengthy delays in the project timeline.
TypicalIssue
Impact / Risk to Project Delivery
Challenge5
© Deloitte LLP and affiliated entities.
11
Is performance testing of cloud solutions immune to key challenges typical in any on-premises solutions?
The Answer
Key success factors for performance testing of cloud solutions include:
No – definitely NOT
Performance Testing of cloud solutions is NOT immune to the typical challenges in any solutions traditionally hosted on-premises.
Investment in APM tools
Early initiation of performance test strategy
Well defined and complete performance requirements
Having a sound data seeding strategy in the overall performance test strategy
© Deloitte LLP and affiliated entities.
12
Best Practices - The big pictureWith the agility and flexibility in the development and deployment of cloud application it’s a natural fit to apply agile development methodology in the SDLC
Planning
Preparation
Execution
The traditional three-step testing process in the waterfall model will resemble the following under Agile Development methodology / framework whereby early feedback is built-into the
process. Hence early performance testing is a natural fit for cloud solutions.
Waterfall
Release Planning
Sprint 1 Planning
Sprint N Planning
Preparation Execution
Agile
Sprint2 Planning
Preparation Execution Preparation Execution
© Deloitte LLP and affiliated entities.
13
Best Practices1. Conduct early performance testing
Start developing performance testing strategy early in the project.
A
Conduct early performance testing iteratively or in short sprints to gather early feedback.
B
Introduce performance testing at service layer, with special attention to change control in the service interface.
C
Early Performance Testing
Sprint 1 Planning
Sprint N Planning
Preparation Execution
Agile
Sprint2 Planning
Preparation Execution Preparation Execution
Ensure you have access to service layer testing tool that meet your project need.
D
Engagement and Collaboration with the architecture team and performance testing team is a key success factor – start early so as to strategize with the teams and gain support.
E
© Deloitte LLP and affiliated entities.
14
Best Practices
• The three components of performance testing which include the server, network, and GUI need to be measured separately.
• Most cloud vendors provide server performance testing services. Very few provide GUI and network performance metrics as these items are considered as out of scope items.
2. Outsource performance testing
Partially outsource performance testing to the cloud solution provider whenever practical and/or feasible, and ensure to measure the end-to-end response time.
A
Measurement provided by vendor as relevant performance metrics
© Deloitte LLP and affiliated entities.
There is a growing need to measure the End-to-End user experience due to different devices, operating systems, browsers and network technologies.
15
Best Practices
• Incorporate tool selection with POCs in performance test strategy – make sure the tool works for you – there is not a one size fits all solution.
3. Tool Selection
Validate new tools with Proof of Concepts (POC’s) during the formulation stage of the strategy.
AQuality / Test management Tools for managing testing strategy,
plans, test cases . scripts, testing processes, exploratory testing, defect management, status reporting, executive dashboard
Test automation (functional and aggression)
Frameworks or tools for automating functional tests (GUI and API Tests*)
Service virtualization Frameworks or tools for integrating components not available for at the time of testing
Load / Performance testing Tools for testing load and performance
Testing Activity Description
Integration with other tools
Performance Test automation Service virtualization
Performance X XTest automation X XService virtualization X X
* API Test examples – Web Services (SOAP and REST) , Databases, FTP, Message Queue© Deloitte LLP and affiliated entities.
16
Best Practices4. Considerations for selecting the right tools
Level of programming effort required in the tool to create the performance test scripts - some are easier than others.
A
Validate if service virtualization can eliminate or alleviate the bottleneck of integration of critical interfaces required in the performance scenario.
B
Ability to support a variety of protocol and technologies needed in your projects ranges from mainframe to middleware to intelligent web 2.0 technologies.
C
Consider budget, support, learning curveD
Select the right tools
Examples Open Source Commercial Edition
Cloud - JMeter - HP Performance Center - Blazemeter
On-Premises - JMeter - HP LoadRunner- Microsoft Visual Studio
© Deloitte LLP and affiliated entities.
17
Best Practices5. Automate injection of test data
Ensure the feasibility of automating the injection of large volume of test data.
A
Automate the data seeding required in the performance test, that means that the performance testing tool and the data seeding tool are related. The interoperability of the data seeding automation tool and the performance testing tool should be taken into consideration.
B
© Deloitte LLP and affiliated entities.
18
Best Practices6. Integrate full performance testing lifecycle
• Start performance testing early as part of an agile process. • Execute end-to-end testing in complex environments. • Coordinate testing with software changes, configuration management and
version control.
• Translate user requirements into load testing objectives.• Create virtual user scripts.• Define and configure user behavior.• Understand network impact within application.
Iteratively employ static/dynamic analysis for software quality analysis and measurement and security analysis.
Monitor applications and end-user experience and perform root cause analysis.
Test planningDefine business requirements for application performance and asses the impact of architecture, design and security on performance.
Test preparation
Test execution
Test analysis
Monitoring in Production
Continuous testingFeedback
© Deloitte LLP and affiliated entities.
19
Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee, and its network of member firms, each of which is a legally separate and independent entity. Please see www.deloitte.com/about for a detailed description of the legal structure of Deloitte Touche Tohmatsu Limited and its member firms.
Deloitte, one of Canada's leading professional services firms, provides audit, tax, consulting, and financial advisory services. Deloitte LLP, an Ontario limited liability partnership, is the Canadian member firm of Deloitte Touche Tohmatsu Limited.
This communication contains general information only, and none of Deloitte Touche Tohmatsu Limited, its member firms, or their related entities (collectively, the “Deloitte Network”) is, by means of this communication, rendering professional advice or services. No entity in the Deloitte network shall be responsible for any loss whatsoever sustained by any person who relies on this communication.
© 2015. For information, contact Deloitte Touche Tohmatsu Limited.