Upload
bob-binder
View
788
Download
3
Embed Size (px)
DESCRIPTION
Invited Talk, Chicago Quality Assurance Association, Chicago, June 26, 2007. Overview of performance testing strategy for handheld devices and multi-tier systems.
Citation preview
mVerify A Million Users in a Box ®
®
Performance Testing Mobile and Multi-Tier Applications
Chicago Quality Assurance Association
June 26, 2007
Robert V. Binder mVerify Corporation [email protected] 312 881-7337 x1001
www.mverify.com
© 2007 mVerify Corporation www.mverify.com 2
Goals of Performance Testing
Validate time requirements/expectations
Validate utilization requirements/expectations
Validate capacity requirements/expectations
Reveal load-related bugs
Prove compliance: SLAs, contracts, competitive rankings
Fire-drill for recovery
Assess robustness to shocks
© 2007 mVerify Corporation www.mverify.com 3
Business Impact/ROI
In 2002, slow e-commerce downloads lead to an estimated $25 billion of abandoned transactions
In 2005, Google’s 15 minute outage estimated to have cost at least $150,000 in lost ad revenue
Recent study: nearly two-thirds of mobile employees rank poor response time as a “significant” inhibitor to working remotely over a VPN
© 2007 mVerify Corporation www.mverify.com 4
Business Impact/ROI
One 9 Two 9s Three 9s Five 9s Six 9s Four 9s
Avoidable Costs •Lost revenue •Lost user productivity •Lost IT productivity •Overtime payments •Wasted goods •Fines
Risk Mitigation •Enterprise demise •Law Suits •Negative Publicity •Personnel morale
Solution Cost •Software Tools •Hardware •Staffing •Services •Training
IBM. "Maximizing Web site Availability," February 2002
© 2007 mVerify Corporation www.mverify.com 5
Basic Objectives: RASP
Reliability Probability of a failure occurring within a certain period of time
Availability Percent achieved up-time, not including scheduled downtime
Scalability The range load for which an incremental input consumes the same resources
Performance The rate at which work is done
© 2007 mVerify Corporation www.mverify.com 6
Reliability
Reliability: probability of non-failure
Total operational hours or transactions
Entire user population
Can be estimated during test, if tests are sufficiently realistic
© 2007 mVerify Corporation www.mverify.com 7
Availability – the “nines”
Annual Unscheduled Downtime
Six nines 99.9999% 32 seconds
five nines 99.999% 5 minutes
four nines 99.99% 53 minutes
Three nines 99.9% 8.8 hours
two nines 99% 87 hours (3.6 days)
One nine 90% 876 hours (36 days)
Availability = percent up-time
MTTR: mean time to recover, repair, restart …
Availability = 1 / 1 + (MTTR Reliability)
© 2007 mVerify Corporation www.mverify.com 8
Reliability (Failures/million hours)
Availability, 6 min MTTR
Some Data Points
NT 4.0 Desktop 82,000 0.999000000
Windows 2K Server 36,013 0.999640000
Common Light Bulb 1,000 0.999990000
Stepstone OO Framework 5 0.999999500
Telelabs Digital Cross Connect 3 0.999999842
© 2007 mVerify Corporation www.mverify.com 9
Performance Metrics
Response Time Round-trip time
Throughput Aggregate transaction
processing rate
Utilization Average % busy
Failure Intensity
Recovery Time
Utilization
Trans/Sec
Avg Resp, Sec
0% 100%
© 2007 mVerify Corporation www.mverify.com 10
Strategies
Performance Testing
Assess compliance with performance goals
Assess compliance with resource utilization goals
Provides data to estimate reliability, availability
Stress Testing, Load Testing
Assess response to over-load scenarios
Assess recovery from failure modes
© 2007 mVerify Corporation www.mverify.com 11
Strategies
Benchmarks
Assess throughput for open standard test suite
Scalability
Assess performance linearity
Profiling
Identify utilization bottlenecks by component
© 2007 mVerify Corporation www.mverify.com 12
Typical Server Side Setup
Emulated Client
Internal LAN
Server(s) Under Test
Emulated Client
Emulated Client
•
•
•
© 2007 mVerify Corporation www.mverify.com 13
Issues
Client Emulation Machines Synchronization, overall test execution
Capacity
Multi-homed, test control subnet
Test vs. Production Systems Separate server farm?
Network contention
Isolation versus scale/scope
Version/Configuration control System under test
Test environment
Database set/reset
© 2007 mVerify Corporation www.mverify.com 14
Issues
Actual end-user/customer experience?
Network latency, QoS …
Thin clients?
Browser, client software versions
Client OS?
© 2007 mVerify Corporation www.mverify.com 15
Edge Monitoring
Emulated Client
Internal LAN
Server(s) Under Test
Emulated Client
Emulated Client
Internet
Monitored Client
Monitored Client
•
•
•
•
•
•
© 2007 mVerify Corporation www.mverify.com 16
Issues
Client Monitoring Machines Synchronization
Achieving desired test input at desired time
Capacity
Data collection
Availability
Security (beta test agreement?)
Network configuration DMZ
Equipment, setup, security considerations
© 2007 mVerify Corporation www.mverify.com 17
Connectivity – a Wild Card
Emulated Client
Internal LAN
Server (s) Under Test
Emulated Client
Emulated Client
Internet
Monitored Client
Monitored Client
Random latency, jitter, lost packets, re-ordered
packets, re-routed packets, duplicate
packets, bandwidth restrictions, bit-error, background load, QoS,
operational events
•
•
•
•
•
•
© 2007 mVerify Corporation www.mverify.com 18
Controlled Connectivity
Emulated Client
Internal LAN
Server (s) Under Test
Emulated Client
Emulated Client
Monitored Client
Monitored Client
Network Emulator
Controlled latency, jitter, lost packets, re-ordered
packets, re-routed packets, duplicate packets,
bandwidth restrictions, bit-error, background load, QoS, operational events
•
•
•
•
•
•
© 2007 mVerify Corporation www.mverify.com 19
Issues
Complexity
Impairment modeling
Impairment emulator programming
Coordination with emulated clients
Coordination with monitored clients
Specialized Skills
Wire Shark (Ethereal)
TCP log analysis
© 2007 mVerify Corporation www.mverify.com 20
How to Maximize Reliability
Combine realistic functional and load testing
Representative variation in load and usage
Supports reliability/availability estimation
Saves time: more test goals supported with fewer tests
Typically effective in finding “weird” bugs
Security?
Add abuse cases to the usage profile
Interleave with normal traffic
“You play like you practice”
© 2007 mVerify Corporation www.mverify.com 21
Use Dynamic Loading
The real world isn’t flat
Vary behavior rate for actor/actor group Arc
Flat
Internet fractal
Negative ramp
Positive ramp
Random
Spikes
Square wave
Waves -500.000
0.000
500.000
1000.000
1500.000
2000.000
2500.000
3000.000
-5000 0 5000 10000 15000 20000 25000
Time (seconds)
Eve
nts
Pe
r S
eco
nd
Actual “Waves” Loading
© 2007 mVerify Corporation www.mverify.com 22
Case Study
Event Simulator
DB Script Writer
Java GUI
Test Oracle
Comparator
SilkTest
Java Servers
Test Run Reports
Test Object Serializer
Java API
Java Driver
TX Formatter
MainFrame 3270 Driver
Custom Test Component
3rd Party Product
System Under Test
© 2007 mVerify Corporation www.mverify.com 23
Case Study
Every test run unique and realistic Simulated user behavior to generate transactions
Automatically submit in real time
~100,000 test cases per hour
~200 complete daily cycles
Evaluated functionality and performance
Controlled distributed heterogeneous test agents (Java, 4Test, Perl, SQL, Prolog) driving Java/CORBA GUI/API
Five person team, huge productivity increase
Achieved proven high reliability Last pre-release test run: ~500,000 events in two hours, no
failures detected
No production failures
© 2007 mVerify Corporation www.mverify.com 24
Notes
Capture/replay scripts
Static think-time
Can distort load and response time
Performance Analysis
Neil Gunther – books and web site
© 2007 mVerify Corporation www.mverify.com 25
Tools
Open Source
openSTA
PushToTest
Grinder
http://opensourcetesting.org/performance.php
Scripting systems: Tcl, Perl, Ruby, Python
Built-in
Windows Perfmon
*nix – SNMP, others
© 2007 mVerify Corporation www.mverify.com 26
mVerify Testing System
End-to-End
Edge to Core
Integrated functional and performance testing Test objects
XML performance measurements
Adapters for Windows Mobile, Web Services, ODBC, *nix command line
Forthcoming Profile-based test generation
Adapters and plug-ins for many other platforms
© 2007 mVerify Corporation www.mverify.com 27
MTS/RPM
Agent Host
Client Host Under Test
Agent HostConsole Host
MTS Console
MTS Test
Agent
MTS Test
Agent
Host Under Test may be
TEST RUN
REPORTS
MTS Remote
Agent
Client
Under Test
Server Host Under Test
MTS Remote
Agent
Server
Under Test
ü Cell Phone
ü PDA
ü Desktop
ü Server
ü Embedded Processor
ü Network Equipment
ü Access Point
ü Base Station
Client Host Under Test
MTS Remote
Agent
Client
Under Test
Server Host Under Test
MTS Remote
Agent
Server
Under Test
RPM
Plug In
RPM
Plug In
RPM
Plug In
RPM
Plug In
© 2007 mVerify Corporation www.mverify.com 28
MTS/RPM
Integrates Functional and Performance Test MTS 1.5: RPM Plug-in for Windows Mobile Plug-ins for Win32, *nix coming soon