Upload
roscoe
View
50
Download
0
Embed Size (px)
DESCRIPTION
Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9). Mark Nesson June, 2008. Why WebFOCUS for z. Runs natively on MVS, Linux IBM has brand new specialty engines you can take advantage of - PowerPoint PPT Presentation
Citation preview
Mark Nesson
June, 2008
Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9)
Copyright 2007, Information Builders. Slide 2
Why WebFOCUS for z
Runs natively on MVS, Linux IBM has brand new specialty engines you can take advantage
ofAbility to create partitions on z to centralize business
intelligence on a single server – where the databases and applications reside
Copyright 2007, Information Builders. Slide 3
Information Builders products used in benchmark
WebFOCUS iWay Software
iWay Service Manager, is a unique and powerful Enterprise Service Bus (ESB) that invoked as Web services to provide event-driven integration and B2B interaction management.
Copyright 2007, Information Builders. Slide 4
There’s an easier way!
`
RDBMS
HTTP Clients Web ServerApp Server/
Servlet ContainerReporting
Server
RCRepository
RC DistributionServer RDBMS
DBServers
ibi_html
ibi_bid
approot
ibi_apps
rcaster
basedir
worp
adapters
focexecs
synonym
data
reports
HTTP/HTTPS
HTTP/HTTPS/
Proprietary
TCP
HTTP/HTTPS TCP/JDBC
TCP/ JDBC
TCP via Client Driver
or JDBC
TCP/JDBC
Copyright 2007, Information Builders. Slide 5
Benchmark Objectives
Test WebFOCUS, on the proven strengths of System z hardware running the Linux open source OS with UDB, and z/OS with DB2.
Evaluate the scalability and performance of WebFOCUS and iWay Service Manager in all operating systems environment on IBM z Server and the benefit of using various specialty engines on IBM z server.
All test configurations accurately and faithfully replicated prior benchmarks run on other UNIX and Windows platforms.
Test results therefore represent the true performance of WebFOCUS workload on that hardware vendor machine.
Testing was done at IBM Gaithersburg (Washington Systems Center) in November of 2006 by a combined team from IBM and Information Builders.
Copyright 2007, Information Builders. Slide 6
UDB and DB2 database size
Linux system IBILIN03 under zVM and z/OS system IBI1 are used in the various benchmark configurations to host the test databases.
Two databases are defined on IBILIN03 and IBI1 With multiple tables defined to each database
2 million rows of data 7 million rows of data
Each row is 256 bytes long.
Copyright 2007, Information Builders. Slide 7
Benchmark Test Workload used
Workload-small: Query to retrieve 61 rows of data Workload-large: Query to retrieve 3000 rows of data Workload-complex: CPU intensive query, involves 4 table
join retrieve 5118 rows
Copyright 2007, Information Builders. Slide 8
How IBI tests were measured
For each given test configuration, Information Builders used the same parameter settings, i.e. interval time, keep alive time, to run small, large and complex workload by varying the concurrent active user numbers, then measure the end to end user response time.
Copyright 2007, Information Builders. Slide 9
Test Environment – 1 (all on Linux)
IBI WebFocus Network and Data Traffic Flow on z9using 3 SuSE SLES9 Linux guests under zVM
z/VM 5.22 CPU assigned plus 14 optional8 GB of Memory & xxGB Xstore
GB Ethernet
3.0 GHz 8way 3.0 GHz 8way
Workstations
Linux 2/4 CP,2/4 GB
SuSE SLES 9 64 bitWAS 6.0.1.13+
Linux 2/4/8 CP,2 GB
SuSE SLES 9
Linux 2/4 CP,2GB
SuSE SLES 9
Hostname: IBILIN01 Hostname; IBILIN02 Hostname; IBILIN03
WebFocus Client 7.6 WebFocus Reporting Server 7.6DB2 Connect 8.2
iWay Service Manager 5.5
UDB 8.2 fixpack 13
11/14/2006
OSAsubchannel
Copyright 2007, Information Builders. Slide 10
Test Environment – 1(Linux) Scenarios
Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2
Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2
Scenario2: IBILIN01: 2/4 CP, 2/4 GB, WAS 6.0.2.17, WebFOCUS Client 7.6 IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2
Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2
Scenario3: IBILIN02: 2/4/8 CP, 2 GB, iWay Service Manager, DB2 JDBC type
4 IBILIN03: 2/4 CP, 2 GB, UDB 8.2
Copyright 2007, Information Builders. Slide 11
Test Env. – 1 (Linux), Scenario – 1(WebFOCUS Reporting Server, DB2 Connect, Workload-small)
# user
Response Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
0.616
1.205
2.388
3.726
0.407
0.435
0.793
Small
1.946
-----
0.223
1.321
-----
Small
Small
Small
Copyright 2007, Information Builders. Slide 12
Filters and Specs:
Operating System:SUSE SLES 9 z/VM 5.2
Memory:2 GB
RDBMS DB2
Rows Returned: 61
Number of CPUS: 2, 4, 8
Protocol: TCP
Access Method: CLI
Concurrent Users: 10, 25, 50, 75, 100, 200, 500
Keep Alive:60 Interval:.05
WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)
Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 13
Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-small)
Test 1 Scenario 1 Small
0
1
2
3
4
0 100 200 300 400 500 600
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 14
Test Env.– 1 (all on Linux), Scenario – 1(WebFOCUS Reporting Server, DB2 Connect, WL-large)
# user
Response Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
1.829
3.529
6.095
17.032
0.941
1.867
3.323
Large
5.487
-----
1.013
6.883**
-----
Large
Large
Large
** Linux system swap occurred
Copyright 2007, Information Builders. Slide 15
Filters and Specs:
Operating System:SUSE SLES 9 z/VM 5.2
Memory:2 GB
RDBMS DB2
Rows Returned: 3000
Number of CPUS: 2, 4, 8
Protocol: TCP
Access Method: CLI
Concurrent Users: 10, 25, 50, 75, 100, 200, 500
Keep Alive:60 Interval:.05
WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)
Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 16
Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-Large)
Test 1 Scenario 1 Large
0
5
10
15
20
50 100 200 500
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 17
Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
# user
Response Time in seconds
2 cp 4 cp 8 cp
10
25
50
100
Workload Type
4.676
11.295
24.443
40.467
3.133
5.813
11.333
Complex
25.175
-----
-----
14.92
8.963
Complex
Complex
Complex
200 80.928 60.784 ----- Complex
Copyright 2007, Information Builders. Slide 18
Filters and Specs:
Operating System:SUSE SLES 9 z/VM 5.2
Memory:2 GB
CPU Speed:550 Mips
Rows Returned: 5118
Number of CPUS: 2, 4, 8
Protocol: TCP
Access Method: CLI
Concurrent Users: 10, 25, 50, 75, 100, 200, 500
Keep Alive:60 Interval:.05
WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)
Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 19
Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
Test 1 Scenario 1 Complex
0
20
40
60
80
100
10 25 50 100 200
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 20
Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-small)
# user
Response Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
0.799
1.378
2.594
5.417
-----
0.798
-----
Small
3.659**
-----
0.624
2.065
-----
Small
Small
Small
** Switched to use JLINK
Copyright 2007, Information Builders. Slide 21
Filters and Specs:
Operating System:SUSE SLES 9 z/VM 5.2
Memory:2 GB
Rows Returned: 61
Number of CPUS: 2, 4, 8
Protocol: SERVLET
Access Method: CLI
Concurrent Users: 10, 25, 50, 75, 100, 200, 500
Keep Alive:60 Interval:.05
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 22
Test Env. – 1 (all on Linux), Scenario – 2 (WAS, Web FOCUS Reporting Server, DB2 Connect, WL-small)
Test 1 Scenario 2 Small
0
1
2
3
4
5
6
50 100 200 500
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 23
Test Env. – 1 (all on Linux), Scenario – 2(WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large)
# user
Response Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
1.913
3.702
7.279
14.966
-----
2.45
-----
Large
11.857**
-----
-----
----
-----
Large
Large
Large
** switched to use JLINK
Copyright 2007, Information Builders. Slide 24
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 25
Test Env. – 1 (all on Linux), Scenario – 2(WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large)
Test 1 Scenario 2 Large
02468
10121416
50 100 200 500
Users
Re
sp
on
se
Tim
e (
Se
cs
)
2 CP
4 CP
Copyright 2007, Information Builders. Slide 26
Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
# user
Response Time in seconds
2 cp 4 cp 8 cp
10
25
50
75
Workload Type
11.578
28.438
-----
-----
2.151
5.047
10.826
Complex
15.312
-----
-----
-----
7.121
Complex
Complex
Complex
100 ----- 19.791 14.056 Complex
Copyright 2007, Information Builders. Slide 27
Average Request Processing Time (in seconds)
Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 28
Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
Test 1 Scenario 2 Complex
0
5
10
15
20
25
30
10 25 50 75 100
Users
Res
po
nse
Tim
e (s
ecs)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 29
Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC Type 4, WL-Small)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
1.731
3.707
7.59
11.799
0.95*
1.796*
3.451*
Small
6.255*
0.734*
1.221*
4.71*
2.62*
Small
Small
Small
500 18.333 9.807* 7.34* Small
* JVM size 1024 MB
Copyright 2007, Information Builders. Slide 30
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 31
Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Small)
Test 1 Scenario 3 Small
0
5
10
15
20
25 50 100 200 500
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 32
Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Large)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
81.522
188.558
369.617*
-----
44.842*
90.319*
235.208*
Large
-----
32.34*
74.42*
-----
182.265*
Large
Large
Large
500 ----- ----- ----- Large
* JVM size 1024 MB.
Copyright 2007, Information Builders. Slide 33
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 34
Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Large)
Test 1 Scenario 3 Large
0
100
200
300
400
25 50 100
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 35
Benchmark Test Environment – 2(App and driver on Linux, DB on z/OS)
IBI WebFocus Network and Data Traffic Flow on z9using Linux under zVM and z/OS system
GB Ethernet
3.0 GHz 8way 3.0 GHz 8way
Workstations
IBILIN02Linux SuSE SLES9
2/4/8 CP,8 GB
WAS 6.0.1.13
z/OS 1.7 8 CP,8 GB
Hostname:IBI_VM
Hostname: IBI1
WebFocus ReportingServer 7.6
iWay Service Manager5.5
DB2 V 8.2
11/16/2006
OSAsubchannel
OSAsubchannel
LPAR 1 LPAR 2
Copyright 2007, Information Builders. Slide 36
Benchmark Test Environment – 2Scenarios
Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS Reporting Server
7.6, DB2 Connect, Native Data Driver (CLI) (z/OS) IBI1: 8 CP, 8 GB, DB2, 2 tables
Copyright 2007, Information Builders. Slide 37
Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)
# userResponse Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
0.896
1.78
3.276
5.166
0.444
0.875
1.885
Small
2.807
0.214
0.435
1.50
0.721
Small
Small
Small
(WebFOCUS Reporting Server, DB2, WL=Small)
Copyright 2007, Information Builders. Slide 38
Average Request Processing Time (in seconds)
Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 39
Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)
(WebFOCUS Reporting Server, DB2, WL=Small)
Test 2 Scenario 1 Small
0
1
2
3
4
5
6
50 100 200 500
Users
Res
po
nse
Tim
e (S
ecs)
2 CP
4 CP
8 CP
Copyright 2007, Information Builders. Slide 40
Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)
# userResponse Time in seconds
2 cp 4 cp 8 cp
50
100
200
500
Workload Type
18.417
33.183
49.23
----
9.812
17.145
34.74
Large
----
4.998
10.278
----
18.369
Large
Large
Large
(WebFOCUS Reporting Server, DB2, WL=Large)
Copyright 2007, Information Builders. Slide 41
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 42
Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)
(WebFOCUS Reporting Server, DB2, WL=Large)
Test 2 Scenario 1 large
0
10
20
30
40
50
60
50 100 200
Users
Re
sp
on
se
Tim
e (
Se
cs
)
2 CP
4 CP
8 CP
Copyright 2007, Information Builders. Slide 43
Benchmark Test Environment – 3(WAS, WebFOCUS Reporting Server, iWay Service Manager – IBI2, DB on z/OS –IBI1)
IBI WebFocus Network and Data Traffic Flow on z9using 2 separate z/OS systems
GB Ethernet
3.0 GHz 8way 3.0 GHz 8way
Workstations
z/OS 1.7 2/4/8 CP,8 GB
WAS 6.1
z/OS 1.7 2/4/8 CP,8 GB
Hostname: IBI2 Hostname: IBI1
WebFocus Client 76iWay Service Manager
5.5JDBC Type 4 driver
DB2 V 8.2
11/16/2006
OSAsubchannel
OSAsubchannel
LPAR 1 LPAR 2
Copyright 2007, Information Builders. Slide 44
Benchmark Test Environment – 3 Scenarios
Scenario1: IBI2: 2/4/8 CP, 8 GB, ISM 5.5, JDBC Type-4 Driver. IBI1: 2/4/8 CP, 8 GB, DB2, 2 tables
Scenario2: IBI2: 2/4/8 CP, 8 GB, WAS 6.1, WebFOCUS Reporting Server 76, WF
Client, CLI. IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine
Scenario3: IBI2: 2/4/8 CP, 8 GB, WebFOCUS Reporting Server 76, CLI IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine
Copyright 2007, Information Builders. Slide 45
Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
2.032
3.22
6.782
15.34
0.716
1.329
2.715
Small
5.864
0.56
1.11
5.2
2.4
Small
Small
Small
500 22.54 7.54 6.391 Small
Copyright 2007, Information Builders. Slide 46
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 47
Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)
Test 3 Scenario 1 Small
0
5
10
15
20
25
25 50 100 200 500
Users
Re
sp
on
se
Tim
e (
Se
cs
)
2 CP
4 CP
8 CP
Copyright 2007, Information Builders. Slide 48
Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
58.95
121.98
292.22
-----
39.179
80.675
210.811
Large
-----
35.0
74.17
-----
181.3
Large
Large
Large
500 ----- ----- ----- Large
Copyright 2007, Information Builders. Slide 49
Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 50
Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP)
Test 3 Scenario 1 Large
0
50
100
150
200
250
300
350
25 50 100
Users
Re
sp
on
se
Tim
e (
Se
cs
)
2 CP
4 CP
8 CP
Copyright 2007, Information Builders. Slide 51
Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
-----
3.639
4.616
9.224
-----
-----
-----
Small
-----
-----
-----
-----
-----
Small
Small
Small
500 21.20 ----- ----- Small
* 4 Clustered WAS Application Servers
Copyright 2007, Information Builders. Slide 52
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 53
Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
Test 3 Scenario 2 Small
0
5
10
15
20
25
0 100 200 300 400 500 600
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
Copyright 2007, Information Builders. Slide 54
Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
-----
6.353
11.756
22.226
-----
-----
-----
Large
-----
-----
-----
-----
-----
Large
Large
Large
500 35.249 ----- ----- Large
Copyright 2007, Information Builders. Slide 55
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 56
Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
Test 3 Scenario 2 Large
0
10
20
30
40
0 100 200 300 400 500 600
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
Copyright 2007, Information Builders. Slide 57
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
-----
0.566
1.118
2.207
-----
0.313
0.65
Small
1.377
-----
0.14
0.749
0.278
Small
Small
Small
500 7.786 5.413 1.847 Small
Copyright 2007, Information Builders. Slide 58
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 59
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
Test 3 Scenario 3 Small
0123456789
50 100 200 500
Users
Res
po
nse
Tim
e (S
ecs)
2 CP
4 CP
8 CP
Copyright 2007, Information Builders. Slide 60
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
-----
5.575
9.615
18.874
-----
3.001
6.756
Large
12.159
-----
1.755
4.824
3.674
Large
Large
Large
500 33.525 27.501 7.582 Large
Copyright 2007, Information Builders. Slide 61
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 62
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
Test 3 Scenario 3 Large
05
10152025303540
50 100 200 500
Users
Res
pons
e Ti
me
(sec
s)
2 CPs
4 CPs
8 CPs
Copyright 2007, Information Builders. Slide 63
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP)
# user
Response Time in seconds
2 cp 4 cp 8 cp
25
50
100
200
Workload Type
-----
-----
-----
-----
-----
-----
-----
Complex
-----
-----
21.084
82.053*
42.295
Complex
Complex
Complex
500 ----- ----- 167.605* Complex
* RMF report indicated that 1 zIIP engine utilization was at 100%
Copyright 2007, Information Builders. Slide 64
Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 65
Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP)
Test 3 Scenario 3 Complex
0
50
100
150
200
0 100 200 300 400 500 600
Users
Res
pons
e Ti
me
(sec
s)
8 CPs
Copyright 2007, Information Builders. Slide 66
Proven Value of zIIP Specialty Engine
06/11/17 01:04 Thread Summary ROW 85 TO 107 OF 148 . . . . . . . . . . . . . . . . . . . . . . . . . . . Display Filter View Print Options Help ------------------------------------------------------------------------------- SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D 3683 0.00 0 100 18.30 0.00 100 DB2USER DSN8DIST 1 004F 11T 0.00 0 100 3590.05 95.80 100 DB2USER DSN8IRLM 1 0050 1266 0.00 0 100 5.51 0.08 100 DB2USER DSN8DBM1 1 0052 263T 0.00 0 100 349.40 1.66 100 DB2USER DSN8SPAS 1 0053 1184 0.00 0 100 0.15 0.00 100 DB2USER
06/11/17 01:14 Thread Summary ROW 1 TO 23 OF 201 . . . . . . . . . . . . . . . . . . . . . . . . . . . Display Filter View Print Options Help ------------------------------------------------------------------------------- SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D 3683 0.00 0 100 18.42 0.01 100 DB2USER DSN8DIST 1 004F 12T 0.00 0 100 6366.16 95.70 100 DB2USER DSN8IRLM 1 0050 1271 0.00 0 100 6.75 0.03 100 DB2USER DSN8DBM1 1 0052 416T 0.00 0 100 401.31 3.36 100 DB2USER DSN8SPAS 1 0053 1184 0.00 0 100 0.15 0.00 100 DB2USER
Copyright 2007, Information Builders. Slide 67
zIIP Actual Versus Projected
CP configuration: IBI1 - 8 CPs, 1 zIIP IBI2 - 8 CPs, 0 zIIP
WebFOCUS Large ------------- IBI1 ------------- ---- IBI2 ---- Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 00:45 3.2 14.04 0.94 11.1 71.3 2548 100 00:49 3.1 14.03 0.78 10.2 74.5 3651 200 00:53 3.6 16.86 1.21 10.8 85.3 2506 500 00:58 3.3 14.64 0.87 10.3 76.4 3437
Complex Query ------------- IBI1 ------------- ---- IBI2 ---- Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 01:05 84.3 90.22 290.43 1159 53.8 180.2 100 01:10 97.8 95.33 336.71 1941 51.0 200.0 200 01:16 92.0 93.20 305.94 2697 63.1 247.5 500 01:23 100.0 95.51 337.83 3117 48.5 111.1
Copyright 2007, Information Builders. Slide 68
Test Env – 3 (2 separate z/OS), Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8CP,Vary zIIPs)
# userResponse Time in seconds
25
50
100
200
Workload Type
-----
44.095
94.362
-----
21.084
Complex
82.053
-----
33.351
Complex
Complex
Complex
500 ----- ----- ------ Complex
-----
18.814
79.838
-----
NO zIIP 1 zIIP 3 zIIP 6 zIIP
22.105
42.295 37.786
17.413
68.262
Copyright 2007, Information Builders. Slide 69
Test Env – 3 (2 separate z/OS),Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8 CP. Vary zIIPs)
0
20
40
60
80
100
0 50 100 150 200 250
Users
Re
sp
on
se
Tim
e(s
ec
)
NO zIIP
1 zIIP
3 zIIP
6 zIIP
Copyright 2007, Information Builders. Slide 70
WSC Benchmark Team
Mary Hu John BishopRichard Lewis Joe Consorti Jennie Liang John GoodyearKenneth HainGlenn MateriaDennis McDonald
Copyright 2007, Information Builders. Slide 71
Questions ?
Copyright 2007, Information Builders. Slide 72
Appendix A
Workload-Small Query
SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5",
T1."F3FLOAT9X8",T1."F3DBL15X2"
FROM TST2MLN T1
WHERE (T1."F3SSN" <= '000000061') FOR FETCH ONLY;
Copyright 2007, Information Builders. Slide 73
Appendix A
Workload-Large Query
SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5",
T1."F3FLOAT9X8",T1."F3DBL15X2"
FROM TST2MLN T1
WHERE (T1."F3SSN" <= '000003000') FOR FETCH ONLY;
Copyright 2007, Information Builders. Slide 74
Appendix A
Workload-Complex Query
SELECT T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE", MAX(T1."TRIP_CODE"), MAX(T3."TRIP_NAME"), MAX(T1."M_DELVRY_DATE"), MAX(T1."DELVRY_YEAR"), SUM(T4."CUSTOMER_COUNT") FROM ( ( ( MKTING_GLOBAL T1 INNER JOIN TRIP_GLOBAL T2 ON T2."VYD_TRIP" = T1."M_TRIP" ) LEFT OUTER JOIN TRANSPORT_NAME_GLOBAL T3 ON T3."TRANSPORT_CODE" = T1."TRANSPORT_CODE" ) INNER JOIN MKTING_CUSTMR_SURVEY T4 ON T4."M_MKTING_NBR" = T1."M_MKTING_NBR" AND T4."M_TRIP" =
T1."M_TRIP" ) WHERE (T1."BUSS_CODE" = 'A') AND (T2."VYD_TRIP_STAT" IN('A', 'H')) AND (T2."REPORT_YEAR" ='2006') AND (T4."EXPIRE_DATE" > '2007-02-17') AND (T4."EFFECT_DATE" <= '2006-10-17') AND (T4."M_C_CUSTMR_STAT" = 'C') AND (((T1."M_GROUP_TYPE" ='H') AND (T4."M_C_CUSTMR_ID" = '2')) OR ((T1."M_GROUP_TYPE" <> 'H') OR T1."M_GROUP_TYPE" IS NULL)) GROUP BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT",T1."M_VISIT_POINTER",T4."M_C_USE", T4."M_C_CONC_CD",T4."EFFECT_DATE",T4."EXPIRE_DATE" ORDER BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE" FOR FETCH ONLY;