31
7/31/2019 teched2009_CD160 http://slidepdf.com/reader/full/teched2009cd160 1/31 1 CD160 Automating Performance Analysis of Custom Coding Decrease TCO, Improve End-user Experience Performance and Scalability, SAP AG Disclaimer This presentation outlines our general product direction and should not be relied on in making a purchase decision. This presentation is not subject to your license agreement or any other agreement with SAP. SAP has no obligation to pursue any functionality mentioned in this presentation. This presentation and SAP's strategy and possible future developments are subject to change and may be changed by SAP at any time for any reason without notice. This document is provided without a warranty of any kind, either express or implied, including but not limited to, the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. SAP assumes no responsibility for errors or omissions in this document, except if such damages were caused by SAP intentionally or grossly negligent.  © SAP AG 2009. Al rights reserved. / Page 2

teched2009_CD160

  • Upload
    ofrq

  • View
    216

  • Download
    0

Embed Size (px)

Citation preview

Page 1: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 1/31

1

CD160

Autom at ing Perform ance Analys is o fCustom CodingDecrease TCO, Improve End-user Experience

Performance and Scalability, SAP AG

Discla imer

This presentation outlines our general product direction and should not be relied on inmaking a purchase decision. This presentation is not subject to your licenseagreement or any other agreement with SAP. SAP has no obligation to pursue any

functionality mentioned in this presentation. This presentation and SAP's strategy andpossible future developments are subject to change and may be changed by SAP atany time for any reason without notice. This document is provided without a warranty

of any kind, either express or implied, including but not limited to, the impliedwarranties of merchantability, fitness for a particular purpose, or non-infringement.SAP assumes no responsibility for errors or omissions in this document, except ifsuch damages were caused by SAP intentionally or grossly negligent.

 © SAP AG 2009. All rights reserved. / Page 2

Page 2: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 2/31

2

After This Session You Wi l l have Learned

How to set up and manage a strategy to automated performance tests

What the scope and coverage of automated performance tests are

How you can check your application coding for non-regression, that is no negative

When best to include automated performance testing in your quality assurance initiative

How to set meaningful KPIs for performance testing

What testing infrastructure you should have in place to ensure a good quality of your customdevelopment

How you can use SAP NetWeaver tools to automate performance tests and analyze thecoding itself

 © SAP AG 2009. All rights reserved. / Page 3

Agenda

1. Introduction to Automated Performance Testing

2. Automated Regression Testing

3. Automated Code Analysis

4. Helpful Tips on Test Organization

 © SAP AG 2009. All rights reserved. / Page 4

.

Page 3: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 3/31

3

Once Created – What Happens to CustomCode?

Custom development in SAP business applications handicap upgrade. But often companies do not know what is happening in their ERP systems “

Many programs are forgotten and never analyzed if still needed.

[...] custom-developed processes strain the ERP environment. Many of these modules show long response times. In addition, background jobs impair online 

users. SAP standard transactions need much less resources in comparison “

 © SAP AG 2009. All rights reserved. / Page 5

One (possible) issue is that customers perform technical upgrades only and do not re-evaluatethe business processes

Non-regression is critical for well-performing programs

Source: “SAP users can jettison a lot of waste”,Computerwoche online, November 2007

In t roduct ion to Per form ance – What i sPerformance?

Performance is often used as a blanket term meaning response time, however,multiple requirements exist

Execution time of a batch process (e.g. 3 mio payroll periods in 90 minutes)

Execution time of a batch chain (e.g. full business completion of several hundred thousandfrom meter reading to invoicing in four hours)

E2E Response time (e.g. up to 0.5 seconds in a call center application)

Parallelization of data upload (e.g. 90 parallel jobs)

...

Therefore, the definition of "good" performance and performance KPIs

 © SAP AG 2009. All rights reserved. / Page 6

 

Must be broken down to the software layers

The basis foor good performance is scalability

Page 4: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 4/31

4

Small test

system

Dimensions of internal and external scalability Effects of scalability

Dimensions and Effects of Scalab i l i ty

Small data

source

One user 

Thousands of users Terabytes of data

Size of 

data

Number of 

processed

objects

Concurrent

users

Number 

of servers /

cores /

threads

Scalability

 © SAP AG 2009. All rights reserved. / Page 7

High throughput Large objects

Globally distributed users

Hardware adaptivity

Many servers

Number of 

objects in

the DB

Number of 

parallel jobs

How t o Test Per form ance / Scalab i l i ty – SingleUser and Load Test A pproac h (1/2)

Volume Testing or Load TestingSingle User Performance Testing

Determine how fast some aspect of a systemperforms, such as

Single user tests

Linear scalability of basis applications/ engines/ infrastructure components

Basis applications and services are essential

A single transaction test

Understand what parts of the system or workloadcause the system to perform badly (diagnosiscase)

Resource consumption of a single transaction asbasis for a sizing prediction

CPU on application server

Memory on application server

Database accesses

for the scalability of the Web ApplicationServer upon which business applications arebuilt.

Locking behavior in application and on database

Determine maximum throughput / maximumnumber of concurrent users

Run load tests to ensure parallel processingcapability and to verify throughputrequirements that can be achieved.

Robustness dataconsistenc under hi h load

 © SAP AG 2009. All rights reserved. / Page 8

Network transfer

Linear scaling of resource consumption accordingto size of data processed

Regression testing (better, if automated)

,

Sizing verification

CPU on application server and DB

Memory on application server and DB

Network load

Disk I/O

Regression testing

Page 5: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 5/31

5

How t o Test Per form ance / Scalab i l i ty – SingleUser and Load Test A pproac h (2/2)

Single User Test

Small test system

(QA, development), one user 

Quality and implications ofaccesses to persistence layer

Linear resource consumption

Can be automated or manual

Parallel processingmechanisms, load balancing

Memory usage and memoryleaks

Disk requirements

Front-end network load

Analyze & measure

scalable behavior /

non-regression

Performance predictions/sizing

 © SAP AG 2009. All rights reserved. / Page 9

Volume Test

Equivalent to multi-user test, stress test,

load test, benchmark

Are generally automated

for high volume environment

Verify assumptions

and ...

Coverage, Com prehens iveness, Sc ope andEf fo r t o f D i f fe ren t Ac t i v i t i es

Manual single user

measurements &

analysis

Automated single user

measurements

Automated load tests

1

2

3

4

5

asy to set upand run

Low tool effort

In-built datastorage / results re-use

 © SAP AG 2009. All rights reserved. / Page 10

0

Exploitability for scalability forecastEasy to use for

regression

Page 6: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 6/31

6

Costs for Autom at ion vs. Gains by Automat ion

Costs

Scenarios under test must be functionally stable

Additional tool knowledge

Gains

Storage of test results

Reproducibility of test results

 

Create reproducible test cases

Establish tool environment and result database

...

,

Much easier comparison/regression ofmeasurement

Costs Gains

 © SAP AG 2009. All rights reserved. / Page 11

So carefully weigh the pros and cons of performance test automation

At SAP, automated regression tests are performed in single-user and load profile mode

Tools and Meaningfu l Per formanc e Basel ines

DETERMINE THE KPIs BEFORE DECIDING ON THE TOOL

For example

Many tools deliver the end-to-end response time of a user interaction via the user interface

Problem: at high workload, the response time does not show the constant resourceconsumption in terms of CPU of an individual transaction ( scalability under load)

30

40

50

60

R

e

s

p

o

n

s

e

 © SAP AG 2009. All rights reserved. / Page 12

0

10

20

1 5 10 20 50 100 200 500

C

P

U

t

i

m

e

Number of requests

Page 7: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 7/31

7

Tools and Meaningfu l Per formanc e Basel ines

DETERMINE THE KPIs BEFORE DECIDING ON THE TOOL

Several factors more or less constitute end-to-end(E2E) response times

Server response timet[1]

Browser 

 

Frontend time

Define meaningful KPIs

CPU: Average CPU time per scenario, per step,Linear resource increase with load increase, Stableindividual times

Response time / execution time: End-to-end in userApplication Server 

Web Server(s)

  s  p  o  n  s  e   T   i  m  e  =       Σ    t   [   i   ]

t[2]

t[3]

t[4]

t[5]

 © SAP AG 2009. All rights reserved. / Page 13

Memory: Peak memory during tested scenario, Peruser

Database size on disk: Inserts, updates, Number ofdata transferred

Front-end: Number of synchronous roundtrips fromclient to application server per interaction step,Number of KB transferred per interaction step

pp ca on erver s

Database

   R  e

t[6]

t[i]

Agenda

.

2. Automated Regression Testing

3. Automated Code Analysis

4. Helpful Tips on Test Organization

 © SAP AG 2009. All rights reserved. / Page 14

5. Conclusion

Page 8: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 8/31

8

The Basis for Meaningfu l Per form ance TestResul t s: Test Cases and Test System s (1/2)

Spend a sufficient amount of time in ensuring that the test cases are ready for performancetesting

80/20 rule

Represent most heavily used functions

 –  Transactions used by many users

Represent most critical functions

 –  Batch chains

Represent functions generating most resource consumption CPU intensive vs. I/O intensive vs. memory-intensive

Repeatable stable results

(In-) dependency of previous / preparation steps

Restartability clean-up needed between runs?

 © SAP AG 2009. All rights reserved. / Page 15

Stable over releases

Allow (systematic) variations of relevant parameters that have impact on resource consumption

Compare identical business functionality only, includes identical customizing and master data

(Non-)locality of used data

The Basis for Meaningfu l Per form ance TestResul t s: Test Cases and Test System s (2/2)

Spend a sufficient amount of time in ensuring that the system under test is usable forperformance tests

General

Keep CPU utilization low (single user tests)

For comparison keep all technical components identical (Server, DB-platform, parameters)

Virtualization off (CPU)

CPU settings like turbo boost, hyperthrading off (CPU)

User Settings Session Manager off (memory)

Single Sign-On off (CPU)

WebDynpro trace off (CPU)

External breakpoints for current user off (CPU)

 © SAP AG 2009. All rights reserved. / Page 16

System management

Authorization trace off (database accesses, Disk, CPU)

Kernel: OPTIMIZED or DEBUG (CPU)

Developer Traces Level 1 (CPU)

Coverage Analyzer off (CPU)

Page 9: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 9/31

9

Autom ated Regression Test ing

Regression tests to ensure comparable performance within and across releases

Global Performance Analysis

Transaction code ST30 

Incorporated into eCATT environment

Provides different performance counters for different aspects

Data Base: Number of DB Accesses and amount of transferred data

Application Server: CPU and memory usage

Front end communication: Round Trips and transferred data

 © SAP AG 2009. All rights reserved. / Page 17

eCATTCentral monitoring

Global Per form ance Analys is :The Center o f Aut omat ed Perform ance Test ing

Testedapplication

2. Execute GUI script

system

ST30

Global Perf. Analysis

4. Save &aggregateresults

STADSingleTransactionRecords

1. Start 3. Readperformancedata

 © SAP AG 2009. All rights reserved. / Page 18

Centraldatabase Performance

Trace

ST33

5. Read results

Central Test System System under test

Page 10: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 10/31

10

 © SAP AG 2009. All rights reserved. / Page 19

DEMO

Global Per form ance Analys isExecut ion of an eCATT Test

Main entry fields for ST30:

• Log ID:

Identifies a set of measurements

•  

Identifies an individual measurement

• Test Configuration:

Connection to the eCATT script

• No. of eCATT Preprocs:

Number of executions beforemeasurement to load buffers

 © SAP AG 2009. All rights reserved. / Page 20

• No. of eCATT Main Runs:

Number of repeated executions formeasurement

• With Run for SQL Trace

Execute a separate run to create aSQL Trace

Page 11: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 11/31

11

Execut ion of the Global Per form ance Analys iseCATT Test (1/3)

Standard Start Options are OKin most cases!

us con nue w .

 © SAP AG 2009. All rights reserved. / Page 21

Execut ion of the Global Per form ance Analys iseCATT Test (2/3)

eCATT Script will be executedautomatically pre + main runtimes.

Don’t interrupt the eCATT!

 © SAP AG 2009. All rights reserved. / Page 22

Page 12: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 12/31

12

Execut ion of the Global Per form ance Analys iseCATT Test (3/3)

 © SAP AG 2009. All rights reserved. / Page 23

Global Per form ance Analys is :Display Perfor ma nc e Dat a (1/2)

 © SAP AG 2009. All rights reserved. / Page 24

Page 13: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 13/31

13

Global Per form ance Analys is :Display Perfor ma nc e Dat a (2/2)

 © SAP AG 2009. All rights reserved. / Page 25

Global Per form ance Analys is :Disp lay Aggregat ed Perform ance Data

 © SAP AG 2009. All rights reserved. / Page 26

Page 14: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 14/31

14

Global Per form ance Analys is :Disp lay Stat is t ic a l Data

 © SAP AG 2009. All rights reserved. / Page 27

Global Per form ance Analys is :Disp lay SQL Trace Data (Stat ement s)

 © SAP AG 2009. All rights reserved. / Page 28

Page 15: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 15/31

15

Global Per form ance Analys is :Disp lay Stat is t ic a l Data

 © SAP AG 2009. All rights reserved. / Page 29

Global Per form ance Analys is :Disp lay Stat is t ic a l Data

 © SAP AG 2009. All rights reserved. / Page 30

Page 16: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 16/31

16

Live - Demo

EXERCISE

Summar y for Autom ated Regression Test ingUsing Global Per form ance Analys is

Pros

Global Performance Analysis is tightly integrated with the eCATT as well as Code Inspector sothat functional tests could be re-used (provided the functional test cases fulfill requirements for

Performance test results are stored in a central test data base and can be re-used forregression testing

Nearly infinite repeatability fast and automated feedback cycle during development and

correction phase If repeated frequently, the detailed analysis allows to pinpoint performance culprits so that

small corrections or transported programs causing performance degradation do not gounnoticed

 © SAP AG 2009. All rights reserved. / Page 32

General automation costs: large number of parameters ay influence performance KPIs verycareful setup and design of test cases and test system necessary

Maintenance of stable test cases and test systems can be time consuming (adapt to kernelchanges / architecture changes / functional changes / etc.)

Page 17: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 17/31

17

Agenda

.

2. Automated Regression Testing

3. Automated Code Analysis

4. Helpful Tips on Test Organization

 © SAP AG 2009. All rights reserved. / Page 33

5. Conclusion

Autom ated Code Analys is

Need for static analysis of coding and other objects, for example:

Performance relevant statements

Naming conventions

Use of obsolete language elements

Extended syntax check for many objects

Security checks

Technical settings of tables ...

Code Inspector

Transaction SCI

 © SAP AG 2009. All rights reserved. / Page 34

Performance-related checks

Complete WHERE clauses

Bypassing buffers

Copy large data objects

...

Page 18: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 18/31

18

Code Inspect or :Inspect ion, Object Set and Check Var iant

Inspection = { }Object Set Check Variant

R  un

Result

programs...function groups...classesinterfaces...DDIC objects

Syntax

Performance

Bypassing Buffer

Nested Loops

WHERE Clause

Table Attributes

 © SAP AG 2009. All rights reserved. / Page 35

...other TADIR objects

...

..

Category_X

.

 © SAP AG 2009. All rights reserved. / Page 36

DEMO

Page 19: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 19/31

19

Global Per form ance Analys is in Conjunct ionWith Code Inspect or

Object Set

Defines development objects, that will bechecked

Check Variant

Defines types of checks to be executed

Inspection

Combines Object Set and Check Variant

 © SAP AG 2009. All rights reserved. / Page 37

Code Inspect or :Check Var iant and Object Set

 © SAP AG 2009. All rights reserved. / Page 38

Page 20: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 20/31

20

Global Per form ance Analys is in Conjunct ionWit h Code Inspec t or (1/3)

 © SAP AG 2009. All rights reserved. / Page 39

Global Per form ance Analys is in Conjunct ionWit h Code Inspec t or (2/3)

SQL Trace Analysis:

Looks for performanceissues in traced SQL

Static Program Analysis:

Does a static analysis of

all programs found in theSQL Trace

Checklist:

Checks for performance

 © SAP AG 2009. All rights reserved. / Page 40

 guidelines

Page 21: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 21/31

21

Global Per form ance Analys is in Conjunct ionWit h Code Inspec t or (3/3)

 © SAP AG 2009. All rights reserved. / Page 41

Live - Demo

EXERCISE

Page 22: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 22/31

22

Autom ated Code Analys is : Pros and Cons

Pros

Broad coverage

Low effort, fairly easy to set up

Integrated with Global Performance Analysis

In this context can also analyze dynamic SQL statements

Cons

Static code analysis does not necessarily tell you which code is relevant

If the Code Inspector is used in a static context

It may return many false positives which need to be put into context

It does not return all potential performance problems, because those may lie in the

 © SAP AG 2009. All rights reserved. / Page 43

dynamic part For important scenarios also use

 –  Global Performance Analyis (ST30) for problem detection

 –  Runtime analysis (SE30/SAT) and Performance Trace (ST05) for detailed analysis

Agenda

.

2. Automated Regression Testing

3. Automated Code Analysis

 © SAP AG 2009. All rights reserved. / Page 44

4. Helpful Tips on Test Organization and Conclusion

Page 23: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 23/31

23

When Should Perform ance Test ing TakePlace?

Planning Development Test Integration Production

epen s on e so ware eve opmen s ra egy

Agile approach: First test anything testable, then test integrated product

Short development cycles, short test cycles

Immediate feedback to development whistle blower effect

Many transports to test systems / test systems may be unstable

Classic approach: First functional testing, then performance (and other quality testing)

Consolidated feedback

 © SAP AG 2009. All rights reserved. / Page 45

Dedicated performance test systems are always preferable because morecontrollable

Regression testing of core processes while development (must be stable duringdevelopment, are tested on a weekly basis), e.g. save financial documents becauseunit testing will depend on it

Creat ing Per formanc e Test Cases – Consequencesi f Requirements Are Not Ful f i l led

Performance problems in heavily and thus highly visibible functions are notrecognized, and will lead to a high costs in maintenance

Represent most

commonly used

functions

Performance problems with critical functions are no recognized and may leadRepresent most critical

to a business disruptionfunctions

For capacity planning, the most important information is missing. The sizingrule will be inaccurate, because the main contributors to resourceconsumption are missing

Represent functions

generating mostresource consumption

Each execution yields different measurement results averages do not makesense, comparisons between release do not make sense

Repeatable

 © SAP AG 2009. All rights reserved. / Page 46

In load tests, if data are too "local", load generated in database may besmaller than in real life distorted results

(Non-) locality of used

data

Meaningful comparisons between releases are not possible

Change in resource consumption cannot be determined

Failing automation (eCATTs)

Stable over releases

Page 24: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 24/31

24

Creat ing Perform ance Test Cases – Comm onProblem s and Trade-Offs

Repeatability vs representativeness

Some functions are not repeatable in production usage, so reaching repeatability may lead totest cases (or load), that is not representative

Keep in mind: repeatability is absolutely necessary for testing

Locality vs effort in building up data

To avoid unrealistic locality in data usage, enough data must be created in database, e.g.master data.

This may be considerable effort, because it‘s often not enough just to copy data ( e.g. if theyhave references to other records, copied data then all may refer so the same record – still alocality problem .. )

Data have to created in a way that tools can easily use them

 © SAP AG 2009. All rights reserved. / Page 47

(e.g. if material number has to be provided as input field .. )

Restrictions imposed by tools

Especially for load tests, tools are necessary to run test cases in parallel and to create the load.

These tools may have some restrictions, e.g. if they have to react dynamically on the output ofprevious step in process chains (are they able to react on error messages ? )

Conclusion

Performance test automation helps you to prove if your custom code shows any regression

To ensure good results it is important to have a dedicated and well-maintained test system

For example, avoid artefacts caused by virtualization techniques

Before you test think first what KPIs you need and don‘t start with the test tool‘s capabilities

SAP offers a variety of tools to support you in performance testing and test automation so youcan choose your proper strategy

 © SAP AG 2009. All rights reserved. / Page 48

Page 25: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 25/31

25

Perform ance Analys is and Opt imizat ion inPrint

How to improve performance of ABAPapplications

Development best practices

Performance as ects of the different la ers

Tools (SQL Trace, ABAP Trace)

Do-it-yourself exercises

Author

From the central performance team

More information

http://www.dpunkt.de/buecher/3096.html

ISBN 978-3-89864-615-4 German onl

 © SAP AG 2009. All rights reserved. / Page 49

 

Fur t her In fo rmat ion

SAP Public Web:

SAP Developer Network (SDN): www.sdn.sap.com

Business Process Expert (BPX) Community: www.bpx.sap.com

Related Workshops/Lectures at SAP TechEd 2009

ALM269 - SAP's End-User Experience Monitoring

SAP BusinessObjects Community (BOC): www.boc.sap.com

 © SAP AG 2009. All rights reserved. / Page 50

ALM270 - SAP's End-to-End Monitoring and Alerting

Page 26: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 26/31

26

Vir t ual SAP TechEd :Ext end your SAP TechEd Year Round

Best of SAP TechEd at Your fingertips

View sessions that you missed

Replay and review sessions that you

Quality SAP TechEd Training

Best Practices

Product Roadmaps

Learn at your own pace

Gain Access to sessions recorded in2006, 2007, 2008 and2009* (*available December 2009)

 © SAP AG 2009. All rights reserved. / Page 51

24/7 Access online/offline Flexible Course Syllabus

Volume Licensing

Special Pricing for multiple subscribers

http://www.sdn.sap.com/irj/scn/virtualteched-allsessions

 Thank You!

Page 27: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 27/31

27

ContactFeedback

Please complete your session evaluation.

Be courteous — deposit your trash,and do not take the handouts for the following session.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained hereinmay be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries,eServer, z/VM, z/OS, i5/OS, S/390, OS/390, OS/400, AS/400, S/390 Parallel Enter rise Server, PowerVM, Power Architecture, POWER6+, POWER6, POWER5+,

Copyright 2009 SAP AGAl l Rights Reserved

, , , , , , , , , , , , , ,POWER5, POWER, OpenPower, PowerPC, BatchPipes, BladeCenter, System Storage , GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2, Parallel Sysplex,MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are trademarks or registered trademarks of IBM Corpor ation.

Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademar ks of Adobe Systems Incorporated in the United States and/or othercountries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, W orld Wide Web Consortium, Massachusetts Institute of Technology.

Java is a registered trademark of Sun Microsystems, Inc.

JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented byNetscape.

SAP, R/3, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective

 © SAP AG 2009. All rights reserved. / Page 54

logos are trademarks or registered trademarks of SAP AG in Germany and other countries.

Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, W eb Intelligence, Xcelsius, and other Business Objects products andservices mentioned herein as well as their respective logos are trademarks or registered trademarks of Business Objects S.A. in the United States and in other countries.Business Objects is an SAP company.

All other product and service n ames mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only.National product specifications may vary.

These materials are subject to change without notice. These mater ials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only,without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Groupproducts and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construedas constituting an additional warrant.

Page 28: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 28/31

Page 29: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 29/31

29

Creation of a SAPGUI eCATT Sc ript (2/6)

 © SAP AG 2009. All rights reserved. / Page 57

Creation of a SAPGUI eCATT Sc ript (3/6)

 © SAP AG 2009. All rights reserved. / Page 58

Page 30: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 30/31

30

Creation of a SAPGUI eCATT Sc ript (4/6)

 © SAP AG 2009. All rights reserved. / Page 59

Creation of a SAPGUI eCATT Sc ript (5/6)

 © SAP AG 2009. All rights reserved. / Page 60

Page 31: teched2009_CD160

7/31/2019 teched2009_CD160

http://slidepdf.com/reader/full/teched2009cd160 31/31

Creation of a SAPGUI eCATT Sc ript (6/6)

 © SAP AG 2009. All rights reserved. / Page 61