View
0
Download
0
Category
Preview:
Citation preview
VISUAL STUDIO 2010 AND
IBM RATIONAL FUNCTIONAL
TESTER by
Alper Yörükçü and Kenzie MacNeil
Submitted to Professor Daniel Amyot
for the course CSI 5112 - Software Engineering
Winter 2013
CSI 5112 Project
Evaluation / Comparison of
PRESENTATION OVERVIEW
1. Review of testing and test automation.
2. Business context of this evaluation.
3. Evaluation methodology.
4. Evaluation criteria.
5. Evaluation of Microsoft Visual Studio 2010
Ultimate package’s testing functionality.
6. Evaluation of IBM Rational Functional Tester.
7. Summary of the evaluation.
8. Recommendations.
9. Questions and comments. 2
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
SOFTWARE TESTING
Testing, verification and validation are important aspects of any software development process. Together these tasks help ensure the quality of a final product.
Different scopes/levels of software testing: Unit, or Component, Testing:
Individual and isolated testing of source code components.
Integration Testing: Testing of the integration of two or more tested components.
Often used as a generic term.
System Testing: Testing completely intergraded system.
Often used as part of requirement verification.
Regression Testing: Rerunning existing tests to verify functionality given changes to the
source code or environment.
User Interface (UI) Testing: Testing with a focus on the graphical user interface (GUI) components.
Win
ter 2
01
3
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
TEST AUTOMATION
Software test automation is a generic term for the process of using code, scripts, stubs and/or external software tools to execute a series of steps, with predicted outcomes.
These tests are designed to operate without user interaction.
Test automation can be applied to the various levels of software testing.
Advantages:
Fast, reliability, reusable and cost (can be execute on off hours to save time).
Disadvantage:
Cost (creating and maintain test cases).
Win
ter 2
01
3
4
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
BUSINESS CONTEXT
Client:
The company provides a variety of custom modeling,
simulation and training solutions to customers.
Their solutions range from small consulting contracts to
large system engineering solutions.
Their customers are chiefly comprised of domestic and
international governments and militaries.
Context:
The company is currently attempting to transition its
project development process to a scrum based model.
As part of this transition, the company is looking to include
automated testing to current and future projects.
Win
ter 2
01
3
5
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
BUSINESS CONTEXT
The company is hoping to implement frequent automated system and user interface testing for some of their large system engineering solutions.
These automated tests will be designed to provide regression and requirements testing throughout the lifecycle of certain projects.
The product must be: Able to provide UI testing;
Compatible with Visual Studio and .NET languages.
The ideal testing product would be: Easily to learn;
Maintainable;
Low in cost;
Portable;
Provide support for other IDEs and languages.
Win
ter 2
01
3
6
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
BUSINESS CONTEXT
We choose to use the system and UI testing
functionalities of the Visual Studio 2010
Ultimate package as our baseline.
This was to be compared against a 3rd party
software to determine if our employer should
upgrade their Visual Studio package or buy a
separate testing product.
IBM’s Rational Functional Tester was selected as
our second product because the company already
uses other IBM technologies like IBM Rational
DOORS.
Win
ter 2
01
3
7
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
CRITERIA AND METHODOLOGY
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
8
EVALUATION METHODOLOGY
DESCRIPTION
We designed our own evaluation methodology.
We defined a set of criteria given the generic
requirements and goals of our employer.
Each criteria was given a weight of 10, 5, 3 or 1.
During the evaluation each product’s criteria was
given a score on a scale of 0 to 4. This was to
identify the satisfaction level of the product for
the specific criteria.
The products will be judged based on sum of their
weighted scores.
The product with the higher score will more
likely be recommended to our employer.
Win
ter 2
01
3
9
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION METHODOLOGY
IMPACT LEVELS
Each criteria is given an impact level which affects the weight of the product sum of the final score.
A tool will be immediately dismissed from the evaluation, as a potential solution, if it receives a score of 0 for any mandatory criteria.
Win
ter 2
01
3
10
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Criteria Impact Relevance Numerical Weight
Mandatory 10
High 5
Medium 3
Low 1
EVALUATION METHODOLOGY
SCORE LEVELS
During the evaluation each of the product’s criteria
will be given a score based on the scale below.
This score is based on the previously defined
constrains of that particular criteria.
NOTE: Some criteria were limited to only 2 or more of these scores.
For example, the ability to support Windows OS was limited to
Not Available and Satisfactory.
Win
ter 2
01
3
11
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Evaluation Score Numerical Value
Not Available 0
Unsatisfactory 1
Satisfactory 2
Good 3
Excellent 4
EVALUATION METHODOLOGY
CALCULATING FINAL SCORE
Each product’s final score will be calculated by
taking the sum of the weighted evaluation scores.
The weighted evaluation score is the numerical
impact level multiplied by the numerical score level.
A “Satisfaction Percentage” will also be
calculated by dividing the product’s final score by
the maximum achievable evaluation score.
Win
ter 2
01
3
12
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION CRITERIA
OVERVIEW
Our evaluation criteria was broken down into the
following generic categories:
Functionality;
Usability;
Deployment;
Scalability;
Maintainability; and
Miscellaneous.
Win
ter 2
01
3
13
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION CRITERIA
FUNCTIONALITY
Functionality Criteria Impact Level
Automated Regression Testing 10 - Mandatory
Automated System Testing 10 - Mandatory
Automated User Interface Testing 10 - Mandatory
Automated Integration & Unit Testing
Support 5 - High
Crash & Bug Handling 5 - High
Crash Report 5 - High
Performance Testing Support 5 - High
Bug Report 3 - Medium
Supports Distributed Systems 3 - Medium
Test Results & Metrics 3 - Medium
Interoperability - Bug and Crash Report 1 - Low
Interoperability - Test Results & Metrics 1 - Low
Win
ter 2
01
3
14
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION CRITERIA
USABILITY
Usability Criteria Impact Level
Community 3 - Medium
Product Documentation 3 - Medium
Required technical knowledge 3 - Medium
User Efficiency 3 - Medium
Support 1 - Low
Win
ter 2
01
3
15
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION CRITERIA
DEPLOYMENT
Deployment Criteria Impact Level
Supports the .NET languages 10 - Mandatory
Supports Visual Studio IDE 10 - Mandatory
Supports Windows Operating System 10 - Mandatory
Interoperability with Software Engineering
Tools 5 - High
Additional supported IDEs 3 - Medium
Additional supported programming languages 3 - Medium
Additional supported System Environments
and Operating Systems 1 - Low
Win
ter 2
01
3
16
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
SCALABILITY
Scalability Criteria Impact Level
Test Case Scalability 3 - Medium
EVALUATION CRITERIA
MAINTAINABILITY
Maintainability Criteria Impact Level
Maintainability of Metrics 3 - Medium
Maintainability of Tests 3 - Medium
Win
ter 2
01
3
17
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
MISCELLANEOUS
Miscellaneous Criteria Impact Level
Product Cost 5 - High
Product Stability 5 - High
Product Performance 3 - Medium
Required Hardware Costs 1 - Low
VISUAL STUDIO 2010 ULTIMATE
Product of Microsoft
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
18
VISUAL STUDIO 2010 ULTIMATE
DESCRIPTION
Prominent IDE produced by Microsoft.
We evaluated the Visual Studio 2010 Ultimate package’s built-in automated system testing and user interface testing features.
Visual Studio also offers a variety of unit test, load tests, web performance tests which were not examined in detail for this particular evaluation.
Focused on the “Coded UI Test” functionality.
Coded UI tests are used to perform functional system testing and GUI testing.
These tests can be used to perform regression testing.
This feature is integrated into the Visual Studio 2010 Ultimate IDE. It is also available in Visual Studio 2010 Premium, 2012 Premium and 2012 Ultimate.
Win
ter 2
01
3
19
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
VISUAL STUDIO 2010 ULTIMATE
DESCRIPTION Creating an automated coded UI test case:
Create a “Coded UI Test” in a Test Project.
Create a test by either recording actions or use an existing test case.
Launch the UIMap application to build the test: Records users actions based on peripheral input.
Record/Pause button to start/stop the recording.
Assertion button to create test assertions (i.e. expected results).
Have the Builder generate code a appropriate times in the recording.
Generates a UI item map and a test case in either Visual Basic or C#.
Win
ter 2
01
3
20
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Executing automated tests:
Select a test or list(s) of tests and
select Run.
Allow the test(s) to run without
user interactions.
Produces a series, XML based, test
result files.
VISUAL STUDIO 2010 ULTIMATE
CODED UI TEST EXAMPLE Win
ter 2
01
3
21
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Screenshot of Recording Coded UI Test Builder
VISUAL STUDIO 2010 ULTIMATE
CODED UI TEST EXAMPLE Win
ter 2
01
3
22
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Screenshot of Coded UI Test Builder’s Add Assertion functionality
Win
ter 2
01
3
23
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r VISUAL STUDIO 2010 ULTIMATE
CODED UI TEST EXAMPLE
Screenshot of generated Coded UI test case
24
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r VISUAL STUDIO 2010 ULTIMATE
CODED UI TEST EXAMPLE
Screenshot of Test Run Results
VISUAL STUDIO 2010 ULTIMATE
EVALUATION PROS AND CONS
Pros:
Allows users to edit list of actions while recording.
Integrates well with other Microsoft products.
Records and executes UI tests well.
Easily reuses existing test cases when creating new ones.
Excellent documentation and community.
Cons:
Recording requires repetitive actions:
Record Pause Generate Code Add Assertions
Generate Code Record.
Bug and crash reports contain too much unhelpful data.
Does not handle image processing very well.
Requires technical knowledge (C# or VB).
Win
ter 2
01
3
25
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
VISUAL STUDIO 2010 ULTIMATE
EVALUATION SCORES Win
ter 2
01
3
26
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
The following is our set of weighted evaluation
scores for Visual Studio 2010 Ultimate’s Coded
UI Test functionality.
The tool did not score 0 on any Mandatory criteria.
Criteria Weighted Evaluation Score
Functionality 127 out of 244
Usability 36 out of 52
Deployment 70 out of 128
Scalability 6 out of 12
Maintainability 12 out of 24
Miscellaneous 34 out of 56
VISUAL STUDIO 2010 ULTIMATE
RELATIVE EVALUATION SCORES Win
ter 2
01
3
27
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
0
20
40
60
80
100Functionality
Usability
Deployment
Scalability
Maintainability
Miscellaneous
VISUAL STUDIO 2010 ULTIMATE
RESULTS
The automated UI and system testing
functionality was satisfactory.
Not suited for testing highly graphical based UI.
Offers a variety of other testing functionalities:
Unit test, load test, database tests, etc.
Final Score: 285 of 516
Satisfaction Percentage: 55.2 %
Win
ter 2
01
3
28
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
VISUAL STUDIO 2010 ULTIMATE
UI TESTING DOCUMENTATION AND TUTORIALS
Documentation:
Testing the User Interface with Automated UI Tests
http://msdn.microsoft.com/en-ca/library/dd286726%28v=vs.100%29.aspx
Video Tutorials:
Automated UI testing with Visual Studio 2010
https://www.youtube.com/watch?v=KYyj5Dfp8iE
How Do I: Get Started with Coded UI tests?
http://msdn.microsoft.com/en-us/vstudio/ee957688.aspx
VSTS 2010 : How to do Automation testing using coded UI test ?
https://www.youtube.com/watch?v=0SzoqOC4JC8
Win
ter 2
01
3
29
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
RATIONAL FUNCTIONAL TESTER
Product of IBM
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
30
RATIONAL FUNCTIONAL TESTER
DESCRIPTION
Testing tool produced by IBM as part of their
Rational quality management and testing
software set.
Provides automated system and regression
testing for functional and GUI based test cases.
Can act as a standalone applications or it can be
integrated into Eclipse, Visual Studio 2005,
Visual Studio 2008 or Visual Studio 2010.
We evaluated this product by integrating it with
Visual Studio 2010.
Used version 8.3
Win
ter 2
01
3
31
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
RATIONAL FUNCTIONAL TESTER
DESCRIPTION
Creating a new automated test case:
Create new test with in a Functional Test project.
Launch the Recorder application to build the test: Records users actions based on peripheral input.
Use the Record/Pause button to start/stop the recording.
Use the Verification Points and Action Wizard to create verification points (i.e. expected results).
Generates a map of UI elements and a test case in either Visual Basic or Java.
Executing automated tests:
A test script can be executed at any point.
Tests must complete without user interactions.
Produces a, HTML based, results file which is viewable/opened in an external web browser.
Win
ter 2
01
3
32
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Win
ter 2
01
3
33
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r RATIONAL FUNCTIONAL TESTER
TEXT CASE EXAMPLE
Screenshot of Functional Tester’s Recorder and Verification Point Wizard
RATIONAL FUNCTIONAL TESTER
TEXT CASE EXAMPLE Win
ter 2
01
3
34
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Screenshot of generated VB test case
Win
ter 2
01
3
35
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r RATIONAL FUNCTIONAL TESTER
TEXT CASE EXAMPLE
Screenshot of test run log
RATIONAL FUNCTIONAL TESTER
EVALUATION PROS AND CONS
Pros: Relatively good image comparison features.
Easy to create test cases.
Integrates with other systems and environments. Integrates into Eclipse and can run in Linux.
Documentation, community and customer support are relatively good.
Cons: Expensive for a single product.
Was unstable in our Visual Studio 2010 test environments.
Requires technical knowledge (VB or Java).
Cannot edit tests while recording.
Easy to interfere with the automated tests.
Reports do not contain a high level of detail.
Does not offer unit, integration or performance testing. Available in separate IBM Rational products.
Win
ter 2
01
3
36
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
RATIONAL FUNCTIONAL TESTER
EVALUATION SCORES Win
ter 2
01
3
37
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
The following is our set of weighted evaluation
scores for IBM Rational Functional Tester.
The tool did not score 0 on any Mandatory criteria.
Criteria Weighted Evaluation Score
Functionality 86 out of 244
Usability 36 out of 52
Deployment 102 out of 128
Scalability 3 out of 12
Maintainability 9 out of 24
Miscellaneous 19 out of 56
RATIONAL FUNCTIONAL TESTER
RELATIVE EVALUATION SCORES Win
ter 2
01
3
38
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
0
20
40
60
80
100Functionality
Usability
Deployment
Scalability
Maintainability
Miscellaneous
RATIONAL FUNCTIONAL TESTER
RESULTS
The automated UI and system testing
functionality was satisfactory.
Does not offer other types of automated testing.
Strongly satisfied our deployment criteria.
Very unstable in our test/evaluation
environments.
Final Score: 255 of 516
Satisfaction Percentage: 49.4 %
Win
ter 2
01
3
39
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
RATIONAL FUNCTIONAL TESTER
DOCUMENTATION AND TUTORIALS
Documentation:
Introduction to IBM Rational Functional Tester
http://publib.boulder.ibm.com/infocenter/rfthelp/v8r2/
index.jsp?topic=%2Fcom.ibm.rational.test.ft.readme.
doc%2Ftopics%2Freadme.html
Video Tutorials:
Getting Started with IBM Rational Functional
Tester v8.3
http://www.youtube.com/watch?v=NadS0T0sJgc
IBM Rational Functional Tester
http://www.youtube.com/watch?v=venklcLj0RY
Win
ter 2
01
3
40
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION RESULTS
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
41
EVALUATION RESULTS
EVALUATION SCORES COMPARISON
Criteria
Visual Studio
2010 Ultimate
Score
IBM Rational
Functional Tester
Score
Functionality (244) 127 86
Usability (52) 36 36
Deployment (128) 70 102
Scalability (12) 6 3
Maintainability (24) 12 9
Miscellaneous (56) 34 19
TOTAL (516)
285
255
Win
ter 2
01
3
42
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION RESULTS
RELATIVE EVALUATION SCORES COMPARISON Win
ter 2
01
3
43
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
0
20
40
60
80
100Functionality
Usability
Deployment
Scalability
Maintainability
Miscellaneous
Visual Studio 2010 Ultimate
0
20
40
60
80
100Functionality
Usability
Deployment
Scalability
Maintainability
Miscellaneous
Rational Functional Tester
EVALUATION RESULTS
PRODUCT RECOMMENDATION
Visual Studio 2010 Ultimate and IBM Rational Functional Tester both satisfied our immediate requirements.
Neither product truly impressed.
If pressed, we would recommend performing an upgrade to Visual Studio 2010 Ultimate over IBM Rational Functional Tester.
This is mainly because of Visual Studio’s other accompanying features.
We do recommend further investigation and evaluation into the available automated system and UI testing tools.
Win
ter 2
01
3
44
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
EVALUATION RESULTS
AUTOMATED TESTING PRODUCTS
The following are a list of other automated testing tools which could be considered as alternative products for this evaluation:
Borland Silk Test set (i.e. TestPartner for functional tests). http://www.borland.com/products/
SmartBear TestComplete. http://smartbear.com/products/qa-tools/automated-testing-tools
HP Unified Functional Testing Software. http://www8.hp.com/ca/en/software-solutions/software.html?compURI=1172957#.UVxT3TdXplE
TOSCA Testsuite. http://www.tricentis.com/en/
Testing Anywhere. http://www.automationanywhere.com/Testing/products/automated-testing-anywhere.htm
IBM Rational Quality Management and Testing products. http://www-142.ibm.com/software/products/us/en/category/SW730
BullseyeCoverage (i.e. Unit testing). http://www.bullseye.com/
Parasoft’s Test Suite (i.e. Unit, integration and regression testing). http://www.parasoft.com/jsp/products.jsp
45
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
QUESTIONS OR COMMENTS?
46
…Criticism?
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
OBLIGATORY DILBERT COMICS
47
Win
ter 2
01
3
MS
Visu
al S
tud
io 2
01
0 &
IBM
Ra
tion
al
Fu
nctio
nal T
este
r
Recommended