34
Confidential PA1 2013-12- 13 1 Testing Mobile Applications A Model for Mobile Testing

ConfidentialPA12013-12-131 Testing Mobile Applications A Model for Mobile Testing

Embed Size (px)

Citation preview

ConfidentialPA12013-12-131

Testing Mobile ApplicationsA Model for Mobile Testing

ConfidentialPA12013-12-132

Introduction

▪ This presentation outlines a model for mobile applications testing

▪ Focus on what could be included in a mobile application test scope

▪ The model is context independent, and should therefore be adapted and applied to specific contexts

▪ Four different dimensions

ConfidentialPA12013-12-133

Mobile Application Test Scope

▪ The model outlines what could be part of a mobile application test scope

▪ Which tests are actually selected/created/executed are dependant on different priorities and risks

▪ Business priorities/risks?

▪ Customer priorities/risks?

▪ User priorities/risks?

▪ Technical priorities/risks?

ConfidentialPA12013-12-134

Model Overview

Application Coverage Items

Configurations

Software Quality Attributes

Environments

ConfidentialPA12013-12-135

Application Coverage Items

▪ Divide the application into different functional areas

▪ Each functional area is called a Coverage Item

▪ The size of the coverage item is dependant on the granularity needed for planning and reporting

ConfidentialPA12013-12-136

Example: Alarm & Clock Application

World Time

Alarm Clock

Stop Watch

Timer

Alarm & Clock Application

ConfidentialPA12013-12-137

Software Quality Attributes

▪ ISO/IEC 25010

▪ Functional Suitability

▪ Performance Efficiency

▪ Compatibility

▪ Usability

▪ Reliability

▪ Security

▪ Maintainability

▪ Portability

ConfidentialPA12013-12-138

Example: Functional Suitability

▪ Functional Completeness

▪ Are the application features there?

▪ Functional Correctness

▪ Are the application features working?

▪ Installability

▪ Is the install/uninstall process for the application working?

▪ Back-end functionality

ConfidentialPA12013-12-139

Example: Performance Efficiency

▪ Application start up time

▪ UI transitions

▪ Memory usage

▪ Loading times

▪ Save time

▪ Back-end capabilities

▪ Performance and capacity testing

▪ Stress tests

ConfidentialPA12013-12-1310

Example: Reliability

▪ Availability

▪ Service up-time

▪ Recoverability

▪ What happens when the application crashes?

▪ How does the application handle data during crash?

▪ Service back-end recovery

ConfidentialPA12013-12-1311

Example: Reliability

▪ Random stability tests

▪ Monkey testing for Android applications

▪ Random walkthrough of UI until application crash or time limit is reached

▪ Aging & Duration

▪ What happens when the application is used over a long period of time?

▪ What happens when the application is used for a long time?

▪ What happens to the back-end over time?

ConfidentialPA12013-12-1312

Example: Usability

▪ Different user personas

▪ Test the application through different user personas to cover the most important use cases

▪ Learnability / Understandability

▪ It is easy to learn how to use the application?

▪ Aesthetics

▪ Usable for people with disabilities?

▪ User Error Protection

▪ Is the application mistake-proof?

ConfidentialPA12013-12-1313

Example: Compatibility

▪ Interoperability testing

▪ Support for different accessories ▪ Headsets, watches, wristbands, external storage, external displays

▪ Support for communication with third-party devices▪ TV, Playstation, Computer

▪ Co-existence

▪ Sharing resources with other applications

▪ Interruptions from other applications or systems

▪ Sharing resources with other users▪ Back-end access

▪ Multiplayer usage with different other mobile devices

ConfidentialPA12013-12-1314

Example: Maintainability

▪ Application upgrade

▪ What happens when you upgrade to a new version of the application, especially with user data

▪ Testability

▪ Has the application been created to enable efficient testing?

▪ Replacing the back-end

▪ Backwards Compatibility

ConfidentialPA12013-12-1315

Example: Security

▪ Fuzz testing

▪ Fuzz testing or fuzzing is a software testing technique, often automated or semi-automated, that involves providing invalid, unexpected, or random data to the inputs of a computer program. The program is then monitored for exceptions such as crashes, or failing built-in code assertions or for finding potential memory leaks. Fuzzing is commonly used to test for security problems in software or computer systems. [2]

▪ There are of course many ways of improving application security [3]

ConfidentialPA12013-12-1316

Example: Portability

▪ Portability testing refers to the process of testing the ease with which a computer software component can be moved from one environment to another, e.g. moving between two different Android cookies, or between Android and Firefox OS. This is typically measured in terms of the maximum amount of effort permitted. [4]

▪ Activity performed by Software Developer (in Test)?

ConfidentialPA12013-12-1317

Configurations

▪ Different Operating Systems

▪ Different Chipsets & HW

▪ Tablets vs. Smart Phones

▪ Free vs. Premium

▪ Targeted customers

ConfidentialPA12013-12-1318

Different Operating Systems

▪ Is the application working for all intended operating systems?

▪ Android

▪ Firefox OS

▪ Windows

▪ IOS

▪ Different versions of operating systems

▪ New Android cookies

▪ Old versions?

▪ Different software releases

▪ Different OS configurations and customizations

ConfidentialPA12013-12-1319

Different Chipsets & HW

▪ Mobile Devices have different chipsets with different capabilities

▪ Qualcom

▪ Nvidia

▪ Samsung

▪ Texas Instruments

▪ Devices have different HW support for different features

▪ Devices could have different capabilities, even though they have the same chipset, such as RAM, internal memory, etc.

ConfidentialPA12013-12-1320

Tablet vs. Smartphone

▪ Default screen orientation differs between smartphones and tablets

▪ There are many different screen sizes for the applications to support

▪ Some tablets only support WiFi

▪ Speakers / sound may be different

ConfidentialPA12013-12-1321

Free vs. Premium

▪ If your application has both a free and a premium verison, you need to consider what testing can be re-used, and what must be re-tested for both versions

▪ Depends on the implementation of the different versions

▪ With a good implementation, the tests on the free version should be minimal

ConfidentialPA12013-12-1322

Targeted Customers

▪ If an application has several customized versions which are going to different customers, it is necessary to consider what testing can be re-used between version, and what is customer specific

ConfidentialPA12013-12-1323

Environments

▪ Mobile Networks

▪ WiFi and Bluetooth

▪ Content

▪ Servers & Services

ConfidentialPA12013-12-1324

Mobile Networks

▪ Different mobile networks and operators have different configurations and different capabilities

▪ Does the application have access to the correct servers?

▪ Is the back-end working in a live environment

▪ Can the application handle loss of connection or low bandwidth?

ConfidentialPA12013-12-1325

WiFi / BT

▪ What happens if there is a loss of WiFi network, or BT connection during application usage?

▪ What happens if reception is bad or fluctuating?

ConfidentialPA12013-12-1326

Content

▪ If the application handles different content it is relevant to try out live customer content to secure compatibility

▪ Different file types

▪ Some content could have DRM protection, and also different DRM solutions

▪ NFC tags can have different data types and different amounts of data

ConfidentialPA12013-12-1327

Servers & Services

▪ How does the application work with different content and streaming servers?

▪ Different email solutions

▪ Other services

ConfidentialPA12013-12-1328

Testing the Complete Ecosystem

ConfidentialPA12013-12-1329

Creating Tests based on the Model

▪ Each test case / test session / test mission / test should be marked as covering:

▪ One or multiple Coverage Items

▪ One or multiple Software Quality Attributes

▪ Low level tests usually only cover one coverage item and one quality attribute, while high level system tests usually cover multiple

▪ Which tests are created should be based on risk and priorities

ConfidentialPA12013-12-1330

Test Runs / Executions / Activity

▪ Each test run / execution / activity should be marked as covering:

▪ One configuration

▪ One environment

▪ You can run the same tests on multiple configurations and environments

▪ Which tests are executed should be based on risks and priorities

ConfidentialPA12013-12-1331

Test Reporting

▪ One (sub-)report for each configuration and environment

▪ Coverage items and software quality attributes can be used in the reports to facilitate analysis and understanding

▪ Group results based on different coverage items and/or quality attributes

▪ Coverage Item / Software Quality Attribute Matrix

ConfidentialPA12013-12-1332

Test Reporting Example

Configuration X Environment Y              

                 

Functional Performance Compatibility Usability Reliability Security Maintainability Portability

Alarm Clock                

World Time                

Stop Watch                

Timer                

                 

      Test OK Test Not OK Not Sufficient Coverage No Test Needed    

ConfidentialPA12013-12-1333

Conclusion

▪ The model described in this presentation can be applied to most contexts

▪ When using the scope created with this model, there are many testing practices that must be applied to make it effective

▪ Risk-based testing

▪ 10-minute test plans

▪ Session-based testing

▪ Etc.

▪ The model alone will not solve all your mobile application testing needs, but it can be a good starting point for setting a relevant test scope

ConfidentialPA12013-12-1334

References

[1] ISO/IEC 25010http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=35733[2] Fuzz Testinghttp://en.wikipedia.org/wiki/Fuzz_testing[3] Application Securityhttp://en.wikipedia.org/wiki/Application_security[4] Portability testinghttp://en.wikipedia.org/wiki/Portability_testing