IMPLEMENTING AUTOMATED TESTING – WORKING SMARTER NOT HARDER

Preview:

DESCRIPTION

IMPLEMENTING AUTOMATED TESTING – WORKING SMARTER NOT HARDER Blayn Parkinson, University of York Automated testing frameworks and their use were once the preserve of software developers and specialist companies. These are now becoming more accessible to those of us without such resources at our disposal. Automated testing tools which record the clicks and playback tests are just a plug-in away. These tools have the capability of allowing ‘front end’ testers to record once and repeat tests in a variety of browsers, ultimately speeding up testing and automating repetitive tasks. This presentation documents our journey in developing an experimental approach to upgrade testing (Blackboard Learn 9.1 SP12), and task automation (exporting data from Campus Pack 4), via web browser automation developed at the University of York using Selenium (http://www.seleniumhq.org/).A formal set of pre-upgrade tests, conducted by hand in a variety of supported browsers have traditionally been used at York to validate service pack upgrades.50% of these tests have been automated and can now be run on demand to support change management processes. This has freed up the time of our e-learning support staff to appraise new service pack features and tools, rather than get bogged down in testing. An added benefit has been that the lead times needed to test new service packs have been cut, allowing us to deploy the latest service pack offering. We will outline the evolution of our preparations to managing VLE upgrades, charting the transition from manual to automated functional testing, looking at the challenges of developing and implementing automated testing – the problems encountered at the time along with our assessment of the sustainability of this approach for future upgrade testing. We will describe how automated testing has been deployed to support the transition away from a cloud based blog/wiki service. Development of data driven automated scripts to mass export content has saved time and money and provided a neat solution to reclaiming data hosted off campus.

Citation preview

Automated TestingSaving Time & MoneyBlayn ParkinsonE-Learning & training Support Assistant

University of York (UK)

ABOUT ME

Blayn ParkinsonE-Learning & Training Support AssistantUniversity of York (UK)blayn.parkinson@york.ac.uk

I have used Blackboard for 3 years.

ABOUT MY INSTITUTION

University of York

15,330 Students

Using Blackboard since 2005.

Central e-Learning team 6 members (some departments have their own provision)

WHAT WE ARE GOING TO LEARN TODAY

• The benefits of using automated testing strategies.

• Limits to automated testing

• Suggestions on future collaboration in this area

OUR CHALLENGE

Thorough testing cycle ( each software not deployed without testing).

3 months: April – June (One window a year to achieve this).

• Lack of agility in responding to new releases, due to the testing time that we need to devote to a release.

• Needed to lock in to a release at a particular time.• Spread the load across the team in terms of manual testing.

OUR CHALLENGE

Functional testing • is necessary• is time consuming• can be ambiguous

Thorough testing cycle (each software not deployed without testing).

3 months: April – June (One window a year to achieve this).

• Lack of agility in responding to new releases, due to the testing time that we need to devote to a release.

• Needed to lock in to a release at a particular time.• Spread the load across the team in terms of manual testing.

OUR CHALLENGE

Functional testing • is necessary• is time consuming• can be ambiguous

Automated testing is• consistent• easily repeatable• reusable

Thorough testing cycle (each software not deployed without testing).

3 months: April – June (One window a year to achieve this).

• Lack of agility in responding to new releases, due to the testing time that we need to devote to a release.

• Needed to lock in to a release at a particular time.• Spread the load across the team in terms of manual testing.

ORGANISING TESTS

RECORDING SCRIPTS

Tests

Command

Target (Locator)

Value

Speed

EXAMPLE SCRIPT

2_LoginTestStudentAccount

open https://vle.york.ac.uk/webapps/login/?action=relogin

type id=user_id studtesta

type id=password ********

clickAndWait css=input[name="login"]

TEST MANAGER

TEST MANAGER

CONSTRUCTING GOOD TESTS

A good test– Mirror user behaviour– Independent– Report accurately

HTML present? Text present? Screenshot

SOURCE CONTROL

TEST SCRIPT (A1.3) ADD BLOG POST (ROLE STUDENT)

BLOCK 1 Login as test student

BLOCK 2 Navigate to the test course site

BLOCK 3 Navigate the left hand menu to access the blog tool

NEW SCRIPT Enter the blog and click the Add New Entry button

BLOCK 4 Add content to the content editor

BLOCK 5 Click the submit button

BLOCK 6 Take a screenshot for human inspection/verification

NEW SCRIPT Verify text on the page is present

BLOCK 7 Logout of the VLE

RUNNING THE PROCESS

Pass

Pass pending check

Fail

Test Script IDExecute

Test Suite

Test Results Web Service

Test Manager Results

TEST CASES & RUNS

Case = Browser/Platform combination

Run = Results against a particular case

TEST RESULTS

PROS AND CONS

Saved time Maintenance burden

Reusable Human intervention

TAKING THIS FORWARD

• Automate more• Maintenance• Continuous testing• Load testing• Blackboard client assisted testing?

OUR RESULTS

• 50% of tests automated (from 2013 testing cycle)

• 2014 Upgrade locking into SP16

• How much maintenance is needed, BB switch from iframe’s to HTML5 will impact on this.

• Can we extend the current % of automated tests

• Review this summer of complete process

DO THIS NEXT

Think about how the adoption of automated testing processes might benefit your institution.

Jointly (with Blackboard) could we come up with a solution to better help the testing processes within our own institutions.

THANK YOU!

Blayn ParkinsonE-Learning & training Support AssistantUniversity of York (UK)Blayn.parkinson@york.ac.uk

Link to the paper:http://goo.gl/Cp17CL