41
NZTester The Quarterly Magazine for the New Zealand Soſtware Tesng Community and Supporters COMPLIMENTARY ISSUE 8 NOV 2014 - JAN 2015 In this issue: Interview with John Lockhart, WebTest Testing@Kiwiplan Dealing with Professional Manipulation The Testing Maturity Model Mobile Testing: Ten Best Practices People Management in Testing Ever Had That Sinking Feeling? StarWest, Let’s Test Oz Conference reviews + Coffee with Viswa, Testing Events & more….

NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

NZTester The Quarterly Magazine for the New Zealand Software Testing Community and Supporters

COMPLIMENTARY ISSUE 8 NOV 2014 - JAN 2015

In this issue: Interview with John Lockhart, WebTest

Testing@Kiwiplan

Dealing with Professional Manipulation

The Testing Maturity Model

Mobile Testing: Ten Best Practices

People Management in Testing

Ever Had That Sinking Feeling?

StarWest, Let’s Test Oz Conference reviews + Coffee with Viswa, Testing Events & more….

Page 2: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

2

NZTester Magazine

Editor: Geoff Horne [email protected] [email protected] ph. 021 634 900 P O Box 48-018 Blockhouse Bay Auckland 0600 New Zealand www.nztester.co.nz

Disclaimer:

Articles and advertisements contained in NZTester Magazine are published in good faith and although provided by people who are experts in

their fields, NZTester make no guarantees or representations of any kind concerning the accuracy or suitability of the information contained

within or the suitability of products and services advertised for any and all specific applications and uses. All such information is provided “as

is” and with specific disclaimer of any warranties of merchantability, fitness for purpose, title and/or non-infringement. The opinions and

writings of all authors and contributors to NZTester are merely an expression of the author’s own thoughts, knowledge or information that they

have gathered for publication. NZTester does not endorse such authors, necessarily agree with opinions and views expressed nor represents

that writings are accurate or suitable for any purpose whatsoever. As a reader of this magazine you disclaim and hold NZTester, its

employees and agents and Geoff Horne, its owner, editor and publisher, harmless of all content contained in this magazine as to its warranty

of merchantability, fitness for purpose, title and/or non-infringement.

No part of this magazine may be reproduced in whole or in part without the express written permission of the publisher.

© Copyright 2014 - NZ Tester Magazine, all rights reserved.

Advertising Enquiries: [email protected]

Page 3: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

3

The Journal For New Zealand Test Professionals

I’m told that while first anniversaries are a huge

ra-ra affair, seconds are far more ho-hum. Excuse us

then if we do do a little ra-ra on our front cover this

issue as it has been two years since the first issue of

NZTester Magazine was published.

It might sound a little ‘Kunta Kinte’-ish (if you

remember the Roots mini-series from the

’seventies) that our beginnings are carved into my

professional folklore - ’whilst recovering from

….yadda yadda yadda….NZTester was born!’

however we hope you’ll forgive us for carrying just

a little pride in that we made it to a second

anniversary issue. I’m told that very few

professional publications actually make it this far!

Anyway, I do have to apologise for the delay in

getting this issue out. Its been a busy period, what

with starting a new assignment plus another short

but unexpected stint in hospital means I ended up

way behind in everything that I needed to get

together. We’ve made this issue a little larger

accordingly, hope it’s worth the wait!

It also comes hot on the heels of the first NZTester

Magazine Conference held in Auckland in August

and we’ve included a few pics from this.

We also have articles from first-time NZTester

contributors Richard Sims, Peter Bink and Sid

Holmes. Regular contributor Mike Talks reviews

Let’s Test Oz and our staff writer does the honours

for StarWest 2014.

We also reveal just a little of what we’re planning

for 2015 on our Test Events page.

Lastly, this will be the final NZTester issue for 2014

and so it’s fitting that we wish everyone a Merry

Christmas and a Happy New Year. By the time this

issue is received, we’ll all be well into the annual

silly season and most of us will probably be thinking

of camping, fishing or whatever else we like to do in

the holidays.

Keep well and in touch.

IN THIS ISSUE…

Click on title...

5 NZTester Magazine Conference

8 Interview with John Lockhart Director, WebTest

10 Recognising and Dealing with Professional Manipulation Geoff Horne, Editor, NZTester Magazine

13 Testing @ Kiwiplan

15 Ever Had That Sinking Feeling? NZTester Magazine Staff Writer

20 The Testing Maturity Model Richard Sims, Planit Software Testing

26 People Management in Testing Sid Holmes, AMP

28 Mobile Testing: Challenges and Ten Best Practices

Peter Bink, Capgemini

31 Review: Let’s Test Oz Mike Talks, Datacom

38 Review: StarWest 2014 NZTester Magazine Staff Writer

REGULAR FEATURES

22 Coffee with Viswa

35 Testing Events

37 NZTester and OZTester Magazine

Back Issues

Page 4: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

4

Page 5: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

5

Venue:

Welcome to the NZTester Magazine Conference

Well, we did it! I’d like to be able to say without a hitch but that would be

fibbing! Despite the challenges and the repeated attempt of Murphy to

derail we went ahead and based on feedback, it appears that it was a

resounding success.

Eight-five test professionals gathered in Auckland on 13 August to add to

the thirty or so who participated in the Tutorials to attend our inaugural

conference. With the day kicking off with Matt Mansell and the Five Whys

of Testing and wrapping with a popular Panel Descussion, a healthy

energy was maintained throughout and I have to confess slipping into bed

that evening was amongst the most satisfying of exploits! Thanks to all!

Page 6: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

6

The Conference at a quick glance:

Matt Mansell holds court!

The Software Education stand was well visited….

....as was the lunch queue

The panel session at the

end of the day proved to

be very popular. Richard

Leeke (Equinox), Vishav

Preet (Air NZ), Joshua

Raine (StatisticsNZ), Pete

Couper (IntegrationQA)

and Mat Mansell (Dept of

Internal Affairs) were

our expert panelists.

Page 8: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

8

Our interview this issue is with John Lockhart of

WebTest. I’ve known John for a few years now and

have found his approach to testing to be professional,

pragmatic and highly sought after.

NZTester: Can you please describe WebTest?

We are a small and flexible company run by three

experienced testers to bring great staff and those

who need them together. We focus on building

ethical and trusting relationships and our testers

exhibiting the best qualities of the Kiwi IT industry -

flexibility, smarts, and a willingness to do whatever

is required to support their teams.

NZTester: What products and services does

WebTest offer?

The full range but particularly we provide agile

approaches to engagement, testing or test

automation focussing more on the test analyst

through to test lead roles, where our employment,

management and engagement models let us provide

great value to clients enabling them to often get the

benefits of outsourcing with a cost structure

equivalent to internal charge out rates.

NZTester: What do you believe makes WebTest

different?

We have a zero overhead model where by not

having physical office, marketing and other

overheads we can offer superior value and flexibility

to clients, and being full time testers/test managers

ourselves, supportive management to our staff. In

some ways we offer a full business service with a

cost structure more like a testers co-operative.

NZTester: What do you think makes a Test

Manager or Analyst come to work for

WebTest?

Those who like to work for great clients in longer

term contract types of roles but in a permanent

employment arrangement with a supportive

employer who wants to develop their careers.

NZTester: Where do you believe NZ’s approach

to testing is going well?

We’ve seen a big move to the use of and recognition

of professional tester; to testers to spend time

adding value by improving quality rather than

writing unnecessary documentation, and to high

trust engagement models that maximise efficiency.

Also, driven partly by agile and fast time to market

models, a growth in the use of automation to

remove repetitive checking and a move beyond

simplistic business case-driven big-bang last century

approaches to a more strategic commitment to

better automation models all in the seven or eight

years of WebTest’s existence. We try to play a small

role in this evolution where we can.

NZTester: Where do you believe NZ’s approach

could improve?

I personally believe that we are missing a lot of value

because we see testing documentation

(both automated and manual) and requirement/

specification documentation as different things. In

particular I believe the Specification by Example

This issue’s interview is:

John Lockhart Test Automation Specialist,

Project Manager, Director

WebTest

Page 9: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

9

(SBE) approach taught by people such as Gojko Adzic

and pioneered partly by local Rick Mugridge and

these days being helped by others like Darren

Rowley and Nigel Charman (Assurity) is a significant

paradigm shift. It potentially greatly improves the

return on investment on time spent not just by

testers but other team members, and integrates the

role of testers and the rest of the agile team in a way

that has been lacking. I’ve seen it provide similar

benefits in traditional development models as well.

I think teams still make the mistake of thinking

quality is the responsibility of testers or can be tested

into software. I see testing more as the final polishing

process of software development, and no amount of

polishing can turn a piece of crap into a Rembrant.

Quality on the other hand should be a core driver of

all our development processes. I think people have

still not learnt the lesson of Demming that in most

circumstances higher quality, when you take into

account the indirect costs of poor quality as well as

the direct ones, always pays for itself and can’t be

over emphasised. That doesn’t mean wasting money

or being inefficient about achieving it though, in fact

the most efficient approach is often the most effective

as it is the most simple.

NZTester: Do you believe that overall the

standard of testing in NZ is improving?

Yes, certainly.

NZTester: Where do you believe the next

initiatives in testing lay? What’s coming next?

In NZ? Internationally?

As above, SBE, even though it is 15 years now since I

first came across it when seeing FitNesse, I see as a

shift missed by much mainstream agile development.

Newer models for automation which have caught on

largely as a result of Agile philosophies, have not in

my view run their course yet. I still see them as

having some stages of evolution (or perhaps

revolution!) to go through, and I think this is

reflected in the deep divisions in the testing

community and the broad spectrum of views from

the Rex Black/ISTQB to the James Bach and context-

driven people.

In parallel to this, an understanding that good testers

are valuable not because they perform repetitive

drudge-work that others don’t want to do, but

because they bring to a team a unique perspective

and intellectual approach that both supports the

team’s efforts towards quality, and provides the final

polishing and detail to both the understanding and

the delivery of the product. This changes it from one

of many so-so systems on the market to an

outstanding one that provides value and

even aesthetic enjoyment to customers and users.

NZTester: Do you have a testing horror story to

share?

More like a recurring nightmare: Rooms full of

people mindlessly executing inaccurate and

outdated manual test scripts. Managers asking

test managers how long it will be until the test team

has got the defects down to acceptable levels, when

every bug that is fixed introduces > 1 more bugs! A

move to outsourcing without allowing for the

indirect costs including but not limited to an

understanding of the mythical man month principles

now almost as old as I am!

I can’t be more specific due to confidentiality etc, but

most testers who’ve been around for a while will

have seen all of this.

Projects such as we’ve seen in the news regarding

teachers pay systems, that repeat the same mistakes I

thought we’d learnt from a decade earlier with the

police system. These often seem to involve

management teams who don’t have a development

background, and fail to include someone who does,

compensating for that by outsourcing, trusting in

CMM/ISO/ISTQB or other acronyms. It seems to also

often involving purchasing “vapour-ware” or

software requiring so much modification that it is

effectively “vapour-ware” and then not applying agile

principles, particularly the key one of getting

production quality deliverables ("thin vertical

slices”) quickly. If that was done the problems would

be revealed much more quickly and could probably

be resolved.

Editor’s comments: Thanks for making time

to write for us John. It’s certainly an evolving

world and sometimes the path moving forward is

fraught with hanging on to old adages that simply

do not make sense anymore. However that said,

I’ve always been a great proponent of ensuring

that changes are made for the better, whether it

be better quality, more timely delivery, lower costs,

rather than for merely for the sake of change - Ed.

Page 10: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

10

Have you ever had a sense that maybe you have been played, set up or

manipulated? It’s not a nice feeling once you finally realise that this is

what might be happening to you. If you’re wondering why I’m writing

on this subject it’s because, like most of us, I’ve had my share of

allowing myself to be co-erced, cajoled, persuaded, guided or whatever

this behaviour is disguised as, into thinking or making decisions that

I really didn’t want to make. I’ve probably also been guilty of putting

others in the same position, albeit unwittingly.

Manipulative people endeavour to make you side with them or do

something they want you to do that is to their benefit, without directly

asking. Mild forms of manipulation can be found in every ‘how to win

friends and influence people’ text ever written. Sales training courses

outline similar approaches and teach persuasion tactics and ‘closing’

strategies to get people to make buying decisions. Not that there is

anything illegal, immoral or otherwise about laying out information

for others to make decisions upon however true manipulation has a

far more sinister edge.

Manipulation uses guilt, shame, threats and other nasty tricks to

achieve its aims and most of us have experienced some form of this

emotional ‘blackmail’ at some stage on our journeys. We usually learn

to spot it and deploy defensive mechanisms for our own personal

safety and protection.

However when it’s done in a professional context, there are far more

subtle tricks out there that are not so easy to spot and we sometimes

fall for these unknowingly. Such mechanisms are often presented to us

in the guise of helping to improve our prospects and use professional

exclusivity, career limiting, opportunity disappearances and other

possible implications as a result of not following a particular train of

thought, method or practice. In many instances the people concerned

are not even aware that what they are doing is classed as manipulation,

genuinely believing themselves to be acting in the best of interests.

But then there are others….!

So where does it all start? We occasionally encounter people who have

adopted rigid positions, whether around religion, politics, academia or

any other particular thought disposition, where everything is seen

through the filter of their own convictions. Not surprisingly, they see

what they’re looking for and the more they use their theories for

making sense of things, the more things seem to fit those theories.

The more that seems to fit, the more sure of themselves they become

to a point where they cannot understand why others do not see things

the way they do1.

1 Hugh MacKay, Australian psychologist, sociologist, social researcher, writer and

former teacher.

Recognising and Dealing with Professional Manipulation

by Geoff Horne, Editor, NZTester Magazine

Page 11: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

11

Consequently, there often comes a sense of

entitlement to impose their perspectives upon

everyone else. Some of these people though are

smart and when doing so, they won’t be in the least

bit obvious. Instead they will use novel anecdotes,

new twists on old adages or intellectual mind games

to promote their world view. These come down to

the old illusionist trick of getting you to concentrate

on the primary object of focus and therefore not

notice anything outside of that frame. Coupled with

a dose of old-fashioned charm and the well-repeated

message that it’s improving everything for everyone,

the trap is set. I remember once being collared by a

lady selling cleaning products who demonstrated a

cleaner by polishing up a very old, well-circulated

NZ 2c piece (before inflation consigned that

particular denomination to the past) – a canned

presentation designed to validate her pitch. Maybe

if I was polishing up NZ 2c pieces for a living, I might

have bought it!

We often wonder how some people can be fooled

into joining religious cults et al. However at the root

of every cult is a charismatic and charming

manipulator who is able to bend the wills of those

who are i) so thirsty for knowledge and betterment

and ii) so dissatisfied with their status quo, that they

will allow their boundaries of mind, body and spirit

to be contravened, almost as if by hypnosis.

In every profession, we find rigid-thought

propagators who believe that they have developed

the next level of practice, the new paradigm, a higher

plain of ethics or professionalism et al. Out of a sense

of misguided entitlement, they will see the need to

press down on what they perceive as the lower

plains in order to promote their message, all in the

name of betterment.

They and those whom they are able to convince and

gather around themselves may form a distinct sub-

community and often a culture will develop that

promotes the agenda, lending further credibility to

the propagator’s message. The sub-community may

give itself a cool-sounding name or title which

obviously aids in its promotion and its spread only

serves to further confirm the propagator’s supposed

correctness. In more extreme examples, members of

the sub-community may allow it to mould both their

professional and personal identities – eg I am a this,

I am a that. In professional circles, more than in

cultic, if careful investigation is undertaken it often

transpires that the propagator is significantly

benefitting financially from the spread of the

‘movement’.

The difficulty is that there are nearly always

elements of ‘bonafidity’ present. However they’re

ultimately enveloped by an infrastructure which has

no real bearing on the validity of the practice and end

up becoming secondary to the culture and the

promotion of the sub-community that adheres to it.

Finally, exclusivity emerges and this is where things

get interesting. Unfortunately in extreme cases, it

has resulted in lawsuits, lost jobs and lost livelihoods.

Parallels can be drawn with masonic lodges, ‘secret

handshakes’ and general professional discrimination

against people outside of the ‘fellowship’. Examples

of these are aplenty in the health profession and

education systems. Multi-level marketing is riddled

them and we also see presence in social programmes

such as parenting, lifestyle enhancement and certain

addiction exit initiatives.

So what to do? Learn to recognise such behaviours

and structures then deploy strategies to establish

reasonable boundaries.

Common traits of manipulative people:

· Flattery and charm – this will be poured on

thick when the proponent believes it will help

to ingratiate themselves with a person. If that

person has been subject to unaddressed

abuse, denigration, victimisation etc that

flattery will be lapped up.

· Threats and consequences – these are

usually well-veiled: comments along the lines

of “well if you’re happy to stay in that place

then…..” or “of course, our people enjoy a

much higher level of …..”; words to the effect

that you will lose out or be significantly

disadvantaged by not following the line.

· High IQ and Intelligence – many

manipulators are of well-above average

intelligence. They often try to reduce their

world-views to models then define the

criteria upon which to base their discussions

and persuasions. They ‘win’ every argument

because they have defined the models from

within which they operate.

· Intellectualising and reasoning –

manipulators can make any argument sound

Page 12: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

12

fair and reasonable especially if the listener

has been disaffected in some manner. By

getting you focus on a specific train of

reasoning and convincing you that it’s the

only thing that matters, the manipulator can

successfully execute the illusion.

· Guilt and shame – a manipulator has a knack

of being able to turn the tables back on you

when challenged and playing the guilt card:

“well if you hadn’t been so….”, “maybe that

could have been averted if you….”, “no, that’s

OK it’s my fault that…..” etc. You may even

find yourself apologising to them!

· Fear and loathing – manipulators can resort

to demeaning, accusations, personal attacks,

credibility slating and behind-the-back

tactics. Swearing can be common as can

shouting and ranting with the intent of

creating fear and soliciting ultimate deference

to the manipulator.

· Self-righteousness and self-focus – a

manipulator has great difficulty in admitting

when they have made mistakes and in

accepting and articulating situations that

might expose faults. In short, they are never

wrong! If you want to break the flow of a

manipulator, simply tell him he is wrong and

see what happens!

These behaviours are tantamount to professional

bullying and justified in the name of betterment in

the same manner – I smack you around the head but

it’s OK because my way is right and yours is wrong.

The worst part about it is that followers become

accordingly convinced that it’s alright to behave in

this manner and start to inflict the same on others.

What to do when you realise you have been

hoodwinked in this manner:

· Extract and distance yourself from the

manipulator, the sub-community culture and

its followers. Do not attempt to fight or

change perspectives, you will only frustrate

yourself no end (this is the voice of

experience speaking). The only thing you can

change is your reaction to the behaviour and

focus.

· Understand that you have fallen for an

illusion and that the manipulation does not

represent a full reality and possibly only a

very small part of it. Know that plenty of

others have been taken in before you and

there will be plenty more after.

· Stand firm on what you know based on your

experiences and first-hand learnings. If need

be, seek out trusted counsel. Do not allow a

manipulator to take away your ability to

make your own decisions, form your own

opinions and follow your own path. Watch

out for the illusionist syndrome and above all,

maintain your professional confidences.

· Knuckle down into work with a renewed

vigour, knowing now that you have an

understanding of how and why these

situations happen and how you will be better

placed to handle in the future.

Work should be enjoyable and you should be

surrounded by people with healthy mutual respect

for each other regardless of alignment. Those who

treat others with disdain because they differ

professionally have little respect for anyone apart

from themselves. If this was not true then there

would be a reasonable co-existence, a viva la

difference and not the continued opposition and

exclusivity that we see in some circles.

There is always a bigger picture. In professional

circles where the practices can be varied there will

always be some who’s quest for betterment, no

matter how honourable the intent, will fall into the

trappings of manipulation and exclusivity. Once

caught up it can be difficult to retreat back to a

more inclusive perspective however recognising

the hallmarks for what they are as opposed to how

we may feel about them, may assist in withdrawing

from these toxic dispositions and re-establish

confidences to operate as effectively and as

professionally as possible.

Page 13: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

13

Just in case you might have been wondering, yes we

did run a Conference this year and yes it was a great

success. NZ companies were good enough to support

us by sending along their best and brightest

however one company, Kiwiplan, sent along its

whole testing contingent of 21 testers!

I’d met numerous folks over the years who had been

with Kiwiplan during various stages of their careers

and from past experience, I knew of the innovation

that Kiwiplan had brought to its particular sector of

the packaging industry.

The company began life as part of the Kiwi

Packaging Company which manufactures cardboard

packaging and in particular the corrugated variety.

One of its earliest innovations was algorithms for

determining the best usage of raw material when

cutting shapes from sheets of cardboard to minimise

waste. Originally developed to run on Data General

minicomputer systems in the late seventies, these

became part-and-parcel with wider software

applications to deliver a complete software solution

for scheduling and planning of corrugated cardboard

carton manufacturing. Further developments

included continuous self-optimisation based on

information feedback loops and it is this intelligence

that sets Kiwiplan apart from other suppliers in

their field.

The software development entity was spun off from

the parent in the eighties and became the foremost

developer and supplier of its niche systems to

manufacturers around the world and more than

thirty-five years on, maintains its leadership

position in this field.

I paid a visit to Kiwiplan offices in East Tamaki and

hosted by QA Manager, Chris Burgess who outlined

for me the way the company operates around

quality assurance and testing. In the New Zealand

office there are approximately 100 personnel; 90

of which are either developers or testers. A further

70-odd non-development staff are located overseas;

in the best proximities to the 500+ customer sites

throughout Europe, USA, South/Central America

and Australia.

Kiwiplan operates a rapid-waterfall type of

approach to software development, focusing on

smallish sprints of 2-3 weeks and a right-first-time

modus operandi. Consequently initial time and

energy is invested on specifying requirements and

developing functional and technical specifications.

That said, Chris was quick to point out that smaller

releases are folded into a continuous integration

environment that is powered by a home-grown test

automation framework not unsurprisingly entitled

Droid. Via Droid some 5,000 tests are executed on a

cyclic basis every weekend. As releases are made to

the CI environment, regression tests are

automatically executed and when major six-monthly

releases are made to the customer-base,

confidence is therefore high that consequential

issues will be minimal.

As early test tool proponents, Kiwiplan found that

there were precious few tools available at the time

hence its propensity to develop these inhouse as

per Droid. Consequently all test management,

requirements tracking, defect management etc

systems have all been home-grown to Kiwiplan’s

exact requirements and refined over the years as

technologies have advanced.

It’s certainly an interesting proposition in the case

of code management. A lot of the early code was

developed using Fortran, which as a mathematically-

based language was ideal at the time for the complex

algorithms required. However Fortran as a native

language is long defunct and rather than reinvent

the wheel by recoding, Kiwiplan commissioned a

bespoke Fortran-to-C converter and found this to be

an extremely economic method of both future-

Testing @ Kiwiplan by NZTester Magazine Staff Writer

Page 14: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

14

Wanna Get Published?

Our formula for selecting articles for

publishing:

Good + Relevant = We’ll Print It (well,

digitally-speaking anyway)

Good = one or more of: thought-provoking,

well-articulated, challenging, experienced-

based, technical skill-based, different

perspective to mainstream, unique….

Relevant = one or more of: emerging

trends, new technology/methodology,

controversial (within reason), beyond the

basics (eg. testing is good, defects

are bad)….

proofing its intellectual property and enabling

deployment to Unix and Linux platforms. All source

code from this era is still held and managed as

Fortran code.

Of course, as technologies have advanced into the

internet and mobile ages, Kiwiplan has made best

use of these for latter-day development initiatives

and the product set now includes web, iOS and

Android platforms as well as the more traditional client/server-based desktops. New development

initiatives include solutions for mass data

migration, data warehousing, machine integration

and middleware.

As mentioned above, the Kiwiplan testing contingent

is some 21-strong and growing all the time. While

external testing training is encouraged the company

also fosters an internal programme for tester

education through mentoring and self-paced learning

online. Kiwiplan does enjoy a very low staff turnover

with many of the test team having been with the

company over 10 years with the average tenure

being around 7 years.

Kiwiplan’s tester hiring programme is very

thorough and it is expected that potential employees

demonstrate their abilities first-hand onsite. As the

company is well aware of the intellectual property

that can disappear through the back door when

experienced staff depart, all testing personnel are

permanent company employees.

Given that the main products are mature

applications, testing is approached more on a

traditional basis. However given that many of the

test team have been involved with the company for

many years, test cases and scripts as such tend to be

developed pragmatically with an emphasis on

accuracy and focus as opposed to verbage; quality

over quantity.

There is also a strong collaboration ethos between

developers and testers with teams co-located by

product area and it was evident as Chris showed me

around the premises that many hives of activity

were in full flight.

In the Testing@ series we’ve covered a number of

NZ software companies including ikeGPS, TradeMe,

Orion Health and Fiserv (previously M-Com).

Kiwiplan has sat at the forefront of NZ software for

many years and shows no signs of relinquishing its

title as one of NZ longest established software

success stories. It appears to have resisted the

temptation to adopt the latest-and-greatest

methodologies in deference to pragmatic and

focused approaches that work best for its products,

environments, customers and personnel.

Kiwiplan is always looking out for talented,

motivated test professionals. If you like the sound of

light, focussed, hands-on teams, delivering world-

beating products every single day then let Chris

know; [email protected]

Page 15: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

15

Ever had that sinking feeling? You know the one, when you walk into

the office on a Monday morning and your colleagues are all grim-faced.

You figure that their weekend for some reason didn’t go so well then

you remember that during it, some of them were in working, installing

that big patch release that you were testing last week. Your heart jumps

into your mouth; what’s happened, what did I miss, did I cover

everything etc. etc. and very soon, you find out that…oops!

The Business Analyst demonstrates the issue and you know you had

covered this-this-and-this, this-this-and-that, this-that-and-this, that-

this-and-this, that-that-and-this, this-that-and-that, that-that-and-that

BUT did you cover that-this-and-that? And of course the user logged

onto the new release, entered that-this-and-that and…kaboom!

The Test Manager reminds you that you had two days to test this

feature and two days was (probably) sufficient. And even if it wasn’t,

the risk was understood in order to get the release out to the user, who

of course needed it yesterday. Plus everyone knows that curly, complex

issues do arise however as you are again reminded, this error was a

simple problem that any tester worth his salt should have found within

the first hour or so of testing. You feel inadequate as the Product

Manager walks passed, eyes diverted and seemingly emotionless

although if his thoughts were any more obvious, he’d be in counselling

and so would you! No-one else says anything yet you know you’ve let

the team down or at least that’s the way it feels. I mean, two days really

was enough wasn’t it and the problem was so simple after all and….

Does this ring a bell or twenty? Ever been here? It’s not nice. However

before you take a stroll down Harakiri Lane, there’s a few things you

might like to be aware of…

Everyone sees even the most simple of things, at least slightly,

differently. It isn’t possible to look through someone else’s eyes as much

as it would be useful sometimes. We can really only imagine how others

see and think, and the better we can imagine then the more we can

empathise. Take a coffee mug that is red on one side, blue on the other

and place it on a table between two people then ask what colour the

mug is; one will swear that its red, the other blue. Add just a dash of

emotion and ego in there and hey presto, instant argument and

empathy strangely apparent by its absence.

Someone else’s perspectives and interpretations can tell a different

story yet we mere humans seem to love developing and following

procedures and processes that so often result in subjectivity being

ironed out. Creativity, innovation, experimentation et al seem to get

flattened in the course of process institutionalisation. In the example

above, there was two days of testing performed however the reality

is that even if two weeks of testing was done, there’s no guarantee that

any or all of the defects would have been found. Remember that a test

Ever Had That Sinking Feeling?

by NZTester Magazine Staff Writer

Page 16: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

16

result is just a snapshot in time; at that specific time,

in that environment, with that data, on that version,

with that configuration, in that manner…’this’, is

what happened (or was it ‘that’?).

Different perspectives and interpretations can also

lead to quite different outcomes. Again in the

example above, someone else may have found the

error that was missed and still someone else may

not have come across the errors that were found.

That’s why the more we test, the more defects we’ll

find and if we test for two days, that’s what we get:

two days-worth of testing and defects. If anyone asks

if testing complete/finished, you might want to ask

for a definition of “complete/finished” (tip: you might

need to excuse yourself for being a smart alec first).

If that means have we covered all the requirements

specified, it’s relatively easy to measure as long as a

traceability matrix of some sort is maintained. Does

this mean all the defects have been found, no. Does

it mean that we’ve tested everything that needs to be

tested, no. It just means that in this time, we covered

this functionality and found these defects.

Different approaches will yield different results. Let’s

have a look at an example: a while back, I had a

smallish enhancement to test for a web-based

application (after having been asked to take time out

from my usual test automation responsibility). Due to

a slight miscommunication, one of my colleagues,

Murali, also ended up assigned to the task. However

given that we did literally have two days to test, we

decided that four eyes were better than two (and no,

it wasn’t merely a case of me putting on glasses!).

Murali went to the specification and spent the first

day developing manual test scripts to cover each

requirement. He then spent the second day executing

those test scripts. I, on the other hand, decided to go

exploratory and spent the first day experimenting,

investigating and generally playing around with the

web pages. On the second day, I developed a bunch

of complex-type scenarios (note: not manually

scripted, just defined). I recorded these using an

automation harness then, so I could define my this-

this-and-that’s, I developed a .txt data file that was

read by the harness and the details spat into the

application. It all worked a treat and Murali and

I both found some “good” defects however,

guess what…

Only about 20% of our defects were duplicates.

Murali’s scripted approach and my investigative

method had both been successful in that we found

errors. Now, if Murali had been asked if had he

finished testing, he could justifiably respond that yes,

he had covered off the requirements as specified and

the defects he found had been fixed and retested.

However my defects would have still been there.

I could also respond to the same question by saying

yes, I had been through each page and field in the

application and applied the usual bunch of tricks eg.

field boundaries, maximum lengths, positive/

negative conditions, if-I-do-this-what-happens etc.

I could also say that I had combined a whole testing

magazine-full of complex scenarios executing

together and based on running those yes, I had

finished my testing. However Murali’s defects would

have still been there.

The real answer is that we finished two lots of two

days-worth of testing using our respective

approaches. And as fate would have it, the release

went to the client who promptly found a this-that-

and-this! Upon investigation it was found that the

failed feature had been covered earlier on in the two

day cycle by both Murali and myself. So either we had

both missed the defect or something further down

the line had changed, mostly likely another fix but,

who can say? Had we performed more testing, we’d

(probably) have found it but again, who can say?

However the perception was that we’d finished

testing and bang, client finds a defect straight away!

Doesn’t exactly spell “professional” in the eyes of

those peering in from the outside, does it?

So, there are a few lessons here:

The more we test, the more defects we will

find. We’ll probably never find them all

however if we can track their frequency of

discovery (by severity or some other

appropriate categorisation) then we can gain

a rough indication of when it might be OK to

wrap testing eg. when continued testing

yields no further high severity defects for a

specific period of time; the defect frequency

reduces to zip and the lower severities to a

trickle . This approach also helps to avoid the

‘last minute fix’ scenario where final, rushed

fixes break all and sundry, effectively winding

the quality clock backwards.

· Different testers will find different issues.

I know from experience that I’m reasonably

adept at finding the curly issue that no-one

Page 17: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

17

else will find yet I can sometimes miss the

obvious. For others it’s vice versa. Both

are needed.

· Completing testing doesn’t mean that

everything has been tested or that all defects

have been found and fixed. It may mean that

we’ve done what we set out to do or

otherwise, but that’s all.

· Varying the approaches to testing can pay

dividends; the experiences above indicate

that multi-pronged methods can be useful.

Both the experimental and the predestined

approaches work depending on your context

however doing both; now there’s a novel

approach!

· Pair testing can pay dividends; another set of

eyes may see what yours do not. Peer testing

can pay dividends; a fresh set of eyes may see

what tired eyes do not. Commercial realities

can sometimes preclude however this is not

simply a case of two doing the work of one.

Think crowd-sourcing here!

· It’s also a cruel fact of life that things beyond

our control can change, yes it’s true. I have had

many situations where a release has been

passed through for implementation, someone

tweaks a parameter or updates a utility or

something, the release goes in and…..bang!

What can you do? Testers should not be held

responsible for those elements beyond their

control. It’s an unfortunate misconception that

because testing is so often considered the last

link in the chain that it’s the catch-all.

· The outcome of testing is always….information

(I really don’t know how many times I’ve said

that over the years); information about the

state of the product or system-under-test.

However it’s a fact of life that we testers will

always be asked whether we believe a product

is ready to go or not, so we have to be succinct

eg. “we specifically tested for these situations”

and “we did not specifically test for those

situations” (tip: have a good reason as to

why); avoid the “we don’t know”, “we think

so” etc. And your stress levels will go down as

a result.…promise! Resisting the temptation to

merely say or even imply what others want to

hear, will ultimately be to everyone’s

advantage even if it doesn’t appear so at first.

In summary, this testing profession of ours comes

with all sorts of lessons to be learned. Every journey

we undertake within it will always reveal more and

adding these to our arsenal of sneaky testing tricks

only serves to improve the overall positive impact we

can have on software development, and IT in general.

If we stop learning then we might as well retire and as

our experiences from these lessons grow, the more

valuable our “tester’s nose” becomes. Test on, people!

Page 19: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

19

SO Testing & Training has recently completed development of a software testing training course where

ground-up training is offered to students wishing to become professional software test analysts. The

programme teaches students the basics of structured software testing over a 12 week schedule and being

public and with NZQA (New Zealand Qualifications Authority) Level 7 Accreditation, is understood to be

unique in New Zealand, given no other offering of this nature is available here.

The programme has been developed in conjunction with SO Testing & Training business partner ITTI

which has a long association with the NZQA. The Accreditation process is a lengthy and formal process

however it does allow for students to apply for a student loan in order to undertake the programme. A 12

week programme allows students to become familiar with the pace of work in the IT industry although it

can be somewhat daunting for those who for may need to put aside full time employment to attend.

The course consists of three modules: theory, practice and communication skills.

Theory Software Development Life Cycle ISTQB Foundation Practice Test Documentation Test Execution Test Organization Test Tools Professional Communication Skills Written and verbal communication

The mix of these modules ensures that graduates are fully equipped for work as a professional

test analysts.

During the course, students work in groups on both Waterfall and Agile projects in order to understand the

differences in development methods and specific tester roles and responsibilities within each. To start

with, test execution exercises are performed manually although later in the course, execution for both

functional and non-functional testing is undertaken using test tools. Current testing considerations eg.

mobile testing, security testing, cloud testing and exploratory testing are all covered within the cirrculum.

In August 2014, a pilot kicked off with three students and by the conclusion in November, all had

successfully completed the programme receiving both NZQA and ISTQB Foundation certification. The next

intake commences in January 2015 with registrations now open.

With this programme SO Testing & Training is hoping to raise the education levels of test analysts starting

work in the New Zealand IT industry. It also hopes to work in the future with other testing service

providers to deliver the programme to a even wider range of potential markets.

Interested students can contact Ed Ouwens at SO Testing & Training for further details by clicking here.

New Software Testing Training Programme Available in Auckland

Tutor Ed Ouwens with students

Page 20: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

20

About ten years ago I was working on a testing

programme back in the UK and they were

undergoing a CMMi certification, during this

certificate I found out that there was a specialist

certificate for testing, this I came to know as TMM.

Skip a few years and this is now the TMMi and

so I did some investigation into this and uncovered

arguably one of the best certification and

measurement models that I have come across that

defines the maturity of testing in an organisation.

This is a summary of who and what the TMMi

Foundation and TMMi is.

What is the TMMi Foundation?

The TMMi Foundation has developed and owns

the TMMi model. It is a non-profit organisation

which focuses its activities on maintaining the

only independent test maturity assessment model

and accreditation for individuals and organisations

to deliver professional TMMI assessment – see

TMMi.org.

What is TMMi?

The TMM (Testing Maturity Model) structure is

based on the Capability Maturity Model (CMM).

The concept was originally postured by the Illinois

Institute of Technology. In 2004 a group of

individual practitioners got together and generated

the TMMi model. This group later became the

original TMMi Foundation members.

TMMi is a test process improvement and

accreditation model which can be used to complete

formal and informal assessments, both of which use

the same maturity requirements (as per the levels

below) however the informal assessment does not

have a certification award associated with it.

The assessment consist of five levels, it is these

levels that show the testing maturity of an

organisation. Each maturity level is split in to

Process Areas for an Assessor to review. Within each

process areas there are Specific Goals (SG) and

Generic Goals (GG) that must be achieved for an

organisation.

The test maturity levels start at Level 1 – Initial and

go through to Level 5 – Optimisation. In pictorial

form these are the levels and the process areas that

are attributed to each level.

The TMMi model can been applied across many

testing domains world-wide. The main differences

in TMMi and other test improvement models is its

independence and its adherence with the most

common international testing standards, this has

made it the standard for test improvement and

assessment.

Level 1 – Initial Definition

At TMMi Level 1, testing has no defined processes

and is mainly completed by developers. Tests are

developed in an ad-hoc way after coding is

completed. The objective of testing at this level is

to show that the software has no major failures

however a side effect of this is that a software

application may not meet the needs of the customer

or is unstable.

Level 2 – Managed Definition

At TMMi Level 2, testing is more of a managed

process consisting of a test team or test resource to

complete formal testing. One of the main objectives

of maturity Level 2 is that there is the ability for

repeatable testing. Test plans are also developed and

as part of this the approach, these test plans should

The Testing Maturity Model by Richard Sims, PlanIt Software Testing

Page 21: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

21

include details of when testing will take place, how

testing will be completed and the testing resources.

Included at this level is also a certain amount of

reporting for management which is to ensure that

testing is going to plan and is tracked. There are still

defects and project related issues at this level as

testing is still later in the development lifecycle.

The process areas at TMMi Level 2 are:

· 2.1 Test Policy and Strategy

· 2.2 Test Planning

· 2.3 Test Monitoring and Control

· 2.4 Test Design and Execution

· 2.5 Test Environment

Level 3 – Defined Definition

At TMMi Level 3, testing is part of the SDLC and has

associated milestones. Test planning is still done in

this stage of the SDLC although is completed earlier.

The development of a master test plan builds on the

test planning skills and commitments acquired at

TMMi Level 2. The organisation now has a set of

standard test processes and a specific test training

program exist.

Organisations at level 3 now understand the

importance of reviews in quality control and a formal

review process has been implemented. At this level

the organisation has its own set of standard

processes and is able to modify these to each project

without impacting the testing processes and

procedures of the organisation.

The process areas at TMMi Level 3 are:

· 3.1 Test Organization

· 3.2 Test Training Program

· 3.3 Test Lifecycle and Integration

· 3.4 Non-functional Testing

· 3.5 Peer Reviews

Level 4 – Measured Definition

TMMi Level 4; testing at this level builds upon that

which has been completed in level 2 and 3. The view

of an organisation is that testing is now part of life in

a programme or project and is self-sustaining and

evolving where possible. It can also be measured

through its processes and accomplishments.

Measurements are also stored in an organisation’s

testing repository to support the decision making

that is required It will also be possible to support

some predictions relating to test performance and

the cost of testing in programme and projects.

Reviews of artefacts are part of the test process and it

will be possible to measure quality earlier in the

lifecycle.

The process areas at TMMi Level 4 are:

· 4.1 Test Measurement

· 4.2 Product Quality Evaluation

· 4.3 Advanced Peer Reviews

Level 5 – Optimisation

TMMi Level 5, the organisation has achieved all the

previous levels of maturity and it will be possible for

the organisation to be capable of self-improvement

based on its processes and procedures. The testing

methods and testing techniques are optimised and

there is a continuous focus on fine tuning and process

improvement.

The process areas at TMMi Level 5 are:

· 5.1 Defect Prevention

· 5.2 Quality Control

· 5.3 Test Process Optimization

As per TMMi, an optimised test process is one that is:

· managed, defined, measured, efficient and

effective

· statistically controlled and predictable

· focused on defect prevention

· supported by automation as much is deemed

an effective use of resources

· able to support technology transfer from the

industry to the organisation

· able to support re-use of test assets

· focused on process change to achieve

continuous improvement.

Page 22: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

22

Coffee with Viswa – Brewing Ideas! One can have a great discussion over a good cup of hot coffee. Know your business – are you a specialised tester? Let’s talk about domain specific knowledge in testing. As a tester, develop your domain knowledge and focus on specialised testing. To know your business is to know your market whether it be banking, insurance, health or automotive. Being a domain expert helps you to test a product from customer point of view. This enables you to perform real time testing instead of merely testing the product as a software application. So learn your customer’s business as you learn the product. Viswa Devarajan is a Senior Test Analyst with QualIT in Auckland.

Richard Sims is a Senior Test Consultant with Planit

Software Testing in Auckland. He can be contacted

at [email protected]

In summary, an accredited TMMi assessment is one

of the most comprehensive assessments for test

process improvement covering all aspects of an

organisations testing capability. For all

organisations, even if they went through an informal

assessment, this would give a detail examination of

the testing capability and it would be possible to

identify the areas for improvement far more easily

than a TPI assessment can.

If you are in an organisation and can see the testing

team struggling and can’t put your finger on the

exact issue then it may be a good idea to get an

assessment done, it would be far more cost effective

in the long run.

Page 23: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

23

TestAnalytics = TestIntelligence Hidden in the depths of your test repositories and not even accessible by your

management tools is testing gold!

What gold you may well ask? Well, for example, your project might only be

wanting to know that all the test cases have passed and all the defects are fixed,

in other words, whether the completion criteria have been met. However what if

on the Friday before your Monday go-live, the team found and fixed 10 Severity

1 defects? Completion criteria would still be met however what would your

confidence be like and how would you communicate it? Better still, how could

you pre-empt the situation?

The gold is information about your product-under-test beyond the garden-

variety and vanilla-flavoured. And it’s almost certainly sitting in your test

repository or database yet your management tools won’t know its there or

even what it is, because they only count things.

Want to get it out and use it? We can help.

We’ve developed a set of tools and services to get this gold out from its hidden

recesses, onto paper and into the hands and before the eyes of those needing it.

Click below to find out more!

NZTesterMagazine

TestAnalytics

Find the gold and convert to cash!

Page 24: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

24

Software Education Group

Press Release

5 November 2014

For immediate release

Disqover becomes a part of the Software Education Training Group

The Software Education Group has acquired Melbourne-based training company Disqover to provide a combined

Software Testing curriculum for 2015. Software Education is internationally recognised for providing comprehensive courses in software development training. The Australasian-based company today announced that Disqover would become part of the Software Edu-cation Group. Disqover is a top Australian specialist Software Testing training company with an industry-leading pass rate in Inter-national Software Testing Qualifications Board (ISTQB) certified training. The purchase of Disqover further increases Software Education’s share in the software testing market through the expansion of training capabilities. Toby Thompson, Founder and Managing Director of Disqover, will become the Software Testing Practice Lead for the Software Education Group and had this to say about the acquisition: “Our focus will remain on delivering the highest quality software testing courses. In the current market Software Testers and software project team members alike require multidisciplinary skills to work effectively in cross-functional teams. The combination of offerings will allow our customers’ the opportunity to leverage a suite of cours-es with improved depth of coverage in 2015” Disqover will continue to operate under the same name as a specialised brand of the Software Education Group. A combined curriculum of Software Testing courses will be available in early January 2015. Managing Director of the Software Education Group, Martyn Jones is excited about the possibilities that a partner-ship with Disqover will provide for the future saying that: “We are thrilled about having Toby and Disqover on-board with the Software Education Group, increasing our soft-ware testing practice to include more publicly scheduled courses across Australasia as well as new course offerings, providing more options for our customers”.

Ends.

Page 25: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

25

About Software Education

Established in 1990 Software Education is an independent software development company offering training and

consultancy services across all sectors of the software development life cycle. SoftEd provides world-class training

for software development teams in Australia, NZ, USA, India, Saudi Arabia, Canada and Singapore. SoftEd’s mis-

sion is to provide customers access to leading-edge content and connect clients with an unrivalled network of inter-

national software development experts.

Website: www.softed.com

About Disqover

Melbourne-based company Disqover are a highly-regarded, specialist software testing training company offering a

range of ISTQB certified courses. Disqover is fully accredited by the Australia and New Zealand Testing Board

(ANZTB) and all trainers have been approved by the ANZTB to deliver International Software Testing Qualifications

Board (ISTQB) certified courses. The company’s wide range of training solutions address the issue of software qual-

ity in large public and private sector software development groups. Disqover was founded by Toby Thompson and

has a strong reputation in the Australian marketplace for providing high quality courseware and excellent customer

service. Disqover offer their courses publicly throughout Australia as well as in-house.

Website: www.disqover.com.au

For more information or further comment please don’t hesitate to contact

Martyn Jones

Managing Director

The Software Education Group

+64 21 590 254

[email protected]

Page 26: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

26

I’m really fortunate that I have one of the best jobs

anywhere. I love my job and all the experiences I

have currently, and have had performing it!! I’m a

people manager; I manage and lead a team with job

titles that involve the word ‘test’. Test Manager, Test

Leads, senior and junior Test Analysts, Test

Engineers, to name a few. What a great bunch they

are; intelligent, intuitive, inquisitive, problem solving

and all with a raft of hidden talents waiting to be

identified and developed. Who wouldn’t appreciate

this environment to work in and with? Personally

I’m passionate about it and did I say I love my job???

IT is a people business after all and a computer still

can’t turn itself on at the wall socket, well not yet

anyway, My passion for my role is heightened when

I read and establish that nearly all companies

stipulate that their most valuable asset is their

people. I believe that this is a true mantra expressed

and desired by the senior management and tasked

to their people leaders to bring this about. But alas

experience and past team members indicate that

sometimes putting this into practice is

something else.

As a people manager, I believe that I have been given

a lot of responsibility for the development,

engagement, well-being and success of my team.

The reason I know this to be a fact is that it is

written into my job description and one of my key

performance indicators. So it confounds me that

many of my peers accept this responsibility and do

not seem to follow through for whatever reason and

we end up with disgruntled, disengaged staff who

in turn tell other staff members. This denigrates the

company, the team and the people manager role in

my opinion, as well as making sure the engagement

surveys we complete look like last years and the year

before, you know the one…senior management not

up to scratch, team great, no training and no

development. The sad thing is we can make a change

here if we can actively carry out the roles we have

been entrusted with.

I often hear that people management is difficult yet

that’s why the job promises so much. The difficult or

disillusioned person being encouraged to consider

new and interesting jobs and roles, new leaders being

found and grown, younger staff being encouraged

and confidence building, unexplored skills and

interests grown and encouraged, and all staff enabled

and empowered to be the best they can be in what

they seek from a career in IT.

We are all different and do things in different ways

but in my view there are some things that need to be

givens; to establish, grow and maintain the

relationship. They need to be fundamental items in

the people manager’s toolbox, and without which,

you are unlikely to succeed at this role.

Respect is first cab off the rank. If you don’t respect

your team and each individual in it, you will be found

out and found wanting. This is an earned item, not

attained just because you are the manager. It is also

important to remember that you can lose this just

as fast as you gain it so it needs to be worked on

constantly. Take nothing for granted. It is easy to

assume and tell ourselves that we respect our staff

but you may be surprised with how this is

perceived by the team or individual, take this lightly

at your peril.

Honesty is next and this is about being upfront and

honest with all staff, particularly around difficult

times or when making difficult decisions. Most

people want to know that they are doing well and

when they are not. Waiting until a performance

review period every six months doesn’t really cut it.

A staff member said to me once “I don’t like your

message but I like the fact you are telling me”. This

has resonated with me for a long time now. People

can deal with the message if they know what it is that

is being conveyed and that it is done with honesty

and respect.

Acknowledgement follows. We all like to feel we are

making a contribution yet many people go years

without ever having this communicated Being given

the “you are doing a good job” once a year at

performance review doesn’t generally cut it without

a specific item to hang your hat on. Just as bad is

People Management in Testing

by Sid Holmes, AMP

Page 27: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

27

saying to staff “you have done well” every day of the

week without any sincerity or honesty because you

feel you have to. What was it that they have done to

make the contribution? Hardy annual lines like “you

did well on the project” don’t generally tell the

member anything. Be specific and be proud. If they

have done a good job, acknowledge it and mean

it…..all good work deserves to be acknowledged.

Finally, the last item is the joint one of

communication and interest. Do you know the names

of staff member’s partners, sports they play, children,

interests they participate in, strengths, thoughts

about the workplace, ideas for betterment or their

development, work experiences that they have had?

These are people here folks with the same needs as

you and me and taking an interest in them is in my

view a given. You are charged with shaping careers,

setting standards and expectations, and growing the

people, yet we don’t take the time to find out about

them. Sounds easy, but doesn't happen often. This is

really important and I know this because many of

my team have told me so.

Respect, honesty, communication, interest and

acknowledgement are the predominant items that

are put forward by staff I have worked with when

I have asked them to identify their expectations of

their managers. Ironically it is not lost on me that

this is what I would also expect of my managers.

I believe that the people management role is a

challenging one and requires a variety of toolbox

skill sets to make a great fist of the role. But the

aforementioned items will help you go a long way

to enjoying your role and the team enjoying your

management. Yes I really do love my job!

Editors comments—do you get the impression Sid

enjoys his job?? It’s refreshing to hear a Test Manager

relate in this manner; how often do we sit there and

bemoan too many defects, not enough time, such poor

software quality et al. Thanks Sid!

Sid Holmes is a Test Delivery Manager with AMP

and based in Wellington. His experience includes

stints with ANZ National Bank, EDS and Unisys

among others. He can be contacted at

[email protected]

Page 28: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

28

The testing world will change!

Mobile has become a game-changer across all

industries. Organisations around the world now

drive significant value through providing continuous

access to services, anytime and anywhere - to

millions of customers and thousands of employees

over an astonishing array of devices.

Research suggests that the use of mobile devices will

grow from 10 billion now to more than 30 billion by

2020. [Reference]

It is predicted that the global mobile banking

industry will grow to 1.1 billion customers by 2015.

It is estimated that 39% of all new tablet users will

use them for banking. Gartner forecasts that the

worldwide mobile payment market will have over

450 million users and a transaction value of more

than USD $721 billion in 2017. This represents

compound annual growth rates of 18% and 35%

respectively for the period 2012 to 2017 [Gartner,

Forecast Mobile payment]. We are also seeing a

growth in utilities companies, government and retail

companies within their mobile portfolio.

There is an increasing awareness of among senior

business and IT executives of the significance of

mobile application testing. In today’s world, the

quality of IT solutions has a much stronger and

more immediate impact on business results. A mobile

application failure often translates to a business

process failure and will be obvious to the end user,

and can damage the corporate reputation.

Mobile will have a big impact on the testing

discipline: next to ‘normal’ functional testing there

needs to be additional focus on security and the

complete user experience, such as performance

and ease of use, and this needs to be tested across a

multitude of platforms and devices, often in a quick

paced or agile environment. The skills for mobile

testers tend towards the need for experience with

agile frameworks, test automation, continuous

integration, performance and security testing.

Another example of the trend towards more

technical testing over the traditional functional

testing role.

Mobile Testing Challenges

Capgemini sponsors the annual development and

publication of the World Quality Report. This report

is developed by Capgemini, Sogeti and HP Software.

Information is taken from the results of 1,543

interviews with CIOs, IT directors/managers, VP

of applications and quality assurance directors/

managers across 25 countries. This report shows

that many companies are ill-equipped to handle the

complexity and scope involved with testing for and

with multiple mobile platforms, devices and services.

Even those organisations with a solid testing

foundation encounter challenges with mobile testing.

Summary of the challenges:

Not enough time to test

Lack of methods, solutions and experts

specific for mobile testing

Lack of devices and operating systems for

testing

Lack of the right mobile testing tools

Rapidly changing technology landscape, such

as device types and operating systems,

leading to continuous need for updates

Lack of re-usable assets and frameworks

Performance, usability and security require

specific attention which require specialised

skills not often part of the core testing

competency

Mobile Testing: Challenges and Ten Best Practices For Developing Your Own Capability by Peter Bink, Capgemini

Page 29: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

29

Success Factor #1: User Experience

User experience is a main criterion for success.

Testing user experience is an art that consists of

both objective and subjective measurements. Key

elements include:

Adherence to platform rules and guidelines,

as dictated by app store review guidelines

User interaction – Level of intuitiveness and

efficiency

Navigation – Adherence to platform rules and

guidelines

Signup and login – Clarity and ease of use

Layout and user interface design – Look and

feel, aesthetics, and precision in layout

Exception handling – User friendly messages

and graceful exception handling

Success Factor #2: Test Automation

The ability to automate tests using scripting or

recording makes mobile testing significantly more

efficient. Smart test automation utilises

parameterised scripts for both device types and user

interaction, and is enabled to run on devices in-house

and in the cloud. Some sophisticated user

interactions involve touch, gestures, and sensors

which cannot be fully automated, and, as a result,

will require manual testing.

Success Factor #3: Performance

Mobile solutions can overload servers, through the

rapid growth in the number of users and an increase

in the average life span of transactions due to

variations in bandwidth and latency. This can have

a performance impact on all users. Also, there are

significant performance differences across different

types of devices. An app or responsive web may run

well on a high-end device but may not be acceptable

on a low-end device.

Successful mobile testing focuses on network

capabilities, system integration and back end layers,

as well as the app itself.

Success Factor #4: Security

Mobile solutions are implicit in an increasing number

of scenarios where sensitive systems are accessed

and private data is in transit or at rest, that is, stored

in mobile devices. And given the physical nature of

mobile devices, they are more easily forgotten, lost,

or stolen. Mature mobile testing aims to:

Secure confidentiality and integrity of data

Validate whether authentication and

authorizing is secure

Verify that systems are keeping records of

events (non-repudiation)

Success Factor #5: End-to-End Integration

Testing

As mobile solutions and apps become increasingly

more business relevant, the scope and depth of

transactional features expand and so does the need

for well-designed and validated system integration.

Integration testing makes sure all the components

are working together properly and that the

interaction between core enterprise systems, like

CRM and ERP, and external interfaces are seamless.

Mobile Device Management (MDM) is a system that is

often part of a full end-to-end integration process.

When specific systems are needed to remotely

manage a fleet of handheld mobile devices, the

mobile solution should be validated and tested

against the main functionalities of the MDM systems:

Device technologies supported

Range of embedded applications supported

IT policy control

Device security enforcement

Management of connected devices

Third-party applications control

Success Factor #6: Connectivity-Related

Testing

Most mobile solutions depend on some kind of

network connectivity. Solution design and testing are

needed to address variable bandwidth, offline and

flight mode scenarios, and validate user sessions

moving between different network conditions. Both

automated network simulations and manual testing

in real network conditions are required to ensure

consistent behaviour.

Page 30: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

30

Success Factor #7: Understanding Physical

Characteristics

The physical characteristics in mobile solutions differ

significantly from non-mobile solutions. Mobile

testing needs to take several of these characteristics

into consideration:

Screen size

Touch and gesture capabilities

Orientation (vertical or horizontal) and

movements in three dimensions

Camera

GPS

Some testing scenarios related to these physical

characteristics can be automated, for example,

orientation changes; but some scenarios do require

manual testing, for example, synchronisation of

gestures and sounds.

Success Factor #8: Location Simulation

An ever increasing number of solutions utilise

location data and GPS integration. In these solutions,

features are designed to depend on location or

distance to other locations. Successful mobile testing

ensures quality across different types of GPS

implementations and needs to utilise efficient

location simulation.

Success Factor #9: Dealing with

Fragmentation

The market fragmentation for both operating

systems and device types continues to challenge

solution design and testing. Most solutions, both

internal and external, need to support hundreds of

device types and several versions of operating

systems. A relevant mobile testing tool box includes:

Physical access to the major device type and

operating systems combinations

Ability to run manual and automated tests

across both physical devices and emulators

Access to cloud-based platforms to maximise

testing a number of devices

Success Factor #10: Optimising Third Party

Review

Most apps are distributed through open and public

app stores, each with its own set of guidelines. Apps

that fail to adhere to guidelines may be rejected. And,

given the third party review process, there is a bug fix

latency inherent in updates. Also, when operating

systems are updated, the new releases can break

existing apps. This means that successful mobile

testing must consider:

Using common testing checklists based on the

most recent app store rules and guidelines

Using rapid testing cycles for updates in order

to minimize impact of bug fix latencies

Testing existing apps on beta versions of

operating systems

Conclusion

Mobile solutions are increasing exponentially across

all domains, not just within the financial sector.

Capgemini New Zealand have seen this demand

across government, health and financial sectors.

Testing for mobile applications has a different set of

challenges and success factors than traditional

functional testing. Any organisation that is required

to undertake mobile testing should also consider

some investment in people, process and tools to

enable efficient testing solutions. This may require

seeking specialised services to support the setup of a

mobile testing capability, and also to undertake

specialised testing such as security, and mobile tool

experience. This is an exciting business to be in but is

not for the faint hearted.

Peter Bink is a Senior Manager within Capgemini

New Zealand with more than 19 years experience in

a broad range of roles the whole SLDC in Europe,

Australia and New Zealand for many different

industries. He can be contacted on

[email protected]

Page 31: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

31

The Personal Journey

There were a lot of nerves on my behalf leading up to

the Let's Test Oz conference this week. I’d never

been to Australia and somehow quoting Crocodile

Dundee doesn’t seem the best way to win friends and

influence people.

What I'm going to do is pick up on a few of the best

moments and revelations from the conference, to

condense my experience ...

The Train Ride

I had a two hour train ride in the morning with tester

Kim Engel, fresh from our conversation about flat

earth's. It was a great journey and a great

conversation to get me in the mood - I almost talked

myself hoarse. It reminded me that one of my

favourite pastimes with my son is travelling, because

whether hiking or driving, it gives you opportunity to

talk and explore. With my son it's about exploring

history and ideas around it. With Kim it was

exploring aspects of testing and mental health that

we've both had personal journeys with.

With such conversations it's actually often a

disappointment when you reach your destination,

because you've enjoyed the journey too much.

Yup - I was actually slightly sad to arrive (but not

for too long).

Coaching Testers Workshop

I've spoken previously about James Bach and Anne-

Marie's workshop on coaching testers. It had some

great ideas circulating and James had found some

examples from movies of people coaching others.

I had to say I feel slightly ashamed that I have not

yet watched the Magnificent Seven despite liking

Westerns - I need to remedy this at some point!

Great focus was given to understanding yourself

as a coach, who you are and what your behaviour

is. How do you interact with people? What's

important to you in others? Then to look at the

person who is looking for coaching and asking

what they need in a coach. Sometimes that's not

you. James and Anne-Marie talked frankly about

their coaching and that occasionally they will

recommend an individual goes to the other

for coaching.

As always with a good workshop this included

reams of hands-on. We logged in anonymously to

Skype and we got to work with other people in

the room, taking turns to be the coach and to be

the student.

I'll be absolutely blunt, I thought the person I was

coaching knew me and was playing a game. I kept

getting incredibly frustrated but trying to be calm

with them. When it came to debrief it turned out

that the person I paired with had never used Skype

Review: Let’s Test Oz

by Mike Talks, Datacom

Page 32: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

32

before and was a slow typist. I learned a valuable

lesson there that especially online, you need to get

some form of check-in about the student, how they

feel and what their backstory is, rather than "leap

into" the coaching.

It's a lesson I really should know but it's amazing

how the session helped to reinforce that and instead

I made my own assumptions that it was someone

trolling me.

So you wanna be a boxer?

SoftEd ran a couple of boxing training sessions which

were absolutely superb. I occasionally do something

similar in Wellington (I've talked about Josette, one of

our instructors here).

Boxing training is a very intimate kind of

training. You pair up with someone and take turns

doing exercises, with one person using gloves to hit

and the other holding the pads being hit (you hit the

pads, not the person!). You have to both be mentally

in a similar zone and develop a kind of rhythm with

each other.

That makes it oddly quite a social activity. The best

kind of pairing is when you're both supporting the

other with "try moving your stance", "we're half way,

don't give up", "c'mon, keep going, nice" - it was some

of Bach and Charrett's coaching tips in miniature.

Half the success of any conference like this is being

able to mingle with people who you don't now. Meet

new people, make new allies. The boxing sessions

proved to be a great "meet and greet" event, with

conversations with other boxers strung out through

the rest of the conference.

Which brings me to this tweet …

The boxing was one method (I'll talk about the other

in a moment), but there were the experienced

keynote speakers touring the conference and there

were other speakers such as myself. But mingling

and listening I realised something important,

"everyone has an experience report and a story

inside them, just keep your ears open".

As I said in that tweet, in some ways listening to the

raw stories from others was a great opportunity to

really spread my net over the conference.

Put yourself around people with passion

The other method of putting myself around a bit

came at lunchtime. It all felt like being a bit at High

School with "who shall I sit with?". Occasionally I just

really needed to eat and dash, and I'd just sit

alone. But I tried to use it as an opportunity to sit

with new people and introduce myself.

I however earned myself the "special little snowflake"

achievement on Monday lunchtime though - finding a

table of people I didn't know and asking if it was okay

to sit with them. They were from a completely

different conference!

But this actually led to a bit of a revelation. They

were a conference of ultrasound operators and were

curious about who we all were. So I got to tell them a

little about software testing as well as ask them a bit

about their conference.

My interest was heightened a bit. Recently my

brother and his wife had a daughter. One thing that

surprised me was the ultrasound. When my son was

Page 33: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

33

"under development", we had a couple of ultrasound

pictures of him and they were a bit like one of those

3D puzzles. If you stared at really hard, you couple

perhaps make out a skull. .

But for Thea, her pictures were strikingly clear, the

technology had really come along leaps and

bounds. So surely being an ultrasound operator was

a lot easier now? Wrong.

Turns out they do more and more checking with

ultrasound as it's such a non-invasive

procedure. They were spending a conference looking

at example images of different ailments such as a

damaged appendix. In their normal routine they

might not see some of these examples so it was all

about improving their ability to look and and

recognise issues. They were using the conference to

broaden their experience from other operators so

when they went back to work on the Thursday they

were just that bit sharper.

The improvements in technology made some things a

lot easier. But at the end of the day it required a

human eye and human judgement. And damn it -

wasn't the same true about testing's relationship to

technology over the same timeframe? Some things

had got easier but at the end of the day, it's about the

human eye and human decision making.

This led me to an important understanding - I learn a

lot about testing but not always from testers. I'm

quite a talker but I'm a good listener too. In fact,

when we go touring around New Zealand, my wife

despairs of me as I really enjoy going into quiet shops

and talking at length to the owners, where we come

from and finding out some of their history.

I am attracted to spending time with people with a

passion and an ability to animatedly talk about that

passion. It doesn't have to be anything I'm interested

in - in fact often it helps if it's not. Just this year I've

spun pieces off from conversations with Josette my

boxing instructor and Lotz my musician friend.

Testing has a good few parallels and if you listen out,

there is knowledge out there from people going

through similar experiences which you can shanghai

and add to yours!

An interesting chat with Erik Peterson leads to

some self-reflection ...

I had an interesting lunchtime chat with Erik

Peterson where we talked about heuristic models for

testing. I came to the realisation that I'm heavily

dependant on an "experiental model" (although I do

use others) - basically "when I used to program, I

once saw this happen" or "I've seen a bug like this

in a similar system".

That's of course okay, as long as you realise it's

fallible. And its greatest fallibility is you aren't

looking for a bug you've never experienced before -

you have a blind spot to anything you've never seen

or heard. It also made me look at some of my writing

- overall my writing heavily leans towards a "series of

experience reports", occasionally postulating a model

from this experience.

It's an interesting look in the mirror at my way of

thinking - also tying into my previous post about

trying and failing. It's like I'm drawn to having a pool

of experiences to base judgements on.

Page 34: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

34

My talk

Yup - I'm not being egotistical here but not only did I

enjoy giving my talk on "deprogramming the cargo

cult of testing" but to my shock, I walked out with an

expanded take on it. Some of the questions asked

allowed me to think and explore the subject in ways

I'd not expected.

The topic was really talking about the system of

testing we've put into place over the last 12 months,

and I talked about it back in my piece on exploratory

testing. We put together a new way of testing when

we moved to being agile but we engaged with our

customer to talk to them about what they felt they

got out of our old methodology? What did they feel

they get from a test plan, a test script or a test report?

The point was this was to form a matrix of values

from our customer - this meant whatever approach

we took for testing it needed to address these values

in some manner. If it didn't then we weren't done

with our approach, it wasn't hitting the needs and we

needed to rethink. But not only that, we had to make

sure we were making "how our testing worked"

visible to the customer.

An example of this would be how the customer saw

test scripts both as "proof of testing" and "training

material". We ended up using qTrace to record our

sessions as "proof of testing" and for "training

material" sharing an internal testing handbook we

already had and making sure we kept it up to date

sprint on sprint.

Someone noted the piece tied in a bit with Keith

Klain's keynote where he talked about avoiding being

overly whiney or self-centred about testing's

problems but understand the person you report to

"has problems and needs" that you don't know of. To

try and go to them not with more problems but trying

to help and aid them.

The bottom line to this approach was that we made

sure we had an evangelical fervour to delivering real

value to our customers in a testing approach that we

felt accurately addressed their needs. In this I really

was pleased we seemed to carry on the spirit

of Alessandra Moreira's talk about engaging and

influencing people. In fact the conclusion from our

talk was it would be a mistake to wait for a major

shift from waterfall to agile before engaging with a

client to ask if the testing you're performing is really

"ticking the boxes" from both the client and the test

team point of view.

Final thoughts ...

These are the things that really stuck with me - a very

interesting conference with a lot to take home. I

sadly missed the Fiona Charles keynote at the end of

the Wednesday which I was looking forward to.

There was a lot to take in but also fun to be had along

the way - one of the funniest activities was

being Joanne Perold and Carsten Feilberg's workshop

where we replicated problems in communication by

using a Lego building exercise to replicate the

software building process. This was an exercise I

would love to try again with different rules to see if

it causes some of the outcomes I expect. Likewise

the boxing and the coaching activities were nicely

hands on.

The team behind Let's Test Oz really did an excellent

job in making this happen - the venue and food was

amazing, everything ran well, and everyone seemed

to come to the conference ready to really share and

engage. I made sure before writing this that I sent an

email to the key players, asking them to circulate to

all who needed to read it.

Great work guys!

Mike Talks is a Test Manager with Datacom in

Wellington and a regular contributor to NZTester

Magazine. He can be contacted at

[email protected]

Page 35: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

35

Testing Events

If you have an event you’d

like to promote on this page,

email [email protected]

Coming from NZTester Magazine in 2015:

NZTester Magazine Testing Leadership Summit

- Wellington, March 2015

NZTester Magazine Conference

- Wellington, August 2015

The Great Big NZTester Magazine Road Trip - August/

September 2015 + regular meetups in Auckland, Hamilton, Wellington & Christchurch,

others as requested

Page 36: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

36

85% of all software tests are still being performed manually. Is there a way to significantly improve manual testing and be able to work 2x faster? This free webinar is designed to change the way you perform manual testing. "Manual Testing 2x Faster in Word and Excel" focuses on new ways to reduce your manual test effort. Join this webinar to learn how to:

Build, execute and analyse software tests directly in Word and Excel

Make UAT, Exploratory, SAP and CRM manual testing much easier

Test websites and applications with greater flexibility

Improve the efficiency of manual testing and test 2x faster

Integrate with popular test management systems

Ask questions and take back valuable tips and tricks to use in your every day manual testing.

Space is limited so sign up today. For further details: http://www.autom8.co.nz/webinars/

See you soon.

- Aaron Athfield, Founder and Chief Manual Tester Guy nz.linkedin.com/pub/aaron-athfield/75/811/626 www.autom8.co.nz

Page 38: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

38

It seems like I always begin these reviews along the

lines of ‘í t doesn’t seem like n months since the last…”

so this time I won’t. Needless to say that another

StarWest Conference has come and gone and at a

rough guess, I’d say this was the biggest one for

many years, 1,200+ attendees I understand.

I was down for two tutorials this time; my traditional

Testing the Data Warehouse and the Programme-

Level Test Management session first introduced at

our own conference back in August. Having done

four tutorials back-to-back then, doing two at

StarWest didn’t seem like such a big deal. My sessions

were well attended with Data Warehouse Testing

selling out and the discussions lively and healthy.

I do find it refreshing in some ways to find that no

matter where on the planet we ply our testing trade,

the challenges are always similar.

Wednesday 15 October saw the start of the StarWest

conference proceedings. Unfortunately the day of

tutorials on the Tuesday coupled with a bout of

unexpected jet lag and I missed the first few sessions

–which annoyed me no end as I really wanted to

catch Julie Gardiner’s keynote on testing web

services, libraries and frameworks. The previous

evening I’d caught up with Julie and Dawn Haynes

for a sneak preview of Julie’s presentation so I was

particularly keen to catch.

Wednesday afternoon I attended Mary Thorn’s

session on the Test Manager’s Role in Agile, which I

found interesting although not too much new for me.

I then jumped over to the World Quality Report 2014

-5: Emerging Testing Trends presentation. At last

year’s StarWest I had attended the same session with

a great amount of interest as this report provides a

huge amount of depth and meat around the current

state of testing and general software quality

assurance. As expected, the results revealed not

only a greater investment in the testing of mobile

applications, which has been growing steadily for the

past couple of years now, but also a continued focus

on getting smarter, faster and more functionally rich

software applications to market in much quicker time

right across the board.

This trend has led to some marked changes upon our

industry which I can only seeing continuing. I can

liken it perhaps to the recording industry where I

have a modicum of experience. In days past, to get a

record or CD onto the streets, an artist usually had to

be signed by a record company and have had many

hundreds of thousands of dollars invested in both

recording process and in artist development, a

future investment. Then with the recording company

marketing machine kicking in behind, an artist could

sell kazillions of CDs and make megabucks for both

the record company and themselves, usually in that

order. Getting a top-flight CD out usually required

input of at least $500,000 however the returns for

top artists were in the thousands of percentage

points so no-one really worried too much. However

in the last decade or so, technology has provided

ways for artists to record themselves and produce

professional sounding, studio quality recordings for

as little as a couple of thousand dollars - for a PC and

some clever software. On the surface, this sounded

good as it meant that no longer did artists have to be

signed to properly record and thousands more CDs

et al became available from artists who otherwise

would not have had the opportunity. In addition,

My test management class in full swing

Page 39: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

39

with the advent of iTunes and other media, record

company marketing is no longer an essential

requirement either, so a whole new industry

paradigm has been created and that can only be

good, right?

Possibly, however think about this: with thousands

more artists ‘in the market’, the range of product is

so much greater and the pressure so much more

present to be able to compete and get ahead of the

competition for the same sized customer dollar.

With the record companies no longer investing in

the manner they did in the past because the returns

are no longer there, there is little or no professional

artist development so the quality of artist and thence

the resulting product is driven down also. If we’re

not careful, we’ll end up with a general ‘dumbing

down’ of the market and quality, which in my opinion

is not good for anyone in the longer term, even if it

means that there’s more available, for less money

and in greater quantities than ever before.

Could the same be about to happen with software,

especially in the mobile space? To check on the

results of my beloved West Ham United Football

Club, I can choose from at least a couple of dozen

apps on the Apple AppStore. I’ve been through at

least five of them in the past two years with the

main reason for changing being that I get sick and

tired of trying to get them all to work properly and

consistently over time. The quality has been ‘dumbed

down’ and no matter, for free or at most another

couple of bucks I can grab the next one of rank.

Problem is in a commercial sense, every time

I change I have to learn how to use, understand the

features, learn to get the best out of etc etc. Then no

sooner have I done this and got it working for me and

a new version comes along full of holes and the cycle

starts again. Not exactly the best use of my time.

Now while this scenario is not necessarily present

yet in the case of non-mobile, commercial software

applications, the continued upwards pressure from

this sector can only impinge upon this space

eventually. Will we then see the software investors,

like the record companies, pull away completely,

seeking better returns from their dollars elsewhere?

We’ve already been here once with the dot.com crash

of ten or so years ago as part of the internet advent

revolution. Is it looming again?

Anyway, maybe enough pontificating. Later

Wednesday afternoon saw the ever-popular

Lightning Strikes the Keynotes strike where

presenters volunteer for a five-minute lightning

talk on anything around testing. I did my perennial

‘Which Came First; the Bug or the Test?’ which

seemed to go down well as always although I really

must come up with another lightning talk someday.

Thursday was another busy day; I sat in on Julie’s

‘Rainmaking for Test Managers’ session although the

title engendered in me a sense of dread. As a test

manager I’m often accused of kicking up too much

of a storm and here’s Julie propagating the message?

I needn’t have been worried, as always Julie put

forward a number of thought-provoking concepts,

in particular one around taking a make-it-happen

approach when everything else seems to merely

roll along.

After a lively discussion around test metrics with

Pablo Garcia and Michael Bolton, I then trotted off to

Pablo’s presentation around the same subject. I do

wonder whether we get too hung up on metrics; as

to whether they’re valid of not. I’ve always seen

metrics more as indicators, not as harbingers of

absolute truth. And we have to be cognizant of the

fact that they change on a day-to-day basis so is a

daily snapshot anything more than just that? Is it

not a trend over time that counts for more and

provides more information around the state of our

product, system or project than mere numbers?

Hmm, anyway I’d better not get on my hobby horse

here however suffice to say that Pablo was

promoting a similar message in his presentation.

Fast forward to the 4:15pm session and Pablo Hope’s

keynote around Security Testing. Appearing on the

podium resembling Gandalf out of the Lord of Rings

trilogy, Pablo proceeded to peel back the layers of

Pablo Hope’s Gandalf rendition!

Page 40: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

40

mystery around this subject point-by-point while at

the same time shedding a layer of his wizard outfit

accordingly. Certainly an entertaining approach to

a subject that is still sometimes considered one of

the testing ‘dark arts’! I learned a few things too eg.

that security testing is a ‘must’ for any product or

system that is exposed via the internet (as if I didn’t

know that already but did I really know it?).

Thursday evening was a late one; caught up with

Pablo Garcia, Rob Sabourin and Scott Barber along

with a few others for enjoyable drinkies, dinner and

healthy conversations. I think we all agree that the

testing landscape is changing and that the whole ‘shift

left’ momentum is gathering around bringing the

tester’s nose closer to the core of software

development. I can’t help but wonder though whether

that now more than ever that the differences between

testing a software product versus testing

a systems implementation are taking on quite unique

perspectives and that good ole approaches from way

back need tempering accordingly. While the whole

exploratory approach is riding the popularity wave

right now, I still feel that we cannot lose sight of the

tried, true and proven within the arenas where that

approach is still valid. To say that one approach is the

best one, the right one, the only one is in my humble

opinion short-sighted and perhaps revealing that the

propagator really does not yet have the breadth and

depth of testing and management experience to be

able to fully appreciate the wider perspectives and

annals of testing. Anyway hereendeththerant!

Again as always, StarWest was a blast and well-

executed by Lee and the SQE team. May they ever

continue to be so!

Page 41: NZTester · The full range but particularly we provide agile approaches to engagement, testing or test ... approach taught by people such as Gojko Adzic and pioneered partly by local

41

Click on title

Assurity Consulting 7

Catch Software / EnterpriseTester 18

NZTester Magazine TestAnalytics 23

WorX / Autom8 36

And now it’s your turn…

If you would like to be involved with and/or

contribute to future NZTester issues, you’re

formally invited to submit your proposals to me at

[email protected]

Articles should be a minimum of ½ A4 page at

Cambria 11pt font and a maximum of 2 A4 pages

for the real enthusiasts. If you wish to use names

of people and/or organisations outside of your

own, you will need to ensure that you have

permission to do so.

Articles may be product reviews, success stories,

testing how-to’s, conference papers or merely

some thought-provoking ideas that you might

wish to put out there. You don’t have to be a great

writer as we have our own staff writer who is

always available to assist.

Please remember to provide your email address

which will be published with your article along

with any photos you might like to include (a

headshot photo of yourself should be provided

with each article selected for publishing).

As NZTester is a free magazine, there will be no

financial compensation for any submission and the

editor reserves the sole right to select what is

published and what is not.

Please also be aware that your article will be proof-

read and amendments possibly made for

readability. And while we all believe in free speech

I’m sure, it goes without saying that any

defamatory or inflammatory comments directed

towards an organisation or individual are not

acceptable and will either be deleted from the

article or the whole submission rejected for

publication.

Feedback

NZTester is open to suggestions of any type, indeed

feedback is encouraged. If you feel so inclined to

tell us how much you enjoyed (or otherwise) this

issue, we will publish both praise and criticism, as

long as the latter is constructive. Email me on

[email protected] and please advise in your email

if you specifically do not want your comments

published in the next issue otherwise we will

assume that you’re OK with this.

Sign Up to Receive NZTester

Finally, if you would like to receive your own copy

of NZTester completely free, even though we’re still

real low tech right now, there’s two easy ways: 1)

go to www.nztester.co.nz, or 3) simply click here -

Ed.