Upload
claudia-badell
View
137
Download
1
Embed Size (px)
Citation preview
INCORPORANDO EL TESTING
WORK EXPERIENCE• Senior Quality Engineer, Indigo Studio Team,
Infragistics, 2009• TestingUy (www.testing.uy)
ABOUT ME
STUDIES• Computer Engineer• Association for Software Testing courses
(Foundations & Bug Advocacy)• Scrum Master• ISTQB Foundation
PAST WORK EXPERIENCE• Test Manager, Tester and Business Analyst• Teacher for Computing Science Department
within the School of Engineering at Universidad de la República
INTRODUCTION1
WHAT WE DID2
CONCLUSIONS3
AGENDA
AGENDA
INTRODUCTION1
WHAT WE DID2
CONCLUSIONS3
Why do we test?
Testing is always a search for information
BBST: Foundations course by the Association for Software Testing
Identifying what to test
• supported by test cases design techniques
• test design and test execution activities are separated in time
• each activity can be performed by different people
SCRIPTED TESTING
• test design and test execution activities are performed simultaneously
EXPLORATORY TESTING
Heuristics provide ideas to test
TESTING HEURISTICS
CEM KANER
Consistent with: • the product• history• comparable products• our image• regulations• purpose
http://testingeducation.org/BBST/foundations/
J AKO B NIELS EN
1. Visibility of system status2. Match between system and the real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and
recover from errors10.Help and documentation
www.useit.com/papers/heuristic/heuristic_list.html
USABILITY HEURISTICS
At what level?
• Test cases design techniques
• Exploratory testing
• Heuristics
• Bug history
• Business domain
• Behavior, visual design & interaction design for similar features
• The source code of the feature
• …
HOW DO WE IDENTIFY THE SCENARIOS TO COVER?
TIME
RISK FROM THEBUSINESS PERSPECTIVE
PRIORITY
RISK FROM THEIMPLEMENTATION PERSPECTIVE
FEATURE COMPLEXITY
CONSIDERING
How do we share a common understanding of previous concepts in a cross-functional team?
And also, how do we apply them?
AGENDA
INTRODUCTION1
WHAT WE DID2
CONCLUSIONS3
THE PRODUCT
• In the market since 2012
• Eight major releases, several intermediate updates and silent updates
THE PRODUCT
• Developers (7) • Visual designers (1) • Interaction designers (1) • Technical writer (1) • Testers (1)
THE TEAM
• Mostly performed by all of us
TESTING IN THE TEAM
• UX is an important aspect to consider when testing
• Tester as evangelist
• Testing strategies are defined and applied together
• Testing strategies for the mid/long term
• Big domain with many scenarios to cover
THINGS THAT WE DID
• Internal trainings
• Kick-off testing meetings
• Follow-up testing meetings
• Early Testing
• Testing variables checklists
• Testbeds
• Mind maps to guide exploratory testing
TEAM TESTING STRATEGIES
• All team members improved their knowledge of the features beyond their specific activities.
• We’ve learned to be flexible enough to adapt and wear other hats according to the needs of the product and the team.
• We incorporated testing terminology as part of our team culture.
• We adopted and unified the domain terminology of the software under test.
WHAT WE’VE LEARNEDAS A TEAM
• As all the team tests and reports bugs, the bug tracker workflow has been improved.
• We improved bugs’ triage.
• We gained understanding of the required testing effort for a release and the impact that a fix has from this perspective.
WHAT WE’VE LEARNEDAS A TEAM
• Allocate time and resources. Otherwise, it's just wishful thinking.
• Automation is not an individual effort, it is a team effort.
WHAT WE’VE LEARNEDAS A TEAM
Automated testing complementsmanual testing, but it does not substitute it.
AGENDA
INTRODUCTION1
WHAT WE DID2
CONCLUSIONS3
• The role of the tester in the team has become that of a facilitator.
• Importance of defining testing strategies for the mid/long term when the product under test has a long life.
• Testing is a team responsibility.
CONCLUSIONS