LCE13: Test and Validation Summit: Evolution of Testing in Linaro (II)

Preview:

DESCRIPTION

Resource: LCE13 Name: Test and Validation Summit: Evolution of Testing in Linaro (II) Date: 09-07-2013 Speaker: Video: http://youtu.be/59i6Zblr6cg

Citation preview

Linaro Test and Validation Summit

Linaro Engineering Teams

LCE13 - Dublin, July 2013

How Do We Better Test our Engineering

PART 1: Linaro Platform

Overview

LAVACitius, Altius, Fortius"Faster, Higher, Stronger"

Builds & CIBuild Your Code When You are Not Looking

QA ServicesCover all bases

PART 2: Linaro Engineering

Kernel Developers& Maintainers

Landing Teams Linaro GroupsLEG / LNG

( Do they use CI ? Manual or Automated Testing? )* How do they validate/verify their output? * What/how do they develop?

& easier to use

Agenda (Tuesday, 9am-1pm

Time Topic Speaker

9:00 Introduction (Dev, Test, Loop) Alan Bennett

9:15 Overview of the CI loop (25 min) Fathi Boudra

9:40 QA services (20 min) Milosz Wasilewski

10:00 Recent LAVA updates (45 min) Antonio Terceiro

10:45 BREAK

11:00 LNG Mike Holmes

11:30 Landing Teams Scott Bambrough

12:00 KWG PM Kevin Hilman

12:30 LEG Grant Likely

Plat

form

Upd

ates

tech

nica

l det

ails

Preface

Why is the Quality of Linaro Engineering so important?

Preface

Why is the Quality of Linaro Engineering so important?

● applying continuous quality control

● frequent integration of small pieces of software

● rapid feedback

● Extreme programming (XP)○ minimize integration

problems

● Shared code repositories● daily commits● automated build systems● extensive unit tests● testing in cloned

production environments

Highlights of Continuous Integration

Preface

Continuous Integration

CI Loop

Chang

es M

ade Automated Build

Test Report /

FeedbackTes

t

Source Control System

Build

Testing

Development

Linaro Test and Validation Summit

Fathi BoudraBuilds and Baselines

LCE13 - Dublin, July 2013

How Do We Better Test our Engineering

● CI Present○ Anatomy of CI loop

● CI Future○ What is on the CI roadmap

Overview

● Get the source○ Source code is under SCM

■ Git (git.linaro.org)■ Bazaar (bazaar.launchpad.net)

● Build the code○ Use a build system

■ Jenkins (ci.linaro.org and android-build.linaro.org)

■ LAVA (yes, LAVA can be used!!!)● Publish the build results

○ Build artifacts are available (snapshots.linaro.org)

Anatomy of CI loop

● Submit the results for testing○ LAVA (validation.linaro.org)

● Get the tests results○ E-mail notifications with filters (validation.linaro.

org/lava-server/dashboard/filters)○ LAVA dashboard (validation.linaro.org/lava-

server/dashboard)

Anatomy of CI loop

● Different type of jobs○ Kernel CI○ Engineering builds○ Components

● Build triggers○ manual, periodically, URL trigger, post-commit

● Do the build○ shell script(s)

■ can be maintained under SCM (linux-preempt-rt)○ Groovy script(s)

● Publish○ to snapshots.linaro.org○ to package repositories (PPA, other)

Build jobs in depth

● LAVA CI Runtime○ LAVA as a build system

● LAVA Publishing API○ LAVA ability to publish artifacts on remote host

● Build time optimization○ persistent slaves○ mirrors and caching

● Better documentation

CI Future

Any Questions?

Q&A

Linaro Test and Validation Summit

Milosz WasilewskiQA Services

LCE13 - Dublin, July 2013

How Do We Better Test our Engineering

QA ServicesTasks:

● manual testing

● dashboard monitoring

● reporting

● porting tests to LAVA

Manual TestingCurrent approach:

● test results are not very detailed

● no connection between test case description and result sheet

● results stored in google spreadsheet

● bug linking done manually (makes it hard to extract the list of 'known issues')

Future:

● store test cases in some better suited place than wiki

● preserve test case change history

● store manual test results along automatic ones (in LAVA)

● have ability to link bugs from various tracking systems to failed cases (in LAVA)

● generate reports easily (known issues, fixed problems, etc.)○ might be done using LAVA if there is an easy way to extract testing

results (for example REST API)

Manual Testing

● Monitoring dashboard ○ adding bugs○ debugging failed runs

● Creating custom dashboards○ Dashboard from filter○ No need to edit python code to create/edit dashboard○ Private/public dashboards○ Dashboard email notification (falls in the concept of filter-as-dashboard

approach)

Dashboards

● Use only binaries that were already automatically tested

● Don't repeat automated tests in manual run (we have to be confident that automated results are reliable)

Release workflow

LAVA: Faster, Higher, Stronger (& easier to use)

Antonio TerceiroLAVA

LCE13 - Dublin, July 2013

Test and Validation Summit

● Improvements● New testing capabilities● Engineering Progress Overview● What are we missing?

○ Open Discussion○ We want to hear from you

Overview

● ~90 ARM devices● ~300 ARM CPUs● ~150 jobs submitted per day● ~99% reliability

Context (0): the size of LAVA, today

● LAVA started as an in-house solution● Open source since day 1● Other organizations (incl. Linaro members)

interested in running their own LAVA lab

We need to go from an in-house service to a solid product

Context (1)

● No bootloader testing● Tests only involve single devices

We need to provide features to support new demands in test and validation

Context (2)

Improvements

● Queue size monitored with munin● Nagios monitoring all sorts of things (e.g.

temperature on Calxeda highbank nodes)● Health check failures

Monitoring

Easing LAVA installation

● Effort on proper upstream packaging so that packages for any (reasonable) OS can be easily made

● WIP on Debian and Fedora packaging

$ apt-get install lava-server$ yum install lava-server

Packaging enhancements

Easing LAVA learning

● Documentation is○ scattered○ outdated○ confusing

Documentation overhaul is in the LAVA roadmap.

Documentation overhaul

Easing LAVA usage

ATM a lava-test-shell job requires● 1 JSON job file● 1 YAML test definition file● + the test code itself

$ sudo apt-get install lava-tool$ lava script submit mytestscript.sh$ lava job list

LAVA test suite helper tool

Getting more out of LAVA data

More information out of LAVA data

● Improvements in test results visualization in the LAVA dashboard

LAVA is too hard to develop

● Too many separate components○ Also a mess for bug/project management

● Requires almost a full deployment for development

● Consolidated client components (3 to 1)● Will consolidate server components (3+ to 1)

Developer-friendliness

New testing capabilities

● LAVA Multi-purpose Probe● 1 base design, 5 boards now● USB serial connection(s) to the host● management of other connections to/from

devices under test

LMP

● prototype sets manufactured and under test● Use cases: ethernet hotplug, SATA hotplug,

HDMI hotplug and EDID faking, USB OTG testing, USB mux (sort of), lsgpio, audio hotplug, SD-Mux for bootloader testing

LMP (2)

LMP (3) - how it works (e.g. SD-MUX)

DUT

SDC1

Host

LMP

USB serialUSB MSD

Multi-node testing (1)

● Schedule jobs across multiple target devices○ Client-server, peer-to-peer and other scenarios

● Combine multiple results into a single result● LAVA will provide a generic interface, test

writers can program any tests they need.○ (special hardware setups possible but need to be

handled case-by-case)

Other sessions:● LAVA multi-node testing on Thursday● LNG multi-node use-cases on Friday

Multi-node testing (2)

● Logistics challenge!● We might end up needing 20 of every device

type in the lab● Need to manage the needed growth in the

lab in a sensible way

Other projects

● Lightweight interface for kernel developers● Boot from test UEFI on all Versatile Express

boards● Support for new member boards

Overview of Engineering Progress

In Progress

● LAVA LMP● Multi-node testing● Helper tool● Test result visualization

improvements● Lightweigth interface for

kernel devs● UEFI on V. Express● Support for new member

boards

In Progress X Planned

Planned (for soon)

● Server components consolidation

● QA improvements● Doc overhaul

Open Discussion

● What is your experience getting started with LAVA?

● What would have made your experience easier?

● Any suggestions to the LAVA team? Let us know!

● Feedback about the image reports revamp?

Seed Questions