View
684
Download
0
Category
Tags:
Preview:
Citation preview
Getting Testy With Perl
Steven LembarkWorkhorse Computing
lembark@wrkhors.com
Cultured Perl
Perl's “test culture” is part of the language.
Test::* modules easily added to Perl code.
Smoke testing provided as part of CPAN.
Reporter accumulates results from installers.
Language structure, modules work together.
Where there's smoke, there's Perl.
CPAN::Reporter send back success/fail.
CPAN testers: variety of platforms, daily reports.
cpan-testers & cpan-testers-discuss mailing lists.
http://cpantesters.org/
Community effort at testing is unique to Perl.
Excellent Reference
Perl Testing: A developer's Notebook,
O'Reilly Press
Wonderful example of a good howto-book.
Includes modules, ways to use them with variations.
Good cover-to-cover read.
Perl is easy to test
Test::* modules do most of the work.
Use the wheels, don't re-invent:
Test::Simple, Test::More, Test::Deep,
Test::Builder, Test::Coverage, Object::Exercise
Perl adds introspection.
Onestop shopping
Perl as a great glue language.
Use perl to test other programs.
Have other programs output TAP.
Combined with “inline” you can test almost anything!
Test your tests!
Devel::Coverage have you tested all of the code?
Test::Builder roll your own tests.
Build you own re-usable test components:
Test::Harness::Straps
Test::Builder::Tester
“Test” code can do more
Pre-load fixed data into a database.
Validate the working environment.
Execute daily cleanups in cron.
Testing simply
Count hardwired:
use Test::Simple tests => 6;
Difference in count will be an error.
Its all “ok”use Test::Simple tests => 2;
# ok( boolean, message );
ok 1 == 1, “Count: '1' (1)”;
ok 2 == 1, “Count: '2' (1)”;
Output like:ok 1 Count: '1' (1)
not ok 2 Count: '2' (1)
Its all “ok”use Test::Simple tests => 2;
# ok( boolean, message );
ok 1, 'It passed!';
Output like:1..2
ok 1 It passed!
# Looks like you planned 2 tests but ran 1.
Seen these before?“not ok 10 Count”
“not ok 11 Format”
“not ok 5 File or directory not found.”
OK, now what?
Designing tests:
Test what matters.
Report something useful.
Isolate the failure
What to fix:
ok -e $path , “Existing: '$path'”;
ok -d _ , “Directory: '$path'”;
ok -r _ , “Readable: '$path'”;
ok -w _ , “Writeable '$path'”;
Q: Why all of the quotes?
ok -e $path, “Existing dir: $path”
Q: Why all of the quotes?
ok -e $path, “Existing dir: $path”
not ok 1 - Existing dir: /var/tmp
not ok 1 Existing dir: /var/tmp
Q: Why all of the quotes?
ok -e $path, “Existing dir: '$path'”
not ok 1 - Existing dir: '/var/tmp '
not ok 1 Existing dir: '/var/tmp
'
Good general format
Include what you found, what you expect.
You'll need them both with “not ok”:ok $got eq $want, 'Name: '$got' ($want)”;
not ok 99 – Name: '' (Jow Bloe)
not ok 99 – Name: '1 Main St.' (Jow Bloe)
not ok 99 – Name: 'Jow Bloe' ()
not ok 99 – Name: 'Jow Bloe' (User::Name=0x1A...
Test::Simple may be enough
Generate a plan.
“ok” generates TAP output.
Test count is fixed.
This works for many day-to-day tests.
But Wait! There's More!
Test::More
plan, done_testing test counts
is, isnt, like, unlike stringy “ok”
pass, fail skip the boolean
use_ok, require_ok module test
note, diag, explain messages
Flexable count
Set count with “plan”.
Compute from test input:
plan tests => 42;
plan tests => 3 * keys $config;
plan tests => 2 * values $query_output;
Test count varys?
Skip plan.
When you're done testing
Just say so:
done_testing;
Generates final report.
Plan count is checked here.
No plan, no count check.
Notes show up with “prove v”note “read_config from '$config_path'”;
my $path = read_config 'tmp_dir';
ok -e $path, “Exists: '$path' (tmp_dir)”;
$ prove v;
# read_config from './etc/pass1.conf'
not ok 1 – Exists: '/var/tmp/foobar ' (tmp_dir)
Diagnostics show why tests failed
Show up without “-v”.
ok … or diag “Oopsie...”;
ok grep /blah/, @test_input, 'Found blah' or diag “Missing 'blah':”, @test_input;
ok $dbh, “Connected” or diag “Failed connect: '$dsn'”;
Explain shows exactly what failed
“explain” shows nested structure.
Use with “note” to show setup details.
With “diag” shows extra info on failure.
my $dbh = DBI->connect( @argz );
ok $dbh, 'Database connected'
or diag 'Connect args:', explain \@argz;
Stringy “ok”
like, not_like use regexen.
saves “=~” syntax in the tests:
like $stuff, qr/milk/, “Got: 'milk'?”
or diag “Have: '$stuff' instead”;
not ok 1 Got: 'milk'?
# Have 'very new cheese' instead
Variable number of tests
If-logic generates variable number of tests.
Skip “plan”, use “done_testing”.
Taking it pass/failif( $obj = eval { $class->constructify } ) { # validate object contents
}
else
{
# nothing more to testfail “Failed constructify: $@”;
}
Test expected errors
eval{
$obj->new( $junk );fail "No exception: foo( $junk )?";1
}or do{
# validate error handling...
}
Controlling test cycle
Abort if everything will fail.
Avoid expensive, specialized, un-necesary tests.
Saves extraneous code in all of the tests.
BAIL_OUT: Knowing when to give up
Aborts all testing.
Unsafe or guaranteed failure.
Useful in early test for environment, arg checks.
BAIL_OUTBAIL_OUT 'Do not run tests as su!' unless $>;
BAIL_OUT 'Missing $DB' unless $ENV{DB};
-e $test_dir or mkdir $dir, 0777
or BAIL_OUT “Failed mkdir: '$dir', $!”;
-d $test_dir or BAIL_OUT “Missing: '$dir'”;
-r _ or BAIL_OUT “Un-readable: '$dir'”;
-w _ or BAIL_OUT “Un-writeable: '$dir'”;
“SKIP” blocks
Skip a block with test count and message.
Adjust the plan test count.
Expensive or specialized tests.
Skip external testsSKIP:{
skip “No database (TEST_DB)”, 8 if ! $ENV{ TEST_DB };
# or … no network available...# or … no server handle...
...}
Skip expensive testsSKIP:{
skip “Used for internal development only”, 12unless $ENV{ EXPENSIVE_TESTS };
# test plan count reduced by 12
...}
Skip dangerous testsSKIP:{
skip “Unsafe as superuser”, 22 unless $>;
# at this point the tests are not running su.
...}
You'll usually use Test::More
note & diag nearly always worth using.
plan & done_testing makes life easier.
Still have “ok” for the majority of work.
Testing Structures
“like” syntax with nested structure:
use Test::Deep;
cmp_deeply $found, $expect, $message;
Great for testing parser or grammar outcome.
Devel::Cover: what didn't you check?
All of the else blocks?
All of the “||=” assignments?
All of the “... or die ...” branches?
Devel::Cover bookkeeps running code.
Reports what you didn't test.
Tells you what test to write next.
Similar to NYTProf:
Run your program & summarize the results:
$ cover -test;or
$ perl -MDevel::Cover ./t/foo.t;$ cover;
Running “cover” generates the report.
Introspection simplifies testing
Want to test database connect failure.
Can't assume SQL::Lite.
Flat file databases are messy.
Guarantee that something fails?
You could write an operation that should fail.
Then again, it might not...
Making sure you fail
Override class methods.
Override Perl functions.
Result:
Mock objects,
Mock methods,
Mock perl.
Mock Objects
Test your wrapper handling failure?
Override DBI::connect with sub { die … }.
No guess: You know it's going to fail.
Mock ModulesYour test:
*DBI::connect = sub { die '…' };
my $status = eval { $obj->make_connection };
my $err = $@;
# test $status, $err, $obj...
Force an exception use Symbol qw( qualify_to_ref );
# e.g., force_exception 'Invalid username', 'connect', 'DBI';# ( name, packcage ) or ( “package::name” )
sub force_exception{
chomp( my $messsage = shift );
my $ref = qualify_to_ref @_;
undef &{ *$ref };
*{ $ref } = sub { die “$message\n” };
return}
Testing multiple failures
for( @testz )
{
my( $msg, $expect ) = @$_;
force_exception $msg, 'DBI::connect';
my $status = eval { $wrapper->connect };
my $err = $@;
# your tests here}
Avoid caller cleanup
Override with “local”
sub force_exception{
my ( $proto, $method, $pkg, $name, $msg ) = splice @_, 0, 5;my $ref = qualify_to_ref $name, $pkg;
local *{ $ref } = sub { die “$msg” };
# exit from block cleans up local override.# caller checks return, $@, $proto.
eval { $proto->$method( @_ ) }}
Mock Perl
Override perl: *Core::Global::<name>
Is “exit” is called?
my $exits = 0;
*Core::Global::exit = sub { $exits = 1 };
eval { frobnicate $arg };
ok $exits, “Frobnicate exits ($exits)”;
Devel::Cover & Mocking
The biggest reason for mock anything:
Force an outcome to test a branch.
Iterate:
Test with Devel::Cover.
See what still needs testing.
Mock object/method/function forces the branch.
Automated testing
Lazyness is a virtue.
Avoid cut+paste.
Let Perl do the work!
Example: Sanity check modules
Use symlinks.
Validate modules compile.
Check package argument.
require_ok $package; # ok if require-able.
use_ok $package; # ok if use-able.
Make the links
Path below ./lib.
Replace slashes with dashes.
Add a leading “01”.
Symlink them all to a generic baseline test.
Symlinks look like:
01WcurveUtil.t generic01t→
use FindBin::libs;use Test::More;use File::Basename qw( basename );
my $madness = basename $0, '.t'; # 01-WCurve-Util$madness =~ s{^ 01-}{}x; # WCurve-Util$madness =~ s{ \W+ }{::}gx; # WCUrve::Util
if( use_ok $madness ){
# check for correct package argument.
ok $madness->can( 'VERSION' ), “$maddness has a method”;ok $a = $madness->VERSION, “$madness is '$a'”;
}done_testing;
./t/generic01t
./t/makegenericlinks#!/bin/bash
cd $(dirname $0);rm 01-*.t;
find ../lib/ -name '*.pm' |perl -n \
-e 'chomp;' \-e 's{^ .+ /lib/}{}x' \ -e 's{.pm $}{.t}x' \-e 'tr{/}{-} \-e 'symlink “01-generic-t” => “01-$_”';
exit 0;
Similar checks for config files
Add “00-” tests to reads config files.
Combine “ok” with Config::Any.
Now make-generic-links tests config & code.
“prove” runs them all in one pass.
The Validation TwoStep
Use with git for sanity check:
git pull &&
./t/make-generic-links &&
prove --jobs=2 –-state=save &&
git tag -a “prove/$date” &&
git push &&
git push –-tag ;
Exercise for healthy objects
Data-driven testing for a single object.
Data includes method, data, expected return.
Module iterates the tests, reports results.
When tests or expected values are generated.
use Object::Exercise;my @testz =(
[[ qw( name ) ], # method + args[ q{Jow Bloe} ], # expected return
],[
[ qw( street ) ],[ q{1 Nuwere St} ],
],[
[ qw( street number ) ], # $obj->address( 'number' )[ qw( 1 ) ],'Check street number' # hardwired message
],);
Person->new( 'Jow Blow' => '1 Nuwere St' )->$exercise( @testz );
Load fixed data
Flat file -> arrayrefs
“insert” as method, [1] as return.
Load the data with:
$sth->$exercise( @data )
Get “ok” message for each of the data record.
Roll your own: Test::Builder
Mirrors Test::More with a singleton object.
my $test = Test::Builder->new;
$test->ok( $boolean, $success_message )
or $test->diag( $more_info );
Spread single test across multiple modules.
Testing Testing
Test::Build::Tester wraps your tests.
Forces test to return not-ok in order to test it.
Ignores the output of the test being tested.
Validate the test plan.
Getting the most out of prove
Save state: re-run failed tests:
$prove –state=save;# fix something...$prove –state=save,failed;
Parallel execution with “--jobs”.
“perldoc” is your friend!
Great, but my shop uses <pick one>
Test multiple languages with Inline.
Include C, C++, Python...
Multiple Perl versions.
Mix languages to test interfaces.
Onestop teting for your API lib's
Need to test multi-language support?
Use Perl to move data between them.
Unit-test C talking to Java.
Example: Testing your C codeuse Inline C;use Test::Simple tests => 1;
my $a = 12 + 34;my $b = add( 12, 34 );
ok $a == $b, "Add: '$b' ($a)";
__END____C__
int add(int x, int y){ return x + y;}
Example: Testing your C code$ prove v add.tadd.t ..1..1ok 1 Add: '46' (46)okAll tests successful.Files=1, Tests=1, 0 wallclock secs( 0.05 usr 0.00 sys + 0.09 cusr 0.00 csys = ... Result: PASS
Example: Testing your C code
You can call into a library. Write a whole program.
Inline builds the interface.
Test Anything Protcol
Inline Language Support Modules add languages.
Inline supports:
C, C++, Java, Python, Ruby, Tcl, Assembler, Basic, Guile, Befunge, Octave, Awk, BC, TT (Template Toolkit), WebChat, and (of course) Perl.
Really: Any Thing
Say you had a quad-rotor UAV...
You'd write a Perl API for it, wouldn't you?
UAV::Pilot::ARDrone, for example.
But then you'd have to test it...
First: Define the test actionsmy @actionz = ( [ takeoff => 10 ], [ wave => 8000 ], [ flip_left => 5000 ], [ land => 5000, send => $expect ],);plan tests => 2 + @actionz;
Basic set of actions:
Takeoff, wobble, flip,
and land.
Final “send” validates
end-of-channel.
Control interfaces
BAIL_OUT avoids running without controls.
my $driver= UAV::Pilot::ARDrone::Driver->new( { host => $HOST, })or BAIL_OUT "Failed construct";
eval{ $driver->connect; pass "Connected to '$HOST'";}or BAIL_OUT "Failed connect: $@";
Error handler for actions
Attempt the last action (land) on errors.
sub oops{ eval { execute $actionz[ -1 ] } or BAIL_OUT "Literal crash expected";}
sub execute{ state $a = $event; my ( $ctrl_op, $time, $cv_op, $cv_val ) = @$_; $a = $a->add_timer ( { duration => $time, duration_units => $event->UNITS_MILLISECOND, cb => sub { $control->$ctrl_op or die "Failed: $ctrl_op"; $cv->$cv_op( $cv_val ) if $cv_op;
pass "Execute: '$ctrl_op' ($cv_op)"; } } )}
Execute the actions
“note” here describes what to expect.
for( @actionz ){ note "Prepare:\n", explain $_;
eval { execute $_ } or oops;} $event->init_event_loop;
Check the endofchannel
my $found = $cv->recv;
if( $found eq $expect ){ pass "Recv completed '$found' ($expect)";}else{ fail "Recv incomplete ($found), send emergency land";
oops;}
done_testing;
Execute the tests$ perl t/01-test-flight.t1..6# Connect to: '192.168.1.1' (UAV::Pilot::ARDrone::Driver)ok 1 - Connected to '192.168.1.1'# U::P::EasyEvent has a socket# Prepare:# [# 'takeoff',# 10# ]# Prepare:# [# 'wave'
“pass” shows what happens# Prepare:# [# 'land',# 5000,# 'send',# '123'# ]ok 2 - Execute: 'takeoff' ()
“pass” shows what happens# Prepare:# [# 'land',# 5000,# 'send',# '123'# ]ok 2 - Execute: 'takeoff' ()ok 3 - Execute: 'wave' ()
“pass” shows what happens# Prepare:# [# 'land',# 5000,# 'send',# '123'# ]ok 2 - Execute: 'takeoff' ()ok 3 - Execute: 'wave' ()ok 4 - Execute: 'flip_left' ()
“pass” shows what happens# Prepare:# [# 'land',# 5000,# 'send',# '123'# ]ok 2 - Execute: 'takeoff' ()ok 3 - Execute: 'wave' ()ok 4 - Execute: 'flip_left' ()ok 5 - Execute: 'land' (send)
“pass” shows what happens# Prepare:# [# 'land',# 5000,# 'send',# '123'# ]ok 2 - Execute: 'takeoff' ()ok 3 - Execute: 'wave' ()ok 4 - Execute: 'flip_left' ()ok 5 - Execute: 'land' (send)ok 6 - Recv completed '123' (123)
Cultured Perls
The Developer's Notebook is a great resource.
POD for Test::More == wheels you don't re-invent.
POD for prove == make life faster, easier.
cpantesters.org: test results for modules by platform.
Stay cultured, be testy: use perl.
Recommended