22
1 Experiences in Automating the Testing of SS7 Signaling Transfer Points ISSTA 2002 July 22-24, 2002 Via di Ripetta, Rome - Italy Tim Moors, Malathi Veeraraghavan, Zhifeng Tao, Xuan Zheng, Ramesh Badri Polytechnic University Brooklyn, NY, USA [email protected] http://kunene.poly.edu/~mv This project is co-sponsored by Verizon, NYSTAR and CATT.

1 Experiences in Automating the Testing of SS7 Signaling Transfer Points ISSTA 2002 July 22-24, 2002Via di Ripetta, Rome - Italy Tim Moors, Malathi Veeraraghavan,

Embed Size (px)

Citation preview

1

Experiences in Automating the Testing of SS7 Signaling Transfer Points

ISSTA 2002 July 22-24, 2002 Via di Ripetta, Rome - Italy

Tim Moors, Malathi Veeraraghavan, Zhifeng Tao, Xuan Zheng, Ramesh Badri

Polytechnic University

Brooklyn, NY, USA

[email protected]

http://kunene.poly.edu/~mv

This project is co-sponsored by Verizon, NYSTAR and CATT.

2

Outline

• Background• Problem statement• Solution approach• Implementation — Automated SS7

Test Results Analyzer (ASTRA)• Summary

3

Telephony & SS7 network

TelephonySwitch

TelephonySwitch

A simple telephony network with two switches and two telephones.

TelephonySwitch

TelephonySwitch

A regular phone call is placed. STP (Signaling Transfer Point) is a datagram router – packet switch of the SS7 (Signaling System No. 7) network.

STP

STP

STP

STP

SS7 Setup message

1-718-260-00001-718-260-0000

Voice trunksSS7 links

TelephonySwitch

TelephonySwitch

An 800 call is placed. Database is consulted for routing number. SS7 network also used for mobility management.

STP

STP

STP

STP1-800-888-8888

1-718-260-8888Databases(e.g. 800)

1-718-260-8888

SS7 Setup message

1-800-8

88-8888

Quad

4

Outline

• Background

• Problem statement

• Solution approach

• Implementation — Automated SS7 Test Results Analyzer (ASTRA)

• Summary

5

Problem statement

• Automate the analysis of test results generated by interoperability tests of STPs– Test result files are too large for

error-free manual analysis– Tests need to be run for every

upgrade (multiple vendors supply STPs to a service provider)

6

What does an STP implement?

• STP is the packet switch of an SS7 network– network layer

• datagram forwarding• routing protocol

– handles failures (topology)– handles congestion (loading)

– data-link layer• complex support needed because of the use of

link-by-link error control• referred to as signaling traffic management

7

• Complex system – many interacting EFSMs• Distributed – many Points of Control and

Observation (PCOs) – synchronization problem• “Embedded testing” or “Testing in Context” – Quad• Testing real-time properties – timed events• Non-determinism – many output sequences possible

STP Quad interoperability testing

STP_C

STP_B STP_D

CO

OC

OA

COACOO

SSP_E1

STP_A

SSP_E6

SSP_W31

SSP_W36

SCP_E1

SCP_E2

SCP_W31

SCP_W32

D4-0,1

A1

A2

A11

A12

A31

A32

A41

A42

D1-0,1

D3-0,1

D2-0,1

C1-

0,1

C2-

0,1

COO

CO

A

……

……

A13

A14

A15

A16

A43

A44

A45A46

COA

CO

O

TF

RT

FR

TFR

TFR

TF

RR

SR

TFP

RSP

System under Test (SUT) Emulatedsystems

8

Complexity of STP testing

Types of tests

• Failure

— Sequential failures

— Simultaneous failures

— STP pair isolation

• Congestion

Number of tests

• High speed links

— 22 test cases

• Low speed links

— 12 test cases

— 22 test parts

— 146 steps

Monitoring

• High speed

— 3 types of monitors

— 252 monitored locations

• Low speed

— 3 types of monitors

— 36 monitored locations

30, 000 test events in test results files

Messages • routing updates• congestion notifications • traffic retrieval

Traffic load• SS7 network engineered for high reliability• Links carry half-load under normal conditions

9

Complete problem

Test EnvironmentSetUp

TestExecution Test

Result

Retrieval

And

Collection

TestResultsAnalysis

ASTRA

ASTRA : Automated SS7 Test Results Analyzer

Automate all steps of STP interoperability testing

10

Outline

• Background

• Problem statement

• Solution approach

• Implementation — Automated SS7 Test Results Analyzer (ASTRA)

• Summary

11

Generic methodology

• The set of actual events (test results) captured by monitor m in step s of test t is At,s,m

• We create the set of Expected Behavior Xt,s,m at monitor m in step s of test t

• Compare the actual captured events At,s,m with the created EB Xt,s,m to verify

– messages and associated parameters– traffic load – timer values

12

Expected behavior at multiple PCOs (monitors)

Expected behavior is represented as a “program” instead of as a database because we:

– Cannot predict the exact sequence of events

• Use a tree structure

– Cannot predict the exact times of occurrences of events

• Use times of test actions for searches of events

– Cannot predict the exact number of occurrences of some events

• use a while loop

13

Expected Behavior Expression Language (EBEL)

Basic language components

•findmsg() function - indicates monitor

•assert_load() function

• foreach(), while() loops

• waitfor() pseudo-event

• test actions– fail_link(), restore_link()– set_load– pause

14

EB program example

14. ...15.foreach $s (@SSPS_E) {16. $ta_restore{$s}: restore_link(A(STP_A)($s)) causes ...

1. foreach $s (@SSPS_E) {

2. $ta_fail{$s}: fail_link(A(STP_A)($s)) causes 4,...

3. ...

4. waitfor(T11,STP_A) causes 5,9,...5. foreach $t (@set_1) {

6. findmsg(A(STP_A)($t),$t,STP_A,$t,TFR,$s)

7. }

8. ...9. findmsg(D(STP_A)(STP_D),STP_D,STP_A,STP_D,TFR,$s)causes 1010. while($time<$ta_restore{$s}) {

11. waitfor(T10,STP_D) causes 1212. findmsg(D(STP_A)(STP_D),STP_A,STP_D,STP_A,RSR,$s)

13. }

Timed event

Specify PCO (monitor)

15

EB as a tree

2

4

5 9

6a 6b 6c 10

Abstract expected events form a tree of causation.

If (2) {

if (4) {

foreach $t (@set_1) {/*5*/

6$t

}

}

if (9) {

10

}

}Representing the abstract event tree on the left as a series of nested if statements.

16

Outline

• Background

• Problem statement

• Solution approach

• Implementation — Automated SS7 Test Results Analyzer (ASTRA)

• Summary

17

Architecture of ASTRA

Network Configuration Files

TimingFiles

Test Data

Data Formatting

Test Action Record

Actual Event Record

Analyzer

ParameterObservation

List

Matched Event

Record

Event Lists: Hidden, Missing,

Unexpected

ExpectedBehavior

Translator

• Translate EBEL code into a PERL program• Add time values (consult TAR)• Convert to depth-first search

• Adjust variations in format from different test data

• Synchronize all events

• Filter database to remove irrelevant information

• Sort events in the database into chronological order

DataProgram

18

Analyzer - matching operation

NO

Write to Hidden

Event List

Event Found?NO Write to

Missing Event List

Find all messages with Match Flag = 0

Write to Unexpected

Event List

Was the link monitored?

Find Event on link, params, [t1,t2] from EB

Actual Event Record

Search for event in the time period

YES

Matched Event

Record

YESSet Match Flag to 1

20

ExpectedBehavior

Analyzer

Actual Event Record

Architecture of ASTRA

Translator

ParameterObservation

List

Matched Event

Record

Event Lists: Hidden, Missing,

Unexpected

Test Results Database

Statistical Analysis &Report Generation

Legend

Data

Program

Network Configuration Files

TimingFiles

Test Data

Data Formatting

Test Action Record

•Calculate the numbers of expected, missing, hidden and unexpected events

• Calculate timer observations

• Calculate load measurements

• Declare failure/pass/inconclusiveness of the tests

21

Outline

• Background

• Problem statement

• Solution approach

• Implementation — Automated SS7 Test Results Analyzer (ASTRA)

• Summary

22

ASTRA

Productivity of ASTRA• The size of the EB code 5909• The number of events in collected test results 29196

EB code 6,000 lines

Test Data 30,000 events

Produced database 3,000 events

23

Summary• Challenging problem

• Non-determinism (e.g., simultaneous link restoral)– If A restores before B, a different type of message will be

sent than if B restores before A; if both options not listed with an “OR” in EB, then either “unexpected” or “missing” event

– Solutions (?)• Cover all possible scenarios by automatically creating EB directly from

SDL specification - state explosion problem• Allow EB program to consult the test results to make further predictions

with “IF” statement. This approach trusts test results; Also found need for a “NOT” statement.

• Asides:– Pay attention to monitors: “more the better” and they miss messages! – Need automation of all steps of testing – not just results analysis– Found 3 implementation bugs in STPs tested!