59
INDUSTRIAL TRAINING REPORT Automated SSI Testing Tool undertaken at ARICENT Submitted by Paritosh Agrawal UE115066 Under the Guidance of Ashutosh Aggarwal Senior Technical Leader ARICENT Department of Electronics & Communication Engineering UNIVERSITY INSTITUTE OF ENGINEERING & TECHNOLOGY PANJAB UNIVERSITY, CHANDIGARH May 2015

Final Report

Embed Size (px)

Citation preview

INDUSTRIAL TRAINING REPORT

Automated SSI Testing Tool

undertaken at

ARICENT

Submitted by

Paritosh Agrawal

UE115066

Under the Guidance of

Ashutosh Aggarwal

Senior Technical Leader

ARICENT

Department of Electronics & Communication Engineering

UNIVERSITY INSTITUTE OF ENGINEERING & TECHNOLOGY

PANJAB UNIVERSITY, CHANDIGARH

May 2015

ii | A R I C E N T

DECLARATION

I hereby declare that the project work entitled ‘Automated SSI Testing Tool’ is an authentic

record of my own work carried out at ARICENT as requirements of Industrial Training for

the award of the degree of B.E. at University Institute of Engineering & Technology, Panjab

University, Chandigarh under the guidance of Ashutosh Aggarwal during 12th

January,

2015 to 30th

June, 2015

Paritosh Agrawal

UE115066

Date: 20th May, 2015

Certified that the above statement made by the student is correct to the best of our knowledge

and belief.

Ashutosh Aggarwal

Senior Technical Leader

iii | A R I C E N T

Copy of Provisional Industrial Training Certificate

iv | A R I C E N T

Attendance Record

Attendance Record: Industrial Training (B.E.-E.C.E. 8th Semester)-May 2015

1. Name of the student: PARITOSH AGRAWAL

2. Name of the organization/company: ARICENT

3. Date of commencement of Training: 12th JANUARY, 2015

Month wise Attendance Record:

Month Total working

Days in a month

Number of

presents in a

month

Percentage of

Attendance

Signature of the

Industrial Guide

January 14 14 100%

February 20 10 50%

March 21 21 100%

April 22 19 86%

May 14 14 100%

Signature (Industrial Guide)

Name of the Industrial Guide ............... Ashutosh Aggarwal ...................

DESIGNATION ......................Senior Technical Leader ......................

Note: If industry/organization has a provision for maintaining attendance record then a copy of

attendance record should be attached along this form.

If industrial guide does not have his/her own stamp, then attendance record can be printed on

company's or organization's letter head and should have stamp of HR concerned.

v | A R I C E N T

Acknowledgement

I, Paritosh Agrawal, student of "University Institute of Engineering & Technology

(UIET), Panjab University, Chandigarh", am highly thankful to ARICENT for the

confidence bestowed in me.

At this juncture, I feel deeply honored in expressing my sincere thanks to my Senior

Technical Leader Ashutosh Aggarwal, Technical Leader Pani Raj K.S. and

Engineering Project Manager Sachindra Kumar Shukla for making all the resources

available at right time and providing valuable insights leading to the successful

ramping up process on my project.

I would like to express my gratitude to my Senior Engineering Project Manager Pooja

Kapoor, for providing me a great opportunity to work on this project. She gave me a

fabulous platform to start with.

I also want to show my deep sense of thanks to my Principle System Engineer

Jayant Kumar Bhardwaj, for his constant technical guidance and encouragement. It

has been of great help in carrying my present work.

I would also like to extend my warm gratitude to my mentor and Software Engineer

Ashish Vijay Gulati, for encouraging me and providing me a healthy and competitive

environment to work and for his critical and valuable advice without which it would

not have been possible.

Paritosh Agrawal

UE115066

Electronics & Communications Engineering

vi | A R I C E N T

ABSTRACT

In today´s fast moving world, it is a challenge for any company to

continuously maintain and improve the quality and efficiency of software systems

development. In many software projects, testing is neglected because of time or cost

constraints. This leads to a lack of product quality, followed by customer

dissatisfaction and ultimately to increased overall quality costs.

Test automation can improve the development process of a software

product in many cases. The automation of tests is initially associated with increased

effort, but the related benefits will quickly pay off.

Automated tests can run fast and frequently, which is cost-effective for

software products with a long maintenance life. When testing in an agile

environment, the ability to quickly react to ever-changing software systems and

requirements is necessary. New test cases are generated continuously and can be

added to existing automation in parallel to the development of the software itself.

The main goal in software development processes is a

timely release. Automated tests run fast and frequently, due to reused modules within

different tests. Automated regression tests which ensure the continuous system

stability and functionality after changes to the software were made lead to shorter

development cycles combined with better quality software and thus the benefits of

automated testing quickly outgain the initial costs.

Automated SSI Testing Tool

1 | A R I C E N T

Table of Contents

Chapter 1: COMPANY PROFILE ..................................................................................................... 4

Company Highlights ...................................................................................................................... 4

History .......................................................................................................................................... 4

Chapter 2: INTRODUCTION ............................................................................................................ 8

VALGRIND .................................................................................................................................. 8

TCL/TK ...................................................................................................................................... 11

EXPECT ..................................................................................................................................... 12

INTRODUCTION TO SOFTWARE AUTOMATION ............................................................................ 13

SPIRENT TEST CENTER ............................................................................................................ 15

Chapter 3: Journey So Far ............................................................................................................... 18

Automation of Spirent ................................................................................................................. 18

Project Undertaken ...................................................................................................................... 20

Activities in January 2014 ......................................................................................................... 20

Activities in February 2015 ....................................................................................................... 22

Activities in February-End 2015 ................................................................................................ 24

Activities In March 2015 .......................................................................................................... 25

Activities in April 2015 ............................................................................................................. 28

Activities in End-April 2015 ...................................................................................................... 34

Chapter 4: The Actual Project .......................................................................................................... 39

Activities in April-End to Present ................................................................................................ 40

Tool So far: ................................................................................................................................. 41

Runfile ..................................................................................................................................... 43

Configuration.txt...................................................................................................................... 44

Testcases.txt ............................................................................................................................ 45

<test_case_NAME>.txt............................................................................................................. 45

Spirent_<#>.tcl ........................................................................................................................ 46

Results_On_<timestamp>.csv .................................................................................................. 49

<timestamp>/logfile ................................................................................................................ 50

Chapter 5: Outcomes from the Training ........................................................................................... 51

Chapter 6: References ...................................................................................................................... 53

Automated SSI Testing Tool

2 | A R I C E N T

LIST OF FIGURES FIGURE 1: ARICENT .............................................................................................................................................. 5

FIGURE 2: COMPREHENSIVE TELECOM DOMAIN E 1................................................................................................ 6

FIGURE 3: MILESTONES AND INDUSTRY FIRSTS ......................................................................................................... 7

FIGURE 4: CONNECTION TO CHASSIS AND PORT RESERVATION ............................................................................... 16

FIGURE 5: EDITING OF A STREAM BLOCK ................................................................................................................ 17

FIGURE 6: TRANSFERRING A PACKET FROM THE TX PORT ..................................................................................... 17

FIGURE 7: SPIRENT TESTCENTER AUTOMATION AND THE SPIRENT TESTCENTER ENVIRONMENT .......................... 19

FIGURE 8: EXAMPLE PROGRAM A.C, WITH MEMORY ERROR AND A MEMORY LEAK .................................................. 22

FIGURE 9: CONNECTION TO CHASSIS AND PORT RESERVATION ............................................................................. 26

FIGURE 10: EDITING OF A STREAM BLOCK ............................................................................................................ 27

FIGURE 11: TRANSFERRING A PACKET FROM THE TX PORT ................................................................................... 28

FIGURE 12: VIEW OF THE SPIRENT SCRIPT EDITOR ................................................................................................ 31

FIGURE 13: VIEW OF THE WIRESHARK .................................................................................................................. 33

FIGURE 14: BASIC ETHERNET PACKET STRUCTURE AND ASSOCIATED ETHERTYPES ............................................. 34

FIGURE 15: A SCHEMATIC OF MPLS NETWORK .................................................................................................... 36

FIGURE 16: A SCHEMATIC DIAGRAM OF A VPLS NETWORK .................................................................................... 37

FIGURE 17: IP ROUTING ........................................................................................................................................ 38 FIGURE 18: AN EXAMPLE SNAPSHOT TAKEN FROM THE MAIN CONSOLE TO SHOW IP ROUTING AND ETHERNET

SWITCHING ................................................................................................................................................... 39

FIGURE 19: SCREENSHOT OF THE EXECUTION OF THE EXPECT SCRIPT RUNFILE .................................................... 41

FIGURE 20: DIRECTORY STRUCTURE ON LINUX MACHINE BEFORE THE TEST TOOL RUN ...................................... 42

FIGURE 21: DIRECTORY STRUCTURE ON WINDOWS MACHINE AFTER THE TEST TOOL RUN .................................... 42

FIGURE 22: VIEW OF THE MAIN EXPECT SCRIPT FILE RUNFILE ................................................................................ 43

FIGURE 23: VIEW OF THE CONFIGURATION.TXT FILE ................................................................................................ 44

FIGURE 24: VIEW OF THE TESTCASES.TXT FILE ......................................................................................................... 45

FIGURE 25: VIEW OF THE TEST_CASE_AC_VLAN_UNICAST.TXT FILE ......................................................................... 46

FIGURE 26: VIEW OF THE SPIRENT_1.TCL FILE ON WINDOWS PLATFORM ................................................................ 47

FIGURE 27: THE SNAPSHOT OF THE AUTOMATED TESTING TOOL RUN .................................................................... 48

FIGURE 28: DIRECTORY STRUCTURE ON LINUX PLATFORM AFTER THE TESTSUITE RUN ......................................... 48

FIGURE 29: THE VIEW OF THE RESULT FILE RESULTS_ON_<TIMESTAMP>.CSV ........................................................ 49

FIGURE 30: VIEW OF THE LOGFILE ........................................................................................................................... 50

FIGURE 31: VIEW OF THE VARIOUS MACHINES OVER THE REMOTE DESKTOP CONNECTION. ................................... 51

Automated SSI Testing Tool

3 | A R I C E N T

LIST OF ACRONYMS

QoS Quality Of Service

IP Internet protocol

SSI Sub-system Integration

MPLS Multi Protocol label switching

VPLS Virtual Private LAN Service

PE Provider Edge

GUI Graphical User Interface

SSH Secure Shell

TCL Tool Command Language

SCP Secure Copy

Automated SSI Testing Tool

4 | A R I C E N T

Chapter 1: COMPANY PROFILE

Aricent is a global innovation, technology and services company focused exclusively

on communications. It is a strategic supplier to the world’s leading application, infrastructure

and service providers, with operations in 19 countries worldwide.

Company Highlights

One of the largest privately-held companies in Silicon Valley;

550+ customers;

8,000+ employees, 33 offices worldwide;

Product portfolio of more than 125 licensable products;

Investors include KKR, Sequoia Capital, The Family Office and The Canadian

Pension Plan Investment Board.

History

In 2006, the founders of Aricent – and investors including KKR and Sequoia Capital

– recognized the fundamental transformation happening in the communications landscape,

including the convergence of innovation and technology. In response, nearly $1B was

invested to purchase select assets representing market leadership in communications

technology and innovation to form Aricent.

Aricent was purposely constructed to work closely with CSPs and equipment &

device manufactures to solve the emerging and complex innovation, SI and R&D challenges

emerging in communications. The Company provides highly unique engagement models –

such as Experience Engineering for CSPs – that are highly complementary to traditional

outsourcing and consulting services.

Aricent Today

The communications industry is experiencing unprecedented change. For consumers,

user experience now trumps technology and price as the key driver of purchase and adoption

of new products and services. Communications Service Providers (CSPs) are facing

aggressive new competition from device makers and powerful Internet companies all looking

to capitalize on the mobile Internet. And for equipment manufacturers, cost pressures and the

need to invest in new technologies are driving the need for new ways to improve efficiency

and scale, while achieving a significantly lower cost structure.

Automated SSI Testing Tool

5 | A R I C E N T

These market forces are creating a fundamental new set of innovation and system

integration (SI) challenges for CSPs. These include creating a subscriber device competitive

with the iPhone, delivering rich converged services, connecting a wide range of new

electronics such as digital picture frames and navigation devices, and building network APIs

and SDKs to best monetize cloud-based services.

Equipment makers now must strike a delicate research and development (R&D)

balance between investments in existing products and those required to ensure competitive

readiness in new growth areas like metro Ethernet and Long Term Evolution (LTE).

Aricent is the first company purposely constructed to work closely with CSPs and

equipment manufactures to solve the emerging and complex innovation, SI and R&D

challenges these organizations now face, in what some analysts estimate to be a $9B market.

Complementary to traditional outsourcing and consulting, Aricent is a fundamentally new

breed of strategic supplier architected to co-create, together with our customers, the world’s

most innovative communications products and services.

Figure 1: ARICENT

Aricent offers Telecom Equipment Manufacturers (TEMs) turnkey solutions for the

entire product lifecycle, from ideation to commercialization to sustenance. These solutions,

enabled by feature-rich and field-proven protocol stacks and software frameworks, are

supported by innovative business models that help TEMs effectively address the technical

and business challenges being faced by them today.

The changing dynamics of the telecommunications industry has network equipment

Automated SSI Testing Tool

6 | A R I C E N T

manufacturers facing new challenges such as:

Consolidation among service providers has resulted in fewer customers with

significantly higher bargaining power;

The emergence of low-priced products from Asian countries has resulted in

significant pricing pressures and erosion in profit margins;

Increased focus on emerging telecommunication domains such as LTE, WiMAX,

Femtocells, etc., has led TEMs down the merger route to gain market traction in these

technologies;

The high cost of new product development has been further exacerbated by increasing

complexity in technology;

Significant investment is required to sustain existing products and the associated

revenue streams.

Figure 2: Comprehensive Telecom Domain E 1

Aricent helps equipment manufacturers address these challenges with innovative

solutions that help to balance investments between core and non-core products, surgically

time new product R&D investments with customer spend cycles, augment in-house R&D

teams with external domain expertise, and de-risk new product introduction while ensuring a

timely market entry. Aricent’s unique offering for TEMs includes comprehensive product

lifecycle services complemented by industry leading protocol stacks and software

Automated SSI Testing Tool

7 | A R I C E N T

frameworks. Aricent leverages its global innovation arm – frog design, one of the world’s

foremost creative consulting firms for over 40 years, together with its global engineering

teams to create innovative and highly differentiated subscriber experiences across multiple

domains of the telecommunications ecosystem.

Aricent’s technical expertise spans across all domains in the communications

industry. With 8000+ domain experts in technologies such as LTE, WiMAX,

Routing/Switching, 2G/3G, etc., Aricent can help TEMs address all their technology needs.

Aricent specializes in multi-domain solutions such as fixed-mobile convergence, IP-based

mobile backhaul, triple and quad play services.

Aricent is a recognized world leader in innovation and Telecom R&D services with

over 20 years and thousands of man hours of highly successful customer engagements.

Conducting pioneering work on emerging technologies, Aricent has helped deliver some of

the first commercial solutions in advanced domains such as Femtocells, WiMAX, LTE and

in-flight broadband. Additionally, as a strategic partner to most Tier 1 equipment

manufacturers, Aricent has contributed extensively to a wide range of deployed products

across multiple generations of wireless and fixed technologies, on both access and core

networks.

Figure 3: Milestones and Industry Firsts

Automated SSI Testing Tool

8 | A R I C E N T

Chapter 2: INTRODUCTION

VALGRIND

Valgrind is an instrumentation framework for building dynamic analysis tools.

There are Valgrind tools that can automatically detect many memory management and

threading bugs, and profile your programs in detail. You can

also use Valgrind to build new tools.

The Valgrind distribution currently includes six

production quality tools: a memory error detector, two thread

error detectors, a cache and branch prediction profiler, a

callgraph generating cache and branch prediction profiler, and

a heap profiler. It also includes three experimental tools: a

stack/global array overrun detector, a second heap profiler that examines how heap blocks are

used, and a SimPoint basic block vector generator. It runs on the following platforms:

X86/Linux, AMD64/Linux, ARM/Linux, ARM64/Linux, PPC32/Linux, PPC64/Linux,

PPC64BE/Linux, S390X/Linux, MIPS32/Linux, MIPS64/Linux, ARM/Android (2.3.x and

later), X86/Android (4.0 and later), MIPS32/Android, X86/Darwin and AMD64/Darwin

(Mac OS X 10.9, with limited support for 10.8).

Valgrind is Open Source / Free Software, and is freely available under the GNU

General Public License, version 2.

The Valgrind tool suite provides a number of debugging and profiling tools that

help you make your programs faster and more correct. The most popular of these tools is

called Memcheck. It can detect many memory related errors that are common in C and C++

programs and that can lead to crashes and unpredictable behaviour.

Running your program under Memcheck:

If you normally run your program like this:

Use this command line:

Automated SSI Testing Tool

9 | A R I C E N T

Memcheck is the default tool. The --leak-check option turns on the detailed

memory leak detector.

Your program will run much slower (eg. 20 to 30 times) than normal, and use a lot

more memory. Memcheck will issue messages about memory errors and leaks that it detects.

Interpreting Memcheck's output:

Here's an example C program, in a file called a.c, with a memory error and a

memory leak.

Most error messages look like the following, which describes problem 1, the heap block

overrun:

Things to notice:

There is a lot of information in each error message; read it carefully.

The 19182 is the process ID; it's usually unimportant.

The first line ("Invalid write...") tells you what kind of error it is. Here, the program

wrote to some memory it should not have due to a heap block overrun.

Automated SSI Testing Tool

10 | A R I C E N T

Below the first line is a stack trace telling you where the problem occurred. Stack

traces can get quite large, and be confusing, especially if you are using the C++ STL.

Reading them from the bottom up can help. If the stack trace is not big enough, use

the --num-callers option to make it bigger.

The code addresses (eg. 0x804838F) are usually unimportant, but occasionally crucial

for tracking down weirder bugs.

Some error messages have a second component which describes the memory address

involved. This one shows that the written memory is just past the end of a block

allocated with malloc() on line 5 of example.c.

It's worth fixing errors in the order they are reported, as later errors can be caused

by earlier errors. Failing to do this is a common cause of difficulty with Memcheck.

Memory messages look like this:

The stack trace tells you where the leaked memory was allocated. Memcheck

cannot tell you why the memory leaked, unfortunately. (Ignore the "vg_replace_malloc.c",

that's an implementation detail.)

There are several kinds of leaks; the two most important categories are:

"definitely lost": your program is leaking memory fix it!

"probably lost": your program is leaking memory, unless you're doing funny things with

pointers (such as moving them to point to the middle of a heap block).

Memcheck also reports uses of uninitialised values, most commonly with the

message "Conditional jump or move depends on uninitialised value(s)". It can be difficult to

determine the root cause of these errors. Try using the --track-origins=yes to get extra

information. This makes Memcheck run slower, but the extra information you get often saves

a lot of time figuring out where the uninitialised values are coming from.

If you don't understand an error message, please consult Explanation of error

messages from Memcheck in the Valgrind User Manual which has examples of all the error

messages Memcheck produces.

Automated SSI Testing Tool

11 | A R I C E N T

TCL/TK

tclsh — Simple shell containing Tcl interpreter

SYNOPSIS

tclsh ?encoding name? ?fileName arg arg ...?

DESCRIPTION

Tclsh is a shelllike application that reads Tcl commands from its standard input or

from a file and evaluates them. If invoked with no arguments then it runs interactively,

reading Tcl commands from standard input and printing command results and error messages

to standard output. It runs until the exit command is invoked or until it reaches endoffile on

its standard input. If there exists a file .tclshrc (or tclshrc.tcl on the Windows platforms) in the

home directory of the user, interactive tclsh evaluates the file as a Tcl script just before

reading the first command from standard input.

SCRIPT FILES

If tclsh is invoked with arguments then the first few arguments specify the name

of a script file, and, optionally, the encoding of the text data stored in that script file. Any

additional arguments are made available to the script as variables (see below). Instead of

reading commands from standard input tclsh will read Tcl commands from the named file;

tclsh will exit when it reaches the end of the file. The end of the file may be marked either by

the physical end of the medium, or by the character, “\032” (“\u001a”, controlZ).

If this character is present in the file, the tclsh application will read text up to but

not including the character. An application that requires this character in the file may safely

encode it as “\032”, “\x1a”, or “\u001a”; or may generate it by use of commands such as

format or binary. There is no automatic evaluation of .tclshrc when the name of a script file is

presented on the tclsh command line, but the script file can always source it if desired.

If you create a Tcl script in a file whose first line is

then you can invoke the script file directly from your shell if you mark the file as

executable. This assumes that tclsh has been installed in the default location in /usr/local/bin;

if it is installed somewhere else then you will have to modify the above line to match. Many

Automated SSI Testing Tool

12 | A R I C E N T

UNIX systems do not allow the #! line to exceed about 30 characters in length, so be sure that

the tclsh executable can be accessed with a short file name.

An even better approach is to start your script files with the following three lines:

This approach has three advantages over the approach in the previous paragraph.

First, the location of the tclsh binary does not have to be hardwired into the script: it can be

anywhere in your shell search path. Second, it gets around the 30character file name limit in

the previous approach. Third, this approach will work even if tclsh is itself a shell script (this

is done on some systems in order to handle multiple architectures or operating systems: the

tclsh script selects one of several binaries to run). The three lines because both sh and tclsh to

process the script, but the exec is only executed by sh. sh processes the script first; it treats

the second line as a comment and executes the third line. The exec statement cause the shell

to stop processing and instead to start up tclsh to reprocess the entire script. When tclsh starts

up, it treats all three lines as comments, since the backslash at the end of the second line

causes the third line to be treated as part of the comment on the second line.

You should note that it is also common practice to install tclsh with its version

number as part of the name. This has the advantage of allowing multiple versions of Tcl to

exist on the same system at once, but also the disadvantage of making it harder to write

scripts that start up uniformly across different versions of Tcl.

EXPECT

SYNOPSIS

expect [ dDinN ] [ -c cmds ] [ [f|b] ] cmdfile ] [ args ]

INTRODUCTION

Expect is a program that "talks" to other interactive programs according to a

script. Following the script, Expect knows what can be expected from a program and what

the correct response should be. An interpreted language provides branching and highlevel

control structures to direct the dialogue. In addition, the user can take control and interact

directly when desired, afterward returning control to the script.

Automated SSI Testing Tool

13 | A R I C E N T

Expectk is a mixture of Expect and Tk. It behaves just like Expect and Tk's wish.

Expect can also be used directly in C or C++ (that is, without Tcl). See libexpect(3).

The name "Expect" comes from the idea of send/expect sequences popularized by

uucp, kermit and other modem control programs. However unlike uucp, Expect is generalized

so that it can be run as a userlevel command with any program and task in mind. Expect can

actually talk to several programs at the same time.

For example, here are some things Expect can do:

Cause your computer to dial you back, so that you can login without paying for the call.

Start a game (e.g., rogue) and if the optimal configuration doesn't appear, restart it (again

and again) until it does, then hand over control to you.

Run fsck, and in response to its questions, answer "yes", "no" or give control back to you,

based on predetermined criteria.

Connect to another network or BBS (e.g., MCI Mail, CompuServe) and automatically

retrieve your mail so that it appears as if it was originally sent to your local system.

Carry environment variables, current directory, or any kind of information across rlogin,

telnet, tip, su, chgrp, etc

There are a variety of reasons why the shell cannot perform these tasks. (Try,

you'll see.) All are possible with Expect.

In general, Expect is useful for running any program which requires interaction

between the program and the user. All that is necessary is that the interaction can be

characterized programmatically. Expect can also give the user back control (without halting

the program being controlled) if desired. Similarly, the user can return control to the script at

any time.

INTRODUCTION TO SOFTWARE AUTOMATION

Why automate Testing?

In today´s fast moving world, it is a challenge for any company to continuously

maintain and improve the quality and efficiency of software systems development. In

many software projects, testing is neglected because of time or cost constraints. This leads to

a lack of product quality, followed by customer dissatisfaction and ultimately to increased

overall quality costs.

Automated SSI Testing Tool

14 | A R I C E N T

The main reasons for these added costs are primarily:

poor test strategy

underestimated effort of test case generation

delay in testing

subsequent test maintenance

Test automation can improve the development process of a software product in

many cases. The automation of tests is initially associated with increased effort, but the

related benefits will quickly pay off.

Automated tests can run fast and frequently, which is cost-effective for software

products with a long maintenance life. When testing in an agile environment, the ability to

quickly react to ever-changing software systems and requirements is necessary. New test

cases are generated continuously and can be added to existing automation in parallel to the

development of the software itself.

In both manual and automated testing environments test cases need to be modified

for extended periods of time as the software project progresses. It is important to be aware

that complete coverage of all tests using test automation is unrealistic. When deciding what

tests to automate first, their value vs. the effort to create them needs to be considered. Test

cases with high value and low effort should be automated first. Subsequently test cases with

frequent use, changes, and past errors; as well as test cases with low to moderate effort in

setting up the test environment and developing the automation project are best suited for

automation.

Optimization of Speed, Efficiency, Quality and the Decrease of Costs

The main goal in software development processes is a timely release. Automated

tests run fast and frequently, due to reused modules within different tests. Automated

regression tests which ensure the continuous system stability and functionality after changes

to the software were made lead to shorter development cycles combined with better quality

software and thus the benefits of automated testing quickly outgain the initial costs.

Advance a Tester´s Motivation and Efficiency

Manual testing can be mundane, error-prone and therefore become

exasperating. Test automation alleviates testers' frustrations and allows the test execution

Automated SSI Testing Tool

15 | A R I C E N T

without user interaction while guaranteeing repeatability and accuracy. Instead testers can

now concentrate on more difficult test scenarios.

Increase of Test Coverage

Sufficient test coverage of software projects is often achieved only with great

effort. Frequent repetition of the same or similar test cases is laborious and time consuming to

perform manually. Some examples are:

Regression test after debugging or further development of software

Testing of software on different platforms or with different configurations

Data-driven testing (creation of tests using the same actions but with many different

inputs)

Test automation allows performing different types of testing efficiently

and effectively.

SPIRENT TEST CENTER

Convergence is creating a new generation of integrated network devices and

services that are much more complex than ever before. This increased complexity, combined

with a scarcity of testing skills and architectural shortcomings in current test systems, is

hurting the ability of manufacturers to ship products on time at acceptable quality levels.

And it is slowing service providers’ ability to deploy networks that get Quality of Experience

(QoE) right the first time. Spirent TestCenter is an end-to-end testing solution that delivers

high performance with deterministic results. Service providers, NEMs and enterprises can use

it to test, measure and assure their networks and deploy services with confidence.

Applications of the software:

Evaluate the stability of switches, routers and edge devices under static or dynamic load

conditions for minutes, hours and days

Characterize and troubleshoot functional behavior (including negative testing) of new

network functionality in the development lab or before deployment into the operational

network

Automated SSI Testing Tool

16 | A R I C E N T

Figure 4: Connection to chassis and Port Reservation

Evaluate key performance parameters such as per-flow QoS, fail-over time or Access

Control Lists (ACL); filtering performance

Perform comparative analysis of devices or services with deterministic traffic during

product development cycles or vendor comparisons

When used in conjunction with any of Spirent TestCenter’s additional protocol packages

the system can emulate complex network topologies and traffic conditions

Automated SSI Testing Tool

17 | A R I C E N T

Figure 5: Editing of a Stream Block

Figure 6: Transferring a packet from the Tx Port

Automated SSI Testing Tool

18 | A R I C E N T

Chapter 3: Journey So Far

Automation of Spirent

In the Automation of software, we make use of scripting language. In Spirent

automation, I am using TCL (Tool Command Language). the name Tcl is derived from "Tool

Command Language" and is pronounced "tickle". Tcl is a radically simple open-source

interpreted programming language that provides common facilities such as variables,

procedures, and control structures as well as many useful features that are not found in any

other major language. Tcl runs on almost all modern operating systems such as Unix,

Macintosh, and Windows (including Windows Mobile).

While Tcl is flexible enough to be used in almost any application imaginable, it does

excel in a few key areas, including: automated interaction with external programs, embedding

as a library into application programs, language design, and general scripting.

In this project, we have to make configurations on ISS, and according to those

configurations, we make the use of Spirent. So, on whole we have to configure ISS and

Spirent with respect to each other. Now, to configure I am using Expect language,which is an

extention of Tcl and is a program to automate interactions with programs that expose a text

terminal interface. It is used to automate control of interactive applications such

as telnet, ftp, passwd, fsck, rlogin, tip, ssh, and others. Expect uses pseudo terminals (Unix)

or emulates a console (Windows), starts the target program, and then communicates with it,

just as a human would. After configuring ISS, we use Tcl to configure the Spirent.There are a

lot of configurations, so we make generic test cases, so user can use them easily.

Spirent TestCenter Automation is an automated software system for network

performance analysis. You use Spirent TestCenter software together with a network hardware

configuration that includes Spirent TestCenter hardware and your network device(s). The

Spirent TestCenter software/hardware combination generates test traffic to measure the

performance of your network device.

Spirent TestCenter Automation provides several programming languages (APIs) that

you use to create and run tests.

Automated SSI Testing Tool

19 | A R I C E N T

Spirent Communications distributes Spirent TestCenter Automation software as part

of the Spirent TestCenter software product. The Spirent TestCenter Automation capabilities

are available in the set of Spirent TestCenter software packages that support network

protocols and RFC test methodologies. There are two classes of Spirent TestCenter packages:

• The Base packages provide software for testing network protocols.

• The Test packages provide software for testing based on well-defined test

methodologies that are either RFC-based standards or developed by Spirent in working with

its customers.

Figure 7: Spirent TestCenter Automation and the Spirent TestCenter Environment

To use the Spirent TestCenter packages, you must obtain the appropriate license(s).

For information about licenses, refer to the Getting Started with Spirent TestCenter

document. The Spirent TestCenter software includes a graphical user interface (GUI) for the

test system. The Spirent TestCenter GUI is separate from the Spirent TestCenter Automation

Automated SSI Testing Tool

20 | A R I C E N T

software. The GUI allows you to generate tests, run the tests, and review the results without

writing any scripts or programs. These GUI interfaces are both easy to use and powerful, but

many users require the ability to develop unique tests or tests that will run unattended for

hours or days. Spirent TestCenter Automation also supports easy customization of tests. In

general, the automation interface and the GUI provide the same capabilities.

Project Undertaken

As, described earlier, we have a project on SSI automation of Spirent. The training

has been divided into various phases based on the project duration. Each phase encompasses

a number of activities scheduled within it and is described in detail. This section also includes

the detailed description about the implementation of the proposed activity in the project.

Activities in January 2014

The internship started with a dedicated training phase related to the topics which are a

prerequisite to the project at hand. The training involved studying, researching and

strengthening of below mentioned concepts:

Overview of UNIX and Networking

I studied about various aspects of UNIX operating system such as shell scripting, shell

variables, and various other basic commands. Additionally, I accumulated knowledge about

basic concepts of computer networking such as network, working of various protocols etc.

What Is "The Shell"?

Simply put, the shell is a program that takes commands from the keyboard and

gives them to the operating system to perform. In the old days, it was the only user interface

available on a Unixlike system such as Linux. Nowadays, we have graphical user interfaces

(GUIs) in addition to command line interfaces (CLIs) such as the shell.

On most Linux systems a program called bash (which stands for Bourne Again

SHell, an enhanced version of the original Unix shell program, sh, written by Steve Bourne)

acts as the shell program. Besides bash, there are other shell programs that can be installed in

a Linux system. These include: ksh, tcsh and zsh.

Automated SSI Testing Tool

21 | A R I C E N T

What's A "Terminal?"

It's a program called a terminal emulator. This is a program that opens a window

and lets you interact with the shell. There are a bunch of different terminal emulators you can

use. Most linux distributions supply several, such as: gnometerminal, konsole, xterm, rxvt,

kvt, nxterm, and eterm.

Programming with Tcl and Expect

I studied about the basic concepts of tcl/tk and Expect that included data types, lists,

control statements, input and output, string handling, file handling.

I also took an overview about the tesing tools such as TcpDump, tshark, iperf, ipftop,

top, iptraf, grep, awk, ssh, wireshark module of python that was a very important part of

creating automation scripts.

Tcl (Tool Command Language) is a radically simple open-source interpreted

programming language that provides common facilities such as variables, procedures, and

control structures as well as many useful features that are not found in any other major

language. Tcl runs on almost all modern operating systems such as Unix, Macintosh, and

Windows (including Windows Mobile). Expect language is an extention of Tcl and is a

program to automate interactions with programs that expose a text terminal interface. It is

used to automate control of interactive applications such

as telnet, ftp, passwd, fsck, rlogin, tip, ssh, and others. Expect uses pseudo terminals (Unix)

or emulates a console (Windows), starts the target program, and then communicates with it,

just as a human would.

Hands-on task Assignment

I got familiar with the Aricent coding guidelines and implemented various programs

that included UNIX basic commands, data structure programs, file handling and various other

concepts of Tcl.

Windows Command Line:

The command line lets you communicate directly with your computer and instruct

it to perform various tasks. For this you have to use specific commands. The commands are

not necessarily intuitive, so they have to be learned, just like words in a language.

Automated SSI Testing Tool

22 | A R I C E N T

Fortunately, there are graphical user interfaces (GUIs) replacing most procedures that

formerly required using the command line.

However, sometimes using the command line is quicker or even the only way to

access certain information. Thus knowing how to use the command line can be extremely

valuable. And that’s where this Windows command guide comes in.

Activities in February 2015

VALGRIND

Valgrind is Open Source / Free Software, and is freely available under the GNU

General Public License, version 2.

The Valgrind tool suite provides a number of debugging and profiling tools that

help you make your programs faster and more correct. The most popular of these tools is

called Memcheck. It can detect many memory related errors that are common in C and C++

programs and that can lead to crashes and unpredictable behaviour.

Figure 8: Example program a.c, with memory error and a memory leak

Above is an example C program, in a file called a.c, with a memory error and a

memory leak.

Most error messages look like the following, which describes problem 1, the heap

block overrun:

Automated SSI Testing Tool

23 | A R I C E N T

Things to notice:

There is a lot of information in each error message; read it carefully.

The 19182 is the process ID; It’s usually unimportant.

The first line ("Invalid write...") tells you what kind of error it is. Here, the program

wrote to some memory it should not have due to a heap block overrun.

Below the first line is a stack trace telling you where the problem occurred. Stack

traces can get quite large, and be confusing, especially if you are using the C++ STL.

Reading them from the bottom up can help. If the stack trace is not big enough, use

the --num-callers option to make it bigger.

The code addresses (eg. 0x804838F) are usually unimportant, but occasionally crucial

for tracking down weirder bugs.

Some error messages have a second component which describes the memory address

involved. This one shows that the written memory is just past the end of a block

allocated with malloc() on line 5 of example.c.

It's worth fixing errors in the order they are reported, as later errors can be caused

by earlier errors. Failing to do this is a common cause of difficulty with Memcheck.

Memory messages look like this:

The stack trace tells you where the leaked memory was allocated. Memcheck

cannot tell you why the memory leaked, unfortunately. (Ignore the "vg_replace_malloc.c",

that's an implementation detail.)

There are several kinds of leaks; the two most important categories are:

"definitely lost": your program is leaking memory fix it!

Automated SSI Testing Tool

24 | A R I C E N T

"probably lost": your program is leaking memory, unless you're doing funny things with

pointers (such as moving them to point to the middle of a heap block).

Memcheck also reports uses of uninitialised values, most commonly with the

message "Conditional jump or move depends on uninitialised value(s)". It can be difficult to

determine the root cause of these errors. Try using the --track-origins=yes to get extra

information. This makes Memcheck run slower, but the extra information you get often saves

a lot of time figuring out where the uninitialised values are coming from.

If you don't understand an error message, please consult Explanation of error

messages from Memcheck in the Valgrind User Manual which has examples of all the error

messages Memcheck produces.

Activities in February-End 2015

TCL/TK

tclsh — Simple shell containing Tcl interpreter

SYNOPSIS

tclsh ?encoding name? ?fileName arg arg ...?

DESCRIPTION

Tclsh is a shelllike application that reads Tcl commands from its standard input or

from a file and evaluates them. If invoked with no arguments then it runs interactively,

reading Tcl commands from standard input and printing command results and error messages

to standard output. It runs until the exit command is invoked or until it reaches endoffile on

its standard input. If there exists a file .tclshrc (or tclshrc.tcl on the Windows platforms) in the

home directory of the user, interactive tclsh evaluates the file as a Tcl script just before

reading the first command from standard input.

SCRIPT FILES

If tclsh is invoked with arguments then the first few arguments specify the name

of a script file, and, optionally, the encoding of the text data stored in that script file. Any

additional arguments are made available to the script as variables (see below). Instead of

reading commands from standard input tclsh will read Tcl commands from the named file;

tclsh will exit when it reaches the end of the file. The end of the file may be marked either by

the physical end of the medium, or by the character, “\032” (“\u001a”, controlZ).

Automated SSI Testing Tool

25 | A R I C E N T

If this character is present in the file, the tclsh application will read text up to but not

including the character. An application that requires this character in the file may safely

encode it as “\032”, “\x1a”, or “\u001a”; or may generate it by use of commands such as

format or binary. There is no automatic evaluation of .tclshrc when the name of a script file is

presented on the tclsh command line, but the script file can always source it if desired.

Activities In March 2015

EXPECT

SYNOPSIS

expect [ dDinN ] [ -c cmds ] [ [f|b] ] cmdfile ] [ args ]

INTRODUCTION

Expect is a program that "talks" to other interactive programs according to a

script. Following the script, Expect knows what can be expected from a program and what

the correct response should be. An interpreted language provides branching and highlevel

control structures to direct the dialogue. In addition, the user can take control and interact

directly when desired, afterward returning control to the script.

Expectk is a mixture of Expect and Tk. It behaves just like Expect and Tk's wish.

Expect can also be used directly in C or C++ (that is, without Tcl). See libexpect(3).

The name "Expect" comes from the idea of send/expect sequences popularized by

uucp, kermit and other modem control programs. However unlike uucp, Expect is generalized

so that it can be run as a userlevel command with any program and task in mind. Expect can

actually talk to several programs at the same time.

For example, here are some things Expect can do:

Cause your computer to dial you back, so that you can login without paying for the call.

Start a game (e.g., rogue) and if the optimal configuration doesn't appear, restart it (again

and again) until it does, then hand over control to you.

Run fsck, and in response to its questions, answer "yes", "no" or give control back to you,

based on predetermined criteria.

Connect to another network or BBS (e.g., MCI Mail, CompuServe) and automatically

retrieve your mail so that it appears as if it was originally sent to your local system.

Automated SSI Testing Tool

26 | A R I C E N T

Carry environment variables, current directory, or any kind of information across rlogin,

telnet, tip, su, chgrp, etc

Understanding Spirent GUI and its various functionalities:

Figure 9: Connection to chassis and Port Reservation

I gathered knowledge about the following tools that we must know about for creating

automation scripts for SSI testcases.

Following are the steps required to start up with Spirent

• Install the setup.

• Start the Spirent TestCenter Application.

• Click on chassis, connect it and reserve a slot and its port.

Here you can see, the ports which are reserved are Port//1/21, Port//1/21, Port//1/22,

Port//1/23, Port//1/24. They are green in color which means the ports and the link related the

ports are up.

• Click on the any of the port and there you can see the option of Traffic Generator.

Automated SSI Testing Tool

27 | A R I C E N T

• In this column we generate a stream block. Stream block is packet having different

types of PDUs. We have the basic L2 packet having Ethernet frame. Similarly, we can

have IPv4, TCP, UDP, ICMP, etc frames.

Figure 10: Editing of a Stream Block

• The port has also the option of Traffic analyzer, Devices, and Capture.

• Traffic analyzer basically sets the filter on the packet, like if we have to set filter on

the Source Mac, we can set here and on the result sheet, the packet matching that

source Mac will be received.

• A port can be set as a router/ host/switch etc by selecting the type of device in

Devices. If we have to analyze that how many packets have reached to the Rx port,

and analyze them from the depth, we use the Capture. For capturing the packets,

WireSHark is inbuilt in Spirent TestCenter.

• Now in the result sheet of Spirent, we can see the Tx port results and corresponding

Rx port results.

Automated SSI Testing Tool

28 | A R I C E N T

Figure 11: Transferring a packet from the Tx Port

Activities in April 2015

Automated testing is the process of running part of whole of the software testing

activity by using automation tools. The tools fall into either the freeware or licensed category.

The usages of tools are used in defect tracking, load and performance testing, static analysis,

test coverage, test implementation and test case management.

With automation testing the use of automation framework came into the scene. A test

automation framework is a set of assumptions, concepts, and practices that provide support

for automated software testing.

So it is a framework or methodology built to successfully carry out test automation?

Automated SSI Testing Tool

29 | A R I C E N T

Steps to automate test cases are:

• Identifying areas within a software for automation.

• Selection of appropriate tool for Test automation.

• Writing Test scripts.

• Development of Test suits.

• Execution of scripts.

• Create result reports.

• Identify any potential bug or performance issue.

Automation of Spirent Configurations:

When you use Spirent TestCenter Automation to run a test, you write a Tcl script that

uses the following elements:

• Test Configuration Objects

• Relations

• Commands and Command Objects.

Test Configuration Objects

Test configuration objects describe the components of your test configuration. These

objects provide the data that Spirent TestCenter needs to create and run a test. Examples of

test configuration objects are:

Relations

Relations define the connections between the objects in your test configuration. Every

object is connected to at least one other object by a ParentChild relation. Spirent TestCenter

creates ParentChild relations automatically when you create objects. In other cases, you must

create relations to support specific test operations. For example, the ExpectedRxPort relation

connects a StreamBlock object to a Port object, to identify the port that will receive traffic.

Spirent TestCenter Automation uses command objects to define test actions such as

StartProtocol, CaptureStart, and L2LearningStart. The command parameters are defined as

attributes for the associated command objects.

Automated SSI Testing Tool

30 | A R I C E N T

Command And Command Objects

There are two methods of invoking commands:

• You can use the perform function to invoke a command. For example:

stc::perform generatorStart -generatorList $generator

When you use the perform function, you specify the command name, along with

name-value pairs for attributes that are defined for the corresponding command object. When

you use perform, you must call the function each time you want to execute an action.

• You can create command objects and use the sequencer to execute a set of commands.

When you use the sequencer, you add the set of command objects to the sequencer,

and then call the perform function to execute the SequencerStart command. (For

information about using the sequencer, see the Spirent TestCenter Automation

Programmer’s Reference.)

Test Execution (Traffic Generation and Analysis)

The basic operations of test execution are traffic generation and analysis. Spirent

TestCenter Automation defines Generator and Analyzer objects to support these

operations. (Spirent TestCenter creates these objects automatically.)

Once you have created the objects and relations for your test configuration, use the

following commands to control generation and analysis:

• AnalyzerStart

• AnalyzerStop

• GeneratorStart

• GeneratorStop

See the Tcl script that is described beginning on page 36 for an example of using the analyzer

and generator components.

API Functions

The Spirent TestCenter API provides a set of functions that you use to create and run

tests.

Automated SSI Testing Tool

31 | A R I C E N T

These functions allow you to do the following:

• Create objects to build the object hierarchy for your test configuration.

• Set the value of object attributes to define the characteristics of your test.

• Get the value of object attributes, including result attributes.

• Run the test after it has been created.

• Connect to your Spirent TestCenter chassis.

Figure 12: View of the Spirent Script Editor

tshark

TShark is a network protocol analyzer. It lets you capture packet data from a live

network, or read packets from a previously saved capture file, either printing a decoded form

of those packets to the standard output or writing the packets to a file. TShark's native capture

file format is .pcap format, which is also the format used by tcpdump and various other tools.

Automated SSI Testing Tool

32 | A R I C E N T

Without any options set, TShark will work much like tcpdump. It will use the pcap

library to capture traffic from the first available network interface and displays a summary

line on stdout for each received packet.

TShark is able to detect, read and write the same capture files that are supported

by Wireshark. The input file doesn't need

a specific filename extension; the file

format and an optional gzip compression

will be automatically detected. Near the

beginning of the DESCRIPTION section

of wireshark (1) or https://www.wireshark.org/docs/man-pages/wireshark.html is a

detailed description of the way Wireshark handles this, which is the same

way Tshark handles this.

Compressed file support uses (and therefore requires) the zlib library. If the zlib

library is not present, TShark will compile, but will be unable to read compressed files.

If the -w option is not specified, TShark writes to the standard output the text of a

decoded form of the packets it captures or reads. If the -w option is specified, TShark writes

to the file specified by that option the raw data of the packets, along with the packets' time

stamps.

When writing a decoded form of packets, TShark writes, by default, a summary line

containing the fields specified by the preferences file (which are also the fields displayed in

the packet list pane in Wireshark), although if it's writing packets as it captures them, rather

than writing packets from a saved capture file, it won't show the "frame number" field. If

the -V option is specified, it writes instead a view of the details of the packet, showing all the

fields of all protocols in the packet. If the -O option is specified, it will only show the full

protocols specified. Use the output of "tshark -G protocols" to find the abbreviations of the

protocols you can specify.

If you want to write the decoded form of packets to a file, run TShark without the -

w option, and redirect its standard output to the file (do not use the -w option).

Automated SSI Testing Tool

33 | A R I C E N T

When writing packets to a file, TShark, by default, writes the file in pcap format, and

writes all of the packets it sees to the output file. The -F option can be used to specify the

format in which to write the file. This list of available file formats is displayed by the -F flag

without a value. However, you can't specify a file format for a live capture.

Figure 13: View of the Wireshark

Read filters in TShark, which allow you to select which packets are to be decoded or

written to a file, are very powerful; more fields are filterable in TShark than in other protocol

analyzers, and the syntax you can use to create your filters is richer. As TShark progresses,

expect more and more protocol fields to be allowed in read filters.

Packet capturing is performed with the pcap library. The capture filter syntax follows

the rules of the pcap library. This syntax is different from the read filter syntax. A read filter

can also be specified when capturing, and only packets that pass the read filter will be

displayed or saved to the output file; note, however, that capture filters are much more

efficient than read filters, and it may be more difficult for TShark to keep up with a busy

network if a read filter is specified for a live capture.

Automated SSI Testing Tool

34 | A R I C E N T

A capture or read filter can either be specified with the -f or -R option, respectively, in

which case the entire filter expression must be specified as a single argument (which means

that if it contains spaces, it must be quoted), or can be specified with command-line

arguments after the option arguments, in which case all the arguments after the filter

arguments are treated as a filter expression. Capture filters are supported only when doing a

live capture; read filters are supported when doing a live capture and when reading a capture

file, but require TShark to do more work when filtering, so you might be more likely to lose

packets under heavy load if you're using a read filter. If the filter is specified with command-

line arguments after the option arguments, it's a capture filter if a capture is being done (i.e.,

if no -r option was specified) and a read filter if a capture file is being read (i.e., if a -r option

was specified).

The -G option is a special mode that simply causes Tshark to dump one of several

types of internal glossaries and then exit.

Activities in End-April 2015

Started working on the implementation of topics learnt and therefore initiated the

development of Automation tool.

Ethernet Basics and IP Routing

Figure 14: Basic Ethernet Packet Structure and associated EtherTypes

Automated SSI Testing Tool

35 | A R I C E N T

IEEE 802.1Q

Tag protocol identifier (TPID): a 16-bit field set to a value of 0x8100

Tag control information (TCI)

• Priority code point (PCP): a 3-bit field which refers to the IEEE 802.1p class

of service and maps to the frame priority level.

• Drop eligible indicator (DEI): a 1-bit field.

• VLAN identifier (VID): a 12-bit field specifying the VLAN. (VLAN 1 the

default VLAN ID)

Automated SSI Testing Tool

36 | A R I C E N T

Why MPLS ?

Figure 15: A schematic of MPLS Network

MPLS Label

20 bits 3 bits 1 bit 8 bits

Label EXP: Experimental bits S: Bottom-of-Stack Time-to-Live

Node roles

LER [ label edge router ]

o Ingress

o Egress

LSR [ label switch router ]

PHP [ Penultimate Hop popping ] (label 3)

FEC – Forward equivalence class

CE1

CE2

CE4 CE5

CE3

PE1

PE3

PE2

PE4 PE5

R1

R2

R3

R4

R6 R5

R7

Automated SSI Testing Tool

37 | A R I C E N T

LSP – label switched path

NHLFE – Next hop label forwarding entry

FTN – Fec to NHLFE

ILM – Incoming label map

VPLS

Figure 16: A schematic diagram of a VPLS network

CE

PE

CE

CE

PE

PE

CE

CE

Automated SSI Testing Tool

38 | A R I C E N T

IP Routing

Figure 17: IP Routing

Host 1 192.168.1.10

08:00:27:a2:6c:01

Host 2 192.168.1.20

08:00:27:a2:6c:02

Host 3 192.168.1.30

08:00:27:a2:6c:03

P1 P2

P3

Host 1 192.168.1.10

08:00:27:a2:6c:01

Host 2 192.168.1.20

08:00:27:a2:6c:02

R1-P1 R1-P2 R2-P1 R2-P2

Automated SSI Testing Tool

39 | A R I C E N T

Figure 18: An example Snapshot taken from the Main console to show IP routing and Ethernet Switching

Chapter 4: The Actual Project

AUTOMATED SSI TESTING TOOL

The objective of the Project is to:

1) Automate the Existing Test Cases

2) Proceed with a plan to automate the future test cases(if any)

3) Automate Spirent Test center using TCL Scripting

a. Reserve ports for traffic generation and reception

b. Generation of Traffic (Packets)

c. Reception of the packets.

d. Verification and Analysis of the packets received at the Reception port

4) Create of an Automation Test Suite with multiple test cases.

5) Create log files for each of the test case run.

Automated SSI Testing Tool

40 | A R I C E N T

Activities in April-End to Present

In the April-End, I along with my mentor Mr. Ashish were working on the Design of

the Testing Tool as to there are multiple machines and with different platforms. The devices

and machines must be initiated in specific order.

So, there were numerous roadblocks which had to be eliminated or solved so as to

make the Design realizable like:

With the above objectives in mind, the following success has been achieved:

1) In order to automate different client application and terminal on different machines

(Windows and Linux), created a process flow to operate from a trigger point present

on the windows machine. However, with the update on work, the Linux to Windows

Trigger is required, and have been looking forward to achieve it. (SSH connection)

2) Solved various compatibility issues associated with different applications on different

platforms. (Windows and Linux).

3) Created a .tcl script to automate Spirent Test Center.

a. The script reserves, connects and attaches the preconfigured ports.

b. Generation of traffic with required attributes. (Custom Packet)

c. Reception of packets.

However, related to packets verification and analysis, the work requires more of

R&D

4) Using Expect Scripting, successful execution of binaries on the switch and calling of

required stubs automatically.

a. The script automatically setups an SSH connection to the switch to deploy

operations and call the stubs thereafter in the required fashion.

b. The Script creates log file for each of the steps in a separate folder.

However, the creation of Separate test suite facilitates the addition of newer test cases

in future. Also, have been on track to achieve and make the test run generic by

including the command line values.

5) Using Expect Scripting have been able to interact and send command automatically to

a remote machine via SSH connection. The scripting has been continued to copy the

log files from the Remote machine to the local machine over SSH connection using

SCP protocol.

Automated SSI Testing Tool

41 | A R I C E N T

With the current status of the project, I have been able to substantiate the design and

have started to implement the design to different SSI Test Cases.

Here is the machine topography used for project:

My Linux Machine: 172.16.135.33

My Windows Machine: 172.16.135.91 with admin access.

Spirent Test Center Server Machine

My Corporate Machine to handle all the terminals and different windows: Windows

Platform 10.203.1.9

Hardware to operate and run the test cases on.

Tool So far:

The Linux Machine works as the trigger point in which we have a named folder TestSuite.

This TestSuite folder contains the DIRECTORY Structure required for Test Suite Run. The

TestSuite is executed with the following command:

Figure 19: Screenshot of the Execution of the expect Script Runfile

The Directory Structure on the Linux platform before the TestSuite is run looks like this:

Automated SSI Testing Tool

42 | A R I C E N T

Figure 20: Directory Structure on Linux Machine before the Test tool Run

And the DIRECTORY STRUCTURE on Windows Machine associated with the Tool has the

following Structure:

Figure 21: Directory Structure on Windows Machine after the Test Tool Run

The files mentioned on both the platforms should be available prior the Test Tool is run to

Automated SSI Testing Tool

43 | A R I C E N T

make the Run Successful. The following are the details of the files on the Linux and

Windows platforms:

Runfile

This is an expect script. It is the base script of the Testing Tool. The script does the

following:

Takes Configurations from the Configuration.txt file. It processes the .txt file to

receive the appropriate inputs and addresses of the different associated hardware

It runs the hardware, and attached other machines in specifically defined order to

communicate between them and send control instructions.

Cracks the listed test cases in the Testcases.txt file to execute only the testcases

contained in the Suite.

Figure 22: View of the Main Expect Script file Runfile

Automated SSI Testing Tool

44 | A R I C E N T

Decrypts the framework of the associated testcase <test_case_NAME>.txt file to

receive the Spirent associated Configurations, Functions and Check values attributed

to a specific test case.

Creates and transfers var.tcl file from Linux to Windows machine through

Operations_On_Windows expect script to communicate the Spirent Configurations

between the machines.

Create the Result_On_<timestamp>.csv to output the final results of the Test tool run.

Creates a <timestamped> named folder to contain the associated logfile

Executed the Copy_Logs expect script to copy the hardware related log files within

the same folder. Later the log files may be used for debugging.

Copies the parsed files into the testcase named folder under the

TestSuite_Run_<timestamp> folder

Configuration.txt

Contains the Addresses of the Hardware associated with the Test Environment

The user who runs the tool need to fill in the values after the ‘>’ sign

Figure 23: View of the Configuration.txt file

Automated SSI Testing Tool

45 | A R I C E N T

Testcases.txt

Contains the names of the testcases to be run on the Hardware with associated

configurations.

Needs the <name>.txt to be present within the same folder as the folder which

contains the Runfile script.

Figure 24: View of the Testcases.txt file

<test_case_NAME>.txt

Contains 3 different but very important properties:

o Spirent Configuration

o Functions/Stubs to be called while running the binary in gdb mode.

o Check values associated with each test case.

The file contents to be filled very carefully as the contents are Case-sensitive.

The Spirent Configuration must be according to the Spirent_<#> file on the windows

platform

The functions called in the test cases must be in specific order and with sufficient

prefixes before the name of the stub, as for eg.

Automated SSI Testing Tool

46 | A R I C E N T

o hal_ for any configuration related stubs.

o Execute_Spirent for calling the Windows and therefore the Spirent Client.

Figure 25: View of the test_case_AC_Vlan_Unicast.txt file

Spirent_<#>.tcl

Tcl/tk script to automate the Spirent Test Center.

The script actually contains the configuration of the packet to be created for forward

as well as backward Transmission.

The script also captures the packet at the reception port for forward and backward

transmission.

The configuration/ initialization of the variables used in the script is received from the

file var.tcl

This file var.tcl is created by the Runfile on the linux platform and is then send over

SSH connection to the Windows machine.

The value of ‘#’ in the name of the file Spirent_<#>.tcl file is unique to a single test

case or a group of test cases that require a similar packet configuration.

The value of ‘#’ is received from the test_case<NAME>.txt file where the user marks

the number alongside the Execute_Spirent Command or stub call.

Automated SSI Testing Tool

47 | A R I C E N T

Figure 26: View of the Spirent_1.tcl file on Windows platform

The Expect Script Runfile, once run creates 4 new entries in the Directory.

Result_On_<timestamp>.csv file

var.tcl file

<timestamp> named folder to contain the logfiles

TestSuite_On_<timestamp> folder containing the testcase named folders which

eventually contain the parsed .txt files as the record of the testsuite run.

However, it is mandatory for the user to follow the directory structure on Linux and

Windows platform. The Runfile, Operations_On_Windows and Copy_Logs are all expect

scripts which need to be present within the same folder.

The script Runfile calls the other two in specific positions to control the work flow

and necessary and required actions.

Once the Test Tool is run, the Directory Structure on linux platform looks like this:

Automated SSI Testing Tool

48 | A R I C E N T

Figure 27: The Snapshot of the Automated Testing Tool Run

Figure 28: Directory Structure on linux Platform after the TestSuite Run

Automated SSI Testing Tool

49 | A R I C E N T

The Results of the Test Suite Run, is processed onto a .csv file. The new file with

name Results_On_<timestamp>.csv is created.

Results_On_<timestamp>.csv

The file is created in the same directory which contains the Runfile.

The contents of the file are the four columns which contain the Test_case Name,

along with the attached result for forward and backward Transmission.

The last column therefore contains the Overall Result of the test case run, in the form

of PASSED/FAILED.

The entry to the file is made through sending comma separated values to the file.

Figure 29: The View of the Result file Results_On_<timestamp>.csv

Along with this an associated logfile is generated with the folder which is named

according to timestamp. This logfile contains the entries of the Test Suite commands

execution.

Automated SSI Testing Tool

50 | A R I C E N T

<timestamp>/logfile

The logfile contains the log of the flow of the Test Suite, sending multiple commands.

If a test case or multiple test cases fail, the debugging is done via this logfile, so as to

check at which point the error occurred.

Figure 30: View of the logfile

The Main computer through which different machines, ie, My Linux Machine and My

Windows Computer can be seen simultaneously helps in follow the TestSuite flow during

developmental stages.

The Linux Machine, The Windows Machine, Spirent and Hardware must be

connected in the same subnet. This is to ensure no permission errors while accessing files and

folders over the network.

It is however not necessary to connect the Master machine, ie, the Desk Windows

machine to be connected to the same subnet. But it essentially requires the permission

granted to remotely access the Computer and to gain their GUI controls.

Automated SSI Testing Tool

51 | A R I C E N T

Figure 31: View of the various machines over the Remote Desktop Connection.

Chapter 5: Outcomes from the Training

The Training at ARICENT helped me a lot to learn about the basic concepts of

Ethernet Packet Switching and IP routing. It helped me understand various Protocols related

to the Networking.

The software and tools learned during the training period add to the skill set. With the

end of the training, I would be well aware with the Spirent GUI and its related different

operations which could be performed using this tool.

The Detailed study about the Linux and Windows terminals and their associated

commands and instructions when learnt, open up various possible ways/paths towards

software development or hardware debugging. It is also very useful in Network associated

operations and testing tools.

Automated SSI Testing Tool

52 | A R I C E N T

The Scripting languages, Tcl/tk and Expect: were the two greatest benchmarks I learnt

during the period. The need of a scripting language in industry, where most of the operations

are done via softwares on terminals, is must. Expect, though is an extension of Tcl/tk with

few added features, but is a powerful tool I learnt during this period. With the help of this

highly interactive scripting language it was lot easier to automate the manual steps taken to

complete the process.

A little learning about newer processes and multiple threads, active threads and

interprocess communication, in addition to Expect scripting language is a lot beneficial. After

having a detailed and thorough understanding of its commands and usage, I was able to

automate different steps the team took to fulfill repeated set of commands. It therefore, eased

the task of the engineer as well allowed to increase the throughput without compensating with

the quality of service.

Talking about the impediments faced during the training, there were many, Each

teaching newer methods and techniques. Learning about the SSH Connection over Port 22,

its client server network establishment across different platforms was one of the serious

issues I faced during the work. Then there was another issue of the compatibility of the

different software and applications across different platforms. However, with the help of the

team, and elaborated learning, I was able to establish successful SSH Server on Windows

platform helping out to implement the Framework design.

By the end of April month, I was stuck under a grand issue of developing an

individual parser to process the packets received and therefore analyze the traffic. Since in

the automation of testing tool, generating packets wasn’t enough, it was to process the packet

accordingly. I was assigned a side project of developing a parser to process the .pcap file.

Solving the issue was a grand challenge. However, with the learning I was able to implement

a parser design into the framework design of the Automated SSI Testing Tool itself and

realize the project in real sense.

With the current process flow and current design, I am currently adding newer test

cases into the test suite. The test Suite is well suited for Unicast packet transfers. However to

implement the design for multicast packets, it is still a challenge. With the latest

developments I hope to develop and upgrade the design appropriately.

Automated SSI Testing Tool

53 | A R I C E N T

Chapter 6: References

https://valgrind.org

http://www.pcstats.com/articleview.cfm?articleid=1723

http://www.tcl.tk/man/tcl/UserCmd/tclsh.htm

http://www.tcl.tk/man/expect5.31/expect.1.html

http://www.aspid.pt/files/PDF/SpirentTestCenter.pdf

https://www.wireshark.org/docs/man-pages/wireshark.html

http://en.wikipedia.org/wiki/EtherType