8
Service Oriented Architecture for Agile Automated Testing Environment Michael Weir, Ross Kulak Agile Test Incorporation Boca Raton, FL, United States [email protected], [email protected] Ankur Agarwal Florida Atlantic University Boca Raton, FL, United States [email protected] AbstractSystem test development, automation and execution process are key stages of the overall product development to both the New Product Introduction (NPI) and Production Release processes. For NPI, companies must create test systems to support product validation and verification. For manufacturing companies, ongoing process metrics are used to ensure the product meets quality specifications and can be sold to customers. This entire test process is time consuming and resource intensive and therefore negatively impacts the overall product net revenue, both in terms of time to market and in terms of development resources. Large and successful companies invest hundreds of thousands of dollars in automated test systems to support product development. Such infrastructures provide a competitive advantage by enabling a systematic methodology to generate test plans and then automatically have the test plan flow through the test software and hardware development, test and data collection, and results analysis phases. The Automatic Testing Equipment (ATE) industry has pushed to develop a framework that supports the sharing of test information, data, and analysis results across various enterprise platforms. An IEEE standard know as Automatic Test Markup Language (ATML), comprising of an XML schema, was proposed and developed in order to allow interoperability of test case, data, equipment information, and results. Our methodology provides a Service-Oriented Architecture that provides an interoperable solution. Users can begin with a test plan, deploy a scalable data monitoring and analysis capability, and follow the process from NPI through production. Various additional capabilities such as advanced analysis capability, customer data sharing resources, test software generation and deployment, closed and open source software library access, test station monitoring and equipment tracking, automated reporting schedules, and others are among the possibilities that can be added to the overall process. The proposed architecture is entirely scalable and can be deployed in single-site or global applications and may be installed behind corporate firewalls or in the cloud. Keywords— Agile Test, Service Oriented Architecture, Data Analysis System Testing. I. INTRODUCTION Generating, collecting, analyzing, and consuming test data are key to corporate business intelligence. Managing the creation of test plans, test automation, test data, and utilization of very expensive assets becomes a discipline and process of its own. Automation of these capabilities is a key component of the modern day test process. The tests are often too complex to do manually and far too time consuming to be executed in a manufacturing environment. The framework of database and test tools wrapped around this process is referred to as a test automation framework. This framework is the key to enhancing the productivity of overall team and gives companies a competitive advantage by allowing them to systematically define test plans and then quickly move through the process of automating the test plan and generating product critical data. The Automatic Testing Equipment (ATE) industry has pushed to develop a framework which would support the sharing of test information, data, and results across various enterprise platforms. Usually the test engineer develops a test plan and defines the equipment and software requirements. Based on the test plan, the test equipment is assembled and testing team orchestrates the automation and further collects, analyzes and reports the data upon test completion. The test process for any given company can encompass the following phases: (1) Test Plan development; (2) Equipment and interface selection; (3) Automation plan creation and automation software development; (4) Data collection and storage; (5) Data analysis; (6) Test report/data sheet assembly and publication; (7) Supply chain test monitoring – Contract manufacturing; (8) Test station utilization, reservation and scheduling; (9) Test Asset calibration and tracking; (10) Manufacturing Execution System (MES). The overall testing process is a long, resource consuming process which starts with the test plan development leading to development of testing software libraries, data sheet assembly, reservation of testing station, development of testing data, test station calibration, test execution, data collection and data analysis and reporting. It is absolutely evident that there can be a wide scope of process automation in system testing. Companies typically follow an evolutionary path to address this process. Test plans begin with engineers converting text to spreadsheet plans with various individual formats and storage locations. These then translate into individual automation plans and automation solutions with moderate amounts of reusability. Once automation is developed and in place, the production, storage, analysis and publications of results ranges from local text files to haphazardly organized file shares. This creates confusion in organization of the results as well as communication, publication, and record-keeping challenges. The formats are varied. Consumption and analysis of the results is dependent on the individual understanding of formatting and location. This provides little opportunity to analyze and share test data within an organization and it is even 978-1-4673-3108-1/13/$31.00 ©2013 IEEE

Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

Service Oriented Architecture for Agile Automated Testing Environment

Michael Weir, Ross Kulak Agile Test Incorporation

Boca Raton, FL, United States [email protected], [email protected]

Ankur Agarwal Florida Atlantic University

Boca Raton, FL, United States [email protected]

Abstract— System test development, automation and execution process are key stages of the overall product development to both the New Product Introduction (NPI) and Production Release processes. For NPI, companies must create test systems to support product validation and verification. For manufacturing companies, ongoing process metrics are used to ensure the product meets quality specifications and can be sold to customers. This entire test process is time consuming and resource intensive and therefore negatively impacts the overall product net revenue, both in terms of time to market and in terms of development resources. Large and successful companies invest hundreds of thousands of dollars in automated test systems to support product development. Such infrastructures provide a competitive advantage by enabling a systematic methodology to generate test plans and then automatically have the test plan flow through the test software and hardware development, test and data collection, and results analysis phases. The Automatic Testing Equipment (ATE) industry has pushed to develop a framework that supports the sharing of test information, data, and analysis results across various enterprise platforms. An IEEE standard know as Automatic Test Markup Language (ATML), comprising of an XML schema, was proposed and developed in order to allow interoperability of test case, data, equipment information, and results. Our methodology provides a Service-Oriented Architecture that provides an interoperable solution. Users can begin with a test plan, deploy a scalable data monitoring and analysis capability, and follow the process from NPI through production. Various additional capabilities such as advanced analysis capability, customer data sharing resources, test software generation and deployment, closed and open source software library access, test station monitoring and equipment tracking, automated reporting schedules, and others are among the possibilities that can be added to the overall process. The proposed architecture is entirely scalable and can be deployed in single-site or global applications and may be installed behind corporate firewalls or in the cloud.

Keywords— Agile Test, Service Oriented Architecture, Data Analysis System Testing.

I. INTRODUCTION Generating, collecting, analyzing, and consuming test data

are key to corporate business intelligence. Managing the creation of test plans, test automation, test data, and utilization of very expensive assets becomes a discipline and process of its own. Automation of these capabilities is a key component of the modern day test process. The tests are often too complex to do manually and far too time consuming to be executed in a

manufacturing environment. The framework of database and test tools wrapped around this process is referred to as a test automation framework. This framework is the key to enhancing the productivity of overall team and gives companies a competitive advantage by allowing them to systematically define test plans and then quickly move through the process of automating the test plan and generating product critical data. The Automatic Testing Equipment (ATE) industry has pushed to develop a framework which would support the sharing of test information, data, and results across various enterprise platforms. Usually the test engineer develops a test plan and defines the equipment and software requirements. Based on the test plan, the test equipment is assembled and testing team orchestrates the automation and further collects, analyzes and reports the data upon test completion. The test process for any given company can encompass the following phases: (1) Test Plan development; (2) Equipment and interface selection; (3) Automation plan creation and automation software development; (4) Data collection and storage; (5) Data analysis; (6) Test report/data sheet assembly and publication; (7) Supply chain test monitoring – Contract manufacturing; (8) Test station utilization, reservation and scheduling; (9) Test Asset calibration and tracking; (10) Manufacturing Execution System (MES).

The overall testing process is a long, resource consuming process which starts with the test plan development leading to development of testing software libraries, data sheet assembly, reservation of testing station, development of testing data, test station calibration, test execution, data collection and data analysis and reporting. It is absolutely evident that there can be a wide scope of process automation in system testing. Companies typically follow an evolutionary path to address this process. Test plans begin with engineers converting text to spreadsheet plans with various individual formats and storage locations. These then translate into individual automation plans and automation solutions with moderate amounts of reusability. Once automation is developed and in place, the production, storage, analysis and publications of results ranges from local text files to haphazardly organized file shares. This creates confusion in organization of the results as well as communication, publication, and record-keeping challenges. The formats are varied. Consumption and analysis of the results is dependent on the individual understanding of formatting and location. This provides little opportunity to analyze and share test data within an organization and it is even

978-1-4673-3108-1/13/$31.00 ©2013 IEEE

Page 2: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

more cumbersome to share externally. To address the the test process, the traditional path is internal investment in proprietary systems that individually address the test phases stated previously. Seldom is there a overall understanding on the entire process in place before the individual test systems are developed.

A more open and interoperable enterprise solution is needed. There should be a shared reusable test-case library that is equipment and platform independent. Users will then use these existing test cases and avoid developing specific test cases for their design thereby adding efficiency to this overall process. There should be a framework where the testing results can be shared and deciphered across multiple platforms. For such an effort to succeed, a standardized method of storing data is required. The IEEE ATML standard is rapidly emerging as one such acceptable possibility that provides an XML-based standard for ATE and test information exchange. This standard was already been widely adopted by government agencies and industry. Naval Air Systems Command division in collaboration with industry leaders have collaborated in defining the XML schemas that correctly represent test information and allow interoperability. To be successful in gaining a competitive advantage, companies are required to invest significant resources into areas that are neither their core competency nor their core business.

II. METHODOLOGY Figure 1 indicates the overall system architecture of the

Agile Test platform. The client side of the overall system is

referred as the Agent. The agent executes on a Local Test Station that is connected to the Device Under Test. The agent maintains a constant communication with the backend system and overall application, and is responsible for data collection, assembly, and transportation to the backend server system. The backend system is made up of database servers, application servers, configuration and user management servers, reporting system servers, and other hardware components.

The data from data receiver is passed to data parser, parsed according to the ATML standard, and entered it into a distributed ATML-schema data server. This backend server system stores data from all the connected test stations. Data organization going into the database is dependent upon a specific test plan. Data retrieved from the database is similarly organized, but filters are applied to simplify the data retrieval process. It allows for reorganization of the data in support of reporting requirements. The application supports a single-site or gobal instantiations and can exist in the cloud or behind a corporate firewall.

The data consumer is able to mine the data in order to perform engineering analysis and overall report generation. An Engineering analysis workflow allows a user to select any part of the whole data, assign mathematical-analysis functions, and create reports to perform device performance evaluation, reliability analysis, characterization, and can ultimately following the product life cycle through to manufacturing test and process tracking. As the product moves into the manufacturing and commercialization phase, automated tests will continue to be run on each product as it moves through the

Figure 1: Agile Test Platform Showing Front and Back-End Modules

Page 3: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

manufacturing workflow. Often, the manufacturing test plan is a subset of the characterization and reliability test plan. Using a similar tool for both areas of the product development lifecycle minimizes additional development time and simplifies data analysis and comparison.

The system further enables users to create and manage the reports of test data in a consistent and simplified manner. The application is well architected and designed such that test results can be shared and deciphered across multiple platforms and users. Reporting of test data is standardized and system provides an integrated solution for selective report distribution and sharing. Customizable reports may be developed using the reporting tools within this system, and can then be published, stored, and distributed from within the system itself. Data filtering and report templates can be saved and reused to allow similar information to be reported over time or location using new data. This capability not only allows a user to monitor in-house processes, it also allows the user to monitor contract manufacturing or distant sites. These reports can further be published with various levels of access control with other users within a group, entity or outside of entity.

Figure 2: Agile Test Platform Layered Applications

Figure 2 shows the layered application simplistically. The interfaces, services, applications, hardware, and front and back end are provided to promote an understanding of the system.

The system incorporates industry-standard encryption algorithms for data transmission and stores user data secured servers to provide highest level of data security and integrity. This is simply a group of applications that sits on a server. As such, state-of-the-art technology easily allows for > 99% up time, is

stored on redundant systems, supports geographic diversity, and takes advantage of all of the features of modern server farms.

III. CURRENT APPLICATION The current application is demonstrated over 2 stations,

numerous units, several test conditions, and a number of runs per measurement point. National Instruments TestStand is used to run the tests because it natively supports the ATML standard. Figure 3 shows the TestStand sequence used to generate the data files.

Figure 3: TestStand Application to Generate ATML Data

The program is a simple sequence that loops through three temperatures and two voltages. For each loop, and each device, current, gain, and noise are repeatadly measured several times. This entire process is repeated over a number of devices and two test stations. As soon as each test completes, TestStand saves the results in an ATML-formatted file and the Agent, shown in Figure 4 forwards the file to the back-end application.

Figure 4: Agile Test Agent Window

Page 4: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

An account on the Agile Test platform, shown in Figure 5, is created to support the demonstrations in this discussion.

Figure 5: Agile Test Application - User Dashboard

Figure 6 shows that the current account is set two support two users and two stations. Users and stations may be added as required, and users may be added as either an administrator or non administrator. The administrator role can see all users information and data and can make changes to the account. The standard user is only able to view the data that the standard users created.

Figure 6: Agile Test Platform - User/Station Management

Data from both stations has been uploaded to the back-end servers. Selecting the DATA tab on the user dashboard brings the user to the data workflow page. As shown in Figure 7, there is a well defined workflow to process data within the Agile Test platform.

Figure 7: Agile Test Platform - Filter and Search

The left-hand box allows the user to define what criteria is going to be used to select data. This criteria is defined in the ATML specification. The selections cn be reordered as required and enabled or disabled. Any changes to the filter and select criteria are mirrored by changes in the data selection area. The type and format of data that a typical user processes does not change too often. Users can save and recall workspaces, which allows users to customize their portals to their individual needs.

The second area of this page, and of all of the Data pages, is the workflow diagram. This area allows the user to follow a defined workflow:

• Filter and Select Data

• Analyze Results

• Reporting

• Preview Report

At any time, the user may move back and forth through the workflow.

The third area of the filter and select screen is the Data Selection area. In this area, there is a scrollable selection box for each of the enabled filters. A user will narrow down a selection of data by chosing what the users want to work with. Figure 8 shows an example of the user wanting to preview gain data from all stations and devices that was taken at 3 volts. The user highlights the parameter and clicks the filter button to narrow the search visibility.

Figure 8: Agile Test Platform - Data Selection

Data is selected as previously described, but it is returned according to the selections in the Return Field Options shown in Figure 9. The operation of this field is identical to the operation of the Filter and Search Options field. Various fields my be enabled or disabled, and the order of the fields is the order in which the data will be returned. Ordering is controlled by drag and drop. Again, this information can be saved away to a workspace so the user can reuse their work.

Page 5: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

Figure 9: Agile Test Platform - Return Field Options

Once data is selected and the return fields are selected, the user can choose to proceed in a number of ways using the intuitive menuing structure. The user can simply preview the data using the QuickView feature shown in Figure 10. Data Quickview is intended to allow a user to verify the data before starting the more resource intensive task of processing the data. It answers the question of if the numbers make any sense before the user spends additional time. Alternatively, the user can simply download the data and locally process it. Data is returned as a CSV file in the same format as the QuickView data. The third option the user has is to continue through the workflow, process the data online, and build and publish reports. A web service API is available to enable 3rd party applications to connect directly.

Figure 10: Agile Test Platform - QuickView Data

The workflow was previously described as a four-step process, and the second step is to analyze data.

Analyzing data is the next step, and the Agile Test platform provides basic mean, minimum, maximum, standard deviation, and CpK analysis types. These analysis types may seem minimal, but they are the fundamental analysis types that everyone uses. The strength of the Agile Test platform is the

ability to easily and repeatably select which data these analysis type are applied to. Before beginning results analysis, data must be selected for analysis. Figure 11 shows a specific selection of data; in this example the ‘gain’ of all of the test units tested at a ‘voltage’ of 3 volts will be analyzed.

Figure 11: Agile Test Platform - Data Selection for

Analysis

Selecting the Analyse Results section of the Agile Test platform brings the user to the page shown in Figure 12.

Figure 12: Agile Test Platform - Defining Analysis

The Agile Test platform does analysis on an ‘output variable’. This variable is a allows users to assign any number of test names to the variable and thereby allows the user to perform an analysis over more than one test name if required. An output variable is assigned by selecting the desired test name(s) and then selecting the Assign Output variable button. A window will pop up asking the users to name the variable. The default name is the first selected test name. in this case the test name ‘Gain’ was selected. Once the output variable is assigned, it goes into the Output Variables list.

To assign an analysis, the users selects the desired output variable, selects the desired analysis type(s), and moves the selection to the Define Analysis column.

Any number of output variables can be defined, and any number of analysis types can be assigned to any output variable. All of this information can be saved in a user workspace and recalled at any time. The user only needs to define this process once and can use it many times.

Page 6: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

Once the analysis is defined, the next step is to generate a report. In the Agile Test platform, is the next step in the process flow is Reporting. The reporting page is shown in Figure 13.

Figure 13: Agile Test Platform - Reporting Page

Reporting in the Agile Test platform begins with defining the report. The default template can be used or a previously defined template can be used. Templates can be saved, opened, or deleted. Permissions can be set to allow only the user to see the report or to allow other groups to see the report.

The report header allows graphics to be uploaded and allows two text lines to support report titles and other information. There is a standard page one-line header and one-line footer available in the template.

The report body allows the user to enter and format text and insert graphic objects. Analysis tables are displayed on the report body as well when they are requested.

To build the report body, the user adds descriptive text and graphics according to the needs of the report. To add analysis results to the report; the user selects a position within the report, selects one or more of the defined analysis values, and selects the Assign Defined Analysis button. The Agile Test

Figure 14: Agile-Test Platform - Example Report

Page 7: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

platform with then place a table of the selected results within the report. An example of a report generated from the Agile Test platform is provided in Figure 14.

The final stage of the analysis and reporting process is to preview and distribute the report. The Agile Test platform simply generates a pdf of the report. The pdf can be downloaded for review and distribution. Furthermore, it can be saved within the user’s database and pulled up later for additional review, distribution, or comparison.

IV. ADDITIONAL CAPABILITY The Agile Test platform curently provides users with a

coherent methodology and process for getting test data into and out of a database. Once installed on a test station, the users no longer have to consider data locations or formats. The data ends up in the data base and it can be retrieved. The database is safe, can be made redundant, secure, and available globally. There are considerably strategic opportunities to extend the application.

Consider advanced processing requirements. Instead of downloading huge amounts of data so the user could process the data using other third-party tools such as Matlab, SAS, or some other tool; it is more efficient to process the data, using the same tool, on the cloud server(s). Just download the answer.

Test equipment tracking is supported by the ATML standard. If the test stations are configured to report what test equipment is online, that information will be entered into the database. Furthermore, calibration houses almost universally maintain a database of equipment that they calibrate for customers, and the calibration houses typically support a method for their customers to check the calibration status and records of any of their equipment. A function that queried both databases could provide a report of the test station calibration status.

Before a set of tests is put together, a test plan must be developed. Although test plans come in all shapes and sizes, they all include similar information; test conditions, test types or methods, number of units to test (which must include software variations), and expected results. Putting this capability into the front end of the Agile Test platform enables a number of end-to-end processes:

It is entirely possible, that a test software structure could be auto-generated from a test plan. There needs to be a method to associate test types with test stations. There needs to be a method to define looping structures supporting test conditions. There needs to be a way to link specific test modules to a test. After these methods are defined, structured software could be automatically generated using the defined test modules.

Once a test planning capability is available, a flow exists to programmatically produce a report based on the test plan, as the test report inherently answers the test plan. Work would be required to associate the returned test results name with the test plan specification, and work would be required to indicate the analysis types associated with the specification. If the test

software structure was automatically generated from the test plan, these connections are already available.

Software libraries could be associated with the Agile Test platform to allow users to select pre-written test modules. These test modules could be open or closed source and they could have read and write rights assigned to control the visibility of these modules. Such a capability would ease the task of associating specific test modules with specific tests.

A program development and tracking system could be wrapped around the Agile Test platform. Such a capability would allow users to instantiate sign-offs for test plans, code reviews, report completions, and anything else supported by the platform. It would allow users to track the status of the various sub-flows associated with the back end of the product development process.

Semi-real-time test tracking capability could be made available. Given that the Agile Test platform understands the test plan and can associate results placed into the databases with the test plan; then at any given time, the completion status of the test plan can be reported.

Production-level process tracking could be supported. Data coming from a production floor could be continuously compared to earlier data to enable process tracking. Alerts could be set to provide users with immediate notifications of process deviations.

The possibilities of additional capability within the Agile Test platform are many. Priorities need to be defined.

V. CONCLUSIONS The Agile Test platform is targeted at companies that

currently have no well-defined data management capability. It is meant to provide these companies with a turn-key solution. Deploying the Agile Test platform is as simple as registering an account, users, and stations and downloading und unsipping a small java application. The user is up and running.

The Agile Test platform provides the user with end-to-end data management. Data is uploaded directly from the test environment and stored and the platform database. Users generate reports from within the platform and download the reports for review and distribution.

The Agile Test platform is not constrained to any particular installation. It can be a cloud-based platform and take advantage of cloud-specific opportunities such as outsourced servers, scalable systems, and geographic redundancies. Alternatively, the entire platform can be installed behid a corporate firewall for companies that are not yet comfortable outsourcing servers.

The Agile Test platform is a simple and power tool that allows companies to minimize the resource requirements of managing test data. It reduces the dollars spent developing in-house solutions. It reduces the dollars lost due to ineffective data management solutions. It increases the product income through time-to-market schedule reductions by providing users with a stable, intuitive, powerful data management capability.

Page 8: Service Oriented Architecture for Agile Automated Testing ...faculty.eng.fau.edu/ankur/files/2017/09/Service... · are key to corporate business intelligence. Managing the creation

REFERENCES

[1] IEEE Standard for Automatic Test Markup Language (ATML) for Exchanging Automatic Test Equipment and Test Information via XML, IEEE, pp 378, 2011.

[2] Xiaoying Bai, Wei-Tek Tek Tsai, Raymond A. Paul, Techeng Shen, Bing Li, “Distributed end-to-end testing management”, EDOC 2001. Proceedings, pp 140-151, September 2001.

[3] Vukota Peković, Nikola Teslić, Ivan Resetar, and Tarkan Tekcan, “Test management and test execution system for automated verification of digital television systems”, 2010 IEEE 14th International Symposium on Consumer Electronics (ISCE)

[4] Stanislav Fomin, “Test management with Testopia — missing link?”, 5th Central and Eastern European Software Engineering Conference in Russia (CEE-SECR), pp 253-258, 2009