6
Synthetic Instrumentation: The Road Ahead Michael N. Granieri Phase Matrix, Inc San Jose, CA [email protected] Robert Wade Lowdermilk BAE Systems San Diego, CA [email protected] Abstract — This paper provides the reader with an overview of current Synthetic Instrumentation technology and a discussion of the probable future direction of this interesting and disruptive technology. The paper opens with a brief introduction and technical overview of Synthetic Instrumentation and then presents a discourse on the primary attributes of Synthetic Instrumentation. Having reviewed the basics of Synthetic Instrumentation technology, the paper then goes on to discuss the fundamental classes/types of Synthetic Instruments (SI), their associated attributes, and typical types of users associated with these SI classes. The authors then discuss how Synthetic Instrumentation fits in the current Test and Measurement (T&M) application continuum. The paper concludes with a discussion on the possible disruptive impact of this emerging technology on the Test and Measurement (T&M) marketplace and the key technologies that will have a future impact on the continuing evolution of Synthetic Instrumentation. Keywords – Synthetic Instrumentation, Synthetic Instrument I. INTRODUCTION A Government Accountability Office (GAO) study conducted during the previous decade revealed that the Department of Defense (DOD) employs more than 400 unique types of test systems to test and diagnose anomalies in various DOD avionics and weapon systems [1]. The DOD spent more than $50 billion in the acquisition and support of Automatic Test Equipment (ATE) from 1980 through 1992. These procurements often resulted in a proliferation of special purpose testers designed to support a specific weapon system or a group of Weapon Replaceable or Shop Replaceable Assemblies. With both Congress and the taxpayers eager to reduce spending, combined with the DOD’s desire to improve avionics and weapon systems readiness, both Government and industry have, over the past decade, searched for new T&M paradigms that would mitigate the cost and inefficiency associated with the “ATE point solutions” and thereby improve the overall ATE capability and cost effectiveness throughout the DOD. One potential technology that may help improve the efficiency of the DOD’s ATE procurement and deployment capability has emerged over the last several years: - “Synthetic Instrumentation” [2]. The concept of Synthetic Instrumentation was born out of the DOD’s NxTest initiative. The primary goals of NxTest were to reduce the total cost of ownership of DOD Automatic Test Systems (ATS) and provide greater flexibility to the war fighter through Joint Services interoperable ATS [3]. II. SYNTHETIC INSTRUMENTATION OVERVIEW The genesis of modern day instantiations of Synthetic Instruments (SI) finds its roots in the communications revolution of the past decade and the emergence of a concept called Software Defined Radios (SDRs). SDRs use the computational capability and the flexibility of DSP hardware to perform many of the functional tasks performed in a modern communication system. SI has been following a parallel development path and is predicated on the concept that most stimulus and measurement functions can be implemented in software employing core SI hardware and software components, many of which may be Commercial-off-the-Shelf (COTS) components. These core components may often be supplemented, as required, by the user’s envelope of test requirements, by additional COTS or modified COTS components including hardware and software [4]. A high level notional block diagram (Figure 1) of an SI-based test and measurement capability appears very similar to that of an SDR. Figure 1. Synthetic Instrument (SI) based ATS: Notional Block Diagram Important observations regarding Figure 1 are as follows: first, the Down Converter functional block is perhaps the most critical component in the measurement path [5]. The Down Converter must provide the frequency translation/filtering function and, via a combination of mixing and filtering, faithfully reproduce at baseband the signal that was modulated onto the selected microwave carrier. Secondly, the input bandwidth and sampling speed of the ADC are often the primary factors that limit the accuracy of the measurement to be performed. A crucial issue/design parameter in the SI paradigm is the ability of the ADC functional block to transmit digital data to the DSP software, residing on an embedded or external host PC, for analysis in a timely manner. This data transmission must occur at a sufficiently high data rate to 978-1-4244-4981-1/09/$25.00 ©2009 IEEE

[IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

Embed Size (px)

Citation preview

Page 1: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

Synthetic Instrumentation: The Road Ahead

Michael N. Granieri Phase Matrix, Inc

San Jose, CA [email protected]

Robert Wade Lowdermilk BAE Systems

San Diego, CA [email protected]

Abstract — This paper provides the reader with an overview of current Synthetic Instrumentation technology and a discussion of the probable future direction of this interesting and disruptive technology. The paper opens with a brief introduction and technical overview of Synthetic Instrumentation and then presents a discourse on the primary attributes of Synthetic Instrumentation. Having reviewed the basics of Synthetic Instrumentation technology, the paper then goes on to discuss the fundamental classes/types of Synthetic Instruments (SI), their associated attributes, and typical types of users associated with these SI classes. The authors then discuss how Synthetic Instrumentation fits in the current Test and Measurement (T&M) application continuum. The paper concludes with a discussion on the possible disruptive impact of this emerging technology on the Test and Measurement (T&M) marketplace and the key technologies that will have a future impact on the continuing evolution of Synthetic Instrumentation.

Keywords – Synthetic Instrumentation, Synthetic Instrument

I. INTRODUCTION A Government Accountability Office (GAO) study

conducted during the previous decade revealed that the Department of Defense (DOD) employs more than 400 unique types of test systems to test and diagnose anomalies in various DOD avionics and weapon systems [1]. The DOD spent more than $50 billion in the acquisition and support of Automatic Test Equipment (ATE) from 1980 through 1992. These procurements often resulted in a proliferation of special purpose testers designed to support a specific weapon system or a group of Weapon Replaceable or Shop Replaceable Assemblies. With both Congress and the taxpayers eager to reduce spending, combined with the DOD’s desire to improve avionics and weapon systems readiness, both Government and industry have, over the past decade, searched for new T&M paradigms that would mitigate the cost and inefficiency associated with the “ATE point solutions” and thereby improve the overall ATE capability and cost effectiveness throughout the DOD. One potential technology that may help improve the efficiency of the DOD’s ATE procurement and deployment capability has emerged over the last several years: - “Synthetic Instrumentation” [2]. The concept of Synthetic Instrumentation was born out of the DOD’s NxTest initiative. The primary goals of NxTest were to reduce the total cost of ownership of DOD Automatic Test Systems (ATS) and provide greater flexibility to the war fighter through Joint Services interoperable ATS [3].

II. SYNTHETIC INSTRUMENTATION OVERVIEW The genesis of modern day instantiations of Synthetic

Instruments (SI) finds its roots in the communications revolution of the past decade and the emergence of a concept called Software Defined Radios (SDRs). SDRs use the computational capability and the flexibility of DSP hardware to perform many of the functional tasks performed in a modern communication system. SI has been following a parallel development path and is predicated on the concept that most stimulus and measurement functions can be implemented in software employing core SI hardware and software components, many of which may be Commercial-off-the-Shelf (COTS) components. These core components may often be supplemented, as required, by the user’s envelope of test requirements, by additional COTS or modified COTS components including hardware and software [4]. A high level notional block diagram (Figure 1) of an SI-based test and measurement capability appears very similar to that of an SDR.

Figure 1. Synthetic Instrument (SI) based ATS: Notional

Block Diagram

Important observations regarding Figure 1 are as follows: first, the Down Converter functional block is perhaps the most critical component in the measurement path [5]. The Down Converter must provide the frequency translation/filtering function and, via a combination of mixing and filtering, faithfully reproduce at baseband the signal that was modulated onto the selected microwave carrier. Secondly, the input bandwidth and sampling speed of the ADC are often the primary factors that limit the accuracy of the measurement to be performed. A crucial issue/design parameter in the SI paradigm is the ability of the ADC functional block to transmit digital data to the DSP software, residing on an embedded or external host PC, for analysis in a timely manner. This data transmission must occur at a sufficiently high data rate to

978-1-4244-4981-1/09/$25.00 ©2009 IEEE

Page 2: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

ensure that the DSP software does not miss needed data points and can perform analysis on streaming digitized data in real-time or in a near real-time environment for embedded applications. Finally, an SI’s overall conversion/data transfer rate is a function of the Down Converter’s IF frequency, the ADC’s sampling rate and sample size, and the SI implementation environment or data transport/bus mechanism employed.

III. ATTRIBUTES OF SYNTHETIC INSTRUMENTATION By virtue of its flexibility to synthesize a stimulus or

measurement function utilizing a limited set of generic hardware and software components, coupled with the high performance that is achievable with modern COTS technologies, SI can often replace racks of conventional test instruments and reduce overall ATS size/footprint, hardware logistics costs (i.e., spares), and associated instrument/ATS calibration costs [6]. Also, since an SI-based ATS is primarily composed of a limited set of generic standards-based COTS components, the ATS obsolescence risk associated with single-source components and limited lifecycle is substantially reduced, along with the expense of “lifetime buys” that affect traditional ATS. Another important attribute of an SI-based ATS is its intrinsic ability to accommodate upgrades to address new requirements, which are facilitated via software, firmware or hardware technology insertion effected by the upgrading of key components, as opposed to a lengthy hardware acquisition process. Another benefit of SI-based ATS is a significant reduction in non-recurring engineering costs for platform and self-test development, recurring maintenance, and training costs since a single SI-based platform can provide numerous discrete instrumentation functions versus a conventional rack-and-stack instrumentation architecture that employs a number of unique test subsystems.

It can also be argued that utilizing SI reduces ATS test software development time and increases measurement speed and testing efficiency because SI is primarily a signal-based stimulus and measurement paradigm. By virtue of being signal based, measurements which use the same data set can be affected in a shorter period of time than classical measurements performed with traditional instrumentation which repetitively acquire, analyze, and present results for each unique measurement (i.e., rise time, fall time, pulse width, frequency, etc.).

IV. CLASSES/ TYPES OF SYNTHETIC INSTRUMENTATION There currently exist three classes or types of Synthetic

Instruments in the marketplace [7]. For the sake of simplicity, the three classes are defined as follows: Class A or a Modular/Loosely Coupled Open Architecture SI, Class B or an integrated SI System, and Class C or an Application-Specific SI. Class A pertains to those SI systems which are synthesized via applying modular system standards such as PXI & VXI. For the users who do not want to spin their own SI, preferring to buy a turnkey, but still want a COTS solution, a Class B is perhaps the best approach to reap the benefits of the SI paradigm. Users who select this route are somewhat dependent on the “canned” software capabilities as well as the robustness, flexibility, and ease of use of the software system provided by the integrated SI system supplier. For those applications that

require adaptability a Class C or Application Specific SI (ASSI) solution may yield the most benefits. This approach is especially useful in complex T&M applications such as those typically experienced by DOD users who desire the benefits/attributes of SI but need to preserve their investment in legacy Test Program Set (TPS) hardware/software.

V. WHERE DOES SI FIT IN THE T&M CONTINUUM? This is a question often asked by users who are considering

use of SI technology and the three classes/types of SI previously discussed. If users are facing a “new start” application environment that is rapidly changing and/or will need to support a broad cross-section of UUTs over a number of years, SI may be the right choice. Adoption of the SI paradigm will likely reduce the need for different/unique types of test stations and/or instruments as well as training and other associated logistics costs discussed previously. Also, if the test requirements are ill defined or emerging, SI provides the testing flexibility to define “just in time” test scripts which are software or FPGA based and not dependent on the inflexible embedded firmware of a discrete instrument. Typically, an SI-based system is provided with a baseline/generic spectrum analysis, modulation analysis, and time-domain analysis capability to jump start an ATS TPS development activity. As unique testing requirements are identified, the SI supplier may enhance/augment the existing baseline test capability or support Application Programming Interfaces (APIs) and test sequences and/or routines to accommodate the customer’s unique testing requirements.

Conversely, if a testing application need only support a limited number of UUTs—assuming the test requirements are well defined and can be supported by traditional COTS instrumentation and/or software, that the longevity of the testing solution spans only one or two years (as opposed to five years or more), and assuming the testing solution satisfies the size/footprint requirements of its intended environment, then SI may not be a good fit for the testing solution. This is predicated on the fact that (assuming there is a limited number of traditional test equipment required) the inherent flexibility of SI may require a somewhat higher initial hardware and software cost over traditional approaches in order to provide a long term solution that may be re-used over many ATS applications. In some instances, a hybrid solution encompassing both SI and classical instruments may be the answer when a user is trying to optimize near-term ATS cost, test flexibility, and size. This approach fits very well in support of large DOD test systems or commercial depots commissioned to service a broad cross-section of UUTs. In this scenario, SI-based instrumentation may be employed in support of testing UUTs employing the following functional test capabilities: spectrum analysis, time-domain analysis, RF/μW power measurement and analysis, vector signal and modulation analysis, scalar or vector network analysis, complex baseband waveform generation/analysis, RF/μW complex signal generation, and digital bus emulation

VI. SI – A DISRUPTIVE TECHNOLOGY The concept of Synthetic Instrumentation represents a “sea

change” or fundamental paradigm shift in thinking regarding

Page 3: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

the implementation of T&M solutions to support the needs of the ATS marketplace. Transforming the T&M industry business model from discrete RF/µW hardware instrument types/classes to open architecture solutions—that combine PC-based software with a limited set of COTS, standards-based functional hardware building blocks that provide equivalent or higher performance, with lower non-recurring and recurring costs—has the potential to disrupt the T&M industry much as the microprocessor, PCs, and standardized operating systems and applications disrupted the computing industry in the 1970s and 80s.

It is commonly observed that commercial technologies often progress faster in terms of feature sets and performance than customers demand. Clayton Christensen in his watershed book The Innovator’s Dilemma was among the first to articulate this phenomenon [8]. As observed in his book, Mr. Christensen noted that in their quest to provide better products than their competitors and thereby earn higher profits and margins, suppliers often “overshoot” their market by supplying their customers with more functionality or performance than they need or for which they are ultimately willing to pay. Innovators often realize that a new, lower-cost or otherwise more efficient technology may lend itself to a given application and proceed to adapt the technology to the existing market, thereby disrupting the status quo if successful. Disruptive technologies that are introduced to supplant established technologies often initially underperform relative to users’ current expectations or requirements relative to traditional technologies. However, disruptive technologies occasionally become fully performance-compliant or even superior as the companies that develop them better understand customer needs and reflect improvements in the products based on the disruptive technologies. Technology-intensive industries are replete with examples where disruptive technology was introduced, experienced a reasonable maturation period, and then became the dominant technology within the industry—often at the expense of the traditional technologies that originally prevailed. Figure 2 depicts a notional representation of Christensen’s premise in a Synthetic Instrumentation context. Opponents of Synthetic Instrumentation have argued (see Figure 2, SI Progress vs. Time, Model 1 curve) that discrete instruments will always out-perform SI because the discrete instrument has been optimized for a specific measurement and that SIs are designed to perform a myriad of measurement functions

However, because of the underlying COTS technologies that provide the foundation of SI, and the rapid evolution that such COTS technologies exhibit, it may be counter-argued that Synthetic Instrumentation technology’s performance vs. time curve will likely exhibit either a higher slope than the corresponding discrete instrument’s performance vs. time curve and/or SI will likely exhibit non-linear improvements vs. time, resulting in an increased acceleration in SI performance over time, perhaps even ultimately surpassing the performance of discrete mainstream instrumentation.

The basis for the counter-argument stems from the fact that SI leverages advances in several other supporting technologies, many of which are themselves disruptive in nature, including FPGAs, multi-core processors, new software methodologies,

Figure 2. Synthetic Instrumentation, a Disruptive

Technology

new bus standards, etc., as well as those which are not (e.g., ADCs).

Success in making a disruptive technology such as SI into a viable mainstream technology lies in continually evolving the solution by investing in supporting technologies via continuous improvement in support of a targeted application or niche until the customer’s minimum but essential technical and operational needs have been satisfied at a comparable or lower cost. However, the case for SI being a disruptive technology is more powerful than that portrayed in the simple two-dimensional Figure 2. SI has the potential of displacing not just one test instrument technology, but a wide spectrum of application-specific test technologies and their associated functions much in the same way the PC displaced multiple, discrete computing platforms. As mentioned previously, test instrument manufacturers have continuously evolved their T&M instruments with features and test capabilities which often exceed the testing needs of their users. A case in point is “high performance” spectrum analyzers which contain literally hundreds of functions—many of which are never employed by the user. Over time the customer may be lulled into a false sense of security in knowing that he does not have to rigorously define his test requirement because his favorite T&M vendor will provide an excess of capability from which he can tap his testing needs. The customer in essence becomes technology driven as opposed to requirements driven. This approach only works if the user has the deep pockets and the rack space to accommodate the latest and greatest instrument specific technologies—as well as the logistics support infrastructure to maintain and calibrate the myriad of test instruments and train personnel on instrument specific operations and applications. SI provides the opportunity to satisfy a customer’s application specific test requirements via test scripting of needed functions (e.g., FFT spur search, modulation analysis, etc.) and the employment of Graphical User Interface (GUI) software. SI essentially enables cost-effective tailoring of ATS equipment to meet customer needs, since customers only pay for the hardware and software that are needed to support of their test applications. With SI, incremental/just-in-time test software functional capability may be added by the user predicated on real testing needs as opposed to perceived testing needs—just as PC users can add additional software to PC systems to meet

Page 4: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

new application requirements. And with SI’s use of modular technologies such as PXI and VXI, upgrades to extend the frequency range or improve the accuracy of the ATS because of new requirements may be accomplished by swapping out COTS modules (e.g., Up Converter and/or Down Converters, ADC/DAC, etc.). This pragmatic approach is easily effected using modular technologies such as PXI and VXI, especially in PXI/VXI hybrid systems where the incorporation of the best technologies from both of these mainstream T&M architectures can be integrated to affect small, high speed, and cost effective SI-based test systems.

VII. CURRENT/FUTURE TRENDS Currently, early adopters of the Synthetic Instrumentation

paradigm are primarily in the DOD market sector. This is understandable given the genesis for Synthetic Instrumentation based test systems was born primarily out of the DOD’s NxTest program and the need within the DOD to test such a wide variety of systems that span numerous technologies and vintages. Today, SI capabilities known to be in development or deployment by the authors include a number programs such as the U.S. Navy’s multi-service Agile Rapid Global Combat Support (ARGCS) Advanced Technology Demonstration (ATD) program, as well as fielded ATE for the U.S. Air Force, and the U.S. Marine Corps.

The Synthetic Instrumentation landscape in the commercial community is quite different. Although commercially available SI-based systems and components are available in the marketplace, the commercial T&M market has currently adopted a wait-and-see attitude with respect to this emerging technology. Adoption of SI requires a long term strategy of continuously evolving and improving T&M capabilities via incremental hardware technology upgrades and functional test capabilities (i.e., software) based on evolving current and future needs. In the long term, however, SI may prove to be both a cheaper and more tractable support strategy than that currently employed in the industry today; but SI requires more of a system engineering and strategic approach to product test and support than that employed by the majority of manufacturers today. In order to be successful, employment of an SI paradigm requires a commitment to a seamless continuum of evolving product improvement (both hardware and software) to keep up with the accelerated technology lifecycle of emerging products to be supported in the future.

As one observes the current state of the marketplace and research and development (R&D) activity, one observation becomes increasingly clear: If the SI marketplace grows and evolves as some proponents predict, the T&M industry may, over the next decade, transform or morph into a subset of the broader information technology marketplace. How does one draw such a conclusion? For one thing, SI systems are becoming more dependent on DSP, FPGA, and similar advanced processing techniques as well as the employment of high-speed ADCs and DACs. Customers are becoming energized about the possibility of up-converting baseband/IF signals to the RF/μW spectrum, down-converting RF/μW signals to baseband/IF signals, and subsequently digitizing the IF signals and writing measurement and analysis algorithms to operate and manipulate this data. The SI paradigm is morphing

more and more of the RF/μW environment into the digital domain and the benefits that can be accrued in the employment/utilization of digital hardware components and associated baseband data collection and measurement/analysis algorithms. In this regard, RF/μW signal generation and analysis is becoming a more tractable endeavor when implemented within an SI context. Based on the above observations and comments, the authors believe a number of emerging trends will have a profound impact on the Synthetic Instrumentation marketplace on “the road ahead” in the coming years. A brief summary of these hardware and software technologies is discussed in the following paragraphs.

A. Field Programmable Gate Arrays (FPGAs) FPGA-based processing has become an increasingly

important resource in the implementation of SI-based test systems. High-speed programmable logic technology now offers significant advantages for implementing SI functions such as real-time spectrum analysis and time-domain analysis. Over the past decade, the functions associated with T&M have seen a shift from custom Application Specific Integrated Circuits (ASICS) to operating as T&M Intellectual Property (IP) implemented in FPGAs. FPGA modules are employed in SI-based systems where real-time or near real-time performance is required to effect a measurement, or where signal “pre-processing” needs to be performed prior to making a measurement in the ATS. By pre-processing the digitized IF signal (for example, frequency translating of the digitized signal from IF to baseband, converting the signal to complex I&Q signal representation, filtering and re-sampling) the SI can improve the signal quality prior to making measurements, thereby improving the measurements. In addition, filtering and re-sampling the signal prior to making the measurement (i.e., reducing the number of samples required to make a measurement) minimizes the computational and memory resources which reduces cost. FPGAs can be employed as an add-on to the ADC module or as a separate board in a modular test system. For SI, this implementation shift brings advantages that include design flexibility, faster measurement speed, higher precision processing, lower power, and lower cost per measurement channel. With the advent of each newer and higher performance FPGA family entering the marketplace, these benefits have leveled the playing field between ASIC-based solutions and FPGA-based solutions. For an increasing number of medium volume applications such as SI, FPGA technology has become the preferred solution.

The design tools emerging to support new FPGA families, combined with the ubiquity of third party IP cores that support the design tools now enable literally thousands of application-specific algorithms that can be applied to implement a wide assortment of traditional and emerging measurement functions—in many cases programmed via a user-friendly GUI based tool set. These parameterizable IP cores are ready to drop into the FPGA process to help meet time to market challenges and minimize risk. The bottom line is that FPGAs and associated tools [9] will play a leading role in SI implementations in the coming years due to their performance, flexibility and increased functionality in a market space that requires flexible and adaptable instrumentation in an ever-changing T&M landscape where new products (UUTs) and

Page 5: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

technology introductions will introduce test requirement uncertainty and testing challenges to both the product developer and maintainer in the field.

B. Multicore Processors/Parallel Programming Since the advent of the microprocessor, processor

manufacturers improved the performance of Central Processing Units (CPUs) by architectural advances and by increasing clock speed by reducing the minimum feature pitch size on the die. Processor companies have begun implementing multiple “cores “within a processor, arranged in a parallel, closely coupled, architecture to increase effective processor throughput on multi-threaded tasks or tasks that can utilize Single Instruction, Multiple Data (SIMD) programming paradigms that enables algorithms to be spread across multiple cores. Manufacturers (Intel, IBM, and others) are packaging multiple CPUs on one chip in order to circumvent speed and power issues in classical processor development. When running on a multi-core system, multitasking Operating Systems (OS), such as Microsoft™ Windows™ can execute multiple tasks concurrently. The multiple computing engines work independently on different tasks (e.g., SI spectrum analysis on input port A and SI time domain analysis on input port B).

It is envisioned that in future applications of SI multi-core computing, engines will work independently in controlling different tasks on independent multi-channel SI. Emerging trends such as parallel programming and associated facilitating tools which assist the user in visualizing and designing parallel code will be of extreme benefit in the application of multi-core processors in support of multichannel SI-based systems, where each channel runs a different application program in support of multiple UUTs or different I/O ports /pins on the same UUT concurrently.

C. Low Phase Noise/Fast Switching Microwave Synthesizers Frequency synthesizers are fundamental to the operation

and performance of SI (see Figure 1). Historically, two critical (and opposing) parameters have influenced synthesizer selection and performance: phase noise and frequency switching speeds. Traditionally, frequency synthesis has been performed by a number of mainstream direct and indirect techniques such as: Direct Analog Frequency Synthesis (DAFS), Direct Digital Synthesis (DDS), and Indirect or Phase Lock Loop (PLL) techniques. DAFS techniques often provide excellent phase noise and switching speed performance but are often more complex, more costly, take up more volume, and consume more power than the other techniques. DDS technology’s primary attribute is frequency switching speed; however it is often encumbered by its lack of frequency coverage and spurious performance. DDS-based synthesizers have tremendous potential for future growth and application to SI as a result of rapid developments in GaAs, Si, and SiGe devices. The extension of the DDS usable bandwidth, together with spurious reduction, is the key improvement required for this technology to be an emerging trend in modular SI-based systems. Frequency synthesizers employing PLL technology are commonly used throughout the electronics industry and are extremely popular devices due to their simplicity, availability and relatively low costs.

Historically, T&M synthesizer developers have relied on Yttrium Iron Garnet (YIG) tuned oscillators featuring broadband operation and excellent phase noise performance. From a performance perspective, the main disadvantage of YIG technology is its slow switching speed; this drawback is the primary cause in the recent shift away from YIG-based designs to indirect techniques employing Voltage Controlled Oscillator (VCO) based technology [10]. This emerging synthesizer technology will be fueled by the capability to simultaneously provide fast frequency switching speed and reduction of the residual noise floor to extend the PLL bandwidths to much higher frequencies where solid state VCO noise becomes competitive with YIG devices. Therefore, a near-term trend in SI is the emergence of fast switching/low phase noise indirect techniques employing VCO technology as a replacement for traditional YIG-based synthesizers [11] where signal quality and speed are of paramount importance—especially in military flight line and other forward level maintenance applications.

D. Data Converters SI-based systems effect measurements (and generate

signals) by employing numerical processing techniques which is inherently a digital process. However, the ATE interface to the UUT is typically analog and the ADC/DAC represents the interface between the analog and digital domains. Therefore, similar to frequency translation, the ADCs and DACs are essential (and critical) components to the SI architecture and advances in data converter technology have a major driving affect on SI performance because it is important to accurately reproduce the signals of interest prior to making a measurement or generating a stimulus signal. The challenge of SI data converters is to simultaneously maximize the accuracy and capture bandwidth, usually forcing the SI to employ multiple data converters because accuracy and bandwidth are inversely proportional [12]. However, empirical data show that we are seeing an increase in capture bandwidth (for a given accuracy) of approximately a factor of two every three years. From an SI perspective, we predict that SI-based test systems will converge to single data converter architectures.

From an ADC performance perspective, recent literature indicates that current and projected advances in sample rates are on track with market demands. However, advances in bandwidth performance for ADCs are projected to be less than required [13]. The authors anticipate that future SI data converter requirements will focus on increased capture bandwidth, improved noise performance, and higher inter-modulation spurious performance in support of spectrum analysis related applications.

New and emerging SI systems will be required to sample or synthesize higher frequencies—not only for higher IFs, but also for direct RF sampling and synthesis. In the future, direct RF sampling may well be the key to greatly simplified systems and lower SI costs with the caveat that their performance parameters must be comparable to, and cost competitive with, SI systems that employ classical RF/μW Up Converter/Down Converter frequency translation techniques. Should this event occur, it will be a defining moment in SI-based systems development. The authors do not foresee the supplanting of classical RF/μW frequency translation techniques by digital

Page 6: [IEEE 2009 IEEE AUTOTESTCON - Anaheim, CA, USA (2009.09.14-2009.09.17)] 2009 IEEE AUTOTESTCON - Synthetic Instrumentation: The road ahead

technology in the foreseeable future. Much of the innovation over the last decade in high speed converter development has focused on improving the operation of converters at higher analog RF frequencies. However, while Input Bandwidth has improved significantly over the past 10 years, SFDR has not. A system with a large input bandwidth is less prone to slew rate limitations allowing the ADC to better track the signal input to the device. That is, for optimal spurious performance, a wide input bandwidth is desirable. However, by making the sample capacitance as small as possible, the resulting increase in input bandwidth allows more noise to enter the front end of the ADC and be spread over the Nyquist spectrum. This phenomenon has created an interesting tradeoff. In addition to improved spurious performance and a wider bandwidth, the circuitry allows more noise to pass to the sampling capacitor. This results in a lower ADC SNR due to increased input noise. Along the ADC performance specification continuum, the authors anticipate that in the foreseeable future, input bandwidth and sample rates will continue to improve in an incremental manner. However, the other two primary data converter parameters of interest, SFDR and SNR, are not orthogonal and must always be traded off between one another. Also, since the SFDR BW product is constrained by the physical size of the sampling capacitor and the associated manufacturing processes employed in its implementation, the authors foresee that an improvement in SFDR performance will most likely require a new or enhanced technology to displace the standard CMOS processes currently in use [14].

E. Multi-Bus Architectures The T&M industry over the past five decades has made

tremendous progress in the development of instrumentation data buses which facilitate the automation of the T&M process. Examples of mainstream buses are the General Purpose Interface (GPIB) bus, Universal Serial Bus (USB), VME Extensions for Instrumentation (VXI), PCI Extensions for Instrumentation (PXI), LAN Extension for Instrumentation (LXI), and the emerging hybrid parallel/serial bus architectures (PCI Express). Each of these buses satisfies a particular need in satisfying the primary user requirements such as system latency, bandwidth (speed and number of data I/O lanes), ease of installation and use, modularity, and data transmission range. The customer/user must choose the platform types that are right for a particular SI application or range applications. For example, the PXI bus and its PXI Express (PXIe) satisfy SI needs in terms of high speed data streaming up to 1 GB/sec from SI digitizers to a companion FPGA board for processing. Similarly, LXI and its associated LAN interface shines in remote control of SI applications, data transmission range, and widespread use. The authors foresee that a future SI design trend will be the employment of multi-bus architectures in order to synthesize tailored SI end user solutions.

An important aspect of the synthesis process with respect to legacy systems may be how to cost effectively re-engineer classical rack-and-stack systems into SI systems via employing a subset of available assets in order to create more flexible and better tailored ATS in concert with the end user’s needs. The authors foresee the increased use of hybrid SI architectures in the out years as users employ “best of class” SI components,

regardless of form factor or bus I/O implementation, to implement their SI solutions. For example, a system could be configured employing GPIB, VXI, or LXI components to synthesize the SHE path due to the widespread availability of up converters/synthesizers and AWGs employing these instrumentation buses and/or formats. Similarly, a user could employ PXI controllers, down converters, FPGA boards, and ADCs in support of MHE functionality because of their size, speed, and low cost attributes. Customization and flexibility in support of a customer’s unique requirements will be the mantra of future SI solution architects and multi-bus systems will play a key role in this regard.

VIII. SUMMARY & CONCLUSIONS Synthetic Instrumentation is an emerging technology that

will transform the T&M industry in support of an increased customer focus on flexible, tailored, and cost-effective multi-purpose instrumentation. Transformation of the Synthetic Instrumentation paradigm from the early adopter phase to a global, mainstream solution will require acceptance and development of core SI supporting technologies, and support by a supply chain of leading T&M hardware and component manufacturers, as well as test software and facilitating software tool suppliers. This transformation will be necessary to satisfy customer need for faster, smaller, cost effective, and more information-intensive ATS which have long service lives and are compatible with the ubiquitous Personal Computer paradigm and its associated visual/parallel programming and data presentation tools.

REFERENCES [1] “Military Readiness: DOD Needs Better Manage Automatic Test

Equipment Modernization.” GAO-03-451 (March 31, 2003). [2] Ames, Ben. “Test and Measurement tools struggle to keep up with

shrinking chips.” Military and Aerospace Electronics (June 2004). [3] Ross, William.” The Impact of Next Generation Test Technology on

Aviation Maintenance.” Autotestcon 2003 Proceedings. [4] Granieri, Mike. “Synthetic Instrumentation: An Emerging Technology

(Part I).” RF Design (Feb. 2004): 16-25. [5] Brown, Scott. “Real World Applications of Synthetic Instrumentation.”

Autotestcon Proceedings 2004:434-439. [6] “Using Synthetic Instruments in Your Test System.” Agilent

Technologies Application Note 1465-24. [7] Pragastis,Peter;Sihra,Iqbal;Granieri,Michael.”Synthetic

Instrumentation:contemporary architectures & applications(Part II).”RF Design/Defense Electronics Supplement (Nov.2004): 12-19.

[8] Christensen, Clayton M. The Innovator’s Dilemma (Boston: Harvard Business School Press, 1997): xiii-xxiv.

[9] “C interface for NI FPGA –based Hardware.” Data Week (April 29, 2009).

[10] Chenakin, Alexander. “Frequency Synthesis: Current Solutions and New Trends.” Microwave Journal (May 2007)

[11] Chenakin, Alexander.”Novel Approach Yields Fast, Clean Synthesizers.” Microwaves & RF (October 2008).

[12] Lowdermilk, Robert; Harris, Fredric; “Signal Conditioning and Data Collection in Synthetic Instruments” Autotestcon 2003 Proceedings

[13] Brannon, Brad. “Converter Performance approaches software-defined radio requirements.” www.rfdesign (April 2007).

[14] Brannon, Brad. “Understanding State of the Art in ADCs.” www.converter-radio.com

(Nov. 2007)