Problem Solvers - The Next Decade of the Software-Defined Instrument
When you take a step back and look at how RF instruments have evolved over the past twenty years, you can observe how the evolution of wireless standards has pushed the RF front ends to new levels of analog performance. For example, one might remember the ORFS (Output RF Spectrum) measurement from GSM – long thought of as one of the most challenging RF measurements that an engineer could make with a signal analyzer. Fast forward a few years, and we saw how the ACLR (Adjacent Channel Leakage Ration) measurement from WCDMA drove innovations such as “noise compensation” in the RF signal analyzer world – a technique that enables an engineer to measure with better dynamic range. Looking forward, there is no question that wireless standards will continue to drive the analog performance of RF instruments – a trend that is already happening for LTE Advance and 802.11ac.
Yet, unlike a decade ago, the continued evolution of wireless standards no longer has test vendors scrambling to push the analog performance of their instruments in the way that it once did. Over the past decade, the microwave industry has seen increasing availability of off-the-shelf, instrumentation grade components. From mixers to amplifiers to attenuators to analog-to-digital converter technology, the guts of an instrument are no longer entirely custom, in-house designs. Instead, best-in-class instrumentation can be built using off-the-shelf technology. As a result, the greatest innovations in the instrumentation world over the next decade will not be innovations in hardware performance. Instead, the biggest innovations over the next decade will be in the areas of software abstraction and signal processing capabilities of the software-defined instrument.
Overview of the Software Defined Instrument
From a technical viewpoint, the idea of a “software-defined instrument” is not a new one. Starting in the early 2000’s, three critical factors drove the transition of the modern signal analyzer from a predominantly analog instrument to a digital one. First, the increasing integration of multiple radios onto singular products (such as the smartphone) drove the requirement for RF signal analyzers to test every wireless standard imaginable. Second, the continued evolution of existing standards – such as the transition from GSM to GPRS to EDGE – drove the requirement for RF signal analyzers to be more easily upgradable. Finally, continued innovation in the embedded processing world introduced a new generation of CPUs and FPGAs that were capable of handling the signal processing challenges of RF instruments.
In the software-defined RF instrument, the core measurement functions are performed with a CPU instead of using traditional approaches. For example, rather than sweeping the local oscillator through a resolution bandwidth filter, the RF signal is mixed to a wideband intermediate frequency (IF) and then digitized. Once digitized, the spectrum is computed digitally through an FFT. One of the clear benefits of this architecture is that software-defined instruments are able to support composite measurements, where demodulation measurements and spectrum measurements are performed in parallel.
Even today, the software-defined instrument has many benefits over the traditional instrument approach. First, since measurements are performed in software, the instrument can be easily upgraded to demodulate new modulation types or signal formats. However, perhaps one of the most understated benefits of the software-defined instrument is the advantages in measurement speed over the traditional approaches to instrumentation. When testing wireless devices, measurement speed is a signal processing problem and not an analog one. Thus, the more processing horsepower that one can throw at the measurement problem the faster the measurement result.
The Future of Instrumentation
As the complexity of wireless standards has continued to increase over the past decade, measurement speed has continued to become a larger challenge. For example, when you compare a simple single-carrier, narrow-band signal such as GSM to a complex OFDM standard such as LTE, you’ll observe that the LTE signal requires orders of magnitude more signal processing power to demodulate. As a result, even with the highest-performance PXI RF signal analyzers in the industry, an LTE EVM measurement can take up to 100 ms, while an EDGE EVM measurement can be performed in under 5 ms. Going forward, we can safely assume that the signal processing required for measurements of next generation standards will only continue to increase. For example, the use of carrier aggregation will multiply the required signal processing by a multiple of the number of carriers used. For this reason, one of the key elements of the software-defined instrument is the ability of the user to upgrade the processor over time to keep up with the increasing demands of next generation standards. Given this need, more modular instrumentation solutions such as PXI will become increasingly critical to solve the next generation of measurement needs.
While there has been a clear industry-wide transition from the completely-analog to the software-defined RF instrument over the past decade, this transition is only just beginning. Over the next decade, we will likely see the continued transition of the software-defined RF signal analyzer from single-port device to a multiport mixed signal instrument. The primary drivers of this instrumentation trend is the requirement for engineers to perform baseband and RF measurements in parallel and the growing need for engineers to control their device under test through a digital interface. For example, the typical instrument configuration for one of the traditionally “simple” parts, an RF power amplifier designed for a wireless handset. Testing a modern power amplifier requires substantially more instrumentation than just a signal generator, signal analyzer and a power supply.
Given the increasing complexity and integration of tomorrow’s RF instruments, the second critical requirement of “software-defined” instruments is to be able to abstract the measurement complexity with productive software. In the future, engineers will rely less on the front panel displays of an instrument to produce a measurement result (although the display is still important), but will instead be engineering “measurement systems” in order to produce meaningful measurement results. Thus, I believe that graphical system design software tools such as NI LabVIEW software will become a necessity in order for engineers to abstract the complexity of next generation measurement systems.
As we look to the future of software-defined instrumentation, two critical trends that will continue to drive innovation in measurement science are the needs for 1) greater signal processing capabilities and 2) productive measurement abstraction. Even though today’s RF engineers already use instruments that are “software defined,” the role of software in the measurement system will continue to expand. Thus, over the next decade, the term “software-defined instrument” will continue to take on new meaning as engineers use software as the primary graphical system tool used to configure their instrumentation.
Posted by Janine E. Mooney, Editor
November 28, 2011