Adapting Next-Generation Applications to Universal Radio Environments
Mon, 08/25/2008 - 7:56am
As the variety of user devices and applications increase, the need to predict end-user experience becomes even more critical.Sudeep Gupta, Azimuth Systems
Wireless access communications have changed dramatically over the past five years. As users demand more from their mobile devices, the interaction between the devices and the radio environment has become more complex. 4G wireless, including the 3GPP Long Term Evolution (LTE) standard and WiMAX based on the expected 802.16m standard, promises to deliver the wide range of services that users demand, but it requires complex RF technologies to achieve its goals. For example, to reach peak throughput speeds of 100 Mbps or greater, 4G uses smart antenna technologies, such as multiple input/multiple output (MIMO) to provide the necessary spectral efficiency. Below are several critical points to consider when delivering next-generation mobile applications to various radio environments
MIMO Drives 4G Testing Needs
MIMO takes advantage of a rich multi-path environment to combine uncorrelated signal paths between two, three or four transmit/receive paths to provide a signal processing gain while maintaining constant radio resources. For example, using MIMO spatial multiplexing, the RF channel may deliver up to double the throughput, so in using a 5 MHz channel, the user may see the same throughput as they would see with a 10 MHz channel. LTE’s high throughput comes partially from MIMO, which depends on multi-path to create uncorrelated signal paths that allow for a processing gain. Because existing testing methods are based on a traditional single-input/single-output (SISO) environment which ignores multipath, the radio channel must be explicitly modeled to properly test an application or device in a MIMO environment.
How the User Experience Drives Product Development
The user experience is the single most important factor in the growth of wireless services today. Users have clearly rejected the “walled garden” approach where service providers delivered a simplified, limited version of the Internet to subscribers. Today, users expect to get the same experience in their portable wireless devices as they get from their wireline products. At the same time, users recognize that the Internet experience must be appropriately scaled for their device.
The most successful portable Internet devices have not been general-purpose “Swiss Army knife” type devices, such as smartphones and PDAs. Rather, successful devices have been optimized for a particular purpose, such as the Apple iPod or Sony PSP, where the user experience has been optimized for a particular application. As a result, 4G will be characterized by a diverse device ecosystem composed of many special-purpose devices. The shift towards Open Access, where wireless devices are not tied to a particular service provider, will further expand the ecosystem.
Service providers have found that the increased popularity of special-purpose devices drives network usage. For example, the Apple iPhone provides a full-featured web browser and a large, high-resolution touch screen that is well suited for browsing the Web. T-Mobile reported that the average Internet usage for an iPhone customer was 30 times the normal usage for other customers [Unstrung, Jan 2008]. Because the value of the electronic device is increasingly determined by network connectivity, network performance is critical. It’s up to application developers, device manufacturers, service providers and infrastructure vendors to ensure that applications will meet users’ expectations under all RF conditions.
Mobile devices are moving beyond the traditional applications seen in cellular communications, such as simple voice or basic data applications. New applications, such as video streaming, require greater bandwidth and better management of quality of service (QoS). For example, in Voice over IP applications, the throughput, latency and jitter define how the IP packets must be managed. Both the uplink and downlink must maintain QoS by keeping a relatively tight window for latency (within 160 milliseconds), jitter (within 50 milliseconds) and throughput (at least 32 kilobits per second, depending on the codec). An example of the convergence window for Voice over IP is shown in Figure 1.
The radio environment impairs the physical layer for wireless more than in wired networks, and the resulting fading dramatically alters the quality of the channel.?
Impact of Fading on the Environment
WiMAX and LTE are both built around RF channel impairment as they balance the throughput and link quality by switching to different modulation coding schemes (MCSs). If the link quality is good, a high capacity MCS, such as 64 quadrature amplitude modulation (64QAM) can be used. If the link quality degrades, the link must be switched to a more robust but lower capacity MCS, such as 16QAM or quadrature phase shift keying (QPSK). Moreover, the high throughput of LTE comes from MIMO. As a result, any testing that occurs over excellent RF conditions and ignores the effects of channel fading risks masking deficiencies in the product, which could lead to an unsatisfactory user experience and wasted development resources.
Let’s consider an example. 64QAM requires a signal to noise ratio (SNR) of at least 25 dB (see Figure 2). If we consider an OFDM signal of -45 dBm and the noise floor is about -90 dBm, then the resulting signal-to-noise ratio (SNR) is 45 dB, making the SNR sufficient for 64QAM in an ideal non-mobile environment. However, when the mobile device or components in the environment move, channel fading occurs. If the RF environment creates deep fades of around 25 dB, the resulting SNR would be only 20 dB, and wouldn’t be sufficient for 64QAM, forcing the system to drop to a more robust but lower capacity MCS.
The rapidly changing RF environment must be a strong consideration during application development. An application such as video streaming or mapping should be designed to operate satisfactorily in both optimal and marginal RF conditions and should meet quality expectations for all RF conditions. If a user starts the application in a rich MIMO environment with conditions that support 64QAM, and then continues using it as they move to the cell edge, the mobile drops down to 16QAM or QPSK. The application must be able to gracefully adjust the experience to deliver the application while meeting the user’s expectations. In other words, the RF environment should seamlessly determine the MCS, which determines the device throughput, which then drives the user experience for the application. A 4G-compatible channel emulator capable of simulating the RF environment and the resulting channel fading can assist developers to see how the applications behave under different fading conditions. Including channel fading in regression testing will give developers a clear understanding of how applications and devices will behave under real-world conditions.
Developing with the RF Environment in Mind
Through the use of MIMO, 4G will offer more advanced applications to meet user needs. However, 4G will also have a much more complex environment than previous wireless access technologies. Because the fading environment can dramatically change, it is critical to develop applications that take RF environment into consideration to maximize performance and the user experience.
Sudeep Gupta is vice president of business development at Azimuth Systems, a leading provider of wireless broadband test equipment and channel emulators for WiMAX, 4G and Wi-Fi. Mr. Gupta has held various business development, technical marketing, and engineering roles at Alcatel-Lucent, Ericsson, and a smart antenna startup called Metawave Communications. He holds an M.B.A. from Southern Methodist University in Dallas and a B.S.E.E. from The University of Texas at Austin.