The marketing battle between WiMAX and LTE is raging in an effort to declare which is the best solution, and vendors and service providers are racing to put stakes in the ground to claim the first or most predominant network delivering wireline speeds. However, the true challenge still lurks in the background and has yet to raise its head: which technology will work better, which will provide more throughput, which will offer better range and which will deliver more flexible mobility? The ability to put each of these technologies through their paces and comprehensively test and benchmark the performance of the network is pivotal to the success of either technology’s rollout.
While each standard has advantages and disadvantages relative to the other, the significant technology step that both use will be the biggest hurdle to winning, or being part of, the race. Developing and delivering the promise of broadband wireless is highly dependent on the advanced wireless capabilities. This article will delve into those technologies and what it will take to ensure they will work when deployed widely.
What are Mobile Broadband and 4G?
The key factors that define the major step to “4G” in wireless technology are the technological advances that enable much higher throughput – the move to OFDM-based modulation and MIMO advanced antenna capabilities. Both WiMAX and LTE are uniquely different than the previous standards used for public wireless services as they add significant robustness and performance enhancing capabilities that enable the massive data rates being touted by the marketing materials.
Like most wireless solutions, performance will vary – being close to a base station in densely covered areas will enable much higher performance, and while that throughput cannot be expected in low coverage areas, the improvement in range and performance when the user is far from the base station will be much better than the previous generation of wireless service. Delivering enhanced service translates into more complex test requirements to guarantee that service under a variety of typical field conditions.
What are the Differences between WiMAX and LTE?
WiMAX and LTE are data centric networks with more advanced wireless technology, demanding more focus on testing performance (throughput) and more robust and comprehensive testing of performance in different RF conditions.
There are some differences that should be highlighted which will affect the technologies success and what needs to be tested as they are deployed. These include:
WIMAX and LTE are both MIMO. WiMAX is defined as 2x2 MIMO, but already vendors are delivering 8x4 base stations with advanced MIMO features like beamforming. LTE is more flexible with many starting with single antenna or Rx Diversity only, but the technology will surely catch up with larger scale MIMO. Battery life dependencies will force the difference between many antenna base stations and single or dual antenna mobile devices.
click to enlarge
Figure 1. Diagram showing the different flavors of MIMO.
The 802.16 standard initially focused on a fixed broadband solution and as such the technology was designed for nomadic rather than mobile operations. Mobile WiMAX based on the 802.16e standard addresses this, but significant testing needs to be done to evaluate the mobility support both from the perspective of speed as well as network mobility (roaming). LTE has focused on fully mobile applications from the outset, and defines much higher speeds like high speed trains and more diverse mobility requirements including handoff to legacy networks.
Cellular wireless has traditionally used FDD transmission, allowing better spectral efficiency for narrow bandwidth. However, the availability of broadband bandwidth in many countries has made the use of TDD a viable and popular option. WiMAX initially focused on using TDD profiles with a promise of FDD in the future, while LTE, initially FDD focused, has made a fast switch to TDD to accommodate Chinese operators and other countries with available TDD spectrum. This is significant for performance as the use of TDD enables beamforming. As TDD uses the same spectrum up and downstream, this makes the test requirements more complex as it results in the need for reciprocal bi-directional testing.
Despite the technical differences between WiMAX and LTE, the underlying difference may be in the use case and the expectations, and a critical part of this is the ability to step beyond previous levels of testing and move to a new level of validation and quality assurance.
In the less complex world of cell phones and 3G communications, the focus of testing is around the ability to conform and interoperate, inconjunction with some ability to deliver data applications like email, SMS, web browsing and mapping. Much of the testing effort for these types of networks is concentrated around passing the industry-standard certification tests, usually followed by a series of service provider qualification tests. Once these tests are passed, the devices are approved for use on the network and assumed to be acceptable.
The standards bodies for WiMAX and LTE, WiMAX Forum and 3GPP, include comprehensive certification programs, but the bar set for “acceptable performance” is low and is set in ideal testing conditions. In order to prove that WiMAX and LTE operate in the bands and environments they will be deployed in, strenuous performance testing needs to be defined and run on a network system to ensure the devices are up to expectations.
The dependence of MIMO algorithms on the environment they operate in to deliver on the promise of high performance demands a more robust testing methodology which will focus primarily on data throughput and secondly on adapting to changing physical conditions.
click to enlarge
Figure 2. Example of an LTE MIMO throughput test under different propagation conditions.
•Range – the most basic factor that will affect performance is the distance from the transmitter, this will change when a user is mobile and will drive the MIMO transmission mode as well as mobility decisions like handoff.
•The amount of multipath and the type of multipath – multipath or duplicate signals received at different times after they are reflected is typical in any wireless transmission. Short delay reflections would be typical in a city center, mid range in suburbs and long range in rural areas.
•The mobility of the user – PCs using 4G broadband wireless are nomadic (they move about, but are usually operated in a static position) but mobile phones, game consoles, eReaders and other devices are mobile and move from pedestrian pace to high speed train pace. The different propagation conditions when walking at 3km/hr vs. passing by base stations at 120 km/hr are significant.
•Gradually changing conditions - devices may experience changes to the conditions due to the terrain at different rates. Examples of this include an obstacle suddenly obscuring the line of site to the base station (such as a skyscraper or mountain) or the amount of interference generated by other transmitting devices or the number of other base stations in range at any one moment.
Why Does this Make a Difference?
The beauty of the combination of MIMO and OFDM is that the reliable modulation scheme, combined with different flavors of MIMO transmission and reception, are diverse enough to adapt and operate optimally in any scenario. Based on the level of signal, spatial conditions, noise level, and other characteristics, WiMAX and LTE systems are able to self configure into the best possible mode of operation to deliver the promised speed and reliability.
Some examples of MIMO modes of operation are:
•Spatial Multiplexing – in ideal multi-antenna conditions, a system is able to transmit multiple streams of data simultaneously and decode the streams effectively duplicating the throughput with each stream.
•Tx Diversity – in less optimal conditions, a device will transmit the same data on two or more antennas using different coding schemes to allow the received data to be compared and error corrected increasing the reliability of data.
•Rx Diversity – multiple receive antennas receiving the same transmission can sum their received signal resulting in a stronger and more error-resistant transmission, increasing the transmission reliability.
•Beamforming – advanced antenna techniques allow two or more antennas to intelligently coordinate the phase and amplitude of signal transmitted on the antennas effectively directing the most power in a specific direction. Beamforming is not only useful to track a specific user and increase distance, but can also be used to avoid interference.
These advanced capabilities are common to WiMAX and LTE as they are both multi-antenna solutions. Therefore, the ability of a vendor – whether they are a service provider, base station supplier or one of the many equipment/devices connecting to the infrastructure and network – to ensure that each of these techniques works efficiently in their product, that each technique is dynamically selected based on quickly changing channel conditions, and that their devices do this in conjunction with other vendors’ devices while delivering acceptable performance, is key to delivering WiMAX or LTE.
Testing WiMAX and LTE
The focus of testing for both WiMAX and LTE is very different to that of previous cellular technologies:
a) The test process needs to be much broader than certification and must focus on data throughput.
b) The testing must involve extensive mobility including simple ranging through to basic handoff and right up to testing under different velocity conditions or dynamic recorded field conditions.
c) Testing must be done under varying propagation conditions to ensure the system is validated in different and varying channel settings and to validate the performance is robust enough under the different scenarios.
These requirements are provided in channel emulation testing. Channel emulators are state of the art tools used to recreate over the air propagation effects in the lab for testing purposes. Once referred to as fading, channel emulation today includes much more sophisticated capabilities to enable the enhanced capabilities of MIMO and 4G technologies to be tested fully.
Channel emulation focuses on the channel – multiple antenna to multiple antenna connections – and provides Doppler fading on each antenna path, complex signal correlation between the paths, shadow (slow) fading or drive test data playback on the channel, interference simulation to accurately control signal to noise ratio during testing and signal strength control on all paths to allow mobility effects like range and handoff to be simply recreated. Figure 2 shows the effect of a dynamic channel model on throughput of a device with different SNR settings.
MIMO channel emulators are fully bi-directional, enabling the most advanced testing including beamforming and full system testing of the downlink and the uplink individually as well as simultaneously. This is very significant as both the data centric nature and the RF nature of WiMAX and LTE will make the performance dependent on both forward and reverse path behavior – testing with the channel conditions in one direction will not truly recreate the conditions the system will find in the field.
Although WiMAX and LTE have various specification differences, physical layer testing aspects are very similar. Testing WiMAX and LTE with channel emulation combines the transmission conditions of the real world with performance tests and ensures that the MIMO algorithms and dynamic nature of WiMAX and LTE will operate well when they are deployed. Doing this in the lab makes the testing more cost effective and as this real world testing takes place much earlier in the test cycle than traditional field testing, the cost saving is large and time to market is significantly reduced.
Graham Celine is senior director of marketing for Azimuth Systems, www.azimuthsystems.com