The constant mobility of the WLAN user coupled with the inherent instability of the unwired medium - air - make the 802.11 protocol an order of magnitude more complex than equivalent wired protocols.

By Graham Celine

Glossary of acronyms
Advanced Encryption Standard
AP — Access Point
BSS — Basic Service Sets
CCMP — Counter-Mode/CBC-Mac Protocol (IEEE 802.11I encryption algorithm)
EAP-TLS — Extensible Authentication Protocol-Transport Layer Security
EAP-TTLS — Extensible Authentication Protocol-Tunneled Transport Layer Security
IEEE — Institute of Electrical and Electronics Engineers
IETF — Internet Engineering Task Force
LEAP — Lightweight Extensible Authorization Protocol
PC — Personal Computer
QoS — Quality of Service
RFM — RF Port Module
TKI — Temporal Key Integrity Protocol (formerly WEP2)
VoWLAN — Voice over WLAN
WEP — Wired Equivalent Privacy (802.11 encryption protocol)
Wi-Fi — Wireless Fidelity
WLA — Wireless LAN Analyzer
WLAN — Wireless Local Area Network
WMM — Wi-Fi Multimedia
WMM-SA — Wi-Fi Multimedia Scheduled Access
Testing techniques developed for wired devices and networks fall short when applied to the WLAN market. The inherent instability of the unwired medium — air — in which the wireless world operates and the constant mobility of the WLAN user make the 802.11 protocol an order of magnitude more complex than equivalent wired protocols. As a result, the metrics used to benchmark wired protocols are only a starting point for the WLAN industry.

Differences between wired and wireless networks require metrics and methodology for performance benchmarking that address the intricacies of the 802.11 protocol. The discussion in the article focuses on parameters critical to WLAN performance and discusses methodology to measure them. It addresses wireless-specific functions central to business-critical applications on a Wi-Fi infrastructure and includes typical results and interpretations.

WLAN Performance Metrics
Wi-Fi protocols address differences between wired and wireless networks, and the implementation of the more advanced wireless protocol demands performance validation. Algorithms used in a network's clients and APs (and the capacity of these devices to process the algorithms) limit the network's performance. The objective of the validation process and test metrics is to identify critical test parameters and find the correct method of testing them. A listing of wireless performance metrics shows that they outnumber traditional Ethernet metrics by a ratio of roughly five to one (see Table 1).

click the image to enlarge
Testing Ethernet network performance is essentially a measure of packet forwarding rate. In addition to packet forwarding measurements, WLANs must undergo tests related to the unstable physical layer and end-user mobility, including automatic data rate adaptation, roaming, verification of security, QoS and overlapping BSSs, as well as behavioral tests that measure performance under abnormal network conditions. The primary focus of the testing effort should be parameters that eventually affect network efficiency and operation.

Data Rate Adaptation: Wired LANs support fixed data rates: 10/100/1000 Mb/s. Wireless networks support multiple data rates: 11/5.5/2/1 Mb/s for 802.11b; 54/48/36/24/18/12/9/6 Mb/s for 802.11a and 802.11g. The critical difference is that WLANs support dynamic rate adaptation and can operate at multiple data rates automatically determined by the end point (AP or client), based on the condition of the physical layer between it and the client device.

Figure 2. Chassis, station test module, RF module and mini test head: Chassis-based test platform with cabled RF architecture offers layer 1 stabilization while the automation allows efficient testing of performance factors such as roaming and throughput.
In addition, because the 802.11 standard does not specify exact criteria for data rate adaptation, the algorithms can vary from device to device. The rate adaptation algorithm should be based on optimizing throughput; that is, when the number of errors at a specific data rate increases to the point where throughput is severely affected, the device should drop to a lower data rate to recover the best throughput at that distance from the AP.

The challenge is how to measure this repeatably, while creating a metric that can be set as the golden standard. To measure throughput at fixed points, many vendors often use an interference-free environment with a long, direct line-of-site area where they can simulate data rate adaptation by wheeling a client up and down on a cart. This method, however, lacks accurate rate adaptation data and is less efficient than newer devices that offer controlled RF environments and accurate signal attenuation through test setup automation. While characterizing range vs. data rate, the test should simultaneously characterize range vs. throughput and range vs. packet error rate.

Roaming: As a client moves out of range of one AP, it dissociates from the AP and must associate and authenticate with another. If the client predicts this roam will occur by noticing the drop in signal and searching for an alternate AP before it is actually disassociated from the first, it can optimize the roam time and network disruption caused by the roam.

The client device makes the decision to roam based on its position relative to different APs and their signal strengths. The client might periodically analyze signal strength of the APs that surround it and decide which one to associate with if it needs to roam. Load-sharing protocols used by some WLAN network vendors depart from the traditional client-based decision process, orchestrating client devices to associate with specific APs and spreading the load evenly among APs and optimizing the entire network throughput. In addition, the IEEE is advancing its work on roaming through better RF measurement (802.11k) and fast roaming processes (802.11r). As the roaming process increases in complexity, it is critical to have standard roaming metrics for testing WLAN networks and equipment.

Roaming is critical because it takes time, which can cause data loss that can ultimately disrupt a communication session. Data loss is particularly important for time-sensitive enterprise applications, such as VoWLAN, that are especially susceptible to packet delay caused by roaming. Roaming metrics include roaming time, packet loss and session continuity. Roaming time can be broken down into the following stages: scanning, associating to the new AP, authentication with the new AP and data flow. Analyzing the time of each phase of this process will help ensure the most efficient roam time.

click the image to enlarge

Figure 3. Roaming test configuration.
Behavior should be verified in a number of different roaming scenarios that will emulate the cause and speed of the roam. A roam caused by an AP failure is significantly different from a roam caused by a person walking down a hallway with a wireless device. Typical results show huge differences between a fail-over roam time — in which the client is not expecting to roam — and a motion roam — where the client is expecting the roam.

Packet Forwarding: Forwarding rate is a function of a device: in wired networks, the Ethernet switch; in WLANs, the AP. Packet forwarding rate testing is always done at the highest signal strength and at the highest data rate because this puts the most demand on the device and measures its packet processing power in the most extreme case.

Like wired throughput tests, a wireless packet forwarding test varies the packet size to ensure the ability of the device to work with diverse traffic. But unlike wired devices, there are other factors to consider. The most critical is security, because wireless network devices must encrypt each packet. This additional overhead must be added to evaluate its effect on the packet forwarding rate. Another important factor is client capacity. Running the test with a large number of users stresses the AP's ability to handle a large number of users, each sending a portion of the bandwidth. This also affects how the AP functions under such conditions.

click the image to enlarge

Figure 4. The test script measures and tabulates the duration of each phase of the roaming process.
Security: Security is a critical consideration for enterprise networks. Because they are susceptible to intruders, wireless networks have more stringent security requirements than their wired counterparts. Wireless security protocols (802.11i) rely heavily on authentication and encryption, which depend on the processing power of the AP and client, and cryptography accelerators for data encryption. The efficiency with which the devices handle key management and encryption will have an effect on performance measurements, such as forwarding rate and roaming.

When a client initially accesses the network or roams between APs, authentication occurs using protocols such as EAP-TLS, EAP-TTLS and LEAP. Complex key derivation algorithms can overload APs if multiple simultaneous authentication requests are made. Authentication of wireless networks is tested by measuring how efficiently and quickly an AP manages simultaneous authentication requests.

Encryption protocols used in Wi-Fi, such as WEP, TKIP and AES/CCMP, can also impact throughput performance. The security metric is performed by making a series of comparative throughput measurements using different encryption methods.

QoS: Because 802.11 is a shared media protocol without QoS, WLANs cannot prioritize real-time applications such as voice and video over data applications. QoS protocols for WLANs must account for jitter, delay and packet loss, which have required minimums for real-time applications including VoIP and multimedia streaming. Jitter, or inter-packet delay, is particularly critical in packetized voice.

click the image to enlarge

Figure 5. Test results of the fastest roaming client reported by an automated test script. The test script performed seven roaming cycles between two APs.
802.11e defines MAC layer QoS protocols for WLANs, and the Wi-Fi Alliance has developed test metrics for QoS protocols that are subsets of 802.11e — WMM, a contention-based method that manages relative priorities, and WMM-SA, a polling-based protocol that supports bandwidth reservation for data streams. Metrics for QoS protocols require accurate measurement of end-to-end characteristics of the network.

Two components are required to measure the capability of devices or a network to deliver QoS. The first is traffic sources, that can accurately generate traffic of different QoS levels, high priority and background best effort traffic that saturates the air. In addition, the same devices need to timestamp and make accurate time measurements of delay and jitter associated with this traffic. Controlled testing of QoS allows users to characterize network capacity; for example, the number VoWLAN users that can simultaneously operate on an AP and still receive toll quality voice.

Testing Methodology for Wireless Networks

Traditionally, wireless system designers have had a variety of testing options. Most are home-grown or custom-built, and include isolated screen rooms for RF control, large open spaces for testing mobility and expensive off-the-shelf meters that focus on point-to-point tests of the physical layer. Another approach is emerging for integrated chassis-based Wi-Fi testing. The older methods are costly, and in many cases do not provide the systems level configuration needed to accurately and usefully provide relevant information about the metrics discussed above.

Isolated screen rooms for controlling RF interference — WLAN system developers reduce the effects of RF interference by conducting tests in a large screen room that isolates devices under test from extraneous RF interference. It's the wireless equivalent of a "clean room." Although screen rooms can eliminate interference effects, they cannot test real-world network conditions such as mobility and roaming. Screen rooms can be expensive to erect and maintain and are not portable, which limits their use and effectiveness.

Open spaces for testing mobility — Wi-Fi test performance metrics and methodologies need conditions that can replicate mobility and roaming. Wireless design engineers often try to simulate these conditions by renting or buying empty office buildings, homes or outdoor open spaces such as football fields. They move client devices manually around the test area.

click the image to enlarge

A summary of wired and wireless metrics. The significantly higher number of wireless metrics speaks to the relative complexity of wireless protocols. Note the enormous effect that an unstable physical layer would have on practically every measurement.
In addition to the expense of such solutions, scalability is limited to the number of clients and test engineers available to operate them. Interference isolation is more and more problematic because of the increased availability of Wi-Fi networks.

Off-the-shelf testing solutions — A variety of off-the-shelf test equipment that focuses on point-to-point tests at the physical layer is available. For example, signal analyzers and signal generators that emit and analyze signals that look like real 802.11 signals to evaluate Layer 1 conformance are generally used in RF-isolated conditions and are conditioned to operate in the 802.11 frequency range. But this method focuses on device testing, not network or system tests, and WLANs should not be evaluated as components but as a complete solution. Effective system-level test solutions must scale beyond single devices, such as clients and access points, to the network itself, providing the ability to test the WLAN as a system, and precisely analyzing Layer 2 performance under various mobility and load conditions.

New industry test solutions — New 802.11 test platforms place devices in an isolated environment, which allows mobility to be emulated through RF signal attenuation and capacity to be varied through client/AP emulation using programmable emulation engines. System test devices of this kind fully automate complicated testing, enabling efficient and repeatable measurements of performance while subjecting the devices under test to controlled emulated motion and traffic load.

Test Setup and Sample Results
The roaming test is an example that shows the efficiency that can be achieved using a chassis-based test platform. Using two test modules — a WLA and an RFM — a phone or a PC client is connected between the two APs through two 80 dB programmable attenuators. The attenuators are programmed to force the client to roam in a controlled way from one AP to another. Data collection is performed on the source and destination channel simultaneously by the integrated dual-channel WLA module. The roaming test is fully automated and can be configured to repeat the measurements for a set period.

One attenuator is initially set to minimum and the other to maximum so that the client receives a strong signal from AP1 and associates with it while AP2 is out of the client's range. The attenuator between AP1 and the client is then gradually increased, eventually making AP1 invisible to the client while the attenuator between the client and AP2 is gradually decreased, "moving" AP2 within range and forcing a roam. The ranges and the rate of change of attenuators are configurable within the test script. At the end of the roaming test, the test script tabulates the details of each roam.

This automated roaming test implementation provides accurate time measurements and identifies specific time intervals in a way that emulates real-life roaming to the clients and access points: gradual signal strength decrease and increase. The entire test can be repeated as many times as is needed, and multiple roams can be performed in a short period without human involvement.

IEEE 802.11t
In addition to the testing challenges created by differences between wired and wireless networks, the industry has also been confronted with the lack of standard metrics. Not even ad-hoc standard metrics have been defined. In 2004, the IEEE formed the 802.11t task group to define industry-standard test metrics. The group was created to standardize metrics, measurement technologies and test conditions so that vendors, service providers and enterprise users can compare performance of network components to standard benchmarks.

The 802.11t task group is modeling its work after similar specifications developed by the IETF more than a decade ago, which aligned the networking industry around a common set of metrics and testing practices for Ethernet protocols. Since then, the IETF's benchmarking methodology for Ethernet networks has been gradually accepted and adopted by the industry, and a variety of network and equipment testing tools that provide accurate and repeatable results have been brought to market. Like the IETF's Ethernet testing methodology, 802.11t will standardize performance validation across vendors, guarantee product and network interoperability and create a predictable end-user experience.

Until the 802.11t task group has completed its charter, the industry must rely heavily on benchmarks such as those defined by members of the task group. Representatives of more than 16 companies participate in the 802.11t task group, including Azimuth Systems, Broadcom, Intel, Microsoft, Symbol, Texas Instruments and the University of New Hampshire Interoperability Lab. Charles Wright, chief scientist of Azimuth Systems, chairs the group.

The wireless environment is inherently erratic, due primarily to the mobility of the end-user and the instability of the physical layer. Bits of data simply don't travel as reliably over the air as they do through copper or fiber.

A survey of traditional Ethernet and wireless performance metrics suggests that wireless metrics outnumber their wired counterparts by roughly five to one. The complexity of the protocol requires additional performance measurements for packet forwarding, security, QoS, rate adaptation, roaming and network behavior.

As WLANs migrate from small businesses to the enterprise, it has become increasingly important for wireless vendors to develop equipment and networks that have been verified according to standardized performance metrics and methodology that go beyond Ethernet networks and address the intricacies of the 802.11 protocol.

Tools and methodologies are emerging that will help move the WLAN testing process from the realm of "voodoo magic" to a repeatable science, providing the end-users with the certainty that the equipment they install will stand up to the stringent network needs.

About the Author
Graham Celine has 15 years of high-tech experience, having joined Azimuth after a 13-year career focused on data networking. Starting in product support with LANNET, a switching startup later acquired by Lucent, Celine led the international support and services group based in Tel Aviv, Israel. After moving to the United States, he was director of technical marketing, where he managed senior-level technical experts to deliver strategic sales support to the worldwide sales team. Subsequent to Avaya's spin off from Lucent, he became vice president of solutions management, responsible for the products of Avaya's MultiService Networking group. Celine was responsible for the definition and development of Avaya's data solutions strategy for enterprises' converged IP networks. Celine received a B.Sc. in electrical engineering from University of the Witwatersrand in South Africa.