The key to maximizing system capacity while keeping system interference under control.|By Christopher Rix

Glossary of acronyms
CDMA — Code-Division Multiple Access
FCH — Fundamental Channel
FER — Frame Error Rate
FWD — Forward
OEM — Original Equipment Manufacturer
QoS — Quality of Service
RVS — Reverse
RX — Receive
TCH — Traffic Channel
TX — Transmit
PCS — Personal Communications System
CDMA has proven to be an efficient and effective technology for wireless operators both in terms of subscriber capacity and revenue generation. The technology is a true 3G system, offering high-speed data transfer rates and traditional voice services.
However, the system presents new problems for testing the minimum performance of the various mobile devices. The CDMA mobile terminal's power control loops are powerful and efficient. They are the secret to maximizing system capacity while keeping system interference under tight control. This, when coupled with good receiver sensitivity (FER), provides for a system with excellent coverage and QoS.

Power Control
Why is power control so critical to CDMA performance? To understand why, consider the nature of the CDMA signal structure. CDMA allows each channel resource to carry multiple mobiles on the same RF frequency, as many as 30 in commonly deployed configurations. This means that the multiple signals arriving at the basestation appear as random noise in the spectral domain. The receiver is able to differentiate these signals from this noise by knowing the unique Walsh code that each mobile used to encode its data. This coding scheme and the multiple mobile configuration provide the capacity advantage in CDMA systems.

However, to enable the receiver to differentiate and then decode the multiple mobiles signals successfully, signals arriving at the input to the basestation receiver must be at comparable levels and must share equally the available signal processing resource. A mobile with a power level that is not controlled correctly would either:
a) Be swamped by the other mobiles' signals (low case).
b) Interfere and swamp other mobiles' signals (high case).
This is also known as the near/far effect, such that mobiles operating at distance would be swamped by close-in mobiles if power is not controlled efficiently.

For maximum operating efficiency, the CDMA standard sets the nominal received power from all mobiles' signals arriving at the basestation to be –73 dBm (cellular) and –76 dBm (PCS).

Every mobile on a CDMA system is a potential interference signal to all other mobiles on that system. Open and closed loop power control circuits in the mobile phone maintain power levels and are updated at 800 times per second. In this manner, the system can dynamically control the effects caused by multipath reflections and signal fading conditions. In the controlled environment of the RF shield enclosure, multipath and fading characteristics are static. Known signal level parameters from an enclosed environment can be a great advantage when conducting minimum performance testing of RF power and receiver sensitivity.

At first it would appear that providing higher levels of RF power would improve the mobile's coverage performance, but it comes at a high cost to the subscriber. Higher levels of RF power affect battery life dramatically. For this reason, most mobiles are set at 1 to 1.5 dB above published minimum levels. Therefore, al-though the standard allows for CDMA mobile phone power to be as high as 30 dBm, in practice the levels are set lower and closer to the lower limit of 23 dBm.

Testing Mobile Devices
Before testing RF power and receiver sensitivity of any radio system, many calibration aspects must be taken into consideration. In CDMA, these are not just important, they are critical to ensuring the mobile's efficient operation and to ensuring that it does not become a source of interference when connected to a real network, and at the same time providing adequate coverage performance.

Test equipment variances — It is physically impossible for two units of the same model of test equipment to have 100% identical performance. Strictly controlled calibration techniques can bring the performance close, but never identical. With of RF power measurement, it is possible to achieve <1 dB variance, typically 0.5 dB, on modern instrumentation.

Temperature variance — Such close performance remains valid only when the instrument is operated in a controlled temperature environment at the same temperature as the calibration laboratory. As a rule, temperature variance should be controlled at ±2º C from the calibration environment.

RF cables and connectors — Every RF connection cable or adapter exhibits a loss characteristic. This loss is also frequency-dependent, such that loss at a low frequency is less than that at a high frequency (tilt). A quality, double-screened, three foot coaxial cable exhibits loss of around 0.5 dB in the cellular 800/900 MHz band, with 0.2 dB tilt. That same cable at PCS 1900/2000 MHz has a loss of 1.0 dB and the same 0.2 dB tilt. These cables may be documented for loss factors by sweeping them with a scalar network analyzer or spectrum analyzer with a tracking generator.

Mobile devices RF cables — OEMs of mobile devices normally provide a test cable to make that last 60 link to the device's RF test port. Some provide loss data, others do not, and this can be the greatest unknown factor in the RF power measurement equation. Typically these cables have 0.8 dB loss at cellular and 1.5 dB at PCS frequencies. Because of the special nature of the RF connector, cable sweeps are difficult to conduct on these proprietary connections.

RF shield enclosures and antenna couplers (radiated test) — When making measurements with no direct RF connection, a whole new set of parameters for free space loss come into play. For CDMA mobiles, it is imperative that the mobile is isolated from local operational signals so interference won't affect the measurements. Besides taking test equipment and cable loss variances into account, also consider the variances between individual RF shield boxes and antenna couplers.

The major difference when using a radiated test method is that the FWD link path (RX) and RVS link path (TX) have different values. This phenomenon is the consequence of unequal antenna coupler factors in the duplex frequency transmit/receive paths and multipath reflections inside the RF shield enclosure. This difference demands a more complex loss entry table to allow for separate values for each radio channel under test duplex frequencies.

Connector wear vs. time — In operation, the connectors of the entire manual test system configuration will show changes in loss factors over time. Depending on the number of connections made and the working environment, measurements will gradually deteriorate to a point where the tests will fail the set lower limit. This results from wear on the mating pins and surfaces because of repeated connections of the device. It is good working practice to sweep this cable periodically to ensure that it remains within certain acceptable specifications.

In radiated test environments, the system is essentially able to be fixed where none of the connections are disturbed and therefore become reasonably stable unless there is cause to move the setup location. This is yet another benefit of radiated test configurations.

Cleanliness — To maintain a calibrated environment, connectors must be cleaned at regular intervals. Dirty connectors can, and will at some point, affect measurement results. Failure to observe a cleanliness regime accelerates connector wear and increases uncertainties in the system.

Many factors can have a dramatic effect on measurement result uncertainty, repeatability and accuracy. It is possible to take precautions and measures to “calibrate them out” in general terms. But precautions can never produce a 100% certainty when measuring the same device on two or more seemingly identical test configurations. Factors contributing to uncertainty, repeatability and accuracy include:
Temperature effects — Corrective action: Strive to control your test environment at 23° C ± 2°.
Cable losses — Precautionary measures: Use only high-quality cables that are either supplied with frequency vs. loss data, or regularly sweep the cables to establish that data. Observe a connector cleanliness regime. Wherever possible, make the installation permanent; disturb it only when necessary.
Radiated test — Calibration: Charac-terize the test system's RF losses to emulate the performance of test results achieved from direct cable tests. Where possible, make the installation permanent. Disturb it only when necessary.
Test equipment variance — Be aware: of the fact that no two seemingly identical test systems will produce identical results. Wherever possible, calibrate each system separately. When using generic values and calibrations across multiple systems, set pass and fail limits appropriate to the potential variances of those systems.
Sum of all unknowns

Temperature limits: ±0.5 dB to 2 dB at extremes
Cable loss differences of two identical test cables: ±0.1 dB to 0.2 dB typical
& Variances between two identical test instruments: ±0.1 dB to 1 dB typical
Variances of two identical shield/couplers for radiated test: ±0.3 dB to 1 dB typical
Worst-case radiated test: 2 dB (temp.) + 1dB (instrument) + 0.2 dB (cable) + 1 dB (shield/coupler) = 4.2 dB
Best-case radiated test: 0.5 dB (temp.) + 0.1 dB (cable) + 0.1 dB (instrument) + 0.3 dB (shield/coupler) = 1.0 dB

Calibration Procedure Using

Direct Connect

For the following, the exact procedure varies depending on equipment, but procedures are similar.
1. Assemble the cables and connectors to be used to connect to the test equipment (basestation emulator).
2. Calculate the sum of RF losses associated with these cables. Use known data or sweep the cables.
3. Enter the direct cable losses into the test equipment's RF offset table. Ensure that power control is set to alternating.
4. Switch on the mobile and wait for registration.
5. After registration, place the mobile into a call mode on the traffic channel to be used for reference measurements (typically 384 Cell or 600 PCS).
6. Set RF level to –104 dBm and select Max Power measurement function. Note and record reading (e.g., +23.5 dBm).
7. Select FER receiver sensitivity measurement function.
Note: Ensure that TCH/FCH traffic level is set to –12.4 dBm (RC1–RC2) or –15.3 dBm (RC3–RC4 and above) and that test parameters are set to 0.5% FER at 95% confidence and 1000 frames.
8. Decrement the RF output level in fine steps until the FER parameter starts to show errors. Continue slowly until the FER reading is 0.4%. Document the RF level (e.g.,–108.5 dBm). Check the accuracy of the cable offset values by setting the RF output level to –65 dBm, and observe the RF power level reading; it should read: –8 dBm (Cell) or –11 dBm (PCS) (±2 dB).
9. Repeat steps 1 – 8 for PCS channel if the mobile is either PCS or dual-band.

1. Achieve identical test results to those documented under direct RF connect.
2. Document the system loss factors for that model of mobile phone.
3. Inject those factors into the test system's RF Offset memory for recall based on the mobile phones' OEM/
Model number.

Mobile Configuration
Modern mobiles come in different shapes and sizes, with various features. It is important to configure a mobile the same way every time for testing in radiated test mode. Retractable antennas are un-reliable in terms of coupling with the an-tenna coupler. Always test with the antenna retracted. Flips open or closed change the multipath reflection characteristics inside the RF shield enclosure. Always test with the flip in the same position.

Calibration Method
1. Initially, use an educated guess for the loss factor to enter into the test system's RF offset memory. Typical losses for the 4910 Antenna Coupler are 11 dB Cell (TX) and 12 dB Cell (RX). For PCS, the values are much higher: 20 dB (TX) and 23 dB (RX). More precise devices can have a closer spread (as close as 14 dB Cell and 17 dB PCS).
2. Enter the estimated antenna coupler loss factors into the test system's TX/RX path loss tables.
3. Set RF level output to –65 dBm, turn on the mobile and wait for registration. If registration fails, increase path loss values until registration is successful.
4. Place mobile onto a call on traffic channel of interest.
5. Set RF Level to –100 dBm; select Max Power function and observe mobile power reading. Adjust the cable loss offset by the amount of differential of the reading from that recorded as Max Power under direct connect.
6. When you are satisfied that this path is close to the final value, select FER at 104 dBm and adjust RF level until 0.4% FER is observed. Adjust the path loss factor by the amount of differential from the FER level value recorded under direct connect.
7. When you are satisfied that both paths are as close as you can get, set RF level to –65 dBm and observe RF Power reading. The results should be close to –8 dBm (Cell) and/or –11 dBm (PCS) if the path loss values are correct.

Characterizing CDMA mobile devices becomes critical when designing components and sections to optimize power and performance. Knowing the metrics of characterization saves the team time and resources and shortens design time and errors. It further ensures that the systems are optimally loaded and power, both at the base and at the mobile, is at maximum efficiency.

About the Author
Christopher W. Rix is Willtek's Communications' 3G Product Marketing & Development Manager. He was educated in the United Kingdom's civilian and military electronics colleges/academies. He holds a U.S. electronics bachelor degree equivalent qualifications. His background includes 23 years in the UK Royal Air Force. He has worked in the Cellular and PCS test equipment industry since 1989 for manufacturers such as Marconi Instruments, IFR, Wavetek/WWG, Acterna and currently at Willtek Communications. At Willtek, he is responsible for the product definition and marketing aspects of test equipment covering 2G and 3G systems.