Networks, including 3G (UMTS) cellular ones, will carry ATM circuits over them. As ATM was designed assuming fibre as the bearer medium, limited error detection or correction is employed. However, microwave radio is a suitable medium for ATM transmission, and current radio planning rules can be used.By Stratex Networks
Microwave radio link design involves specifying equipment configuration and antenna type and sizes to meet a particular quality objective. In the past, radio links had to meet a quality objective strict enough to carry high quality voice circuits. The minimum quality standard of any transmission medium to carry voice circuits was assumed to be a Bit Error Ratio (BER) of 10-3. If the link quality of a radio system fell below this, the link would be assumed to be unusable. The duration that the link was in this condition was considered as outage time.
As data transmission increased on transmission networks, new standards emerged to quantify the quality requirements of transmission systems. Outages longer than 10 seconds were covered under ITU availability standards and new standards such as G.821 (and G.826) covered the short breaks in transmission that affected link quality. They also tried to introduce a metric stricter than 10-3 to account for the fact that data transmission requires better than 10-6 for acceptable throughput. The Degraded Minute concept that had a threshold of 10-6 was introduced to address this.
Unfortunately, the added complexities in these standards, together with practical difficulties associated with applying them, made them difficult to apply, which led the industry to adopt a simpler and more pragmatic approach. This approach assumes that all outages from both rain and multipath fading should result in an overall hop availability between 99.99% and 99.999% (so called 4 nines and 5 nines standard). The minimum threshold value assumed is 10-6, rather than 10-3, which was more suitable to voice-based traffic.
In summary, planning of the radio links is done assuming that during the periods of availability the quality level equals the background error rate (usually 10-13) and the unavailable time (including accumulated multipath errors) should be less than 99.99% (or 99.999%). This is calculated from the fade margin, which is the difference between the nominal receive level (a function of transmitter output power and antenna size) and the 10-6 receiver threshold level.
With increased growth of ATM networks and particularly with 3G networks being based on ATM, there is a concern that 10-6 is not good enough as the receiver threshold value, due to the fact that ATM was designed with fiber in mind as the transmission medium. Confusion appears to exist between the background bit error rate (BBER) and outage threshold level. With the advent of more data and protocols such as ATM that assume a high quality transmission medium, new standards such as ITU-T G.828 and ITU-T I.356/I.357 have been written. These standards attempt to address the minimum design standards for the transmission medium, however, no radio standards yet exist that guide the Radio Planner on the exact approach to be taken. It is unlikely that any specific usable standards will be available in the short term because:
Existing standards such as G.821/G.826 are seldom rigorously applied due to their complexity. Block based standards used for G.826/G.828 were complicated to apply as it is not entirely clear what the ratio is between a block error rate and a bit error rate. In the case of SDH it is generally assumed that a BER of 10-3 equates to 10-4, or 10-5 block rate depending on the error distribution and burst length.
ATM error measurements are done using Cell Loss Ratio (CLR) but only the header is checked hence the probability of error detection is 5 bytes out of 53 (approximately 10%). The actual error rate can thus only be detected at the end points of the virtual circuit and, therefore, any BER assumptions of the transmission link cannot be easily related to CLR. A general assumption based on the above could be to assume any BER requirement should be one order of magnitude higher for ATM traffic.
It is important to know the type and size of original traffic that was mapped into an ATM cell and whether re-transmission or error detection/correction is done at a higher protocol level as this will effect the performance. For example, for Ethernet traffic where you expect to measure a BER of 1 × 10-6 for a PDH/SDH system, the corresponding IP system would have an equivalent performance of FER 5 × 10-4 based on Frame error rates. An Ethernet frame is considered errored if at least one bit in the frame is errored. Assuming a normal distribution the probability of there being exactly one errored bit in a 64 octet frame is:
P = p 3 (1-p) 64*8-1 3 (64*8)
where p = BER
The probability that a frame contains exactly two errors is:
P = p2 3 (1-p) 64*8-2 3
The probability of there being at least one bit errored in the 64 octet frame is:
P =1- (1-p) 64*8
It is Stratex Networks' view that, while taking cognizance of the new standards, a pragmatic approach is once again required.
In reality most of this theoretically derived information has little significance in a real network and is almost impossible to apply. Let's look at the reality of a typical network:
 It is impossible to offer an error free service that never, ever fails. In practice it is required that the Operator provides a good quality connection for 'most of the time' this is usually quantified by the Operator and often included in Service Level Agreements (SLA's). To achieve this, it is important that the transmission medium has a good residual background rate. ATM may have issues running over a transmission medium with a very poor background bit error rate (BBER) due to the issues discussed above. In the case of microwave radio and fiber optics, there are no concerns as both offer a BBER of around 10-13.
 For a small percentage of time, irrespective of the medium used, errors will occur leading to periods of poor quality and unavailable service. One needs to quantify what small is most operators choose an availability value between 99.99% and 99.999% and assume a MINIMUM quality level of 10-6. That is, if the quality of the link falls below 10-6 it is assumed the link is unusable and this time counts as outage time.
Herein lies the rub! Officially, outage from an ITU perspective is 10-3 but operators have tended to specify 10-6 because they argue the data throughput rate slows down to an unacceptable level below 10-6. So an Operator viewpoint, the link is unavailable. Despite being contrary to the ITU approach, choosing 10-6 as the minimum level has become accepted as an industry standard and is the level that microwave radio systems are designed to.
For ATM, the debate is whether 10-6 is good enough (some have suggested that 10-10 is a minimum level). There is no right or wrong answer because we are already operating outside the letter of the law, so far as standards are concerned. One could argue that as ATM requires one order of magnitude better than other protocols to operate and to be fully compliant to ITU, a threshold level of 10-4 should be specified for PDH links (where Bit Error rate standards are specified) and the threshold would be set at 10-6 for ATM over SDH (where Block error rates are specified since 10-5 is roughly equivalent to the Bit error ratio of 10-3).
Alternatively, one may argue that even though 10-6 is not specified by the ITU as the minimum standard, it is the accepted industry standard and therefore 10-7 is the appropriate threshold value. DMC's approach is to say that while all this is fascinating technically it makes virtually no difference from a practical point of view, as even under the strictest assumption of 10-10, there is less than 2 dB difference in the fade margin. Certainly most leased line circuits won't guarantee anything near 10-10 yet ATM runs successfully over them. The receiver threshold value for link design of 10-6 is still valid in practice, and hence the design method for ATM-based transmission is no different to the industry approach currently used.
ATM traffic requires a very high quality transmission medium with a good background error rate. Microwave radio and fiber optics are ideal from this perspective as they both offer in the order of 10-13 background error rates. Microwave radio is subject to fading for a small period of time but by proper design this is limited to a specified time period (typically better than 99.99% availability). The threshold used to calculate this outage has traditionally been set higher than that specified by the ITU standards (10-6 instead of 10-3), which means that the stricter ATM requirements are still met. If one is rigorously applying the ITU standards, it may be appropriate to use 10-4 as the threshold for PDH links and 10-6 for SDH links. However, concerns over the actual performance in practice may lead planners to demand a threshold of anything up to 10-10. Considering the industry standard is already 10-6 and that there is typically less than 2 dB difference between 10-6 and 10-10, there is no requirement to change the planning approach.
Mr. Manning can be reached at firstname.lastname@example.org.