Cable VoIP Troubleshooting Poster - 0804test-Wc

Embed Size (px)

Citation preview

  • 7/30/2019 Cable VoIP Troubleshooting Poster - 0804test-Wc

    1/1

    When it comes to testing digital QAMsignals,testing experts suggest that cable network oper-

    ators use their digital video analyzers to test bothModulation Error Ratio (MER)and Bit Error Rate

    (BER).Thats because MER and BERmeasurementsdetect different types of impairments.

    MER is the measurement in dBof the RMS error magni-tude over the average symbol magnitude.The higher the

    error magnitude, the poorer the MER.MER essentiallyassigns a value to the fuzzinessof the symbol cluster (see

    Figure 1). So, the larger or fuzzier the cluster becomes, thepoorer the MER. Likewise, the farther the dots move from their

    ideal locations, the poorer the MER.

    For example, the diagram shown here on the left is a constellation with a goodMER of34 dB (Figure 2), while the diagram on the right (Figure 3) shows a constellation with apoorMERof 21 dB.

    Each symbol, or dot,on the constellation is framed by decisionboundaries (see Figure 4). When the carrier falls inside the

    boundaries, the information is transmitted without

    errors.In this example, BER testing is not aneffective measurement because the BER is per-fect.But the good news could be hiding problems.

    Using MERinstead, it is clear that while each ofthe following constellations have a perfect BER,the constellation in Figure 7 has a much betterMER, with less noise (see Figures 5, 6 and 7).

    So, why measure BER? Because MERis a poorindicator of fast, intermittent transients.Examples ofthese types of impairments include laser clipping(the most common cause), loose or corroded con-nections,sweep system interference and microphonics.So, if you have high MER, but errors are present, theyare probably being caused by intermittent interference.This shows up on a constellation diagram as a lone dotthat is away from the main cluster.

    JUST ANOTHER WAY WERE

    UNCOMPLICATING CABLE

    MER

    PRE/POST BER

    AMPLITUDE RESPONSE

    GROUP DELAY

    MICRO-REFLECTIONS

    SUNRISE TELECOMwww.sunrisetelecom.com

    UPSTREAM

    CHARACTERIZATION TOOLKIT

    AT2500RQvSpectrumAnalyzer

    CM1000-USUpstreamSigGenerator

    Figure 1

    Figure 3. Constellation with poor MER

    -7 -5 -3 -1-1

    -3

    -5

    -7

    1

    Correct locations fallwithin decision

    boundaries

    Locations in error fall outof decision boundaries

    -7 -5 -

    GoodPerfe

    -7 -5 -

    PoorPerfe

    -7 -5 -

    BestPerfe

    How goodshould it be?Targeted performance goals

    The charts shown at right outline performance goals for a

    typical network.Specific system requirements may requiretighter or less critical performance.

    Scientific notationBER (bit error rate) measurements are expressed in terms oferrors divided by the total number of un-errored bits transmitted

    or received.Since the number of errors is very small comparedto the number of bits transmitted, the measurement is typically

    expressed in scientific notation.For example, one error out ofone million bits would be expressed as 1/1,000,000 or 1.0 E-6.

    Confusion often arises when a second measurement is com-pared. Is 7.0 E-7 better or worse? 7.0 E-7 means seven errors

    out of 10 million bits, which is actually a little better than 1 inone million.The chart at top right may be helpful in interpreting

    scientific notation.

    One important note: Many instruments will read 0 (zero) or0.0E-0 when no errors have been detected. E0 or E-0 is equal to

    1, but the leading 0 makes the measurement equal to 0.

    1.00E+00 1/1

    1.00E-01 1/10

    1.00E-02 1/100

    1.00E-03 1/1,000

    1.00E-04 1/10,000

    1 .0 0E -0 5 1 /1 00 ,0 00 O n

    1.00E-06 1/1,000,000

    1.00E-07 1/10,000,000

    1.00E-08 1/100,000,000 O

    1.00E-09 1/1,000,000,000

    1.00E-10 1/10,000,000,000

    1.00E-11 1/100,000,000,000 O

    1.00E-12 1/1,000,000,000,000

    0.00E-00 0 x 1

    SCIENTIFIC NOTA

    6 4 QA M 2

    Excellent 3 5 d B

    Acceptable 3 4 d B

    Marginal 3 2 d B

    Ex ce lle n t 3 5 d B

    Ac ce p ta b le 3 3 d B

    M ar gi na l 3 0 d B

    Excellent 3 3 d B

    Acceptable 3 1 d B

    M ar gi na l 2 8 d B

    Excellent 3 3 d B

    Acceptable 2 9 d B Marginal 2 5 d B

    Excellent 3 2 d B

    Acceptable 2 8 d B Marginal 2 5 d B

    Node

    Amp

    Mode

    m

    Headend

    Digital dataMER

    6 4 QA M 2

    Excellent 3 5 d B

    Acceptable 3 3 d B

    Marginal 3 0 d B

    Excellent 3 4 d B

    Ac ce p ta b le 3 1 d B

    M ar gi na l 2 8 d B

    Excellent 3 3 d B

    Acceptable 3 0 d B

    Marginal 2 5 d B

    Excellent 3 2 d B

    Acceptable 2 8 d B

    Marginal 2 4 d B

    Excellent 3 2 d B

    Acceptable 2 7 d B

    Marginal 2 3 d B

    Digital video

    Node

    Amp

    Tap

    Set-top

    Expected MER

    MER

    Headend

    Tap

    When it comes to the deployment of IP-based services, its still soearly that there appears to be a lack of consensus regarding which tests

    need to be run, and how often they should be performed. Some say that ifa cable operator builds a strong platform, then the network will run well and

    services will be pristine. But just what are the key components to a strongplatform? In talking to the experts, it consists of controlling the worst of the

    impairments, namely latency, jitter and packet loss.

    Latency

    Performing all the functions that are required to process and packetize voice signalsand then transport them from the origination point to the receive point in any IP

    architecture, including PacketCable, takes time.Each particular function requires tinyfractions of seconds, but the total amount of time varies based on the architecture of

    the device as well as the amount of traffic that has to be processed.Thistime delay is known as latency.

    Most network latency occurs after the packets leave the endpoint, orgateway. Every time a packet encounters a network r outer, a few mil-

    liseconds or more of additional latency is introduced.Therefore,unless the signal is kept within a carefully managed intranet or

    similar type of network, there is no control over the number ofrouter-to-router hops a packet takes.Monitoring the total

    latency a packet is experiencing is necessary to maintain-ing a high-quality signal transmission.

    According to the International TelecommunicationsUnion (ITU) guidelines, delays below 150

    milliseconds are considered acceptablefor most communications.Delays

    ranging between 150 and400 ms could also

    be acceptable, depending on the voice quality desired, but over 400ms is deemed unacceptable.Delays on VoIPsessions are measured intwo categoriesfixed and variable.

    Fixed delays can include the following:

    Propagation delay:The time it takes for the packet to be transmitted over thephysical link.This delay is usually bound by physical characteristics of the trans-mission media (e.g., when using a fiber optic circuit, it would be bound by the

    speed of light).

    Serialization delay:The time it takes to place the bits from the transmission bufferinto the transmission media. The higher the speed, the less serialization delay.

    Processing delay:Includes the time it takes to code, compress, decompress anddecode the voice signal, and the time it takes to collect enough voice samples to beplaced on the payload for a data packet.This varies, depending on the algorithm used.

    An example of variable delay is Queuing Delaythe time a packet has to wait in a routerbefore it can be serviced.This delay will occur at every router in the path of a VoIP session.

    Jitter

    In addition to being sent over an unpredictable number of router hops, packets are also routedfrom one router to another using different assigned routes, each of which has a dif-ferent amount of traffic it has to handle.So, packets from the same voice conversa-

    tion will experience differing amounts oflatency as they head toward their

    destination.T hese variabledelays produce jittera phenomenon thatcomes from differ-ent packetsarriving at thedestination

    at differentpoints in

    time.

    Gateways usebuffers to collectand hold thepackets and putthem back inthe properorder. But eventhis process

    has to be optimized, so as not to introduce its ownunacceptable latency.Again, jitter must be effective-ly monitored to be sure its being properly dealt with.

    Dropped packets

    When traffic levels rise to a l evel that overloads arouter, the device may intentionally drop packets torelieve the congestion.Error-checking has been builtinto the protocols and is used to maintain dataintegrity.But this procedure requires additional over-head, and isnt really optimized for voice signals.Acertain number of dropped packets (less than3 percent, typically) can be tolerated by thehuman ear before signal degradation isperceived, but beyond that amount,

    call quality can degrade tounacceptable levels.

    StandaloneMTA

    HFC accessnetwork

    (DOCSIS)

    Embedded MTA

    MTACable

    modema

    managemen

    server

    Media

    servers

    etwor - asedcall signaling

    arc tecture

    Linecontrol

    s gna ng

    architecture

    Cable

    modem

    CMTS

    MTA

    StandaloneMTA

    HFC accessnetwork

    (DOCSIS)

    Embedded MTA

    MTACable

    modem

    Cablemodem

    CMTS

    MTA

    OSSbackoffice

    Billing

    Provisioning

    Problem resolution

    DHCP servers

    DNS (Domain name service)

    TFTP servers

    Managed IP backbone

    (QoS features)(Headend, local, regional)

    PSTN

    For years, cable operators have been diligently testing their networks to ensure signal quality and to comply with national signal leakage

    guidelines. But as the industry adds new services to its repertoire, testing the network simultaneously becomes more difficult and more important.

    The addition of digital video, DOCSIS data channelsand now, voice-over-IP servicehas ushered in a new list of parameters that have to be

    monitored, analyzed and even adjusted to ensure that customers are getting what they pay for.

    The goal of this chart is to explain several new, emerging testing concepts that relate to voice-over-IP, DOCSIS data and digital video.

    TESTING FOR IP-BASED SERVICES

    MEANOPINIONSCORES

    Speech quality is usually evaluated on a five-point scale, knownas the mean-opinion score (MOS) scale, in speech quality

    testingan average over a large number of speech data, speakers and

    listeners.The five points of quality, from one to five, are: bad, poor, fair,good, and excellent. Quality scores of 3.5 or higher generally imply high levels

    of intelligibility, speaker recognition and naturalness.

    MOS is a global method used to evaluate the users acceptance of a transmissionchannel or speech output system.It reflects t he total auditory impression of speech

    by a listener. For quality ratings, normal test sentences or a free conversation are usedto obtain the listeners impression.The listener is asked to rate his impression on sub-

    jective scales such as: intelligibility, quality, acceptability, naturalness, etc. The MOSgives a wide variation among listener scores and does not give an absolute measuresince the scales used by the listeners are not calibrated.

    Using this method, a score from 4 to 5 isconsidered toll quality;3 to 4, communica-tion quality;and less than 3, syntheticquality.But this method is both timeconsuming and expensive.Objectivemodels that predict human qualityjudgments have also been devel-

    oped.These perceptual modelstransmit an audio file through

    the network, comparing thereceived and transmitted files to assess distortions. While

    perceptual models are useful in laboratory settings,

    these models are unsuitable for the continu-ous monitoring of VoIPnetworks.

    1

    0

    2

    3

    4

    5

    MERANDBER

    4HE!CTERNA$3!-

    -EASURESVOICEVIDEOANDDATA

    s 0!#+%4,/33

    s $ %,!9

    s * )4 4%2

    s-%2

    s"%2

    7P*1#VJMU3JHIU*O

    "$5&3/"XXXBDUFSOBDPN

    Figure 2. Constellation with good MER

    P.O.Box 266007, Highlands Ranch, CO 80163-6007CEDmagazine, August 2004 www.cedmagazine.comTel.:303-470-4800 Fax:303-470-4890

    The publisher gratefully acknowledges Trilithic Inc., Sunrise Telecom,Acterna and others for contributing content to this chart.

    VoIP and Digital TV Testing

    PacketCable architecture

    Figure 4

    http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.sunrisetelecom.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.acterna.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.cedmagazine.com/http://www.sunrisetelecom.com/http://www.trilithic.com/http://www.acterna.com/