9.2 Understanding the Results Table

The Results table not only provides a way to view, sort, and group the data you received from a VoIP Quality assessment. It also shows data that is not included in VoIP Readiness Assessment reports. Each tabbed view in the table contains a summary of the major statistics—those that figure directly in the MOS calculation—as well as a listing of the measurements used to derive the major statistics.

For example, when you click the Lost Data Results tab, you can find out whether data loss was in a random or a bursty pattern by looking at the “Maximum Consecutive Datagrams Lost” column. Or you can check whether a significant number of datagrams were received out of order.

The Results table lets you easily see which factors contributed to a particular call-quality problem and also helps you test possible remedies. Adding quality of service to network traffic is one example of such a remedy. If you add QoS settings to some of the call groups in the VoIP Quality assessment, you can later access Analysis Console to sort the results by QoS. Just click on the “QoS” column heading and drag it to the gray area just above the Results table to sort the results by QoS setting. Similarly, you can sort the results by codec (call script) by clicking and dragging the “Script” column heading.

When you view the values in the Results table as Average Values, you might notice some apparent discrepancies in the totals. For example, assume you ran a VoIP Quality assessment containing four call groups. Each call group was created by a VoIP connector that defined four simultaneous calls. Analysis Console lets you view averages per call if you drill down into details for a call group. Select the call group in the Call Group Details list above the Results table.

The Results table presents data for each of the simultaneous “calls” whose results were collected over the course of the entire assessment. Averages are shown per call and per call direction.

Click to view the results as VoIP Quality indicators. Position the mouse over an indicator to see whether any calls were “Unavailable.” If you see any Unavailable results, the per-call values will not be exact averages of the per-direction values. That is because to be included in the per-call averages, a call must be complete: it must have generated results in both directions. Results from either direction may be unavailable. The per-direction averages include all results received for each direction. But unavailable results in one direction mean that the (available) results for the other direction are excluded from the per-call averages.

The following topics provide more information about the metrics and other information you see in the Results table:

9.2.1 VoIP Results Tab

The VoIP Results tab summarizes the results from the VoIP Quality assessment. By default, results are sorted by lowest MOS to highest. More information about each of the VoIP performance metrics shown in the tab (delay, jitter, and lost data) is available in the following topics.

The following information is shown on the VoIP Results tab:

Column Name

Description

Endpoint 1/Endpoint 2

The endpoints in a call group. Endpoints are computers that initiated and received bi-directional, simulated VoIP calls during the assessment.

Script

The call script used to generate the simulated VoIP traffic between the endpoints. Corresponds to a popular VoIP codec type.

QoS

Quality of service setting used by the simulated VoIP traffic, if any. For more information, see Section 7.11.7, Reviewing Quality of Service.

# Calls

The number of simulated calls that ran simultaneously between the endpoints in each call group. The Call Multiplier in the VoIP connector definition determines the number of calls.

Call Quality

The average Mean Opinion Score (MOS) estimate that a call group received over the course of the assessment. A score of 5.0 is the highest possible and is only theoretically achievable. By default, Vivinet Assessor assigns a VoIP readiness rating of "Good" to call groups with a MOS in the range of 5.0 to 4.03.

For more information, see Section 8.5.1, Mean Opinion Score.

End-to-End Delay (ms)

The average delay, or latency, measured between the endpoints in a call group from one end of the network to the other. Calculated by adding the network delay, packetization delay, jitter buffer delay, and additional fixed delay (if any was configured).

Jitter Buffer Loss (%)

Average data loss due to the jitter buffer configured in the call script, shown as a percentage of all data sent between the endpoints in a call group.

Lost Data (%)

Lost datagrams, expressed as a percentage of all data sent between the endpoints in a particular call group over the course of the entire assessment.

R-value

The output of an E-model calculation. The E-Model, ITU standard G.107, is an algorithm used to evaluate the quality of a voice transmission. The R-value is derived from voice quality impairment factors and mapped to an estimated Mean Opinion Score.

Throughput (kbps)

The average data rate achieved by the call traffic sent between the endpoints in a particular call group over the course of the VoIP Quality assessment.

Comment

Identifies the VoIP connector. By default, consists of the following information: Call script name: Endpoint 1 name - Endpoint 2 name Call multiplier.

9.2.2 Delay Results Tab

The Delay Results tab summarizes the delay results from the VoIP Quality assessment. By default, results are sorted by lowest MOS to highest. For more information, see Section 8.5.2, Delay.

See Section 9.2.1, VoIP Results Tab for definitions of the “Endpoint,” “Script,” “QoS,” “Call Quality,” and “Comment” columns on the Delay Results tab. The following measurements are also shown:

Column Name

Description

End-to-End Delay (ms)

The average delay, or latency, measured between the endpoints in a call group from one end of the network to the other. Calculated by adding the network delay, packetization delay, jitter buffer delay, and additional fixed delay (if any was configured).

Network Delay (ms)

Also referred to as "one-way delay." A datagram's RTP timestamp (signifying the time it was sent by the sending endpoint), subtracted from the time it was received by the receiving endpoint. Includes both of the following:

  • propagation delay—time spent on the actual network

  • transport delay—time spent getting through intermediate network devices.

must synchronize their high-precision clocks to calculate network delay. For more information, see How Endpoints Calculate Delay.

Estimated Clock Error (ms)

An estimate of the maximum discrepancy between the synchronized high-precision clocks on the endpoint computers used to measure network (one-way) delay. Add the estimated clock error to and subtract it from the network delay measurement to yield an upper and lower bound for the actual network delay.

NOTE:The estimated clock error can be unexpectedly large for any of the following reasons:

  • The endpoints just synchronized their clocks for the first time (as in a quick quality check with a single set of simulated calls). Therefore, insufficient information has been gathered to calculate an accurate estimated clock error. A conservatively large estimated clock error is shown instead.

  • During clock synchronization, network conditions changed such that data flowed between the endpoints significantly more rapidly in one direction than in the other. Accurate clock synchronization requires a symmetric connection between , where data is sent with equal speed in both directions.

  • One or both of the endpoints was busy with other processing during clock synchronization.

9.2.3 Jitter Results Tab

The Jitter Results tab summarizes the delay results from the VoIP Quality assessment. By default, results are sorted by lowest MOS to highest. For more information, see Section 8.5.3, Jitter and Section 8.5.4, Jitter Buffers and Datagram Loss.

See Section 9.2.1, VoIP Results Tab for definitions of the “Endpoint,” “Script,” “QoS,” “Call Quality,” and “Comment” columns on the Jitter Results tab. The following measurements are also shown:

Column Name

Description

Jitter Buffer Loss (%)

Average data loss due to the jitter buffer configured in the call script, shown as a percentage of all data sent between the endpoints in a call group. Includes the following totals:

  • jitter buffer overruns—datagrams with a delay variation (jitter) greater than the jitter buffer size or were delayed too long.

  • jitter buffer underruns—datagrams that arrived too quickly while the jitter buffer was still full.

Jitter (ms)

The average jitter, or delay variation, measured between the endpoints over the course of the VoIP Quality assessment. Shows the differences in arrival times among all datagrams sent between these endpoints

Max. Jitter (ms)

The maximum jitter, or delay variation, measured between the endpoints during the VoIP Quality assessment.

9.2.4 Lost Data Results Tab

The Lost Data Results tab summarizes the delay results from the VoIP Quality assessment. By default, results are sorted by lowest MOS to highest. For more information, see Section 8.5.5, Lost Data.

See Section 9.2.1, VoIP Results Tab for definitions of the “Endpoint,” “Script,” “QoS,” “Call Quality,” and “Comment” columns on the Lost Data Results tab. The following measurements are also shown:

Column Name

Description

Lost Data (%)

The average amount of data lost between the endpoints during a set of simulated calls. Expressed as a percentage of all datagrams sent.

Max. Cons. Datagrams Lost

The maximum number of datagrams lost consecutively. The RTP header includes sequencing information to help the receiver reconstruct the transmission. The endpoints can use this information to measure consecutive datagram loss.

Datagrams Lost

The number of datagrams lost between the endpoints during a set of simulated calls.

Datagrams Out of Order

The number of datagrams received out of order during each set of simulated calls.

9.2.5 Endpoint Configuration Tab

The Endpoint Configuration tab identifies each endpoint used in the VoIP Quality assessment and provides the following information:

Column Name

Description

Name

The domain name assigned to the endpoint computer.

Version

The version of the Performance Endpoint software.

Build Level

The build number of the Performance Endpoint software.

Operating System

The operating system of the endpoint computer.

OS Version Major

Operating system version.

OS Version Minor

The revision of the operating system.

CSD Level

Service-pack information for the operating system installed on the endpoint computer.

Memory

Total amount of random-access memory (RAM), in kilobytes, on the endpoint computer (not the available RAM).

For more information, see Section 1.4, NetIQ Performance Endpoints.

9.2.6 Background Traffic Tab

The Background Traffic tab summarizes the results from any background traffic that ran during the VoIP Quality assessment. For more information, see Section 7.11.1, Understanding Background Traffic.

See Section 9.2.1, VoIP Results Tab for definitions of the “Endpoint” and “Comment” columns on the Background Traffic tab. The following measurements are also shown:

Column Name

Description

Configured Data Rate

The data rate selected when the Background Traffic connector was created. Default is 28.8 kbps (modem).

Actual Data Rate (kbps)

The throughput for TCP traffic that was measured during the assessment.