12
ericsson White paper 284 23-3179 Uen | September 2012 How to verify an LTE deployment Mobile-network operators face the significant challenge of maintaining a profitable business while meeting market demands for faster data speeds and increased data volumes, and addressing the needs of millions of smartphone subscribers. To overcome these challenges and add capacity, many operators have chosen to deploy 4G LTE networks – a strategy that supports the rollout of new services and enhances the user experience. Capacity and competition challenges are compressing the deadlines for bringing LTE networks into service. Add to this the fact that networks are becoming increasingly complex – not just in terms of design and optimization, but also in relation to testing and verification. The result is a focus shift and a structured approach to verification that supports simplicity and transparency. Transparent network- performance verification for LTE rollouts

Wp lte-acceptance

Embed Size (px)

Citation preview

Page 1: Wp lte-acceptance

ericsson White paper284 23-3179 Uen | September 2012

How to verify an LTE deploymentMobile-network operators face the significant challenge of maintaining a profitable business while meeting market demands for faster data speeds and increased data volumes, and addressing the needs of millions of smartphone subscribers.

To overcome these challenges and add capacity, many operators have chosen to deploy 4G LTE networks – a strategy that supports the rollout of new services and enhances the user experience.

Capacity and competition challenges are compressing the deadlines for bringing LTE networks into service. Add to this the fact that networks are becoming increasingly complex – not just in terms of design and optimization, but also in relation to testing and verification. The result is a focus shift and a structured approach to verification that supports simplicity and transparency.

Transparent network-performance verification for LTE rollouts

Page 2: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • ToDaY’s verificaTion cHaLLenGes 2

Rapid changes in the mobile industry are creating difficulties for network operators. Ensuring optimal network performance during rollout and understanding how performance ultimately affects user experience are just two examples of the type of issues operators have to face.

Industry complexIty The number of 4G networks being deployed is rising constantly. These new networks usually require integration with 2G and 3G technologies and can include greenfield rollouts, overlays on existing networks, transformation and multi-vendor solutions.

At the same time, rapid consumer adoption of smartphones has significantly increased the amount of signaling in networks, created massive volumes of data and added to the number of network events. Network-performance measurements – and their interpretation – need to be harmonized and adapted to the complex nature of today’s networks to ensure that network-performance targets set by the operator are met.

more servIcesIn addition to these trends, new functionality and services are constantly being introduced, implying that test methods and metrics also need to be kept up to date to ensure that user experience continues to be measured accurately. Additional subscriber services, such as voice over LTE (voLTe) and high-definition video on demand (voD), will become commonplace services offered through high-bandwidth LTE networks and many operators will use QoS and QoE indicators to ensure that these services are delivered with the high level of quality expected by users.

AutomAtIon The increasing reliance on self-organizing networks (SON) to perform optimization tasks introduces a different set of challenges. To realize the full operational cost savings that can be achieved when SON functions are implemented, performance metrics and their capacity to reflect user experience accurately must be completely reliable.

KpI complexItyTraditionally, a range of key performance indicators (KPIs) has been used to ensure that quality and performance targets are met. Usually, such KPIs are chosen from the bottom up, tend to be uncoordinated, and do not always accurately reflect user experience. Furthermore, significant effort can be invested in trying to improve one set of KPIs at the expense of others. Poorly selected KPIs can be complicated to measure and difficult to work with, and reaching their set targets may have little impact on overall system efficiency, performance and user experience.

Operators need to be assured that a given KPI will reflect the actual user experience accurately after rollout. For example, an LTE radio bearer can be dropped when an always-on LTE device is not transmitting data. Measuring such a KPI does not offer any real understanding of user experience, as sessions are quickly reestablished when needed and any delay goes unnoticed by the user.

The way smartphones behave adds to the complexity of measuring performance – traffic generated by these devices tends to be chatty and differs significantly from the traffic that is generated by, for example, an LTE USB dongle. The way a device behaves in a network can change when a new model is introduced, software upgrades can alter the way an application functions, and the constant flow of new apps onto the market all affect the way KPI measurements are interpreted.

Managing KPI complexity is difficult enough in a single-technology environment. In an ecosystem that encompasses all technologies (2G, 3G and 4G) the level of complexity increases dramatically and confidence in KPIs as a true reflection of actual user experience needs to be assured.

TODAY’S VERIFICATION CHALLENGES

Page 3: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • a new framework for verificaTion 3

There are as many ways of measuring network performance as there are equipment vendors. Just as the metric system was introduced so that weights and distances could be compared on similar scales and across international boundaries, the mobile industry today needs a shared framework for network-performance verification.

This paper proposes a framework that simplifies the process of network-performance verification and provides a concise view of performance that is aligned with user experience.

The key characteristics of this framework are that it: > Complies with ITU (International Telecommunication Union) and ETSI (European

Telecommunications Standards Institute) recommendations for performance reporting. > Uses commercially available terminals with standard network-parameter settings. > Involves the selection of a manageable number of top-level KPIs, built from many lower-level

performance indicators (PIs). > Includes only KPIs that are relevant to user experience. > Specifies how and where each KPI should be measured. > Focuses on verification that is in line with operator priorities and is applicable in strategically

selected areas.

complIAnce wIth recommendAtIonsFor mobile networks, the ITU Telecommunication Standardization Sector (ITU-T) has worked with ETSI to describe a general model for QoS from the user’s perspective[1][2]. The 3GPP-defined QoS categories are service accessibility, retainability, integrity and mobility[3][4][5].

In accordance with these recommendations, Figure 1 illustrates a cost-efficient KPI methodology that includes just a few QoS-aligned KPIs based on measurements for subscriber experience and network quality.

A NEW FRAMEWORK FOR VERIFICATION

Figure 1: The QoS component of the ITU-T KPI hierarchy

KPI hierarchy – ITU-TKPI hierarchy – ITU-T

QoS

ContractedKPIsContractedKPIsService

accessibilityService

accessibility

QoSQoS

Network monitoringNetwork monitoring

Serviceretainability

Serviceretainability

ServiceintegrityServiceintegrity

Availability performanceAvailability performance

Resourcesand facilitiesResources

and facilities Reliabilityperformance

Reliabilityperformance

TransmissionperformanceTransmissionperformanceMaintainability

performanceMaintainabilityperformance

Maintenancesupport

performance

Maintenancesupport

performance

Page 4: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • a new framework for verificaTion 4

commercIAl devIces And stAndArd settIngsThe capabilities of LTE are often demonstrated in trial scenarios with ideal and highly optimized network conditions. Trials can be carried out using high-performance user equipment (UE) simulators, well-tuned laptop settings and colocated core and radio-access network (RAN) equipment with parameter settings optimized for performance rather than capacity. Such trial environments do not reflect real network environments accurately. In operational networks, there are tradeoffs between performance and capacity that cannot always be taken into account in trials. The architecture of the core and transport networks may have an impact on performance, and some KPIs may need to be adjusted accordingly.

To create reliable KPI measurements that reflect user experience accurately, testing should be conducted using: > Commercial terminals. > Standard live network parameter settings. > Transparent and fully documented KPI formulas, including the network events that trigger

counter stepping.

mInImIzIng KpIsFigure 2 illustrates a structured approach where many PI metrics are consolidated into just a few KPIs that reflect user experience accurately.

By adopting this approach, the KPIs that best reflect user experience will be assigned with the greatest importance. In this way, priorities and performance targets can be determined and the right level of effort can be allocated to solving issues that are relevant – thereby fast-tracking network-performance improvements.

relevAnt KpIsThe relevance of a given metric to a KPI is directly related to its impact on user experience or overall system performance. A metric that affects neither should be classified as a PI. For example, most users are completely unaware of the Time to Attach metric, which is a measure of the amount of time it takes for a device to connect to a network. This is a background operation that occurs infrequently, and as such has little impact on user experience and overall system performance.

how And where?Complex and time-consuming verification procedures should be limited to laboratory or field environments. Such tests can be advantageous when it comes to validating the final set of KPIs and suitable target-value ranges.

For cluster tuning and acceptance, a small number of KPIs should be selected to allow for efficient drive testing and to ensure a sufficient degree of accuracy when measuring user experience.

Target values for KPIs should be determined to reflect different test environments. Once a network is in operation, counter-based KPIs can be utilized to monitor performance as traffic increases. These KPIs based on commercial traffic and network counters are likely to differ from those selected for cluster tuning, which are based on drive test measurements. For example, measuring user-perceived latency is straightforward with a drive test, but impractical with network counters.

Focused verIFIcAtIon Tuning and verification activities are designed to offer maximum benefit to the operator. Focused verification can be based on factors such as the location of key users, areas of dense traffic, difficult environments and known trouble spots.

KPIKPI KPI levelKPI level

Primary PI levelPrimary PI levelPIPI

PIPI PIPI PIPI

PIPI

EventsEvents EventsEvents EventsEvents

Figure 2: A structured approach to KPIs

Page 5: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • a new framework for verificaTion 5

cluster And networK perFormAnce This type of KPI and the way it is defined tends to be agreed between the operator and the vendor responsible for design, build and integration before testing starts. Ideally, agreement should be reached before contract finalization so that both parties have a mutual understanding of the requirements and scope of verification testing. Table 1 summarizes some KPIs commonly used for cluster verification.

Table 1: List of recommended first-order KPI categories for cluster verification

The KPIs recommended here are built on several lower-order or network-monitoring PIs. Figure 3 illustrates how a single top-level KPI – Session Setup Success Rate – is created from several lower-order PIs as the best way of reflecting user experience when it comes to establishing a session. Users are unaware of the network events required to establish a session and failures can occur at any point, such as Radio Resource Control (RRC) establishment, S1 link establishment or radio-access bearer (RAB) establishment. By reporting only the top-level KPI, a better estimation of user experience is reported. Performance statistics for all PIs are recorded for investigative and troubleshooting activities.

cAtegory descrIptIon

Accessibility How easy it is for the user to obtain a service within specified tolerances and other given conditions. Session Setup Success Rate is a common KPI in this category.

Retainability The capability of a service, once obtained, to continue to be provided under given conditions for a requested period. Examples of KPIs in this category include Session Abnormal Release Rate (dropped calls) and Minutes Per Abnormal Session Release.

Integrity The degree to which a service, once obtained, is provided without excessive impairments. examples include downlink (DL) and uplink (UL) throughput, latency and packet loss.

Mobility Performance of all handover types. Examples include LTE Handover success rate and inter-radio access Technology (iraT) Handover success rate.

Session setup success rateKPI

PIsRRC establishment

success rateS1 link establishment

success rate

System events and counters

ERAB establishmentsuccess rate

Figure 3: KPI simplification for Session Setup Success Rate

Page 6: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • a new framework for verificaTion 6

KpI FrAmeworK – An exAmpleTable 2 shows a typical KPI framework for an LTE network, highlighting where each KPI should be tested. In this example, VoLTE is tested in a controlled environment and verified in the field using a reference site or golden cluster. Having successfully completed voLTe tests in this way, the default-bearer KPIs can be used with confidence for general cluster-acceptance drive tests.

This table is operator-specific, developed to suit individual operator targets and requirements, and supports the objective of reducing the total number of KPIs.

As throughput performance indicators vary from cluster to cluster and are influenced by site location and user behavior, the uplink and Downlink user Throughput are marked as monitored PIs in the example. These indicators are used to set tuning objectives in cluster verification and can be utilized to monitor load and capacity in operational verification – and are consequently not KPIs.

Table 2: KPI framework for LTE network verification

KpI AreA KpIcontrolled envIronment

sIte AcceptAnce

golden cluster

clusterIn-servIce operAtIon

Availability Cell Availability •

Accessibility Session Setup Success Rate • • • •

Retainability Session Abnormal Release Rate • • • •

Integrity RTT Latency • • • •

Integrity RTT Packet Loss (ping) • • • •

Integrity uL packet Loss (pDcp) • • •

Integrity DL packet Loss (pDcp) • • •

Integrity Downlink peak user Throughput • •

Integrity Uplink Peak User Throughput • •

Integrity Downlink user Throughput • PI PI PI

Integrity Uplink User Throughput • PI PI PI

Mobility Handover success rate • • • •

MobilityHandover interruption Time – Control Plane

• • •

MobilityHandover – session continuity Interruption Time

• •

VoLTE Voice – Session Setup Success Rate • • •

VoLTEVoice – Session Abnormal Release Rate

• • •

VoLTE Voice – Session Setup Time • • •

VoLTE voice – speech Delay • •

VoLTE Voice – Packet Loss Rate • •

VoLTE Voice – Jitter • •

different target values and test methods

Page 7: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • TYpicaL neTwork verificaTion 7

A typical network-deployment project includes several stages, and different verification processes are applicable at each of these. Figure 4 illustrates the traditional stages of a network rollout and the applicable verification activities.

product verIFIcAtIon, certIFIcAtIon And conFormItyIn the first stage of a network rollout, checks need to be carried out on the network equipment supplied to determine whether it is fit for purpose, and consequently whether the LTE solution can be integrated into the existing network with the given interfaces. This verification can also be used to validate KPIs that are strongly linked to product performance and are independent of radio tuning. examples include the ability of the system to achieve DL and uL peak data-throughput rates close to the theoretical maximum.

Product-conformance verification is typically carried out on a golden site or golden cluster, or in the operator’s lab environment. The outcome of the verification procedure is specified in an agreed set of configurations and parameters for the overall solution that have a certain level of confidence associated with them. This outcome can be used to benchmark performance once the network is deployed.

Some examples of product-conformance verification are:certificate of conformance – typically, this verification method is applied when product functionalities or attributes are complex or require specialized equipment to take accurate measurements. In such cases, product vendors supply a certificate to verify product conformance to the relevant industry standard.

supply of First office Application (FoA) reports – all new system features and software releases undergo an extensive FOA period, supporting the first operation and demonstration of new products in the operational system prior to commercialization. FOA reports can be used to verify system performance.

demonstration of features – to check interfaces to legacy equipment or to confirm configurations operators may require certain new system features to be demonstrated directly in the operational network. This type of verification testing is typically performed on a golden site or in a golden cluster.

performance measurement – appropriately selected performance aspects (KPIs and PIs) are validated to ensure that the contractually agreed requirements are met; optimization may lead to an adjustment of parameters.

TYPICAL NETWORK VERIFICATIOn

Productverification,certification

and conformity

Productverification,certification

and conformity

Site buildverificationSite build

verificationCluster and

networkverification

Cluster andnetwork

verification

In serviceIn service

• Laboratory or controlled field tests• In-depth testing• Verify performance and features

• Laboratory or controlled field tests

• Verify performance and features

• Site health checks• Site build integrity• Site health checks• Site build integrity

BuildBuild TuningTuning OperateOperate

• In-depth testing• Accessibility• Retainability• Integrity• Mobility

• Accessibility• Retainability• Integrity• Mobility

Drive test KPIDrive test KPI• Accessibility• Retainability• Integrity• Mobility

• Accessibility• Retainability• Integrity• Mobility

Network counter KPINetwork counter KPI

Figure 4: Verification stages of a typical network rollout

Page 8: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • TYpicaL neTwork verificaTion 8

BuIldThe purpose of site-build verification is to detect problems that arise during the installation and integration processes, which may affect user performance.

In this phase, typical tests – sometimes referred to as shake-down tests – include basic functionality checks of accessibility, retainability and integrity. Tests performed on an LTE site build may include access attempts on network cells, DL and uL throughput tests, and round-trip time tests for measuring end-to-end latency.

tuneThe processes of cluster and network verification are usually performed following network tuning. Tuning – or optimization – of the radio network ensures that network performance is in line with the targets set prior to commercial launch. Tuning helps to ensure good quality for users, which in turn facilitates successful network introduction for the operator.

The tuning phase is performed on base-station clusters and uses data from a number of sources, including drive tests and network measurements. Improvements can be identified and implemented to increase network performance. The final part of the tuning process typically involves cluster verification, in which measured network performance is compared with preset performance goals – typically the KPIs shown in Tables 1 and 2.

Cluster verification traditionally utilizes performance-evaluation data from network-drive tests. The load levels of pre-commercial and recently launched networks tend to be low, and user traffic tends to be network-friendly. When possible, statistical counters and event logs are also used to analyze user behavior, but this is only possible to a certain degree owing to the small number of active terminals in the network.

LTE brings a number of SON features that support the performance-tuning process. The availability, scope and effectiveness of SON features such as Automated Neighbor Relations (ANR) are expected to improve over time. By applying SON techniques, the process of performance tuning should be faster, significantly simpler and will eventually reduce or even eliminate the need for network-drive testing. 3GPP has started a feasibility study[6] to investigate the minimization of drive tests in next-generation networks, with a view to reducing network operational costs.

If the introduction of SON functionality succeeds in reducing or eliminating network-drive testing, it will become increasingly important for network-performance reports and measures to reflect user performance accurately and transparently.

operAteOnce the network is in service, network performance can be assured through monitoring of counters. Key aspects of operational performance monitoring are described in another Ericsson white paper on service assurance[7].

Page 9: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • case sTuDies 9

cAse 1 – KpIs wIth low relevAnceThis case relates to an LTE rollout project targeting more than 10 million subscribers for LTE mobile broadband data services. The performance criteria were largely based on a number of traditional 3G (wcDma) voice kpis, including voice call setup Time, voice call Drop rate, rrc Failure Rate, Block Error Rate and IRAT handover performance. These criteria and their target values were inherited from the original Request for Proposal (RFP), which was written prior to the full standardization of the LTE technology.

The relevance level of many of the chosen KPIs to user experience was low, and as a result considerable time and effort was required to establish test methods and tools to measure and try to improve them. Consequently, key technical resources were unavailable for other network-improvement and optimization activities that would have had a much larger impact on user experience.

After several months of testing and discussion, it was agreed that some of the original KPIs did not provide an accurate measure of user experience and were removed or reclassified as PIs and used for monitoring.

cAse 2 – tArgetIng selected KpIsIn this case, an LTE operator defined 15 performance KPIs for the rollout project. This list was reduced to eight KPIs for validation of cluster tuning and acceptance, which directly reduced the amount of time and effort required for data collection and analysis, freeing up resources for tuning and resolution of performance issues.

The operator carried out a number of special tests and measurements in one golden cluster. The tests were linked to product functions and parameter settings, had low correlation with network tuning, and included the seven KPIs that were excluded from the cluster tuning. Performing these tests early in the process and on a limited area allowed the project engineers to define and implement performance improvements for the whole network as it was rolled out and tuned.

The operator’s network was tuned and put into service quickly with more than 90 percent of clusters being accepted after just one round of tuning. The rollout engineers were able to shift their focus completely to capacity management, operational processes and troubleshooting.

Today, this operator is a leading provider of LTE mobile broadband, and has a stable, high-performing network.

cAse 3 – A mInImAlIst ApproAchAn operator with a low-cost model wanted to enter the LTE market quickly but without compromising service quality. The network rollout project was a continuous build, with low traffic from the start. Costs were a concern, so the deployment strategy focused on fewer KPIs and limited drive testing. Those drive tests that were carried out were targeted to prioritized areas and performance verification. Many parts of the network underwent only a single site test before launch.

The network was successfully launched, and performance was monitored with network counter statistics. SON features were found to have worked effectively. As post-launch user traffic was low, the few performance issues that arose were quickly identified and rectified before a wider group of users was affected. Deployment, verification and a smooth launch were all achieved within very tight deadlines.

Case studies

Page 10: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • measure THe BenefiTs 10

The Next Generation Mobile Networks (NGMN) Alliance has recognized the problems associated with previous approaches to network verification and has recommended a structured top-down approach using a set of standard service-level KPIs[8]. These KPIs are grouped into five main categories – accessibility, retainability, integrity, availability and mobility – where measurements use common formulas and methods.

Network-level testing and acceptance should focus on these KPIs, which are based on numerous resource-level PIs that provide system behavior and performance information at a more granular level.

The telecom industry is pushing for the increased use of SON functions as a way to change how vendors and operators tune and manage networks. For example, the ANR feature influences how networks are tuned because it reduces the amount of effort required to plan and manage neighbor relationships. To achieve the full capex and opex benefits with the delivery of SON, reported KPIs need to be trustworthy.

BeneFIts For operAtors > Faster time to market for new networks, services and coverage. > Confidence that network-centric measurements accurately reflect the user experience. > Transparent and easy benchmarking in multiple-vendor environments. > Capacity to focus resources and efforts on areas that will result in the greatest benefit

for users. > Ability to achieve targeted network efficiency.

BeneFIts For vendors > Possibility to focus resources and efforts on tuning networks, maximizing network efficiency

and improving user experience. > Capacity to meet aggressive timelines. > Faster network deployment. > Creation of strong operator partnerships. > Opportunity to deliver high-quality networks that can be adapted to changing technology and

market conditions.

MEASURE THE BENEFITS

Page 11: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • concLusion anD references 11

LTE networks provide the higher capacity and data speeds required to support ever-increasing numbers of mobile-broadband subscriptions, smartphones and other connected devices. The timely deployment of LTE networks is crucial for operators to launch new services, support faster data speeds, provide greater capacity, and reduce investment costs.

The network complexity created by new services, new types of devices, multiple technologies, and multiple-vendor deployments requires an efficient approach to network-performance verification from initial testing through to in-service operation. Such an approach allows operators to focus on bringing LTE networks into service quickly with the required level of network performance for successful commercial launch and continued operation.

Conclusion

References1. ITU-T, Recommendation E.800, 2008-09, Series E: Overall Network Operation, Telephone

service, service operation and Human factors, available at: http://www.itu.int/rec/T-REC-E.800-200809-I/en

2. ETSI, Technical Specification 132 450, 2009-04, Key Performance Indicators (KPIs) for evolved universal Terrestrial radio access network (e-uTran): Definitions (3Gpp Ts 32.450 Version 8.0.0 Release 8), available at: http://www.etsi.org/deliver/etsi_ts/132400_132499/132450/08.00.00_60/ts_132450v080000p.pdf

3. 3GPP, Technical Specification 32.450, 2009, Telecommunication Management; Key Performance Indicators (KPIs) for Evolved Universal Terrestrial Radio Access Network (e-uTran): Definitions, available at: http://www.3gpp.org/ftp/Specs/html-info/32450.htm

4. 3GPP, Technical Specification 32.451, 2009, Telecommunication Management; Key Performance Indicators (KPIs) for Evolved Universal Terrestrial Radio Access Network (E-UTRAN); Requirements, available at: http://www.3gpp.org/ftp/Specs/html-info/32451.htm

5. 3GPP, Technical Specification 32.425, 2009, Telecommunication Management; Performance Management (PM); Performance Measurements Evolved Universal Terrestrial Radio Access Network (E-UTRAN), available at: http://www.3gpp.org/ftp/Specs/html-info/32425.htm

6. 3Gpp, Technical report 36.805, 2009, study on minimization of drive-tests in next generation networks, available at: http://www.3gpp.org/ftp/specs/html-info/36805.htm

7. Ericsson, White Paper, 2011, Keeping the Customer Service Experience Promise – How to meet the service assurance challenge, available at: http://www.ericsson.com/news/110121 wp service assurance 244188811 c

8. nGmn alliance, white paper, 2006, next-Generation mobile networks Beyond Hspa & evDo, available at: http://www.ngmn.org/uploads/media/Next Generation Mobilenetworks Beyond Hspa evDo web.pdf

Page 12: Wp lte-acceptance

TransparenT neTwork-performance verificaTion for LTe roLLouTs • GLossarY 12

GLOSSARY2g 2nd-generation wireless telephone technology3g 3rd-generation wireless telephone technology 3gpp 3rd Generation Partnership Project4g 4th-generation mobile wireless standardsAnr Automatic Neighbor Relationscapex capital expendituredl downlinkerAB EUTRAN Radio Access BeareretsI European Telecommunications Standards Institutee-utrAn Evolved Universal Terrestrial Radio Access Networkevdo evolution Data optimized/evolution, data onlyFoA First Office ApplicationhspA High-speed packet accessIrAt Inter-Radio Access Technology (Inter-RAT)Itu International Telecommunication UnionItu-t ITU Telecommunication Standardization SectorKpI key performance indicatorlte Long Term Evolutionngmn Next Generation Mobile Networksopex operational expenditurepdcp packet data convergence protocolpI performance indicatorQoe quality of experienceQos quality of servicerAB radio-access bearerrAn radio-access networkrFp Request for Proposalrrc Radio Resource Control rtt round-trip timeson self-organizing networksue user equipmentul uplinkusB Universal Serial Busvod video on demandvolte voice over LTEwcdmA wideband code Division multiple access