47
An Oracle White Paper November 2013 Typical Key Performance Indicator Reports for Performance Intelligence Centers

Typical Key Performance Indicator Reports for Performance

  • Upload
    vuhuong

  • View
    223

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Typical Key Performance Indicator Reports for Performance

An Oracle White Paper November 2013

Typical Key Performance Indicator Reports for Performance Intelligence Centers

Page 2: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

Disclaimer

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle’s products remains at the sole discretion of Oracle.

Page 3: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

Executive Overview ............................................................................. 1 Introduction ......................................................................................... 1 1.0 Service Overview .......................................................................... 2

1.1 Available Services ..................................................................... 2 1.2 Presentation of Reports ............................................................ 2

2.0 Reports Descriptions ..................................................................... 3 2.1 Call QoS Reports ...................................................................... 3 2.2 Roaming QoS Reports ............................................................ 10 2.3 Network Interconnection Reports ............................................ 32 2.4 2G Core Network Reports ....................................................... 33 2.5 Mobile Number Portability Reports ......................................... 35 2.6 Network Security and Fraud Detection Reports ...................... 37 2.7 Accounting Reports ................................................................. 40 2.8 LTE Reports ............................................................................ 42

Conclusion ........................................................................................ 43

Page 4: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

1

Executive Overview

This document helps performance intelligence center (PIC) customers derive more value from their current system by giving them access to Oracle expertise in implementing a broad set of key performance indicator (KPI) reports.

Introduction

The PIC system enables customers to configure fully customized KPI reports, to generate alarms, and to get real time curves. These features can be linked together to provide the best tools to manage operations the best way. From this framework, it is possible to configure a cost effective system tailored to meet individual needs.

Customers without available resources to set up reports by themselves may obtain additional support to configure their KPI reports. In those cases, Oracle provides KPI reports configuration services.

Page 5: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

2

1.0 Service Overview

1.1 Available Services

The set of services provided by Oracle usually comprises the following elements:

• Detailed Scope of Work definition

• Configuration of

• Enrichments

• KPI records

• Alarms

• Dashboards

• Datafeeds

• Documentation, acceptance test, and handoff

When needed, the following services may be included:

• System audit

• System reengineering

• System expansion

The KPI reports configured can be used as a model to cover other needs autonomously. But Oracle Communications products can also handle the entire KPI lifecycle if needed.

1.2 Presentation of Reports

This KPI catalog defines KPI records that can be configured in the context of KPI services. It is the base for writing a Scope of Work for such services. Usually the most difficult task is to define the dimensions. This catalog provides examples for interesting dimensions, but Oracle service professionals can also perform pre-studies to refine the dimensions in the actual context of a network and traffic. While each project is unique at the detailed level, Oracle service professionals bring the knowledge of multiple engagements to each project.

This catalog covers use cases that are identified as providing business benefit when applied in the right context. The catalog will grow over time with new versions published regularly.

Page 6: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

3

2.0 Reports Descriptions

2.1 Call QoS Reports

2.1.1 ISUP QOS Report

The business driver for this report is to manage the Quality of Service (QOS) of calls managed by ISUP.

The record dimensions are directions or point codes. The results can be used to avoid losing calls and to use Network Elements better. This report can be used in different contexts:

• Network wide per network element

• International interconnection

• Long distance interconnection

• Specific traffic only

As a prerequisite, the business context shall be agreed with the customer. Several contexts would require several KPI reports. At least one of the following builders is required:

• SS7 ISUP ETSI

• SS7 ISUP ANSI

The service package would require a review in case ANSI and ETSI need to be managed.

The monitoring coverage shall match the agreed business context.

Page 7: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

4

ISUP QoS MEASURES

• Call Attempts: Simple Data Counter, Total Call Attempts

• ASR: # Answered / # Attempts

• NER: Net Success / # Attempts

• Net Success = Call is Answered or Cause Family = NA or SSB

or HE

• Unsuccessful calls due to customer (cause 17,18 and 19)

• Unsuccessful calls due to customer and/or network

(cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Unsuccessful calls due to resource unavailable

(cause 34 to 47)

• Unsuccessful calls due to service or option not available

(cause 50 to 63)

• Unsuccessful calls due to service or option not implemented

(cause 65 to 79)

• Unsuccessful calls due to invalid message (cause 87 to 95)

• Unsuccessful calls due to protocol error (cause 102 to 11)

• Unsuccessful calls due to interworking (cause 127)

• Percentage of unsuccessful calls due to customer

(cause 17,18 and 19)

• Percentage of unsuccessful calls due to customer and/or

network (cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Percentage of unsuccessful calls due to resource unavailable

(cause 34 to 47)

• Percentage of unsuccessful calls due to service or option not

available (cause 50 to 63)

• Percentage of unsuccessful calls due to service or option not

implemented (cause 65 to 79)

• Percentage of unsuccessful calls due to invalid message

(cause 87 to 95)

• Percentage of unsuccessful calls due to protocol error

(cause 102 to 11)

• Percentage of unsuccessful calls due to interworking

(cause 127)

• Average setup time

• Average conversation time

• Average length of call

DIMENSIONS

The customer shall determine which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• Point codes

• Country

• Other directions

The customer shall provide the reference data.

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

Note 2: For directions it is important to identify specific problems due to national numbering plan, carrier prefixes, or others.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Standard values can be adjusted either globally or per dimension.

Page 8: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

5

2.1.2 SIP/SIP-T QOS Report

The business driver for this report is to manage the QoS of calls managed by SIP/SIP-T.

The record dimensions are directions or IP addresses. The results can be used to avoid losing calls and to use Network Elements better. This report can be used in different contexts:

• Network wide per network element

• International interconnection

• Long distance interconnection

• Specific traffic only

As a prerequisite, the customer shall agree with the business context. Several contexts would require several KPI reports. At least one of the following builders is required:

• SIP

• SIP-T

The monitoring coverage shall match the agreed business context.

Page 9: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

6

SIP/SIP-T QoS MEASURES

• Attempts (simple counter)

• Successful (filtered counter on “Status Code” set to “OK”)

• ASR (ratio Successful/Attempts)

• Net Successful (filtered counter on “Status Code” set to OK or Busy here or Invite aborted or request terminated) (**needs to be

refined based on services provided by the customer)

• NER (ratio net Successful/Attempts)

• Switch Problem (filtered counter on “Status Code” set to “5??”

• Total Conversation time (cumulative counter on “Conversation Time” field Transaction Duration

• Average Conversation Time (ratio Total Conversation Time / Invites)

• Average Transaction Duration (ratio Transaction Duration / Attempts)

Note: Measures are made from a SIP perspective. For SIP-T, the protocol provides also the “Cause Value” according to the ETSI

Q.850. Oracle is ready to use also this information to refine the ASR/NER measures.

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• IP addresses

• Country

• Other directions

The customer shall provide the reference data.

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a

prestudy to identify those ones.

Note 2: For directions it is important to identify specific problems due to national numbering plan, carrier prefixes, or others.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER, and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 10: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

7

2.1.3 SIP Registration Report

The business driver for this report is the need to ensure registration. On SIP, the registration is important to access the service. This report can be used to detect if and where registration is failing.

As a prerequisite, the SIP builder is required.

SIP Registration MEASURES

• Total Registrations - (REGISTER)

• Total Successful Registrations

• Total Unsuccessful Registrations

• Total Forbidden

• Total Unauthorized

• Percentage of Successful Registrations – Total Successful Registrations/Total Registrations

• Percentage of Unsuccessful Registrations – Total Unsuccessful Registrations/Total Registrations

• Percentage Forbidden

• Percentage Unauthorized

• Total transaction time

• Average transaction time

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• IP source

• IP destination

• Top 20 IP source not OK

• Top 20 URI (field “to” ) not OK

• Addresses may be enriched to get operator or other business information

The customer shall provide the reference data for the enrichment.

Note: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global registration success rate versus time Threshold to be determined on the registration success per

operator

Page 11: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

8

2.1.4 SIP Method Report

The business driver for this report occurs when the SIP method error surveillance shows possible internetworking issues between network elements.

As a prerequisite, the SIP builder is required.

SIP Method MEASURES

• Total session

• Total session due to “Bad Request”- 400 or “Method Not Allowed”- 405 or “Not Acceptable”- 406

• Total sessions tear down due to “Bad Request”- 400

• Total sessions tear down due to “Method Not Allowed”- 405

• Total sessions tear down due to “Not Acceptable”- 406

• Percentage of session tear down due to “Bad Request” – Total sessions tear down due to Bad Request /Total sessions

• Percentage of session tear down due to “Method Not Allowed” – Total sessions tear down due to Method Not Allowed /Total

sessions

• Percentage of session tear down due to “Not Acceptable” – Total sessions tear down due to Not Acceptable /Total sessions

• Percentage of session due to “Bad Request”- 400 or “Method Not Allowed”- 405 or “Not Acceptable”- 406

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• IP source

• IP destination

• Top 20 IP source with 400 or 405 or 406

• Top 20 URI (field “to” ) with 400 or 405 or 406

• Addresses may be enriched to get operator or other business information

The customer shall provide the reference data for the enrichment.

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global curves with 400 or 405 or 406 over time Threshold to be determined on percentage of session due to “Bad

Request”- 400 or “Method Not Allowed”- 405 or “Not Acceptable”-

406

Page 12: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

9

2.1.5 BICC QoS Report

The business driver for this report is to manage the QoS of calls managed by the BICC.

The record dimensions are directions or point codes. The results can be used to avoid losing calls and to use Network Elements better. This report can be used in the following contexts:

• Network wide per network element

• International interconnection

• Long distance interconnection

• Specific traffic only

As a prerequisite, the BICC ANSI or BICC ETSI XDR builder is required.

BICC QoS MEASURES

• Call Attempts: Simple Data Counter, Total Call Attempts

• ASR: # Answered / # Attempts

• NER: Net Success / # Attempts

• Net Success = Call is Answered or Cause Family = NA or SSB or HE

• Unsuccessful calls due to customer (cause 17,18 and 19)

• Unsuccessful calls due to customer and/or network (cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Unsuccessful calls due to resource unavailable (cause 34 to 47)

• Unsuccessful calls due to service or option not available (cause 50 to 63)

• Unsuccessful calls due to service or option not implemented (cause 65 to 79)

• Unsuccessful calls due to invalid message (cause 87 to 95)

• Unsuccessful calls due to protocol error (cause 102 to 11)

• Unsuccessful calls due to interworking (cause 127)

• Percentage of unsuccessful calls due to customer (cause 17,18 and 19)

• Percentage of unsuccessful calls due to customer and/or network (cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Percentage of unsuccessful calls due to resource unavailable (cause 34 to 47)

• Percentage of unsuccessful calls due to service or option not available (cause 50 to 63)

• Percentage of unsuccessful calls due to service or option not implemented (cause 65 to 79)

• Percentage of unsuccessful calls due to invalid message (cause 87 to 95)

• Percentage of unsuccessful calls due to protocol error (cause 102 to 11)

• Percentage of unsuccessful calls due to interworking (cause 127)

• Average setup time

• Average conversation time

• Average length of call

Page 13: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

10

BICC QoS (continued) DIMENSIONS

It shall be determine with the customer which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• Point codes

• Country

• Other directions

The customer shall provide the reference data.

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

Note 2: for directions it is important to identify specific problems due to national numbering plan, carrier prefixes, or others.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER, and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

2.2 Roaming QoS Reports

2.2.1 Roaming Access Reports

2.2.1.1 Roaming In – Visitor Network QoS – Update Location Report

The business driver for this report is to manage QoS of inbound roaming traffic.

For inbound roamers, this report checks that services can be accessed correctly from the GSM core network perspective, based on the Update Location operation. Update Location is a critical MAP Operation which needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configuration issues between roaming partners.

As a prerequisite, the record dimensions are country/operators. This report must only be used on MAP roaming traffic at the interconnection. The MAP ETSI builder is required.

Page 14: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

11

Roaming In – Visitor Network QoS – Update Location MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of

Update Locations

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Roaming Not

Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Unexpected Data

Value”

• Average access time

DIMENSIONS

• Country/Operator based on Roamer IMSI analysis

Both dimensions are mixed together and the list shall be provided by the customer.

Note 1: The dimensions shall be selected from a business perspective. A prestudy might be required to identify the relevant

dimensions.

Note 2: As a variant, country and operator might be separated.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ER (Efficiency Ratio) and number of

attempts versus time

Per ER (Efficiency Ratio):

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally or per dimension.

2.2.1.2 Roaming In – Visitor Network QoS – Top Ten – Update Location Report

The business driver for this report is to manage Quality of Service of inbound roaming traffic.

For inbound roamers, it checks that services can be accessed correctly from the GSM core network perspective based on the Update Location operation. Update Location is a critical MAP Operation which needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configuration issues between roaming partners.

As a prerequisite, the record dimensions are country/operators. This report must be only used on MAP roaming traffic at the interconnection. The MAP ETSI builder is required.

Page 15: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

12

Roaming In – Visitor Network QoS – Top 10 – Update Location MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Unexpected Data Value”

• Average access time

DIMENSIONS

• Top 10 Country/Operator based on Roamer IMSI analysis

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

No dashboard (results are used in ProTrace) None

Page 16: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

13

2.2.1.3 Roaming In – Visitor Network QoS – IMSI Report

The business driver for this report is to manage Quality of Service of inbound roaming traffic.

For inbound roamers, it checks that services can be accessed correctly from the GSM core network perspective based on the Update Location operation. Update Location is a critical MAP Operation which needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configuration issues between roaming partners.

As a prerequisite, this report must be only used on MAP roaming traffic at the interconnection. The MAP ETSI builder is required.

Roaming In – Visitor Network QoS – IMSI MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of Number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Unexpected Data Value”

• Average access time

DIMENSIONS

• Top N IMSI

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

No dashboard (results are used in ProTrace) None

Page 17: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

14

2.2.1.4 Roaming In – Steering of Roaming (SoR) Report

The business driver for this report is to identify service providers using SoR.

This report identifies service providers probably using SoR. SoR KPIs apply to roaming in traffic. It helps to detect any new steering, and eventually renegotiate roaming agreements. Mobile operators may prefer roaming partners in a given country, in particular when they are part of larger groups. In such cases, SoR is often used. SoR consists of aborting Update Locations several times before allowing them to be successful. SoR is guided by the GSMA, and only some predefined failures are allowed. These are analyzed by this report.

As a prerequisite, the MAP ETSI builder is required.

Roaming In – Steering of Roaming (SoR) MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed” or “System Failure” or “Unexpected Data Value”

• Steering of roaming ratio number of Update Location failed due to “Roaming Not Allowed,”, “System Failure”, “Unexpected Data

Value” / Total number of Update Locations

DIMENSIONS

• Top 20 among 500 Country/Operator having the highest steering of roaming ratio

• Country/Operator is based on enrichment

The two dimensions are mixed together and the list shall be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

None None

Page 18: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

15

2.2.1.5 Roaming Out – Visitor Network QoS – Update Location Report

The business driver for this report is to manage QoS of outbound roaming traffic.

This report ensures that services for outbound roamers can be accessed from a core network perspective, based on the Update Location operation. Update Location is a critical MAP Operation that needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configurations between roaming partners.

As a prerequisite, this record dimensions are country/operators. This report must only be used on MAP roaming traffic at the interconnection. The required builder is MAP ETSI.

Roaming Out – Visitor Network QoS – Update Location MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Unexpected Data Value”

• Average access time

DIMENSIONS

• Country/Operator based on GTT analysis

Both dimensions are mixed together and the list shall be provided by the customer.

Note 1: The dimensions shall be selected from a business perspective. A prestudy might be required to identify the relevant

dimensions.

Note 2: As a variant, country and operator might be separated.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ER (Efficiency Ratio) and number of attempts versus

time

Per ER (Efficiency Ratio):

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 19: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

16

2.2.1.6 Roaming Out – Visitor Network QoS – Top Ten – Update Location Report

The business driver for this report is to manage QoS of outbound roaming traffic.

This report ensures that services for outbound roamers can be accessed from a core network perspective, based on the Update Location operation. Update Location is a critical MAP Operation that needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configurations between roaming partners.

As a prerequisite, the record dimensions are country/operators. This report must only be used on MAP roaming traffic at the interconnection. The required builder is MAP ETSI.

Roaming Out – Visitor Network QoS – Top Ten – Update Location MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Unexpected Data Value”

• Average access time

DIMENSIONS

Top 10 Country/Operator based on GTT analysis

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ER (Efficiency Ratio) and number of attempts versus

time

Per ER (Efficiency Ratio):

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 20: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

17

2.2.1.7 Roaming Out – Visitor Network QoS – IMSI Report

The business driver for this report is to manage QoS of outbound roaming traffic.

This report ensures that services for outbound roamers can be accessed from a core network perspective, based on the Update Location operation. Update Location is a critical MAP Operation that needs to work properly to access voice and data services. In the context of roaming failures, it may reveal configurations between roaming partners.

As a prerequisite, the report must only be used on MAP roaming traffic at the interconnection. The required builder is MAP ETSI.

Roaming Out – Visitor Network QoS – IMSI MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed”

• Number of Update Location failed due to “System Failure”

• Number of Update Location failed due to “Data Missing”

• Number of Update Location failed due to “Unexpected Data Value”

• Average access time

DIMENSIONS

• Top N IMSI

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ER (Efficiency Ratio) and number of attempts versus

time

Per ER (Efficiency Ratio):

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 21: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

18

2.2.1.8 Roaming Out – Anti-steering of Roaming Report

The business driver for this report is to identify service providers using SoR.

This report identifies service providers probably using anti-steering. SoR KPIs apply to roaming in traffic. Anti-steering is not allowed. When applying steering, the Update Location success should drop to a very low rate. When it is low, but over a certain percentage depending on steering strategy, this is abnormal and requires analysis. SoR consists of aborting Update Locations several times before allowing them to be successful.

As a prerequisite, the required builder is MAP ETSI.

Roaming Out – Antisteering of Roaming MEASURES

• Total number of Update Locations (simple counter)

• Number of Update Location OK

• Number of Update Location NOK

• Ratio of number of Update Locations OK/Total number of Update Locations

• Number of Update Location failed due to “Roaming Not Allowed” or “System Failure” or “Unexpected Data Value”

• Steering of roaming ratio number of Update Location failed due to “Roaming Not Allowed,”, “System Failure”, “Unexpected Data

Value” / Total number of Update Locations

DIMENSIONS

• Top 20 among 500 Country/Operator having the highest steering of roaming ratio

• Country/Operator is based on enrichment

The two dimensions are mixed together and the list shall be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

None None

2.2.2 Roaming SMS Reports

2.2.2.1 Roaming In – MAP MO SMS Transaction Efficiency and Delay Report

The business driver for this report is to manage QoS of SMS MO traffic.

SMS is a very popular service and it is critical for the service provider to control it. This is particularly important for identified subscribers and for roaming services.

Three major steps to consider are

1. Submitting the SMS to the SMSC (also corresponding to the MO / Mobile Originated SMS)

2. Retrieving the information to route the SMS to the final destination

3. Delivering the SMS to the final destination (also corresponding to the MT / Mobile Terminated SMS)

Page 22: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

19

Several contexts (or use cases) shall be considered as well, by changing the following dimensions:

• Internal SMS traffic

• National SMS traffic

• International SMS traffic

This report checks QoS for step 1, SMS MO.

As a prerequisite, the required builder is MAP ETSI.

Roaming In – MAP MO SMS Transaction Efficiency and Delay MEASURES

• Total SMS MO (simple counter)

• Successful SMS MO (No SMS, SCCP, TCAP, User and Provider error)

• SMS Success ratio (Successful SMS MO/ Total SMS MO)

• Total transaction time (Cumulative counter for the transaction time field)

• Average transaction time (Total transaction time / Total SMS MO)

• Maximum transaction time (Maximum value counter on transaction time field)

Note: The SMS operations are identified with OpCodes = (Forward SM and TPMTI = SMS SUBMIT) or MO Forward SM

DIMENSIONS

One of these dimensions shall be selected

• Country/Operator based on GTT analysis, which shall be selected for the roaming case. For roaming out, the SMSC belongs to the

home network and the visited MSC belongs to a foreign network. For roaming in, this is the opposite.

• Home MSC (identified by Point Code or GTT depending on routing options)

• Home SMSC (identified by Point Code or GTT depending on routing options)

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

Global SMS MO Success Rate and number Minor and major alarms on the SMS MO Success Rate per

selected dimensions

Page 23: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

20

2.2.2.2 Roaming In – MAP MT SMS Transaction Efficiency and Delay Report

The business driver for this report is to manage QoS of SMS MT traffic.

SMS is a very popular service and it is critical for the service provider to control it. This is particularly important for identified subscribers and for roaming services.

Three major steps to consider are

1. Submitting the SMS to the SMSC (also corresponding to the MO / Mobile Originated SMS)

2. Retrieving the information to route the SMS to the final destination

3. Delivering the SMS to the final destination (also corresponding to the MT / Mobile Terminated SMS)

Several contexts (or use cases) shall be considered as well, by changing the following dimensions:

• Internal SMS traffic

• National SMS traffic

• International SMS traffic

This report checks QoS for step 3, SMS MT.

As a prerequisite, the required builder is MAP ETSI.

Roaming In – MAP MT SMS Transaction Efficiency and Delay MEASURES

• Total SMS MT (simple counter)

• Successful SMS MT (No SMS, SCCP, TCAP, User and Provider error)

• SMS Success ratio (Successful SMS MT/ Total SMS MO)

• Total transaction time (Cumulative counter for the transaction time field)

• Average transaction time (Total transaction time / Total SMS MT)

• Maximum transaction time (Maximum value counter on transaction time field)

Note: The SMS operations are identified with OpCodes = (Forward SM and TPMTI = SMS DELIVER) or MT Forward SM

DIMENSIONS

One of these dimensions shall be selected

• Country/Operator based on GTT analysis, which shall be selected for the roaming case. For roaming out, the SMSC belongs to the

home network and the visited MSC belongs to a foreign network. For roaming in, this is the opposite.

• Home MSC (identified by Point Code or GTT depending on routing options)

• Home SMSC (identified by Point Code or GTT depending on routing options)

• Top N IMSI

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

Global SMS MT Success Rate and number Minor and major alarms on the SMS MT Success Rate per

selected dimensions

Page 24: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

21

2.2.2.3 Roaming Out – MAP MO SMS Transaction Efficiency and Delay Report

The business driver for this report is to manage QoS of SMS MO traffic.

SMS is a very popular service and it is critical for the service provider to control it. This is particularly important for identified subscribers and for roaming services.

Three major steps to consider are

1. Submitting the SMS to the SMSC (also corresponding to the MO / Mobile Originated SMS)

2. Retrieving the information to route the SMS to the final destination

3. Delivering the SMS to the final destination (also corresponding to the MT / Mobile Terminated SMS)

Several contexts (or use cases) shall be considered as well, by changing the following dimensions:

• Internal SMS traffic

• National SMS traffic

• International SMS traffic

This report checks QoS for step 1, SMS MO.

As a prerequisite, the required builder is MAP ETSI.

Roaming Out – MAP MO SMS Transaction Efficiency and Delay MEASURES

• Total SMS MO (simple counter)

• Successful SMS MO (No SMS, SCCP, TCAP, User and Provider error)

• SMS Success ratio (Successful SMS MO/ Total SMS MO)

• Total transaction time (Cumulative counter for the transaction time field)

• Average transaction time (Total transaction time / Total SMS MO)

• Maximum transaction time (Maximum value counter on transaction time field)

Note: The SMS operations are identified with OpCodes = (Forward SM and TPMTI = SMS SUBMIT) or MO Forward SM

DIMENSIONS

One of these dimensions shall be selected

• Country/Operator based on GTT analysis, which shall be selected for the roaming case. For roaming out, the SMSC belongs to the

home network and the visited MSC belongs to a foreign network. For roaming in, this is the opposite.

• Home MSC (identified by Point Code or GTT depending on routing options)

• Home SMSC (identified by Point Code or GTT depending on routing options)

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

Global SMS MO Success Rate and number Minor and major alarms on the SMS MO Success Rate per

selected dimensions

Page 25: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

22

2.2.2.4 Roaming Out – MAP MT SMS Transaction Efficiency and Delay Report

The business driver for this report is to manage QoS of SMS MT traffic.

SMS is a very popular service and it is critical for the service provider to control it. This is particularly important for identified subscribers and for roaming services.

Three major steps to consider are

1. Submitting the SMS to the SMSC (also corresponding to the MO / Mobile Originated SMS)

2. Retrieving the information to route the SMS to the final destination

3. Delivering the SMS to the final destination (also corresponding to the MT / Mobile Terminated SMS)

Several contexts (or use cases) shall be considered as well, by changing the following dimensions:

• Internal SMS traffic

• National SMS traffic

• International SMS traffic

This report checks QoS for step 3, SMS MT.

As a prerequisite, the required builder is MAP ETSI.

Roaming Out – MAP MT SMS Transaction Efficiency and Delay MEASURES

• Total SMS MT (simple counter)

• Successful SMS MT (No SMS, SCCP, TCAP, User and Provider error)

• SMS Success ratio (Successful SMS MT/ Total SMS MO)

• Total transaction time (Cumulative counter for the transaction time field)

• Average transaction time (Total transaction time / Total SMS MT)

• Maximum transaction time (Maximum value counter on transaction time field)

Note: The SMS operations are identified with OpCodes = (Forward SM and TPMTI = SMS DELIVER) or MT Forward SM

DIMENSIONS

One of these dimensions shall be selected

• Country/Operator based on GTT analysis, which shall be selected for the roaming case. For roaming out, the SMSC belongs to the

home network and the visited MSC belongs to a foreign network. For roaming in, this is the opposite.

• Home MSC (identified by Point Code or GTT depending on routing options)

• Home SMSC (identified by Point Code or GTT depending on routing options)

• Top N IMSI

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

Global SMS MT Success Rate and number Minor and major alarms on the SMS MT Success Rate per

selected dimensions

Page 26: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

23

2.2.2.5 Map SRI for SM Report

The business driver for this report is to manage QoS of SMS SRI for SM traffic.

SMS is a very popular service and it is critical for the service provider to control it. This is particularly important for identified subscribers and for roaming services.

Three major steps to consider are

1. Submitting the SMS to the SMSC (also corresponding to the MO / Mobile Originated SMS)

2. Retrieving the information to route the SMS to the final destination

3. Delivering the SMS to the final destination (also corresponding to the MT / Mobile Terminated SMS)

Several contexts (or use cases) shall be considered as well, by changing the following dimensions:

• Internal SMS traffic

• National SMS traffic

• International SMS traffic

This report checks QoS for step 2, SRI for SM report.

As a prerequisite, the required builder is MAP ETSI.

Map SRI for SM MEASURES

• Total SRI for MS (simple counter)

• Successful SRI for SM (No SMS, SCCP, TCAP, User and Provider error)

• SRI for SM Success ratio (Successful SRI for SM / Total SRI for SM)

• Total transaction time (Cumulative counter for the transaction time field)

• Average transaction time (Total transaction time / Total SRI for SM)

• Maximum transaction time (Maximum value counter on transaction time field)

DIMENSIONS

One of these dimensions shall be selected

• Home MSC (identified by Point Code or GTT depending on routing options)

• Home SMSC (identified by Point Code or GTT depending on routing options)

AGGREGATION PERIOD RETENTION PERIOD

1 hour 1 month

DASHBOARD ALARMS

Global SMS MT Success Rate and number Minor and major alarms on the SMS MT Success Rate per

selected dimensions

Page 27: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

24

2.2.3 Roaming Data Reports

2.2.3.1 Roaming In – GTP Context Creation Report

The business driver for this report is to create a PDP context during the setup of a data service.

This procedure creates a GTP tunnel that will be used later for the service. From a roaming perspective, this is where interaction between networks starts in the context of data service. Thus, it is important to survey that PDP context creation is working well in both networks and countries.

As a prerequisite, the required builder is Gp TDR.

Roaming In – GTP Context Creation MEASURES

• Total Create PDP (filter on transaction type = create PDP context)

• Successful Create PDP (filtered counter on success field set to Yes and transaction type = create PDP context)

• PDP context creation success ratio (Total Create PDP/ Successful Create PDP)

• Total PDP context creation duration (cumulative counter on PDP context creation duration)

• AVG PDP context creation duration (Total PDP context creation duration/Total create PDP context)

DIMENSIONS

The following dimensions can be used:

• Per country (enrichment needed)

• Per operator (enrichment needed)

• Per SGSN

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

Note 2: Reference data to be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global PDP context success rate versus time Alarm on PDP context success rate per major roaming partner

(list to be determined)

Page 28: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

25

2.2.3.2 Roaming In – GTP Traffic Report

The business driver for this report is to measure the Gp traffic volumes. This can be used to manage load sharing or capacity extensions.

As a prerequisite, the required builder is Gp TDR.

Roaming In – GTP Traffic MEASURES

• Volume DL (cumulative counter on data length and filter on way=DL and transaction type=T-PDU)

• Volume UL (cumulative counter on data length and filter on way=UL and transaction type=T-PDU)

DIMENSIONS

The following dimensions can be used:

• Per country (enrichment needed)

• Per operator (enrichment needed)

• Per SGSN

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a

prestudy to identify those ones.

Note 2: Reference data to be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global PDP context success rate versus time Alarm on PDP context success rate per major roaming partner

(list to be determined)

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global PDP context success rate versus time Alarm on PDP context success rate per major roaming partner

(list to be determined)

Page 29: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

26

2.2.3.3 Roaming Out – GTP Context Creation Report

The business driver for this report is to create a PDP context during the setup of a data service.

This procedure creates a GTP tunnel that will be used later for the service. From a roaming perspective, this is where interaction between networks starts in the context of data service. Thus, it is important to survey that PDP context creation is working well in both networks and countries.

As a prerequisite, the required builder is Gp TDR.

Roaming Out – GTP Context Creation MEASURES

• Total Create PDP (filter on transaction type = create PDP context)

• Successful Create PDP (filtered counter on success field set to Yes and transaction type = create PDP context)

• PDP context creation success ratio (Total Create PDP/ Successful Create PDP)

• Total PDP context creation duration (cumulative counter on PDP context creation duration)

• AVG PDP context creation duration (Total PDP context creation duration/Total create PDP context)

DIMENSIONS

The following dimensions can be used:

• Per country (enrichment needed)

• Per operator (enrichment needed)

• Per SGSN

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a

prestudy to identify those ones.

Note 2: Reference data to be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global PDP context success rate versus time Alarm on PDP context success rate per major roaming partner

(list to be determined)

Page 30: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

27

2.2.3.4 Roaming Out – GTP Traffic Report

The business driver for this report is to measure the Gp traffic volumes. This can be used to manage load sharing or capacity extensions.

As a prerequisite, the required builder is Gp TDR.

Roaming Out – GTP Traffic MEASURES

• Volume DL (cumulative counter on data length and filter on way=DL and transaction type=T-PDU)

• Volume UL (cumulative counter on data length and filter on way=UL and transaction type=T-PDU)

DIMENSIONS

The following dimensions can be used:

• Per country (enrichment needed)

• Per operator (enrichment needed)

• Per SGSN

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

Note 2: Reference data to be provided by the customer.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global PDP context success rate versus time Alarm on PDP context success rate per major roaming partner

(list to be determined)

2.2.4 Roaming Voice Reports

2.2.4.1 Roaming In – MO Calls at ISUP Interco Report

The business driver for this report is to manage the QoS of calls managed by the ISUP.

The record dimensions are directions or point codes. The results can be used to avoid losing calls and to use Network Elements better. This report shall be applied at the roaming interco, on outgoing traffic from roaming in subscribers.

As a prerequisite, the business context shall be agreed with the customer. Several contexts would require several KPI reports.

At least one of the following builders is required:

• SS7 ISUP ETSI

• SS7 ISUP ANSI

Service packages would require a review in case ANSI and ETSI need to be managed. The monitoring coverage shall match the agreed business context.

Page 31: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

28

Roaming In – MO Calls at ISUP Interco MEASURES

• Call Attempts: Simple Data Counter, Total Call Attempts

• ASR: # Answered / # Attempts

• NER: Net Success / # Attempts

• Net Success = Call is Answered or Cause Family = NA or SSB

or HE

• Unsuccessful calls due to customer (cause 17,18 and 19)

• Unsuccessful calls due to customer and/or network

(cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Unsuccessful calls due to resource unavailable

(cause 34 to 47)

• Unsuccessful calls due to service or option not available

(cause 50 to 63)

• Unsuccessful calls due to service or option not implemented

(cause 65 to 79)

• Unsuccessful calls due to invalid message (cause 87 to 95)

• Unsuccessful calls due to protocol error (cause 102 to 11)

• Unsuccessful calls due to interworking (cause 127)

• Percentage of Unsuccessful calls due to customer

(cause 17,18 and 19)

• Percentage of Unsuccessful calls due to customer and/or

network (cause 1, 3, 4, 21, 22, 27, 28, 29, 31)

• Percentage of Unsuccessful calls due to resource unavailable

(cause 34 to 47)

• Percentage of Unsuccessful calls due to service or option not

available (cause 50 to 63)

• Percentage of Unsuccessful calls due to service or option not

implemented (cause 65 to 79)

• Percentage of Unsuccessful calls due to invalid message

(cause 87 to 95)

• Percentage of Unsuccessful calls due to protocol error

(cause 102 to 11)

• Percentage of Unsuccessful calls due to interworking

(cause 127)

• Average setup time

• Average conversation time

• Average length of call

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• Point codes

• Country

• Other directions

The customer shall provide the reference data.

Note 1: The dimensions shall be focused on the most important ones from a business perspective. It is possible to think about a pre-

study to identify those ones.

Note 2: For directions it is important to identify specific problems due to national numbering plan, carrier prefixes, or others.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER, and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 32: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

29

2.2.4.2 Roaming In – MO Calls at the RAN Report

The business driver for this report is to measure voice quality for visitors from the RAN interface.

As a prerequisite, the required builder is RAN CC.

Roaming In – MO Calls at the RAN MEASURES

• Total nb of call

• Total nb of Successful (Successful=yes)

• ASR (Total nb of successful / total nb of call)

• Total nb of network failure (cause value = Normal call clearing or normal unspecified or user busy or no user responding or no

answer from user or subscriber absent)

• NER (Total nb of network failure/ Total nb of call)

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• Country

The customer shall provide the reference data.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER, and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 33: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

30

2.2.4.3 Roaming In – MT Calls at the RAN Report

The business driver for this report is to measure voice quality for visitors from the RAN interface on Mobile Terminated calls. For the MT Calls, it is difficult to monitor calls at the interco since the subscriber identifier (MSISDN or IMSI) is not usually provided.

As a prerequisite, the required builder is RAN CC.

Roaming In – MT Calls at the RAN MEASURES

• Total nb of call

• Total nb of Successful (Successful=yes)

• ASR (Total nb of successful / total nb of call)

• Total nb of network failure (cause value = Normal call clearing or normal unspecified or user busy or no user responding or no

answer from user or subscriber absent)

• NER (Total nb of network failure/ Total nb of call)

DIMENSIONS

It shall be determined with the customer which of the following dimensions shall be used:

• Operator (reference data to be provided by the customer)

• Country

The customer shall provide the reference data.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR, NER, and number of calls versus time Per NER:

• Minor if < 80%

• Major if < 70%

Those standard values can be adjusted afterward either globally

or per dimension.

2.2.5 CAMEL QoS and Traffic Reports

The business driver for this report is to manage the QoS of prepaid traffic.

This report is to control QoS for CAMEL prepaid services for outbound roamers. Any issue occurring with CAMEL may prevent subscribers from using services while roaming. For service providers, it is critical to check CAMEL QoS details in order to solve problems quickly and avoid losing revenues and degrading QoS (typically with roaming subscribers).

As a prerequisite, the required builder is INAP ETSI.

Page 34: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

31

CAMEL QoS and Traffic MEASURES

• Total Call attempts

• Total Answered calls and ASR

• Time measurements, cumulated and average for:

• Accept Time, connection time, transaction time

• Total Charged time and related average value

• Number of connections, number of resource connections

• Normal release cause (and ratio) (list of normal causes: normal call clearing, User busy, No User responding, No answer from user,

Subscriber absent, Normal unspecified)

• Abnormal release cause (and ratio)

• SCCP error (and ratio)

• TCAP error (and ratio)

• User (CAMEL) error (and ratio)

• Provider error (and ratio)

• Total MSUs (received + transmitted)

• Total bytes (received + transmitted)

DIMENSIONS

• Overall total except prepaid

• Overall total prepaid

• Per SCP prepaid only

Note: A filter shall be applied to CAMEL traffic for outbound roamers; calling GTT shall be different to home network.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR and number of attempts versus time Per SCP ASR (efficiency ratio):

• Minor if < 65%

• Major if < 45%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 35: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

32

2.3 Network Interconnection Reports

2.3.1 MAP QoS per GMSC Report

The business driver for this report is management of the QoS of MAP interconnect traffic. This report is focused on MAP QoS at the network interconnection points. In particular, it helps detect abnormal errors for the MAP traffic at GMSC.

As a prerequisite, the required builder is MAP ETSI.

MAP QoS per GMSC MEASURES

• Total TDRs (simple counter)

• Total Send routing info (Filtered counter on Operation = Send routing info)

• Total Update Location (Filtered counter on Operation code = Update location)

• Total transaction time (Cumulative counter on Transaction time)

• Average transaction time (Total transaction time / Total TDRs)

• Total successful transactions (No SMS, SCCP, TCAP, User and Provider error)

• Successful ratio (Total Successful / Total TDRs)

DIMENSIONS

• Per GMSC based on point code

• Report can be specific to incoming and outgoing traffic based on the way field

• A filter shall be used to ensure that the report addresses interconnect traffic. Filter shall be based on GTT (either calling or called

GTT shall be different to the home network).

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Total success ratio Per GMSC success ratio:

• Minor if < 60%

• Major if < 50%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 36: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

33

2.4 2G Core Network Reports

2.4.1 MAP QoS per NE Report

The business driver for this report is to manage the QoS of Core Network MAP traffic. This report is focused on MAP QoS at the Core Network. It identifies and shows any abnormal MAP errors that impair traffic at the Core Network elements level.

As a prerequisite, the required builder is MAP ETSI.

MAP QoS per NE MEASURES

• Total TDRs (simple counter)

• Total transaction time (Cumulative counter on Transaction time)

• Average transaction time (Total transaction time / Total TDRs)

• Total successful transactions (No SMS, SCCP, TCAP, User and Provider error)

• Successful ratio (Total Successful / Total TDRs)

DIMENSIONS

• Per Point Code

• The report can be separated on incoming and outgoing traffic based on the way field

• A filter shall be used to ensure the report does not address interco traffic. Filter can be based on GTT (either calling or called GTT

shall be equal to the home network).

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Total success ratio Per GMSC success ratio:

• Minor if < 60%

• Major if < 50%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 37: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

34

2.4.2 CAMEL QoS for Core Network Report

The business driver for this report is to manage the QoS of home network prepaid traffic and identify QoS for CAMEL prepaid services. This report is similar to the roaming prepaid report, though the context is focused on home core network.

As a prerequisite, the required builder is INAP ETSI.

CAMEL QoS for Core Network MEASURES

• Total Call attempts

• Total Answered calls and ASR

• Time measurements, cumulated and average for:

• Accept Time, connection time, transaction time

• Total Charged time and related average value

• Number of connections, number of resource connections

• Normal release cause (and ratio) (list of normal causes: normal call clearing, User busy, No User responding, No answer from user,

Subscriber absent, Normal unspecified)

• Abnormal release cause (and ratio)

• SCCP error (and ratio)

• TCAP error (and ratio)

• User (CAMEL) error (and ratio)

• Provider error (and ratio)

• Total MSUs (received + transmitted)

• Total bytes (received + transmitted)

DIMENSIONS

• Overall total except prepaid

• Overall total prepaid

• Per SCP prepaid only

Note: A filter shall be applied to exclude any CAMEL interconnect traffic.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR and number of attempts versus time Per SCP ASR(Efficiency Ratio):

• Minor if < 65%

• Major if < 45%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 38: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

35

2.5 Mobile Number Portability Reports

2.5.1 ISUP MNP Report

The business driver for this report is to manage the QoS of MNP related ISUP traffic.

In the case of MNP, several issues may occur with ISUP calls. This report identifies and manages the following such cases:

• Errors in the ported number Database. For instance, calls are routed to the monitored network before being updated. Some errors may occur such as unknown subscriber. This type of error is mostly important to control for incoming calls, and in particular when provisioning is completed and the portability service is ready to go live.

• Setup time increase

• Transit traffic increase, causing problems

As a prerequisite, one of the following builders is required:

• ISUP ETSI

• ISUP ANSI

ISUP MNP MEASURES

• Total calls

• Answered calls

• Failed calls

• Failed due to network resources

• Failed due to subscriber unknown

• Network success calls

• ASR, NER

• Average Setup time

• Average conversation time

DIMENSIONS

• Per DPC and Routing Number Prefix

• Or Per TOP N B-Number (for this report B-Number shall be included in the column definition)

Note: Only incoming ported calls shall be considered. A filter shall be applied to focus the report on this traffic.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global ASR and number of attempts versus time Alarm shall be raised if failure due to subscriber unknown count

• Minor if > 50

• Major if > 100

Those standard values can be adjusted afterward either globally

or per dimension.

Page 39: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

36

2.5.2 MAP MNP Report

The business driver for this report is to manage the QoS of MNP related MAP traffic.

The MAP MNP report identifies QoS in the context of portability when the GMSC is asking the routing number to the HLR. Compared to the ISUP MNP report, this is another leg for the same call management. Getting two reports provides the service provider with a comprehensive view of any failures that occur at the various legs of the call management.

As a prerequisite, the required builder is MAP TDR.

MAP MNP MEASURES

• Total SRI

• Successful SRI and Ratio

• Failed SRI

• Failed due to network

• Failed to subscriber unknown

DIMENSIONS

• Per DPC and Routing Number Prefix

• Or Per TOP N MSISDN (for this report MSISDN shall be included in the column definition)

Note: Only incoming ported calls shall be considered. A filter shall be applied to focus the report on this traffic.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global SRI success ratio and number of attempts versus time Alarm shall be raised if failure due to subscriber unknown count

• Minor if > 50

• Major if > 100

Those standard values can be adjusted afterward either globally

or per dimension.

Page 40: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

37

2.6 Network Security and Fraud Detection Reports

2.6.1 Spamming (Unbalanced SMS Traffic) Report

The business driver for this report is to identify SMS spamming.

Traffic between networks is typically balanced, which means there is roughly as much traffic received as transmitted. Therefore, any unbalanced traffic may be due to unexpected traffic patterns like spamming. To identify such behaviors, it is useful to check the details of the traffic routes, knowing that spam traffic often uses fake GTT addresses. This report helps identify the exact origin of such traffic. Once the inappropriate GTT addresses are being detected, it is possible to use Eagle5 Network Security features to screen the traffic accordingly.

As a prerequisite, the required builder is MAP ETSI.

Spamming (Unbalanced SMS Traffic) MEASURES

• Total MAP (from Foreign SMSC)

• MAP operations with XDR type frame alone

• SRI for SM Error ratio (SRI for SM Error count/ Total SRI for SM

• Ratio to/from (Total SMS deliver to Foreign SMSC / Total SMS deliver from Foreign SMSC)

• Total number of SRI with error cause “unidentified subscriber”

• Total number of SRI with error cause “illegal subscriber”

• Total number of SRI with error cause “illegal equipment”

• Total number of SRI with error cause “unknown equipment”

DIMENSIONS

• TOP Country/Operator based on enrichment

• Enrichment applies on both Calling GTT

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None Per Country/Operator SRI for SM error ratio:

• Minor if < 90%

• Major if < 80%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 41: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

38

2.6.2 SRI for SM Error Ratio Report

The business driver for this report is to identify SMS spamming due to fake SMSC.

This report addresses spamming done with fake SMSC, sending SRI for SM. In such cases, many MAP errors usually appear since used subscriber numbers are not accurate. In case of alarm, GTT at the origin of the traffic, as well as route being used, needs to be checked. Once the fake SMSC is detected, it is then easy to screen the traffic using Eagle5 Network Security features.

As a prerequisite, the required builder is MAP ETSI.

SRI for SM Error Ratio MEASURES

• Total MAP (from Foreign SMSC)

• MAP operations with XDR type frame alone

• SRI for SM Error ratio (SRI for SM Error count/ Total SRI for SM

• Ratio to/from (Total SMS deliver to Foreign SMSC / Total SMS deliver from Foreign SMSC)

• Total number of SRI with error cause “unidentified subscriber”

• Total number of SRI with error cause “illegal subscriber”

• Total number of SRI with error cause “illegal equipment”

• Total number of SRI with error cause “unknown equipment”

DIMENSIONS

• TOP Country/Operator based on enrichment

• Enrichment applies on both Calling GTT

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None Per Country/Operator SRI for SM error ratio:

• Minor if < 90%

• Major if < 80%

Those standard values can be adjusted afterward either globally

or per dimension.

Page 42: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

39

2.6.3 TCAP Error Report

The business driver for this report is to identify Mobile Service Providers that send spam traffic and risk getting bills for fake SMS traffic.

When a service provider is faking SCCP addresses, responses are sent to equipment that is not actually at the origin of the messages and they will receive isolated TCAP end messages. If the spam is not detected, the spammed operator will ask for payment of SMS – payment that isn’t due. Detecting the TCAP helps proactively manage problems. As well, some missing links or route can generate the same kind of errors. These same reports will also help manage traffic QoS.

As a prerequisite, the required builder is MAP ETSI.

TCAP Error MEASURES

• Total MAP Operation

• Total XDR type frame alone

• Frame alone ratio (Total XDR type frame alone/ Total MAP Operation)

DIMENSIONS

• TOP Country/Operator based on enrichment

• Enrichment applies on Calling GTT

• A filter shall focus the traffic on SMS deliver

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None Per Country/Operator SRI for SM error ratio:

• Minor if > 10%

• Major if < 20%

Those standard values can be adjusted afterward either globally

or per dimension.

2.6.4 Bypass Traffic Report

The business driver for this report is to identify fraudulent international bypass traffic.

This fraud case corresponds to international traffic bypassing the normal route in order to avoid paying the international rate. Detection is based on A-Number analysis, as in some cases the original A-Number is kept in the messages being routed within the network of destination. The fraud pattern may alternatively remove or transform the A-Number into either a standard or randomly changing number. In such cases, depending on regulation rules applicable on network of destination and the process an A-Number is being transformed, accuracy of bypass fraud detection may be altered.

As a prerequisite, at least one of the following builders is required:

• ISUP ETSI

• ISUP ANSI

Page 43: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

40

Bypass Traffic MEASURES

• Total international calls

• Total international Answered calls

• Minutes of Usage

Reports shall be engineered in order to address traffic that shall not be international.

DIMENSIONS

• Per OPC

Note: Enrichment may be required to filter the traffic correctly: get only calls that are supposed to be national.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global number of attempts versus time Alarm shall be raised for calls count

• Minor if > 50

• Major if > 100

Those standard values can be adjusted afterward either globally

or per dimension.

2.7 Accounting Reports

2.7.1 SS7 Accounting Reports

2.7.1.1 ISUP MTP Accounting per OPC/DPC/Linkset Report

The business driver for this report is focused on MTP accounting; it accounts ISUP messages.

As a prerequisite, the UM SUDR builder is required.

ISUP MTP Accounting per OPC/DPC/Linkset MEASURES

• MSU Count – per MSU type

• MSU kiloBytes

• Total MSU Counts

• Total MSU kiloBytes

DIMENSIONS

• OPC

• DPC

• Linkset

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None None

Page 44: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

41

2.7.1.2 SCCP Accounting per SCCP Calling/Called Operator/Country Report

The business driver for this report is SCCP messages that are routed through carriers for operators to reach international targets. This service is billed and must be checked by the operators.

As a prerequisite, the UM SUDR builder is required.

SCCP Accounting per SCCP Calling/Called Operator/Country MEASURES

• MSU Count – per MSU type

• MSU kiloBytes

• Total MSU Counts

• Total MSU kiloBytes

DIMENSIONS

• Country, Operators based on calling/called GT (need enrichment)

Reference data must be provided by the customer for enrichment.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None None

2.7.2 Diameter Accounting Report

The business driver for this report is to support accounting of Diameter interconnect traffic. Diameter messages are routed through carriers for operators to reach international targets. This service is billed and must be checked by the operators.

As a prerequisite, the Diameter Accounting SUDR builder is required.

Diameter Accounting MEASURES

• MSU Count – per MSU type

• MSU kiloBytes

• Total MSU Counts

• Total MSU kiloBytes

DIMENSIONS

• Country, Operators based on MCC/MNC

• IP addresses can also be used to create other dimensions based on enrichment

• Protocol based on Application Id

Reference data must be provided by the customer for enrichment.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

None None

Page 45: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

42

2.8 LTE Reports

2.8.1 Diameter S6 Update Location QoS Report

The business driver for this report is to manage the Quality of Service of Diameter Update Locations. Update Location is one of the first procedures to access LTE services. It is as important to monitor this procedure as in the MAP protocol.

As a prerequisite, the S6 builder is required.

Diameter S6 Update Location QoS MEASURES

• Update Location Nb

• Update Location success (result code = diameter success)

• Creation session success rate

DIMENSIONS

• Country, Operators based on IMSI or visited network (need enrichment)

Reference data must be provided by the customer for enrichment.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global creation session success rate over time Alarms per defined threshold for the main LTE roaming partners

on creation success rate

2.8.2 Diameter S8 Session Creation QoS Report

The business driver for this report is to manage the Quality of Service of Session Creation with Diameter S8. Session Creation is one of the first procedures to access LTE services. This is the evolution of creating sessions that was previously done for 2.5/3G.

As a prerequisite, the GTP V2 builder is required.

Diameter S8 Session Creation QoS MEASURES

• Creation session Nb

• Creation session success (response cause value = request accepted)

• Creation session success rate

DIMENSIONS

• Country, Operators based on MCC, MNC (need enrichment for roaming in)

Reference data must be provided by the customer for enrichment.

AGGREGATION PERIOD RETENTION PERIOD

15 minutes 1 month

DASHBOARD ALARMS

Global creation session success rate over time Alarms per defined threshold for the main LTE roaming partners

on creation success rate

Page 46: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers

43

Conclusion

Oracle has extensive expertise in implementing a broad set of key performance indicator (KPI) reports. By giving performance intelligence center (PIC) customers access to these KPI reports, Oracle enables customers to derive more value from their current system. PIC customers can configure customized KPI reports, generate alarms, and receive real time curves. For customers without resources to configure reports on their own, Oracle provides KPI reports configuration services.

Page 47: Typical Key Performance Indicator Reports for Performance

Typical Key Performance Indicator Reports for Performance Intelligence Centers November 2013

Oracle Corporation World Headquarters 500 Oracle Parkway Redwood Shores, CA 94065 U.S.A.

Worldwide Inquiries: Phone: +1.650.506.7000 Fax: +1.650.506.7200

oracle.com

Copyright © 2013, Oracle and/or its affiliates. All rights reserved.

This document is provided for information purposes only, and the contents hereof are subject to change without notice. This document is not warranted to be error-free, nor subject to any other warranties or conditions, whether expressed orally or implied in law, including implied warranties and conditions of merchantability or fitness for a particular purpose. We specifically disclaim any liability with respect to this document, and no contractual obligations are formed either directly or indirectly by this document. This document may not be reproduced or transmitted in any form or by any means, electronic or mechanical, for any purpose, without our prior written permission.

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used under license and are trademarks or registered trademarks of SPARC International, Inc. AMD, Opteron, the AMD logo, and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered trademark of The Open Group. 1113