34
Benchmarking IOTA Good Practice Guide for Tax Administrations

Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

Benchmarking

IOTA Good Practice Guide for Tax Administrations

Page 2: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

BENCHMARKING METHODOLOGIES

IOTA Good Practice Guide for Tax Administrations

Intra-European Organisation of Tax Administrations (IOTA)

Budapest 2011

Page 3: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

EXECUTIVE SUMMARY

This document has been developed by the IOTA Area Group “Strategic Management: Benchmarking” and is intended to facilitate benchmarking initiatives of IOTA Members. It focuses on benchmarking being used as a strategic tool which should have constant senior management support. The purpose of benchmarking is to identify ideas and practices that can potentially lead to improvement. This can be achieved by comparing performance, processes and functions. Comparisons can be made either internally within an organisation; or with external organisations. The most valuable output of benchmarking comparisons is to understand what is behind the differences in the indicators. Is it caused by strategic or operational approaches; or external factors? What are the relations between specific practices and performance? The answers to such questions can lead to the adoption of new approaches to seek improvement. The history of benchmarking has shown the need for agreement on the conduct of the benchmarking partners. Therefore this Guide envisages a Code of Conduct that provides the basic behaviouristic principles of legality, preparation, contact, exchange, confidentiality, use, completion, understanding and agreement. The Guide encompasses four stages of the benchmarking process – preparation, collection, analysis and reporting. The Guide also provides suggestions on how the organisations should implement changes by themselves as a result of ideas and best practice that have come out of the benchmarking exercise. The preparation stage defines the objectives and the focus of areas to benchmark and analyses the organisation’s own systems and processes. It also envisages the setting up of a team with sufficient skills and capacity to deliver the benchmarking project. In order to fulfil a benchmarking study the tax organisation needs to select benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from the IOTA Members and will form the basis of a database to be used for identifying potential benchmarking partners. After the selection of partners the parties involved need to agree how to benchmark and what to benchmark (scope, quantity and quality indicators, metrics, measures, etc.) It is suggested that a written agreement between the partners is adopted that covers the purpose, scope, time period, cost sharing, publicity principles, roles and responsibilities and collaboration plan of the benchmarking project. Sources of information and data should also be identified at this stage. The Guide provides information on different types of performance and gives the process indicators which can be used to make benchmark comparisons,

Page 4: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

including: input (cost and resources) and output (productivity and cost-efficiency, service and quality) indicators. Data collection tools and techniques are needed in order to obtain the necessary data. These might include data collection templates, working sessions, field visits, scored interviews and questionnaires. Guidance is provided on acquiring this data and how to ensure standardisation so as to enable comparison of similar products or services across all of the participating benchmarking administrations. The final step in the benchmarking process is reporting. A final report may contain information on the project objectives, working methods, a description of the best practices that were identified, data collected, etc. In order to facilitate dissemination of best practice, partners in benchmarking projects are urged to share the final report where appropriate and when agreed by all benchmarking participants. Consideration should be given to using IOTA to share outputs.

Page 5: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

TABLE OF CONTENTS

1. Introduction.................................................................................6 1.1. Purpose of the Guide ...................................................................... 6 1.2. Purpose of benchmarking ................................................................. 6 1.3. Overview of different approaches to benchmarking ................................. 7

2. IOTA Benchmarking Code of Conduct ...................................................9 2.1. Introduction ................................................................................. 9 2.2. Legality....................................................................................... 9 2.3. Preparation .................................................................................. 9 2.4. Contact ...................................................................................... 10 2.5. Exchange .................................................................................... 10 2.6. Confidentiality............................................................................. 10 2.7. Use ........................................................................................... 11 2.8. Completion ................................................................................. 11 2.9. Understanding and agreement .......................................................... 11

3. Benchmarking Guidance ................................................................ 12 3.1. Preparing to conduct benchmarking.................................................... 12

3.1.1. Defining objectives ................................................... 12 3.1.2. System analysis........................................................ 13 3.1.3. Team.................................................................... 13 3.1.4. Benchmarking preparatory plan .................................... 13

3.2. Selecting benchmarking partners ....................................................... 14 3.3. Agreeing with partners ................................................................... 15

3.3.1. How to benchmark.................................................... 15 3.3.2. What to benchmark................................................... 15 3.3.3. Agreement to conduct benchmarking exercise .................. 16 3.3.4. Benchmarking collaboration plan .................................. 17

3.4. Identifying data for benchmarks........................................................ 18 3.4.1. Sources of information and data ................................... 18 3.4.2. Categories of performance and process indicators.............. 19

3.4.2.1. Input – costs and resources ............................................................ 20 3.4.2.2. Output – productivity & cost-efficiency; service & quality .......... 20 3.4.2.3. Effect.............................................................................................. 23

3.5. Data collection............................................................................. 23 3.5.1. Developing data collection tools ................................... 23

3.6. Coordination of data collection......................................................... 25 3.7. Structuring and analysis of the data ................................................... 26 3.8. Interpreting the comparative responses............................................... 27

3.8.1. The importance of understanding “why” ......................... 27 3.8.2. How to fully interpret benchmark comparisons ................. 28

3.9. Reporting and dissemination – standardised approach.............................. 28

4

Page 6: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

3.9.1. Final report ............................................................ 28 3.9.2. Country activity plan for improvements .......................... 29 3.9.3. Country implementation report .................................... 29 3.9.4. Wider dissemination of best practice.............................. 30

Disclaimer....................................................................................... 32

Appendix I - Glossary.......................................................................... 33

5

Page 7: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

1. INTRODUCTION

1.1. Purpose of the Guide

1) This Benchmarking Guide has been developed by the IOTA Area Group “Strategic Management: Benchmarking”. The guide is intended to simplify any benchmarking project and to provide a common understanding for benchmarking partners among the IOTA Membership. It should be noted, that no benchmarking exercise should be undertaken without prior commitment by the senior or top level management. This Guide should not be considered as a step-by-step tool, but more a framework that provides general guidelines to ease the conducting of any benchmarking exercise.

2) In this Guide you will find an explanation as to what benchmarking is intended to provide and how it can help your tax authority to achieve its strategic objectives and improve performance. You will also find a short overview to the different approaches and types of benchmarking.

3) An important part of the guide is the IOTA Benchmarking Code of Conduct. This Code is intended to manage participants’ expectations and ensure a mutual understanding of the correct behaviour during a benchmarking exercise. The Code of Conduct and the remainder of the Guide are not legally binding documents.

4) Furthermore, the most important steps of benchmarking are described. The guide will take you through the preparation for benchmarking, selection of partners and the undertaking of formal agreements, scoping the benchmarks and acquiring the necessary data. It will help you develop the templates and forms necessary and will show you ways of data collection and analyses.

5) Finally, it will support you in analysing the data and will provide the necessary information on how to conduct the final reporting.

1.2. Purpose of benchmarking

6) Tax administrations are one of the most important state institutions in every country because they deal with collection of taxes which are required to fund public services.

7) Benchmarking is an approach which facilitates the sharing of ideas and best practice which in turn can be used to identify opportunities to improve performance within all participating tax administrations.

8) Just about anything that can be observed or measured can be benchmarked. That is why identifying data is the central part of benchmarking. In the past, the practise of organisational comparison was somewhat limited to structural or service-related areas, i.e., things that could be readily observed. However, the experience with benchmarking has greatly expanded the potential areas for investigation.

9) Some of the main purposes to which benchmarking could be put include:

6

Page 8: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

• Identifying what the scale of tax compliance, tax avoidance or evasion is;

• Disclosing the weakest points of a tax system; • Identifying what activities, and how they are performed within an

administration; • Indicating how much a situation has changed compared with the

previous period; • Comparison of certain aspects and figures with other tax

administrations; • Reinforcing the competition within the organisation and its

departments; • Reviewing the future strategies of a tax administration; • Assisting in the planning to improve a situation; • Identifying and promoting best practice; and • Review of future strategic targets.

10) The following areas can provide a clue where best to look to benchmark: • Where operational bottlenecks occur; • Where frequent complaints arise; • Where backlogs are most prevalent; • Which functions contribute most to the favourable or unfavourable

image of an administration; • What qualities of performance are most valued by their customers or

stakeholders; • What will have the most impact on achieving strategic goals and

objectives; • Where the greatest opportunities for substantial gain are likely to

reside; and • Which functions consume the greatest portion of the organisation’s

resources.

1.3. Overview of different approaches to benchmarking1

11) There are a number of different types of benchmarking, which are driven by different motivating factors and specific needs of administrations and thus involve different comparisons.

12) The major types of benchmarking are dependant upon the level at which the performance is measured: a. Strategic: It is used to improve overall performance by examining the

long–term strategies and approaches that have enabled great performers to succeed. This form of benchmarking looks at what strategies the organisations are using to make them successful. Strategic benchmarking deals with top management. The main purpose of strategic benchmarking is to re-align business strategies that have become inappropriate.

1 http://www.bqf.org.uk; http://www.apqc.org ; Benchmarking: Understanding the basic. William M. Lankford

7

Page 9: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

b. Performance or Competitive: It is used when organisations consider their position in relation to the performance characteristics of key products and services. The main purpose of performance or competitive benchmarking is to assess the relative level of performance in key areas or activities in comparison with others in the same sector; and finding ways of closing any gaps in performance.

c. Process: It focuses on improving critical processes and operations through comparison with best practice organisations performing similar work. Process benchmarking invariability involves producing process maps to facilitate comparison and analysis. This often results in short term benefits. The main purpose of process benchmarking is to achieve improvements in key processes to obtain quick benefits.

d. Functional: this type of benchmarking is used when organisations look to benchmark with partners from different business areas or activities in order to find ways of improving similar functions or work processes. This sort of benchmarking can lead to innovation and dramatic improvement. The main purpose of functional benchmarking is to improve activities or services for which counterparts do not exist.

13) The types of benchmarking depend on the choice of partner: a. Internal: Such benchmarking involves benchmarking operations from

within the same organisation. The main advantages of internal benchmarking are that access to sensitive data and information is easier, standardised data is often readily available and usually less time and resources are needed. However, real innovation may be lacking and “best in class” performance is more likely to be found through external benchmarking.

b. External benchmarking: It involves analysing outside organisations that are known to be best in class. External benchmarking provides opportunities of learning from those who are at the “leading edge”.

c. International benchmarking: It seeks partners from other countries. Globalisation and advances in information technology are increasing opportunities for international projects.

14) Very often, a “hybrid” approach may be adopted which incorporates a number of the above mentioned benchmarking types.

8

Page 10: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

2. IOTA BENCHMARKING CODE OF CONDUCT2

2.1. Introduction

15) This Code of Conduct provides principles, to be considered in a benchmarking exercise. Definition of benchmarking: A process used in strategic management, benchmarking aims to target opportunities to improve performance and processes. It is a systematic approach of identifying good practice amongst participating organisations. It involves applying a variety of methods and techniques to gather and compare appropriate information at a given point in time. It has the potential to facilitate continuous reviews.

16) The IOTA Area Group “Strategic Management: Benchmarking” has produced this IOTA Benchmarking Code of Conduct to guide the participants in benchmarking exercises and to advance the professionalism and effectiveness of benchmarking in tax administrations. It is based upon the Code of Conduct used by American Productivity & Quality Center (APQC) and has been modified to reflect the needs of the IOTA Members. Adherence to this Code will contribute to efficient, effective and ethical benchmarking.

2.2. Legality

• If there is any potential question on the legality of an activity, then seek legal advice.

• Avoid discussions or actions that could lead to restricting taxpayers fundamental rights or can in anyway restrict the principles set by the World Trade Organisation (WTO)3.

• Under no circumstances should you exchange disclosive information on taxpayers for benchmarking purposes as, according to tax regulations, information regarding taxpayers can only be exchanged through mutual assistance agreements.

2.3. Preparation

• Where appropriate, obtain senior management commitment and support for benchmarking, prior to taking part in any benchmarking exercise.

• Demonstrate commitment to the efficiency and effectiveness of benchmarking by being prepared prior to making an initial benchmarking

2 Important notice: This IOTA Code of Conduct is based upon the APQC Benchmarking Code of Conduct and has been modified in parts to meet the specific requirements of IOTA and its Member tax administrations. Copyright APQC. APQC would like to see the APQC Benchmarking Code of Conduct receive wide distribution, discussion, and use. Therefore, it grants permission for copying the Benchmarking Code of Conduct, as long as APQC is notified and acknowledged. For more information please visit www.apqc.org. 3 The World Trade Organisation set out a number of simple, fundamental principles that form the foundation of the multilateral trading system www.wto.org/english/thewto_e/whatis_e/what_stand_for_e.htm

9

Page 11: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

request or expressing willingness to participate in a benchmarking exercise.

• Before any benchmarking exercise, make checks to ensure compliance with relevant legislation.

• Before any benchmarking exercise, discuss the likely resource requirements and seek necessary agreement that adequate resources will be committed to the exercise.

• Make the most of your benchmarking partners’ time by being fully prepared for each exchange.

• Help your benchmarking partners prepare by providing them with full and adequate information on the agreed forms/templates.

• Conduct an assessment of the risks associated with the conduct of the benchmarking exercise.

2.4. Contact

• Respect the corporate culture of partner organisations and work within mutually agreed procedures and those established by IOTA.

• Use the IOTA Benchmarking Database contact persons that can be found in the IOTA Country Profile Database for making initial contact with tax administrations on benchmarking issues.

• Obtain mutual agreement with the designated benchmarking contact on any hand-off of communication or responsibility to other parties.

• Obtain relevant authorisation from the management of your organisation before accepting a request to participate in a benchmarking exercise.

• Avoid communicating a contact’s name in an open forum without the contact’s prior permission.

2.5. Exchange

• Be willing to provide the same type and level of information that you request from your benchmarking partners.

• Fully communicate early in the relationship to clarify expectations, avoid misunderstanding, and establish mutual interest in the benchmarking exchange. Provide feedback on any issues or problems during the exchange.

• Be honest and complete with the information submitted. • Provide information in a timely manner as outlined by the stated

benchmarking schedule.

2.6. Confidentiality

• At the start of any benchmarking exercise, agree how benchmark data and findings may be shared. Agree whether all participants will have access to all other participant’s data (open, transparent sharing), or whether data will be held confidentially (‘black box’) and if held confidentially, by whom. Agree whether findings will openly identify

10

Page 12: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

participants (‘open book’ comparisons) or whether findings will maintain anonymity of participants (‘closed book’ comparisons).

• Treat benchmarking interchange as confidential between the partners involved. Information must not be communicated outside the partner organisations without the prior consent of the benchmarking partner(s) who shared the information.

• An organisation’s participation in a study is confidential and should not be communicated externally without their prior permission.

2.7. Use

• Use information obtained through benchmarking only for purposes agreed with the benchmarking partners and within the framework set by IOTA’s Benchmarking Guide.

• The use or communication of a benchmarking partner’s name with the data obtained or the practices observed requires the prior permission of the benchmarking partner.

• Contact lists or other contact information provided by benchmarking networks in any form may not be used for purposes other than benchmarking and networking, except if agreed so by the Members of IOTA listed as contact points.

2.8. Completion

• Follow through with each commitment made to your benchmarking partners in a timely manner.

• Complete a benchmarking effort to the satisfaction of all benchmarking partners as mutually agreed.

2.9. Understanding and agreement

• Understand how your benchmarking partners would like to be treated. • Treat your benchmarking partner in the way that your benchmarking

partner would want to be treated. • Understand how your benchmarking partner would like to have the

information he or she provides handled and used. Handle and use it in that manner.

11

Page 13: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

3. BENCHMARKING GUIDANCE

17) There are no strict rules regarding the approach to benchmarking. The general benchmarking process consists of four phases: a. Preparation; this stage is extremely critical in producing a successful

benchmarking study. The specific focus area, key measures and definitions of the study are established and clearly documented. The data collection tools are refined and finalised; benchmark partners are identified and selected (see Sections 3.1 to 3.4).

b. Collection: Participants in the collection phase use an interactive approach within their national organisation and different adequate tools to ensure qualitative and available data (see Sections 3.5 and 3.6).

c. Analysing: This phase includes analysing performance and best practice, identifying gaps in performance and identifying the enablers that lead to best practices (see Sections 3.7 and 3.8).

d. Reporting: This step in the study ensures that data captured and lessons learned in the benchmarking process are well documented and available for others to use as reference material. Participants may directly share the case studies’ findings or best practices in a workshop format (see Section 3.9).

18) Finally, the decision to adapt or improve a process(es) or practice(s) can institutionalise the performance gains. The learning from other benchmarking partners is adapted so that it can be applied within the national organisation. Best practices must be analysed to see how they fit within the culture and structure of other organisations.

3.1. Preparing to conduct benchmarking

19) Before embarking on a benchmarking exercise it is important to establish who will take the lead within the organisation. Appropriate preparation for benchmarking is of key importance during the first stage of the benchmarking project implementation. This Chapter focuses on the practical aspects which a participating tax administration must consider.

3.1.1. Defining objectives

20) The first step within this stage provides for identifying the subject matter of benchmarking, defining objectives in the project and the objects to be examined. In order to define the objectives in an appropriate manner it is essential to focus on the strategic objectives of the state, the tax administration or institutions and adjust the objectives of the benchmarking to suite them. The focus may be on the following areas: • Organisation-wide; • Operating practices; • Human resources; • Revenues; • Expenses;

12

Page 14: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

• Internal/external services; and • Others showing best practices.

21) The processes of identifying and formulating the objectives should be carried out with the help of various techniques (such as workshops or surveys) and also discussed with a wide group of administration employees who will participate in the project in the future as well as any interested parties.

22) The identification process should be finalised with the drawing up of a list of desired objectives. Identifying the overriding objective and drawing up a hierarchy of the other objectives forms the basis of effective benchmarking implementation.

3.1.2. System analysis

23) In order to prepare better for the project, it is of utmost importance to analyse the processes selected for benchmarking thoroughly. It is essential to familiarise oneself with the areas to be examined in detail. If your own procedures are not known it will not be feasible to identify differences between the processes and procedures of your benchmarking partners and as a consequence the benchmarking exercise will not be effective. Moreover, in order to compare systems it is essential to check whether data for individual areas is available and in case it is not, to check whether it is possible to collect any such data and how much it would cost.

3.1.3. Team

24) During the preparation stage it is important to set up a team and define the distribution of responsibilities amongst the team members. The team should comprise of individuals with open personalities, a command of foreign languages and who are at ease in an international environment. It is also important to ensure that there are adequate numbers of staff in the team to undertake all of the benchmarking tasks.

25) Active involvement of senior management and support for the activities undertaken from superiors is of decisive importance for any effective benchmarking implementation. It is also important to involve the business areas experts.

26) Furthermore, in order to enhance the effectiveness of benchmarking, it is necessary to inform units at various levels within the organisation of the activities planned. As a result, it is possible to promote the project over a wider group, collect opinions from co-workers and check whether individual areas have not been subject to benchmarking earlier.

3.1.4. Benchmarking preparatory plan

27) As in the case of any project, in order to ensure its success, it is essential to draw up a good action plan covering all the stages of project implementation. It is advised to prepare a framework plan including all

13

Page 15: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

the major project stages and to specify the commencement dates and deadlines of the individual stages.

28) Moreover, a general plan should specify: • Risk management during project implementation; • Procedures for controlling the project implementation; and • Expectations with regards to the people involved in the project

implementation. 29) It is also necessary to draw up a specific plan for each part of the

project. Any such detailed plan should include the duration of individual activities, the deadline of such activities and also any other costs associated with the activities.

30) While drawing up the plan it is necessary to remember that arrangements concerning deadlines of individual stages may be modified as a result of talks and negotiations carried out with the partners invited to participate in the benchmarking process.

3.2. Selecting benchmarking partners

31) As part of the Area Group activities a Task Team was requested to develop a database to provide IOTA Members with relevant and structured information to assist in the identification of suitable benchmarking projects and partners based on comparative information.

32) Designed for IOTA Members only, this database focuses on high level tax administration information only; within the selected categories.

33) The database is currently hosted by a third party called Confirmit and contains the answers to 64 questions divided into the following nine sections: 1. Countries overview; 2. Return filing; 3. Tax audits; 4. Tax collection; 5. Guidance and information; 6. Appeals; 7. Human resource management; 8. Strategic topics; 9. International benchmarking.

34) The information available on all questions comes from the data submitted by 40 IOTA Members in response to a questionnaire issued by the Task Team in early 2011, each tax administration being responsible for the accuracy of their own data.

35) The benchmarking country profiles database can be accessed by registered users via the IOTA website by, having first logged in, using the following link and clicking on “Pilot Country Profiles Database.” If users would prefer to collect all the submitted data and make their own statistical analysis, the data for all questions can be downloaded from an associated Excel file.

36) The information from the database can be used by contributing IOTA Members to carry out both simple and complex comparisons between different administrations over the whole range of topics covered by the

14

Page 16: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

questionnaire. This will allow them to find those administrations that most closely mirror their own in terms of size and structure and who would be most suitable to carry out any future benchmarking exercise with.

37) A range of pre-prepared reports on each topic have been created and are available on the system for those who do not wish to carry out any further analysis. However, by using the downloadable excel spreadsheet IOTA Members have to opportunity to carry out more detailed data comparisons to help them find the ideal benchmarking partners.

3.3. Agreeing with partners

3.3.1. How to benchmark

38) All partners should initially agree on the areas (e.g., compliance costs) they are interested in. Then they need to find potential benchmarks and specify their possible metrics, the area in which the administrations intend to carry out the benchmarking process and the potential tax product (e.g., VAT). Each potential benchmark should be rated according to their importance and ease in order to find the best ones for each partner. Before making a decision it is recommended to record what the partners expects to learn from each of the specific benchmarks chosen and any relevant information that could be useful before launching the project.

39) For more on the criteria for agreeing benchmarks please see the next chapter.

40) Partners should pay attention to the principles of legality set out in the Code of Conduct. It is recommended to create a legal document, which can be a benchmarking agreement/standard agreement/contract or memorandum of understanding. In addition to a contract, an agreement should include a detailed benchmarking plan.

41) All partners should be prepared to share the information in return, as the process is based on information exchange.

3.3.2. What to benchmark

42) To make the benchmarking project feasible it is necessary to arrive at a consensus among the partners on reference issues: the benchmarks and their scope.

43) Every participant in the benchmarking exercise has to be directly involved in the study of scoping and defining the benchmarks.

44) The criteria for selecting benchmarks may include: • Relevance – It is valuable to tax organisations to understand their

relative cost performance in the area to be benchmarked (is it a significant contributor to cost and how it contributes to the achievement of corporate strategic objectives);

• Simplicity – To define across tax administrations, simple to apportion costs; logical grouping of like activities;

15

Page 17: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

• Consistency - Consistently defined across tax organisations; and • Data availablity – Data relating to cost and performance is readily

available now or similar data is available, minimising or eliminating the need for labour intensive primary data collection exercises.

45) Depending on the benchmarking orientation (organisation, processes, performance, products, etc.) every benchmark has to be a valuable indicator against the benchmarking project objectives of situation, performance or results. Participants can choose benchmarks of three categories according to their usual timing in relation to the process cycle: • Leading, which change before the situation; • Coincident, those which provide information about the current state;

and • Lagging, which change after the situation does.

46) Benchmarks should be selected in order to facilitate achieving exactly the needs of information without unnecessary administrative burden. Benchmarks have to be selected whilst avoiding issues related to information any partner would not wish to reveal. It is important to produce a prior definition of the terminology and methods to be used for performing any calculations that may be required. If this is not done, it may prove difficult to make valid performance comparisons.

47) For every benchmark it is necessary to establish, test and document in advance: • The scope: Description of the benchmark and its utility; • Metrics: Benchmark data may be collected from different sources

such as publications, archival information, websites, surveys and/or questionnaires;

• Measurement approach: Information about how to get/calculate the metrics (i.e., web survey, tick box, etc.);

• Proposal of how the results will be presented. 48) All the information about the benchmark should be documented on data

sheets that ease the data collection process.

3.3.3. Agreement to conduct benchmarking exercise

49) A framework should be agreed between senior leaders of the participating benchmarking partners. This could include: a. The purpose of the agreement:

Why is it necessary to carry out the benchmarking exercise? What are the desired outcomes, targets accepted by the parties?

b. The scope of work: If there is a benchmarking plan attached it is enough to refer to its content, otherwise detailed information on the project (e.g., topic, contact persons, applied methodology) should be given.

c. Period of performance: If it does not need to be clarified until the end of the report, or by an exact date (note that benchmarking exercises can take anything from a few weeks to a year or longer depending on the scale of activities).

d. Transparency (publicity):

16

Page 18: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

How will the results be used (e.g., publishing examples of “best practice”), publication of the final documents, disclosure of the contract or the applied benchmarking methodology, compulsory conditions of disclosure.

e. Cost sharing agreements: Calculation and monitoring of costs, including details of converting currencies in the calculations. In case of a participating consultant, the payment mechanism and billing procedures. Any price variations should be negotiated and agreed by all parties (e.g., travel costs, translation and interpretation, external consultancy, IT costs, printing, internal staff time).

f. Roles and responsibilities: Assigning the roles and responsibilities (usually the contact persons) of each participating benchmarking partner.

g. General terms and conditions (include amongst others): - Access to data; - Transfer of data; - Confidentiality, privacy protection clause: rules of how the

information will be treated (e.g., whether confidentiality is required, how info should be stored, disposed);

- Conflict of interest; - Copyright provisions; - Termination (eventually an early termination clause); - The dispute resolution mechanism; - Provisions allowing another administration to join the

agreement at a later stage and conditions related to this; - Rules for amendments; - Declaration of the parties that all activities are in accordance

with the applicable laws, rules and regulations; - Agreement that any other understandings or representations

regarding the subject matter of the contract shall be deemed to exist or to bind any of the parties.

50) It is recommended that any further details are incorporated into the benchmarking collaboration plan. a. After signing the agreement it is recommended that IOTA is notified

about the decision to carry out the benchmarking project and its general content. This will be published on the IOTA website for the purpose of disseminating information among other IOTA Members.

3.3.4. Benchmarking collaboration plan

51) The benchmarking collaboration plan is a document containing further detailed information building on what has been agreed in the agreement, in particular: • Process timetable and milestones; • Roles and responsibilities; • Means of contact and contact details; • Risk management; • Ways of working;

17

Page 19: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

• Cost sharing arrangements; • Technical issues; • Templates; • Language; • Translation; and • Currency equivalents.

3.4. Identifying data for benchmarks

52) Identifying potential sources of data and considering what data should be used for comparison between benchmarking partners requires careful consideration. This Section will outline some of these considerations including: • Sources of information and data (Section 3.4.1); and • Categories of performance and process indicators (Section 3.4.2)

including inputs (Section 3.4.2.1), outputs (Section 3.4.2.2) and effect (Section 3.4.2.3).

3.4.1. Sources of information and data

53) There are a number of different sources of information and data which can be used for benchmarking. For any given benchmarking study it is likely that more than one source will be used. Sources include: a. Existing published sources:

A lot of information, like promotional materials, media information, financial statements and annual reports are available on the Internet. The advantage is that these are readily available and may include interesting comparisons. However, much of the information in the public domain is either dated or is public relations information rather than substantive data about performance or processes. More useful information can be obtained from different statistical agencies, for example, academic institutions, the Organisation for Economic Co-operation and Development (OECD) or the Intra-European Organisation of Tax Administrations (IOTA).

b. Information gathered from senior leaders, process experts and data experts in tax administrations: When collecting benchmarking information it is important to fully understand the strategic context and the processes within each tax administration. Certain individuals within each administration can provide highly relevant information. Senior leaders such as the commissioner or directors will provide an insight into strategic priorities, contextual factors, historical changes and other high level information. Process and data experts will be able to provide much more detailed information about tax processes and data that is available in the tax administration. This information can be collected in various ways, including face to face interviews, process mapping sessions, telephone interviews, etc. Collecting this information will be necessary throughout the benchmarking process, from deciding what to benchmark, defining benchmarks, collecting data and interpreting

18

Page 20: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

and understanding the results. Therefore, continual engagement with key individuals will significantly enhance a benchmarking exercise.

c. Performance and process indicators – including those already available and new indicators which require data collection and/or new methods of calculation: Within each participating tax administration there will be many sources of data and information. It is this data which can form the basis for comparisons between tax administrations. Section 3.4.2 examines the different types of performance and process indicators in much greater detail.

54) In order to improve the quality of the benchmark comparisons it is desirable to validate data by using multiple sources of information. For example, multiple interviews per location, cross-checking of data, multiple measures of information over time from the same source. Nevertheless, most benchmarking projects do rely on the report of a single individual to reflect the working of the entire organisation.

3.4.2. Categories of performance and process indicators

55) Most performance and process indicators and data used by different organisations can be categorised to cover economy, cost efficiency or effectiveness. Economy is about keeping the cost low, cost efficiency (or productivity) is about getting the most or best output from available resources and effectiveness is about achieving the stipulated aims or objectives.

56) Benchmarking data and indicators used by tax administrations based on this model can be grouped into three corresponding categories: • Input (costs and resources); • Output (productivity & cost-efficiency; service & quality); and • Effect (tax gap).

57) It should be noted that some indicators could be placed under more than one category as they may refer to an activity that has more than one purpose. For example, activities aiming to improve quality and service, such as improving the provision of advice and guidance, will also have an effect on compliance and on the tax gap.

58) The traditional measurements of tax administrations are based on government’s financial requirements. The problem with these measures is that they are based on derived information and as presented they have no clear relationship or link to the operational data. On the other hand, operations develop their own measures which are unrelated to financial results, so as to identify the needs of the operational management. This division contributes to conflict in evaluation of performance. These separate internal focuses miss the drivers and the critical external perspective that put measures into context.

59) Measuring performance requires the adoption of a critical eye and an understanding of the relationship of various parts of the organisation to the desired level of performance of the whole. While quantitative measurements are a vital part of the benchmarking process, they are not the only factors to consider. The qualitative characteristics of

19

Page 21: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

organisations, such as the management structure, the level of available services and the operational environment has also to be evaluated.

3.4.2.1. Input – costs and resources

60) The input in the form of funding and resources used to realise the activities of the tax administration vary between different countries and reflects the different institutional set-ups, range of activities and performance measurement systems of the tax administrations. There are often differences between tax administrations in the way that costs are calculated and attributed to the aggregate administration costs. One problem is that there is no universally accepted definition for the measurement of administrative or overhead costs. a. Cost data:

In each tax administration the central source of cost information comes from the administration’s accounting system. This will be set up to monitor costs and the cost data will be structured into different categories. Understanding and using accurately financial data is not a simple task. It requires time, patience and more than a casual understanding of the categorisations of cost data. In the international benchmarking projects it is important to agree in which currency all the costs are exchanged in order to make a meaningful and accurate comparison possible. Statistical price indexes are also used to make corrections to the different price-levels between countries.

b. Staff cost data: The largest cost category for tax administrations are staff costs and it is desirable, therefore, to understand staff cost data as accurately as possible. Staff cost data can be based either on personnel administrative records or on the work time tracking system, which enables the monitoring and analysing of different processes more closely. Staff data may be divided to direct and indirect work, with direct work usually being associated with the frontline interface with external customers, and indirect work representing internal functions, like IT, estates, procurement and human resources. Staff data can be measured in different ways, for example, by the number of employees, full time equivalents (FTE), the amount of working hours or effective working hours (which exclude holidays and sick leave).

3.4.2.2. Output – productivity & cost-efficiency; service & quality

61) When identifying output data and measures, one of the first considerations is to seek to ensure that the input and output data used in benchmarking relates to the same activities so as to ensure a true picture. Further consideration must be given to the different types of output measures. The following sections will focus on (a) productivity and cost efficiency and (b) service and quality. a. Productivity and efficiency:

20

Page 22: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

Tax administrations use a wide range of different key performance indicators to track their own performance. The simplest way to benchmark entities is by comparing simple key indicators with the central measure, which are productivity and cost efficiency.

- Productivity is calculated by dividing output (for example, the number of audits) by input (resource use). The rate of productivity will thereby reveal something about the volume of output obtained by each entity. Overall productivity means more than getting more output per direct labour input. A euro saved in the back office is as important as one saved on the customer service: both reduce society’s cost to obtain the services provided by the tax administration.

- Inversely, cost efficiency and unit costs may be calculated by dividing input by output.

These kinds of simple key indicators may be an excellent starting point for a benchmarking exercise. It is relatively easy to compare, for instance, the productivity of the entities and make an assessment of best practices. The most productive ones are those with the highest output/input ratio. A high rate of productivity will allow production of more output at a given input of resources. However, care needs to be taken when calculating productivity and efficiency measures.

- Relative staffing levels can vary substantially, in part due to factors unrelated to cost efficiency and productivity such as the scope of the taxes and non-tax related activities.

- Continual reorganisation can blur the lines between direct and indirect labour and staff, making direct labour-to-output measures complicated or invalid. (Higher level measures of total productivity (total output per total FTE) or total cost efficiency (EUR per output) may provide a more systematic view of performance that remain valid as ongoing improvements are made throughout the organisation).

- If a variety of input types are used, especially if several types of output are produced, there may be a need to attribute proportions or attach different weights to the results, which will make calculations more complex. (The great challenge is how to do it in a way that will produce a well-defined measure of productivity and cost efficiency. If such corrections are not carried through, it may be difficult to make comparisons. One approach to weighting several output values together, for instance, is by applying the resources used in obtaining each output. However, the problem is often that these weights are unknown).

b. Service and quality: Quality is one output and the tax administrations have increasingly developed indicators to measure the overall quality and the quality of service provided to taxpayers. Qualitative benchmarks are not easily

21

Page 23: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

measured, but they are critical factors affecting performance of the organisation. One aspect of quality is the focus on reducing the burden placed upon compliant taxpayers and businesses, for example, through shorter forms and better guidance, as the quality and service of the tax administrations have an indirect effect on compliance. An indicator in this category could be, for example, dealing with telephone calls or processing times. Given that we try to reduce the resource use for a particular task, we are working both to reduce the time it takes to perform a function (traditional productivity) and to reduce unnecessary demand (reducing task scope). (One effect of this approach is that the time per unit may increase as we reach our productivity benchmark as only more complex, resource intensive cases remain). There are also interesting examples of more sophisticated models calculating the costs relating to customers:

- The Standard Cost Model (SCM) is designed to measure the administrative consequences for businesses and it is the most widely employed method to do so.

- Another example is the study of Doing Business, carried out by the World Bank, which includes measures on the ease of complying with tax obligations, including the number of payments, total tax rate and time to complete returns.

c. Although tax administrations can try and reduce the administrative burden, there are broad categories of obligation that are likely to exist for almost all taxpayers. There are two broad categories of compliance indicators: voluntary compliance by the taxpayer and intervention by the tax administration (e.g., desk or field audits). The two categories are interdependent as the taxpayers’ voluntary compliance can affect the tax administrations’ intervention and vice versa. The OECD defines voluntary compliance as “registration in the system, timely filing of requisite taxation information, reporting accurate information and payment of taxation obligations on time”. If taxpayers fail to meet any of the above obligations then they may be considered to be non-compliant. However, there are clearly different degrees of non-compliance which require different responses.

22

Page 24: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

3.4.2.3. Effect

62) Overall efficiency is concerned with the ability to achieve the greatest possible effect of the input of resources, meaning the entire process from resource input to final outcome. It is not so easy, however, to measure effects. At the same time there will not be any one-to-one correlation between the input of resources and the produced outcome.

63) Theoretically the tax gap would be one of the key measures of a tax administrations’ effectiveness. The term generally defines the difference between the amount of tax that it is possible to charge according to fiscal law (theoretical tax) and the amount that is actually debited. In line with international trends, the strategic focus in many tax administrations has been shifted from traditional input/output targets to tax compliance. However, there is no consensus on how to calculate tax gap, or even if it is possible to calculate a meaningful tax gap estimate. Countries participating in a benchmarking project need to consider which effectiveness indicators are most appropriate.

64) A substitute measurement, for example, can be the proportion of taxes payable that are collected. It is a measure of which all countries have available data. Other effect measuring indicators can be the results of random audits or the results from behaviour surveys, etc.

65) Good sources of information on tax revenue include tax information databases of international organisations.

3.5. Data collection

66) A benchmarking project should not be launched at any price, especially if the data material is too poor. However, it may often be a good idea to carry out benchmarking with the available data, as waiting for the “perfect” data might result in substantial delays.

67) Here, it is worth noting that the purpose of a benchmarking project is not to make accurate measurements of complex dimensions, but rather to focus on major, significant differences in performance which can provide a good opportunity to rethink and consider completely new ways to do things.

68) Therefore, one possible way to solve data problems initially could be to adjust the existing data material to give an approximate answer for the sake of the special circumstances. Thus, the use of existing data can function as a catalyst for the future development and establishment of better information, and the chances for better performance measurement increases.

3.5.1. Developing data collection tools

69) For successful benchmarking, consideration should be given to the balance between collecting quantitative and qualitative data. Quantitative data typically includes cost data, staff data, performance data, elapsed time, customers, etc. whereas qualitative data may include contextual information, cultural differences, legal differences,

23

Page 25: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

process design, differences in strategic priorities, etc. The templates and forms designed need to capture both of these types of data.

70) In practice both types are important. Templates of quantitative data have to be filled and completed during the process of collecting data and before conducting interviews or planning visits to the benchmarking partner.

71) The quality of parameters and those measures which measure performance are important not only for ensuring comparisons are precise but also in ensuring that results of any benchmarking project are acceptable to the organisation.

72) Examples of tools used for data collection include: • Data collection templates; • Working sessions; • Field visits; • Scored interviews; and • Questionnaires.

73) The data collection tools should be agreed by the benchmarking partners or, if available, IOTA templates can be used so that all partners sign up to clear definitions to make sure they are measuring and, therefore, comparing ‘like with like’.

74) The production of these types of document is an activity that requires care and attention. The aim should be to produce documents which: • Produce a general profile of the service to be examined, e.g., number

of users, type of facilities provided and a breakdown of costs; and • Delves more deeply into specific areas and provides sufficient detail

for comparability purposes, e.g., standards, recurring problems and recent initiatives.

75) The production of a questionnaire for comparison purposes, for example, is an activity that requires consideration of a number of issues, including the following: • Ensure that the software to be used to create the data collection

tools is available for all the parties involved; • The data is collected in a format and structure in which it would be

analysed; • Try to build up the structure of the questionnaire so it has an internal

logic and follows in a sequence; • Try to build a questionnaire that is not only related to the core

process but also to the supporting processes and to the context; • Try to ensure that questions which are related to each other follow on

in that section of the questionnaire; • Be quite clear about what the question is seeking to obtain by way of

an answer and that this information is what is needed; • Avoid ambiguity, ensure that the reader clearly understands the

question and all responses required; • Consider the use of guidance notes if the question remains unclear or

there may be uncertainty about the response to be made;

24

Page 26: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

• Include clear definitions of what is included and excluded from the data being requested to make sure that ‘like for like’ is being measured; and

• Identify what the data source for the question might be. 76) When and if the benchmarking process proceeds to a face-to-face site

visit, the following behaviours are encouraged: • Provide a meeting agenda in advance; • Be professional, honest, courteous and prompt; • Introduce all attendees and explain why they are present; • Adhere to the agenda; • Use language that is agreed in the IOTA basic documents; • Be sure that neither party is sharing proprietary or confidential

information unless prior approval has been obtained by both parties, from the proper authority;

• Share information about your own process, and if asked, consider sharing study results;

• Offer to facilitate a future reciprocal visit; • Conclude meetings and visits on schedule; • Thank your benchmarking partner for sharing his/her process; and • Offer to send your partner a written summary of the information

discussed during the meeting. 77) Consideration could also be given to conducting interviews and workshops

in order to collect additional qualitative data.

3.6. Coordination of data collection

78) This phase of the benchmarking process is the one which demands the most work from team members involved in the process because it is necessary to read, sort and present very carefully great amount of numbers and reports so collected data can be used and revised.

79) The data being collected and used should be standardised to enable comparison of similar products or services across all of the participating benchmarking administrations. Where the products or services of the benchmarking partners are not aligned, it is necessary to sort and reorganise the collected data to enable like for like comparisons. Success or failure of the benchmarking process depends on how successfully the data is transformed.

80) Use of graphics and diagrams instead of tables is a more suitable way for presenting the data. It is of great importance to add some descriptive comments to those graphics and diagrams which would help others to understand the content.

81) One of the most important components of benchmarking is the knowledge that is obtained by the team during the benchmarking project, especially during visits to benchmarking partners. The team must decide what actions should be taken as a result of any benchmarking process. It often happens that the actions you will take in future will not be the same as the initial demands of users of a benchmarking project.

25

Page 27: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

3.7. Structuring and analysis of the data

“Focus should be on what is possible rather than what is impossible.” 82) An essential part of ensuring acquisition of the best possible data is to

check the collected data material in order to find any data failures. This should be done with the cooperation of those people who have a good knowledge of the conditions in the different areas that feature in the analysis. Afterwards, a similar procedure can be used, when an analysis of the collected data is available.

83) Moreover, it is important that it is agreed early in the process which method will be used to explain price differences between the different countries conducting the benchmarking if this is considered necessary, e.g., PPP (Purchasing Power Parity).

84) With reference to Chapter 3.3 in this Guide, an agreement has been made about what the benchmark must contain and how the benchmark will be made. Based on this the collected data material will be divided in two main types of benchmark: • Performance benchmarking: Performance benchmarking typically

concerns quantitative analysis of performance and efficiency. In its most simple form performance benchmarking consists of a comparison of central key figures of a number of units.

• Process benchmarking: Process benchmarking concerns detailed analyses and comparisons of, for example, work and production processes in the different units.

85) It must be emphasised that performance benchmarking and process benchmarking in no way exclude each other. Performance benchmarking and process benchmarking are two different approaches based on two different instruments.

86) As already mentioned, performance benchmarking consists of, amongst other things, comparing central key numbers from a number of units. To analyse is the simplest form of performance benchmarking. Simple analysis can be extended by more advanced methods. The most frequently used methods in this case are Data Envelopment Analysis (DEA) and statistic/economic methods (e.g., Regression analyses).

87) Examples of tables and graphs which can be used in the benchmark: Table 1: Population and taxpayers

Country 1 Country 2 Country 3 Country 4

Population

Taxpayers

Taxpayers in percent of population

26

Page 28: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

Table 2: Number of businesses, companies and businesses registered for VAT

Country 1 Country 2 Country 3 Country 4

Businesses registered for VAT

Companies registered for VAT

All businesses and companies

3.8. Interpreting the comparative responses

3.8.1. The importance of understanding “why”

88) Fully interpreting benchmark comparisons is key to maximising the usefulness and value of the benchmarking exercise for making strategic management decisions. The key to interpreting benchmarking results is to be able to explain why one tax administration’s benchmark results are different from another. This requires the development of a thorough, in-depth understanding of the data being compared, the tax process being compared, the strategic and operational context and external factors involved. It is only once these factors are fully understood, that the question of “why?” tax administration A differs from tax administration B can be answered. Once it is understood why differences exist, it is then possible to understand the relationship between certain practices and performance; and identify how performance improvements can be made in a tax administration.

89) Re-examine the tables in Section 3.7 which show a number of metrics for four countries. Tables of this kind can be used to represent any data where numerical comparison is made between countries’ tax administrations. Other examples of tables may include a comparison of the cost to collect one Euro of personal income tax. The cost may be 6 cents in Country 1 and 14 cents in Country 2. Whilst interesting in itself, this information is of little use or value for guiding strategic management decisions in Country 2 which seek to reduce the cost of collecting personal income tax. The benchmark comparison does not suggest why Country 1 is more efficient, or how it achieves its greater efficiency.

90) What is needed then, is a fuller comparison between the two countries, not just of two metrics, but of the processes involved, the strategic and operational context and external factors. Much of this information is descriptive and qualitative in nature. It involves developing a full understanding of not only the metric being compared but of the key elements of the process, of its strategic and operational context and any external factors such as local legislation, cultural norms, economic and labour market context. With a much more detailed understanding of these wider concerns it is then possible to begin to understand why Country 1 may performs better than Country 2. With an understanding of

27

Page 29: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

why Country 1 performs well, Country 2 is able to identify how it may be able to improve its performance, and the strategic management decisions that need to be taken.

3.8.2. How to fully interpret benchmark comparisons

91) Fully interpreting benchmark comparisons, therefore, requires the development of a detailed understanding of tax administrations’ respective processes, strategic and operational context and impact of external factors. In order to achieve this, detailed and in-depth conversations are necessary with a number of key individuals in each tax administration: • Senior leaders and directors are required to provide an understanding

of the wider strategic and operational context; • Process experts are required to provide detailed understanding of the

tax processes being benchmarked; • Data experts are required to provide a detailed understanding of what

the tax administrations’ data includes or does not include. 92) This input is required at all stages of the benchmarking process, from

deciding what to benchmark through to scoping the benchmarks and collecting the data. But it is especially important after the analysis has been conducted and benchmark comparisons have been made. Before benchmark comparisons and analysis are finalised they should be provided for validation and checking that they are correct. Then, detailed discussions should be held in order to fully interpret the results.

93) The most valuable output from a benchmarking comparison is the explanations why any apparent differences exist. These may be due to their strategic and operational context and external factors or to differences in specific operating practices. It is then possible to begin to understand the relationship between certain practices and performance and to identify best practice. The real value of benchmarking is to identify best practice which benchmarking participants can look at adopting in their own tax administration. Additionally it should be noted that best practice may be learnt not just from leading tax administrations but others that have similar contexts.

3.9. Reporting and dissemination – standardised approach

94) At the end of the benchmarking project/exercise the participating tax administrations should prepare the final report, the country activity plans for improvements to be made and the implementation report.

3.9.1. Final report

95) The final (reporting) meeting should be convened before drafting of the final report in order to review the information collected during the course of the benchmarking exercise, identify the best practices and agree upon the content of the final report.

28

Page 30: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

96) The final report should be produced in collaboration with all the participating tax administrations within an agreed period of time. The authority that initiated the specific benchmarking project, or a benchmarking project coordinator agreed among the participating tax administrations, will coordinate the drafting process of the final report. The final report must be a concise document providing findings, solutions and recommendations for the tax administrations of the participating countries, but which may be used also by the tax administrations of other countries that did not participate in the benchmarking project/exercise.

97) It is recommended that the final report contains the following elements: • Executive summary; • Table of contents; • Project objectives; • Methodology section - work methods (what was done during the

benchmarking project), important facts and conclusions; • Description of the practices in each tax administration; • Description of the best practices identified; • Comparison of each participating tax administration practices; and

identification of leading practices; • Identification of opportunities for improvement; • Annexes (collected data, statistics, detailed data processing and

analysis, etc.) 98) Drafting of the final report may be started at the beginning of the

benchmarking project by creating the structural layout that can be completed throughout the course of the project.

99) The final report can be used as a basis for: • Future changes to be made; • Informing top management of the processes carried out; • An indicator for future benchmarking projects; • Information for benchmarking partners; • Information for other interested parties.

3.9.2. Country activity plan for improvements

100) The participating tax administrations should, within one month after the final report has been agreed, draw up individual activity plans for improvements, defining the activities, determining the time frames, and appointing the persons responsible for the implementation of best practices. As all the best practices described in the final report may not be applicable for all the participating administrations, each tax administration will need to select those practices that they consider worthy of introduction in their organisation.

3.9.3. Country implementation report

101) Each participating tax administration needs to produce an implementation report within a reasonable time period after completion/presentation of the final report.

29

Page 31: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

102) The implementation report provides a review of the results of implementing the recommended improvements, i.e., of the activities carried out for introduction of the best practices, the impact from the implementation and also of the difficulties faced during the course of implementation, if there were any. The implementation report should also contain any follow up plans if the improvement programme has not been fully implemented.

103) After identifying the best solutions, the benchmarking team must direct its activities towards developing an implementation plan. Some activities may also include developing fiscal blueprints. Improving business processes within an institution that is developing processes based on benchmarking should be continuous process. A successful plan of implementation should include: • Participation of the whole management and employees in the process

of improvement; • Understanding the goals of the administration; • A final schedule of plans and actions which must be synchronised; • The existence of interest and support to the project by management;

and • A good database and adequate documentation that can be used by

everyone.

3.9.4. Wider dissemination of best practice

104) In order to facilitate dissemination of best practice, partners in benchmarking projects should consider sharing the final report, when appropriate and when agreed by all benchmarking participants. Consideration should be given to using IOTA to share the outputs. In such cases any sensitive or confidential information should be extracted from the report before it is published. It is highly recommended that at least the methodology section is submitted to IOTA for the purpose of future amendments to the benchmarking framework.

105) National benchmarking coordinators/contact persons or any other persons authorised by the tax administrations are responsible for the dissemination of information on the benchmarking projects that have been or will be carried out within their administrations to the managers/heads of business processes and to the top management.

106) The availability of benchmarking reports to all IOTA Members would help to avoid the initiation/conduct of overlapping projects and would provide the opportunity for the tax administrations of those tax administrations not involved in benchmarking to use the best practices identified in the course of their own project activities. It would also allow the use of different research methods in project work. The availability of information would provide a surety to all IOTA Members that they will get the best benefit from the benchmarking initiative.

107) It is of great relevance that best practices identified in the course of project activities are available at all decision making levels of the tax administration, including the top management, who are responsible for the development and implementation of the strategies and policies.

30

Page 32: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

Knowing the experiences and best practices of other tax administrations will help to notably reduce the administrative burden, thus accelerating and intensifying the development of new working methods within administrations.

31

Page 33: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

DISCLAIMER

This Code of Conduct is not a legally binding document. Though all due care has been taken in its preparation, the authors will not be held responsible for any legal or other action resulting directly or indirectly from adherence to this Code of Conduct. It is for guidance only and does not imply protection or immunity from the law.

32

Page 34: Benchmarking · benchmarking partners. To ease this process the IOTA Area Group “Strategic Management - Benchmarking” developed a questionnaire that was used to collect data from

IOTA Good Practice Guide for Tax Administrations – Benchmarking Methodologies

APPENDIX I - GLOSSARY

Benchmark

Benchmarking Agreement/Contract/ Memorandum of Understanding

Benchmarking Code of Conduct

Benchmarking Collaboration Plan

Benchmarking Exercise

Benchmarking Preparatory Plan

Benchmarking Project

Benchmarking Study

Best Practices

Participant Country

Template/Form

Third parties

A measure that shows the level of achievement

A legal agreement between the benchmarking partners for conducting benchmarking exercise

A document laying the basic principles of behaviour in a benchmarking exercise

A document setting out the expected input from all benchmarking partners to the exercise

Bilateral or multilateral common exercise for comparing levels of achievement

A means of identifying, implementing and managing a benchmarking study programme

A time and scope limited process for comparing levels of achievement and adoption of best-in-class practices

A study aiming at finding the best-in-class practices and achievements in specific area(s)

The best way of performing an activity to reach the established goals

Any countries involved in the project

A document/file in which the project data is gathered

Private or public companies or Institutions involved in a benchmarking project as non-participants

33