17
F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshop on ISO IEC 17011.doc Page 1 of 10 APLAC WORKSHOP ON ISO/IEC 17011 22-24 April, 2005 – Narita, Japan The training course was held in two parts. On day 1 APLAC lead evaluators attended for training on evaluation techniques. On days 2 and 3 they were joined by representatives from APLAC full members that do not have a lead evaluator on their staff. Attendance list – see appendix 1 Workshop agenda – see appendix 2 1. Welcome and Introduction The Chair of the APLAC MRA Council, Terence Chan, welcomed attendees to the workshop and thanked IAJapan for all the arrangements for the workshop. Attendees then introduced themselves. Terence Chan summarised his experience as an evaluator and lead evaluator. Evaluations - give strong support to the APLAC MRA - need to be shown to the end-users that they are rigorous and reliable - need to be planned well ahead - need to cover all areas of an AB’s operations and scope of activities Before the evaluation visit a team leader must - assign team members to their tasks - prepare the framework for the evaluation report - ensure team members have briefed themselves on how the AB operates - hold a team meeting the day before the evaluation - encourage the team to focus on the key issues - select the particular issues that may need clarification After the evaluation visit the team leader must - prepare the full report and send it to the AB for review and response - check the AB’s response against the findings, trying to anticipate the questions that may be raised in the MRA Council meeting - prepare the recommendation to the MRA Council - present the report to the MRA Council, which needs time to prepare Terence Chan said that during the workshop the participants should examine ISO/IEC 17011 from all possible angles, taking care not to interpret the clauses narrowly. The aim is to have a harmonised interpretation of 17011 within APLAC so that there is a

APLAC WORKSHOP ON ISO/IEC 17011 22-24 April, 2005 – …apac-production-wp.s3.ap-southeast-2.amazonaws.com › ...on ISO IEC 17011.doc Page 1 of 10 APLAC WORKSHOP ON ISO/IEC 17011

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 1 of 10

    APLAC WORKSHOP ON ISO/IEC 1701122-24 April, 2005 – Narita, Japan

    The training course was held in two parts. On day 1 APLAC lead evaluators attended fortraining on evaluation techniques. On days 2 and 3 they were joined by representativesfrom APLAC full members that do not have a lead evaluator on their staff.

    Attendance list – see appendix 1Workshop agenda – see appendix 2

    1. Welcome and Introduction

    The Chair of the APLAC MRA Council, Terence Chan, welcomed attendees to theworkshop and thanked IAJapan for all the arrangements for the workshop. Attendeesthen introduced themselves.

    Terence Chan summarised his experience as an evaluator and lead evaluator.

    Evaluations - give strong support to the APLAC MRA- need to be shown to the end-users that they are rigorous and reliable- need to be planned well ahead- need to cover all areas of an AB’s operations and scope of activities

    Before the evaluation visit a team leader must

    - assign team members to their tasks- prepare the framework for the evaluation report- ensure team members have briefed themselves on how the AB

    operates- hold a team meeting the day before the evaluation- encourage the team to focus on the key issues- select the particular issues that may need clarification

    After the evaluation visit the team leader must

    - prepare the full report and send it to the AB for review and response- check the AB’s response against the findings, trying to anticipate the

    questions that may be raised in the MRA Council meeting- prepare the recommendation to the MRA Council- present the report to the MRA Council, which needs time to prepare

    Terence Chan said that during the workshop the participants should examine ISO/IEC17011 from all possible angles, taking care not to interpret the clauses narrowly. Theaim is to have a harmonised interpretation of 17011 within APLAC so that there is a

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 2 of 10

    standardised approach to all ABs. In particular, an outcome of the workshop is toidentify key issues in 17011 that are different from those in Guide 58 and TR 17010.He added that, hopefully, there will not be different interpretations among the differentregions.

    He reminded participants that evaluations done in 2005 are done against 17011, andthat all MRA signatories that are not evaluated in 2005 are required to do a self-evaluation against 17011 and report on the outcome to the MRA Council.

    Finally, he thanked the workshop facilitators, Peter Unger, Barry Ashcroft andPanadda Silva, and the rapporteur.

    2. List of Relevant Documents for an Evaluation

    Peter Unger introduced this topic. The PPTs are given in appendix 3.

    He emphasised that, while the key documents for an APLAC evaluation are 17011 andAPLAC MR 001, an evaluator needs to be aware of all the relevant A series, P seriesand other referenced documents. A lead evaluator may also be asked to lead theevaluation of another region or of an unaffiliated body.

    3. Preparing for an Evaluation

    Terence Chan introduced this topic and emphasised it is critical to the success of theevaluation that the team leader prepares well, collecting all the necessary information.The team leader is responsible for determining that the AB is ready for the evaluation,and may need to consult with the Chair of the MRA Council if there are any concerns.In preparing, the team leader needs to

    • review the documentation provided by the AB (set A and set B as definedin MR 001, section 8)

    • make sure the organisation is an AB and not a certification/registrationbody

    • establish whether or not a pre-evaluation visit is to be done• determine how many evaluators are needed to cover the range of

    accreditation activities covered by the proposed (or current) scope ofrecognition; for an extension only to a current scope a smaller team maybe used

    • determine the dates for the visit• draw up the timetable for the visit in consultation with the AB

    In putting together the evaluation team, the team leader needs to consider thefollowing

    • scope of activities of the AB: at least 1 evaluator each for calibration andinspection, depending upon the range of activities in those areas, and onthe structure of the AB; 1 evaluator for ISO 15189

    • potential conflicts of interest• balancing cost considerations wherever possible• a mix of experienced and less experienced evaluators on the team,

    including if possible, a provisional evaluator• inclusion of an evaluator from the previous evaluation, if applicable, to

    give some continuity

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 3 of 10

    • language skills of team members, and knowledge of native language ofAB

    • need for an interpreter (from AB’s native language into English): amember of AB staff is preferable to a “commercial” interpreter because ofthat person’s knowledge of the specialist accreditation “language”

    In drawing up the timetable and allocating tasks the team leader should ensure that(s)he has left some flexibility for him/her to remain at the AB's offices to follow up onissues, should they arise, rather than witness assessments.

    4. Reporting on the Evaluation

    Pete Unger introduced this topic. The PPTs are given in appendix 4.

    In the discussion that followed the presentation the following points were made.

    - the names of the laboratories whose assessments were witnessed should bedeleted from the scopes of accreditation appended to the report on the evaluation

    - for long multi-page scopes of accreditation, a shorter summary of the scope isacceptable

    - the AB’s self-evaluation against KPIs (A3) can either• be attached to the report as an appendix• used as a “first draft” for the body of the report and edited or annotated by

    the team based on its findings- details of PT performance by laboratories accredited by the AB should be included

    in an appendix to the report but the description of how the AB and its laboratoriesmeet PT requirements should be included in the body of the report under theheading for KPI No. 10, Proficiency Testing

    - NCs and other findings can be written in the body of the report under the relevantKPI heading but the finding must be tied to a clause of 17011 or MR 001 not tothe KPIs; the current edition of A3 cross-references to G58 and TR 17010, not to17011

    - the full report needs to present the performance of the AB’s overall system, boththe positives and the negatives

    - the decision makers, i.e. the signatories to the MRA, need to have the findings ofthe team presented in context so that they can correctly judge their impact

    - if an AB accredits laboratories, inspection bodies and certification bodies, the wayin which the evaluation report is structured will depend on how the AB organisesthe administration of the programs, i.e. there could be a combined report orseparate report sections for each program

    - rather than individual reports on each assessment witnessed, there should be asingle summary report that draws together the conclusions of the team based onall the assessments witnessed; if there is a major problem at one assessment,however, it may be necessary to write a separate report on that particularassessment

    - the issues related to MRA obligations are those covered by the MRA text and therequirement in A2 (but not MR 001 as yet) to have a program to promote the MRAto key stakeholders

    - some team leaders are not adhering to the report requirements in MR 001, e.g. notlabelling the report as “confidential”, not including the signed confidentialitystatement in the report sent to the secretariat for filing

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 4 of 10

    5. Classification of Findings

    Pete Unger introduced this topic. The PPTs are given in appendix 5.

    In the discussion that followed the presentation the following points were made.

    - in general, for any “concerns” found, the AB’s response should include details ofpreventive action it is taking to stop the “concern” becoming a “NC” in the future

    - comments are a valuable part of a report, and provide “value adding to the AB” andmay assist the AB in its development

    6. Group Exercise – Planning for an Evaluation and Preparing an Agenda

    The participants were divided into 5 groups and given a description of the structure,scope of activities, etc. of an accreditation body: 2 groups were asked to develop a listof actions needed to be done in advance of the on-site visit; 3 groups were asked toprepare a detailed agenda based on 3 different scenarios – a pre-evaluation visit; allwitnessing of assessments done in the week of the evaluation visit; some witnessing ofassessments done prior to the week of the evaluation visit. In the discussion followingthe group presentations the following points were made.

    - a pre-evaluation visit need not necessarily include witnessing assessments- the team for a pre-evaluation should not be doing a quasi-evaluation- 4 days is probably too long for a pre-evaluation for the reasons given above

    7. Review of ISO/IEC 17011, Section 4

    Barry Ashcroft presented this topic. The PPTs are given in appendix 6. He stated thatneither he nor anyone else has all the answers to how various clauses can beinterpreted. The Standard needs to have been in place for a couple of years beforewe should consider the need for any interpretative document.

    A group exercise on “impartiality” was included as part of this presentation, with 3different scenarios, all of which were “real life” scenarios.

    In summarising the exercise Barry Ashcroft made the following points.

    • the evaluation team should ask the AB for a self-evaluation of potentialconflicts of interest

    • an evaluation team may meet complex situations and may not be able tomake a decision, in which case it is particularly important to draw out all theissues so that the situation can be presented to the MRA Council

    8. Review of ISO/IEC 17011, Section 5

    Pete Unger presented this topic. The PPTs are given in appendix 7. He reminded thegroup that preventive action is a means of increasing the robustness of the systemand/or an opportunity for improvement.

    9. Review of ISO/IEC 17011, Section 6

    Barry Ashcroft presented this topic. The PPTs are given in appendix 8. One of theways in which an evaluation team can judge the adequacy of staffing levels is to lookat factors such as overdue assessments.

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 5 of 10

    It was agreed that 17011 requirements apply to sub-contracted organisations that doassessments on behalf of the AB. It was noted that there is no policy requiring anevaluation team to visit sub-contactors but the practice to date has been to do so foran AB that makes extensive use of sub-contractors.

    Barry emphasised that an evaluation team must have an open mind when evaluatingan AB’s compliance with 17011. The team needs to look at the outcome of an AB’sprocess when evaluating compliance. The concept that “how we do it” is the only orbest way has no place in an evaluation.

    10. Group Exercise – Information Collection and Rewriting Findings

    Two groups dealt with information collection for sections 4, 5, 6 of 17011; three groupsdealt with rewriting findings for sections 4, 5, 6 of 17011.

    The exercise on information gathering was summed up by noting that, for someclauses of 17011, it is possible to evaluate fully and conclude on compliance orotherwise before the on-site visit but, for other clauses, inputs from observations by allteam members are necessary before any conclusions can be drawn.

    A question was raised about how to evaluate if there is undue pressure on staff as thismay be intangible. Formal and informal interviews with staff may reveal someinformation but staff may not always be entirely honest in their answers. As with allother clauses, if there is no objective evidence there can be no finding.

    Each group reporting on rewriting findings stated the assumptions they had madewhen rewriting a finding and classifying it as an NC, concern or comment. There wasdiscussion on some of the findings and the main points arising from the discussion aresummarised below.

    1. The definition of a “legal entity” may be different in different economies.2. There needs to be evidence that any perceived financial problems are having an

    adverse impact on the AB’s accreditation activities or there is no NC.3. It was felt that critiqueing a quality manual may be construed as “consultancy”.4. The key point is whether or not an AB allows opportunity for input from all

    interested parties. If some parties choose not to avail themselves of theopportunity for input, the AB cannot compel their input (clause 4.3.2 of 17011).

    5. Any documents cross-referenced in ISO/IEC 17011 also need to be included inthe document control system.

    11. Review of ISO/IEC 17011, Section 7

    Panadda Silva presented this topic. The PPTs are not attached as they consistedsolely of the words of the Standard. During the presentation several clauses werediscussed by the participants. Where this occurred the discussion is summarisedbelow against the relevant clause number.

    7.1.2 Much of this information may be made available via the AB’s web site so amember of the evaluation team needs to check the web site to see whatinformation is available and whether it is current.

    7.4 Sub-contracting is a fundamental issue for some ABs. Some ABs choose asa matter of policy not to sub-contract any assessments. An AB cannot sub-contract all assessments. An AB may sub-contract for geographical reasonsand/or because it does not have the technical expertise. In the latter case,

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 6 of 10

    though, it needs to be ensured that the AB has the expertise to make theaccreditation decision.

    7.5.3(a) This clause does not prohibit the use of assessors that may have consultedto the CAB. This may be unavoidable at times, e.g. for some specialistareas of testing or in economies with limited resources.

    7.5.6 “Sampling” often applies to CABs with large scopes of accreditation.Thought needs to be given to how/what to sample, e.g. biased towards themore technically demanding tests; ensuring all “families” or groups of typesof tests are covered.

    7.5.7 It was felt that, for initial assessments at least, all sites need to be visited butit was also stated that, in general, for inspection bodies, it is not possible tovisit each site at which inspections are done. For CABs that set uptemporary or mobile laboratories the most important point is to assess thecapacity of the CAB to set up for contract-specific activities.

    7.5.10 The evaluation team should check that each assessor for an assessmenthas the same set of documents and briefing information, i.e. that the AB isconsistent in the package of information it gives to each assessor.

    7.6.2 This clause is new as it explicitly allows an AB to chose not to do anassessment.

    7.7.3 This clause is also new.

    7.8.1 There may be instances when there is not consensus amongst anassessment team in which case there needs to be a mechanism for this tobe resolved by the AB.

    7.8.5 While a “plan of action” may be acceptable for some NCs identified at asurveillance visit or re-assessment, all NCs must be completely signed offfor an initial assessment before accreditation could be granted.

    7.8.6 This clause specifically requires the decision makers to be provided withcertain information, rather than just have access to it. This may presentlogistical difficulties when the assessment has been done by a sub-contractor.

    7.9.3 This situation is similar to that in which a sub-contractor is used to do theassessment. The AB does, however, need to assure itself of the impact ofany changes that have happened at the CAB since the assessment by theother AB was done. There may also be problems of translation when theother AB’s report is in another language to that used by the AB granting theaccreditation.

    7.9.4 The date of the accreditation Standard (e.g. ISO/IEC 17025:2005) must beon either the accreditation certificate or the accompanying scope ofaccreditation.

    7.11.4 The reference to on-site surveillance is different to the definition of“surveillance” in 3.18 that includes activities that can be done off-site.

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 7 of 10

    7.15 An evaluation team needs to look at 4 things in relation to proficiency testing(PT):

    • the AB’s policy on PT• PT programs offered by the AB itself and any other programs mandated

    and/or used by the AB• results from PT programs and how follow-up is done on poor

    performance• how the AB deals with areas where PT is not practicable.

    12. Review of ISO/IEC 17011, Section 8

    Barry Ashcroft presented this topic. The PPTs are given in appendix 9. It was notedthat clause 8.3.2 does not mention “approved signatories”. It was also noted that therevision of ILAC G14 on the use of accreditation logos (to be issued as ILAC P8) willnot cover inspection.

    13. Group Exercise – Rewriting Findings

    This was a continuation of the exercise for topic 10 above.

    14. Major Differences In ISO/IEC 17011 Compared to ISO/IEC 17011 Guide 58 andISO/IEC TR 17010

    The workshop participants identified the following as being the major differences inISO/IEC 17011 compared to ISO/IEC Guide 58 and ISO/IEC TR 17010.

    Section 4 - “related body” impartiality- extending scope of activities (previously covered by KPI)

    Section 5 - more specific requirements in relation to document control, corrective and preventive action, management review, audits, complaints

    Section 6 - records for personnel, especially decision-makers- monitoring performance of decision makers

    Section 7 - public availability of complaints and appeals procedures- sub-contracting- sampling- appeals- more specificity in surveillance requirements- use of PT in assessment process; policy on frequency of PT

    Helen LiddyAPLAC Secretary

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 8 of 10

    Appendices

    1. Attendance list2. Workshop agenda3. PPTs for relevant documents for an evaluation4. PPTS for reporting on the evaluation5. PPTs for classification of findings6. PPTs for ISO/IEC 17011, section 47. PPTs for ISO/IEC 10711, section 58. PPTs for ISO/IEC 10711, section 69. PPTs for ISO/IEC 17011, section 8

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 9 of 10

    Appendix 1

    APLAC Workshop on ISO/IEC 17011

    Attendance List

    Helen Liddy* APLAC [email protected] Oke* NATA, Australia [email protected] Robertson* NATA, Australia [email protected] Russell* NATA, Australia [email protected] Wilson* NATA, Australia [email protected] Soares INMETRO, Brazil [email protected] Gravel* CAEAL, Canada [email protected] Dulmage SCC, Canada [email protected] Mingxia CNAL, People’s Republic of

    [email protected]

    Terence Chan* HKAS, Hong Kong China [email protected] Wah Wong* HKAS, Hong Kong China [email protected] K Rana NABL, India [email protected] S Achmad KAN, Indonesia [email protected] Hosaka JAB, Japan [email protected] Murata* IAJapan, Japan [email protected] Seta* IAJapan, Japan [email protected] Uematsu* IAJapan, Japan [email protected] Takata JCLA, Japan [email protected] Kawashima VLAC, Japan [email protected] Jang KOLAS, Republic of Korea [email protected] Sadri Alwi DSM, Malaysia [email protected] Fernandez ema, Mexico [email protected] MASM, Mongolia [email protected] Ashcroft* IANZ, New Zealand [email protected] Richards IANZ, New Zealand [email protected]. Shahid Rasool PNAC, Pakistan [email protected] Paita NISIT, Papua New Guinea [email protected] Baje BPSLAS, Philippines [email protected] Kwei Fern* SAC, Singapore [email protected] Poh Yin* SAC, Singapore [email protected] Tan* SAC, Singapore [email protected] Jou* TAF, Chinese Taipei [email protected] Lin* TAF, Chinese Taipei [email protected] Silva* DMSc, Thailand [email protected] Chaitheerapapkul DSS, Thailand [email protected] Soongswang TLAS, Thailand [email protected] McInturff* A2LA, USA [email protected] Unger* A2LA, USA [email protected] Hirt ACLASS, USA [email protected]

  • F:\Usr\INTERNATIONAL\VIC\APLAC MRA Council\Evaluator Course\2005\17011 course report\Report on APLAC Workshopon ISO IEC 17011.doc Page 10 of 10

    Pat McCullen* IAS, USA [email protected] Horlick* NVLAP, USA [email protected] Xuan Thuy BoA, Vietnam [email protected]

    * APLAC lead evaluators

  • 11 April 2005 Appendix 2

    APLAC Lead Evaluator/17011 Workshop

    Three-Day Agenda22-24 April 2005

    Narita, Japan

    First Day (22 April 2005)

    8:30 – 9:00 am Registration

    9:00 – 9:15 Welcome and Introductions – Terence Chan

    9:15 – 9:30 List of Relevant Documents for an Evaluation – Peter Unger

    9:30 – 10:30 Preparing for an Evaluation, Writing a Report – Terence Chan/ Peter Unger/others

    10:30 – 11:00 Break

    11:00 – 11:30 Classification of Findings (NCs, concerns & comments) – Peter Unger

    11:30 – 12:30 Group Exercises (two groups do a planning list: three groups do agendas)

    12:30 – 1:30 Lunch

    1:30 – 3:00 Group Exercises (continued)

    3:00 – 3:15 Break

    3:15 – 5:30 Reports of Groups (projected)

    Second Day (23 April 2005)

    8:30 – 9:00 Registration of Others who do not attend the First Day

    9:00 – 10:45 Review of ISO/IEC 17011:2004: Section 4 – Barry Ashcroft

    10:45 – 11:00 Break

    11:00 – 12:30 Review of ISO/IEC 17011:2004: Section 5 Pete Unger

    12:30 -- 1:30 Lunch

    1:30 – 3:00 Review of ISO/IEC 17011:2004: Section 6 – Barry Ashcroft

    3:00 – 3:15 Break

    3:15 – 5:30 Group Exercises (two groups deal with information collection on sections 4, 5 and 6;three groups deal with rewriting findings related to sections 4, 5 and 6)

  • Third Day (24 April 2005)

    9:00 – 10:30 Reports of the Five Groups (projected)

    10:30 – 10:45 Break

    10:45 – 12:00 Review of ISO/IEC 17011:2004: Section 7 – Panadda Silva

    12:00 – 12:30 Review of ISO/IEC 17011:2004: Section 8 – Barry Ashcroft

    12:30 – 1:30 Lunch

    2:15 – 3:00 Group Exercises (two groups do sections 7 & 8; three groups deal with rewriting findingsrelated to sections 7 and 8)

    3:00 – 3:15 Break

    3:15 – 5:15 Report of Groups (projected)

    5:15 – 5:30 Wrap-Up

    LIST OF DOCUMENTS PROVIDED IN ADVANCE OF WORKSHOP

    Agenda

    Biosketches of Moderators

    PPT Slides

    Document comparing of Guide 58 against ISO/IEC 17011

    Other documents comparing against ISO/IEC 17011

    APLAC MR 001: Peer Evaluation Requirements and Procedures

    APLAC MR 002 rev 1: APLAC MRA Text

    DOCUMENTS TO BE BROUGHT BY EACH PARTICIPANT

    1S0/IEC 17011; ISO/IEC Guide 58; ISO/IEC TR 17101

    ISO/IEC 17020; ISO/IEC 17025

    ILAC/IAF A2 and A3; ILAC G14 (all available form the ILAC web site)

  • 12005-04-22

    Relevant Documents for Peer Evaluations

    Peter UngerA2LA President

    22005-04-22

    IAF/ILAC A-series

    A1: Requirements for Evaluation of a Regional Arrangement Group

    A2: Requirements for Evaluation of a Single Accreditation Body

    A3: Key Performance Indicators

    A4: ISO/IEC 17020 Guidance

    3

    IAF/ILAC A2

    Requirements for Evaluation of a Single Accreditation Body

    42005-04-22

    Outline of A2

    • Introduction

    • Requirements

    • Flowchart with:

    – 8 Annexes

    52005-04-22

    Supplementary Requirements

    • Enough experience (4 for test, 4 for cal)

    • PT requirements (see ILAC P9)

    • Abide by MRA requirements & obligations

    • Program to promote to stakeholders

    • Contribute its fair share of resources for peer evaluation at global level

    62005-04-22

    The Eight Annexes

    1 Application

    2 Check report

    3 Evaluation team

    4 Program

    5 Reporting

    6 Evaluation Summary Report

    7 Decision-making

    8 Re-evaluation

  • 72005-04-22

    Annex 3 - Evaluation Team

    • Appointment and duties of team leader

    • Composition of evaluation team

    • Requirements for qualifications of team members

    82005-04-22

    Annex 4 -Typical Evaluation Program

    • Duration: within 7 days

    • Witnessing/observing assessments

    • Managing evaluation:

    – preparation

    – on-site

    – activities after on-site

    • Typical example timetables

    92005-04-22

    Annex 5 -Steps in Evaluation Reporting

    • Preparation of summary report

    • Formal report of on-site visit

    • Formal response of AB

    • Formal reaction of the team

    • Steps 3 and 4 are iterative

    • Preparation of a final report

    102005-04-22

    Distinguishing among Reports

    Summary Report with all findings: Team to AB

    Full Report of on-site visit: Team to AB

    Final Report with recommendation & CA resolution: Team to MRA Council

    112005-04-22

    Overview of ILAC P Series

    • ILAC P Series documents address all matters related to the Peer Evaluation process including:

    – Requirements

    – Policies and procedures

    – Arrangement text

    – Guidelines, e.g., KPIs

    122005-04-22

    List of ILAC P-seriesP-1, Requirements for peer evaluation procedures

    P-2, Procedures for evaluating a region

    P-3, Procedures for evaluating unaffiliated bodies

    P-4, Arrangement policy statement

    P-5, Text of the mutual recognition arrangement

    P-6, Application for peer evaluation

  • 132005-04-22

    List of ILAC P-series

    P-7, Key Performance Indicators (now A3)

    P-8, Referencing Accredited Status (draft rev. G14)

    P-9, Minimum PT Requirements

    P-10, Policy on traceability of measurement results

    P-11, Monitoring Performance of ILAC Evaluators

    P-12, Harmonization of Work with Regions142005-04-22

    A-series versus P-series

    IAF/ILAC A1 equivalent to ILAC P2

    IAF/ILAC A2 comparable to ILAC P1 & P3

    IAF/ILAC A3 equivalent to ILAC P7

    152005-04-22

    APLAC MR-00x series

    MR-001 Procedures for Maintaining the MRA

    MR-002 MRA Text

    MR-003 Application for MRA Signatory Status

    MR-004 Evaluator Performance

    MR-005 Training of APLAC MRA Evaluators

    MR-006 Conduct of Joint Evaluations with Other Regions

  • Reports of Peer Evaluations

    Peter S. UngerA2LA President

    Summary Report[left with AB after exit briefing]

    • 1 to 2 Page Summary with Recommendation on Next Step(s)

    • NCs, Concerns and Comments in a Word Table

    • Declaration of Confidentiality and Impartiality

    Full Report[provided in draft shortly after visit]

    • Cover page• Contents• Summary report• Introduction• Background of AB• Performance of the system (per KPIs )• MRA Obligations• Annexes

    Full Report Annexes

    • NCs, Concerns and Comments• List of Documents supplied before

    evaluation• Agenda for evaluation• Organization chart of AB• Accreditation scopes of organizations visited• Declaration of confidentiality and impartiality• Miscellaneous

    Final Report[provided to MRA Council through

    Secretariat]

    • Cover Memo with Final Recommendation

    • Table Attached on the Final Resolution ofNCs and Concerns

    • Full Report

  • Classification of Findings

    Peter S. UngerA2LA President

    Three Types of Findings

    • Nonconformities

    • Concerns

    • Comments

    Nonconformities

    • Non-fulfillment of a requirement:– Guide 58 and/or 17010 (17011 in future),– AB’s own management system/rules;– Arrangement’s requirements– Supported by objective evidence identified by evaluation

    team

    • Evidence of successful implementation of corrective action is expected

    Concerns

    • Finding where AB practice may develop into an NC or the team is not fully satisfied, but not enough objective evidence of a nonconformity.

    • Response from accreditation body is expected, either an appropriate action plan or clarification.

    Comments

    • Finding about documents or practices with a potential for improvement, but still fulfilling the requirements

    • Response from accreditation body would not be expected, but it may do so if it wishes.

    Table of Findings

    Type of Finding

    Statement of Finding

    AB response Team reaction

    NC Evidence with ID of clause

    Necessary Ensure closure

    Concern Description Necessary OK or not

    Comment Description Optional Not necessary