71
CA10_Guidelines_For_Assessors July 2014.Docx Page 1 of 71 Form CA10 Guidelines for Assessors (DRAFT) GUIDELINES FOR ASSESSORS HOW ASSESSMENT PANELS MUST PROCESS COMPETENCE ASSESSMENTS Last saved at 11:21:00 a.m. on Tuesday, 29 July 2014 Saved to Q:\Competence Assessment\Policies and procedures\Forms Register\Developing\Forms to revise for 2014\CA10_Guidelines_for_Assessors July 2014.docx

CA10 Guidelines for Assessors

Embed Size (px)

DESCRIPTION

guidelines for IPENZ assessors

Citation preview

Page 1: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 1 of 71

Form CA10 – Guidelines for Assessors

(DRAFT) GUIDELINES FOR ASSESSORS

HOW ASSESSMENT PANELS MUST PROCESS COMPETENCE ASSESSMENTS

Last saved at 11:21:00 a.m. on Tuesday, 29 July 2014

Saved to Q:\Competence Assessment\Policies and procedures\Forms Register\Developing\Forms to

revise for 2014\CA10_Guidelines_for_Assessors July 2014.docx

Page 2: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 2 of 71

CONTENTS 1. CHANGE MANAGEMENT RECORD .................................................................................. 4

2. INTRODUCTION ................................................................................................................. 4

3. COMPETENCE CONCEPTS .............................................................................................. 5

4. DECISION-MAKING ............................................................................................................ 5

5. TRIGGERS FOR ASSESSORS .......................................................................................... 6

6. PURPOSE OF PANEL ........................................................................................................ 7

7. ROLE OF ASSESSORS ...................................................................................................... 8

8. YOUR APPROACH TO CANDIDATES ............................................................................. 10

9. GIVING ADVICE TO CANDIDATES .................................................................................. 10

10. BEST PRACTICE FOR MANAGING ASSESSMENTS .................................................. 10

11. ASSESSOR PORTAL – AN INTRODUCTION ............................................................... 13

12. ACCESSING ASSESSOR PORTAL .............................................................................. 13

13. ASSESSOR PORTAL FAQS ......................................................................................... 14

14. ASSESSORS LOG ........................................................................................................ 16

15. COMPLETING AN ASSESSMENT ................................................................................ 17

16. DETERMINE RELEVANT COMPETENCE STANDARD ................................................ 18

17. USE OF ASSESSMENT TOOLS ................................................................................... 20

18. PRELIMINARY EVALUATION ....................................................................................... 20

19. KNOWLEDGE ASSESSMENT ...................................................................................... 21

20. INTERACTIVE ASSESSMENT ...................................................................................... 21

21. PROFESSIONAL JUDGEMENT .................................................................................... 27

22. WRITTEN ASSIGNMENT .............................................................................................. 29

23. REFEREES AND REPORTS ......................................................................................... 34

24. USING INFORMATION FROM REFEREES .................................................................. 35

25. EVALUATION OF EVIDENCE ....................................................................................... 36

26. HOLISTIC ASSESSMENT ............................................................................................. 45

27. VALIDATE PRACTICE AREA ........................................................................................ 50

28. Validate practice field information .................................................................................. 53

29. PREPARE RECOMMENDATIONS ................................................................................ 54

30. RECOGNISED ENGINEER AND DESIGN VERIFIER ASSESSMENTS........................ 56

31. CANDIDATE ACCESS TO CA07 REPORTS ................................................................. 57

32. PRE-SIGN OFF CHECKLIST ........................................................................................ 57

33. CA07 REPORT SIGN OFF PROCESS .......................................................................... 59

33. ASSESSING ‘RECOGNISED ENGINEER’ OR ‘DESIGN VERIFIER’ ............................. 60

34. COMPETENCE ASSESSMENT BOARD DECISIONS .................................................. 60

35. natural justice ................................................................................................................ 61

Page 3: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 3 of 71

36. APPEALS and Procedural Reviews ............................................................................... 62

37. ASSESSMENTS UNDER TRANS TASMAN MUTUAL RECOGNITION ACT (TTMRA) . 63

38. APPLICATION OF TTMRA PRINCIPLES TO CREDIT SCHEDULE .............................. 63

39. ADMINISTRATIVE INFORMATION ............................................................................... 64

40. CPENG RULES AND IPENZ REGULATIONS COVERING ASSESSMENTS ................ 64

41. Policy on term to next assessment................................................................................. 66

42. INDEX ........................................................................................................................... 70

Page 4: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 4 of 71

1. CHANGE MANAGEMENT RECORD

Version Number

Date Nature of change

Version 3.3 Tuesday, 17 June

2014

Added CAB advice on practice fields, checklists and various

other.

Version 3.2 8 February 2013 Updating to reflect changes to CPEng Rules and IPENZ

Regulations.

Version 3.1 30 March 2010 Final refinements to major revision.

Version 3.0 16 February 2010 Major revision of the document to include information on

use of the assessor portal

Version 2.1 15 May 2009 Competence concepts (new)

References to Recognised Engineer and Design Verifier

(new) – see index

Advice to assessment panels regarding use of controlled

written assignment (new)

Information on referee eligibility (further information)

CAB advice when assessing relevant registers (new)

CAB advice to panels on feedback to candidates (new)

Assessing element 8 – generic application of ethical codes

(further information)

Justification for reduced term to next assessment (further

information)

Assessing for IntPE as part of continued registration

assessment for CPEng (further information)

Assessing CPD for continued registration assessments

(further information)

Inclusion of an index (new)

2. INTRODUCTION

These guidelines have been developed to give assessors the information and support required to undertake an interactive assessment for CPEng candidates. These are guidelines only and are intended to clarify the individual and collective responsibilities of assessors and assist in complying with the rules and regulations when conducting assessments. Assessors must use these in conjunction with their professional judgement based on their assessor training and technical expertise. Further information is provided on this in the Decision-making section.

Page 5: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 5 of 71

3. COMPETENCE CONCEPTS

Competence is ‘artful doing’ – the ability to acquire and apply high levels of knowledge. In conducting a competence-based assessment, an assessment panel must consider the relevant standard of competence for the quality marks the candidate has applied for, and make a judgement on whether the evidence presented shows that the candidate is able to perform the things required of the standard. The fundamental philosophy is that the candidate must have actually done the things required, and the evidence presented demonstrates this.

3.1 CHARACTERISTICS OF ‘COMPETENCE’

While there is considerable debate on the definition of competence, there is general agreement that it encompasses the following 5 elements:

the ability to perform individual tasks to a specified standard repeatedly.

the ability to manage a number of different tasks within the job repeatedly.

the ability to respond to irregularities and breakdowns in routine.

the ability to deal with the responsibilities and expectations of the work environment, including working with others.

the ability to continue to learn in rapidly changing work environments.

Competence is not based on:

a. Specific qualifications – there is no requirement to hold a specific qualification. A candidate must demonstrate he/she has acquired and can apply knowledge to a specified standard (such as a Washington Accord, Sydney Accord or Dublin Accord qualification). If in doubt, the assessment panel can use appropriate assessment tools to assist in making this determination (a knowledge assessment or an examination are examples of ways to test the candidate’s knowledge).

b. Level of experience – there is no minimum time requirement performing engineering work (or study) to demonstrate competence. Time performing work is an input, but its weight as evidence is limited by the ‘quality’ of that work. Someone doing challenging work constantly tackling new problems is likely to provide strong evidence sooner than someone performing work of a repetitious nature with few engineering challenges. Similarly, age is not relevant to the concepts of competence.

Assessors must not consider these non-competence factors when conducting a competence assessment.

4. DECISION-MAKING

Evidence of competence must meet the criteria of sound assessment, i.e. it must be:

valid (relevant to the competencies)

sufficient (there should be enough evidence of sufficient quality for the assessor to judge that good engineering practices have been followed. The evidence must also demonstrate repeatability of performance)

authentic (the evidence is that of the candidate).

REMEMBER

Page 6: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 6 of 71

Candidates must be assessed on the whole of their evidence – including inputs from referee contacts and interactive assessments.

4.1 ARRIVING AT A DECISION

The following process may be useful in coming to a panel decision:

Put aside your first impressions

Have the evidence requirements and competencies been fully (or only partly) met?

Was the evidence valid, authentic and sufficient?

Were there any inconsistencies in the evidence (including the referees’ reports)?

Were there any significant triggers that concerned you (see below)?

Is there any further evidence you need - including further referee reports (see below)?

Are you confident that the candidate is safe to operate as an independent practitioner?

Make and record your decision

You must be confident in the panel’s decision at the end of the process.

5. TRIGGERS FOR ASSESSORS

The following includes a list of triggers, which should help assessors with making a judgement. The more positive triggers there are, the more likely the candidate will meet the requirements of the competencies. Conversely, the more negative triggers there are, the more likely panel will have to do more work - further evidence will be required and assessment tools used to demonstrate the requirements are met – and the more likely the competence standard may not be met.

Page 7: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 7 of 71

Positive Triggers for Assessors Negative Triggers for assessors

Application completed correctly and all material provided

Application not completed correctly and/or materials missing

Clarity of portfolio and documentation e.g. structure, order, process, line weights, scale

Lack of clarity of portfolio and documentation e.g. structure, order, process, line weights, scale

Sufficient detailing Insufficient detailing

Good interpersonal and communication skills Limited interpersonal and communication skills

Sound knowledge of the regulatory environment Limited knowledge of the regulatory environment

Compliant with legislation, regulations and codes of practice

Not compliant with legislation, regulations and codes of practice

Information presented tells a ‘story’ of consistent competence

Gaps in the ‘story’ of consistent competence

Demonstrates relevant knowledge in relation to projects described

Lack of applied knowledge

Drawings and information are well organised Drawings and information are chaotic

Ability to answer questions correctly and in sufficient detail

Unable to answer questions correctly or in sufficient detail

Referee reports and verbal feedback consistent with information provided by candidate

Referee reports and verbal feedback inconsistent with information provided by candidate

Ability of candidate to reflect on his or her own practice including clear perceptions of limitations

Lack of ability of candidate to reflect on his or her own practice and unclear perceptions on limitations

Evidence of regular up skilling No evidence of up skilling

6. PURPOSE OF PANEL

An assessment panel has only two decisions to make. The Rules and Regulations require an assessment panel to:

a. assess whether a candidate meets the standard for registration (for assessment for admission to register) or continued registration (for assessments for continued registration), and if so,

b. set the term to the next assessment.

The CPEng Act requires all decisions made by IPENZ (as the Registration Authority) to be supported by reasons. Therefore panels must document the reasons for their recommendations, and the CA07 form is developed to guide assessment panels in recording

Page 8: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 8 of 71

critical evidence used as the basis for their decisions. The comments must be based on relevance of the evidence presented to the respective elements of the competence standard.

6.1 TASK OF PANEL

The task of the panel is to:

o Review and validate evidence (as outlined on page 5);

o Determine if the required level of complexity of engineering work has been demonstrated;

o Record the panel’s conclusions (with supporting reasons) in its recommendations to the Competency Assessment Board.

6.2 TERMINOLOGY – AFAS AND CRAS

An AFA is an “assessment for admission” – for entry to either a register of current competence or an IPENZ membership class.

Candidates for an AFA who have already attained recognition at the same level of competence for which they are applying (such as an MIPENZ applying for CPEng) are referred to as “previously assessed” candidates. Candidates who have not previously demonstrated competence to the same level are referred as “not previously assessed” candidates.

A CRA is a “continued registration assessment”, or an assessment to continue to be registered on a current-competence register.

Assessment panels follow the similar steps for each type of assessment, with some variations depending on whether it is an AFA or a CRA.

7. ROLE OF ASSESSORS

The assessment panel consists of at least 2 assessors. Each assessor has a role, either as:

Staff Assessor – responsible for managing the process – ensuring timeliness of assessments, compliance with rules and regulations, completion of documentation; or

Knowledge assessor – primarily responsible to completing assessment of the candidate’s level of knowledge (element 1) against the relevant standard; or

Practice Area Assessor – responsible as the technical adviser to the panel, appointed because of his/her personal knowledge and experience in the candidate’s area of engineering practice.

While each assessor has a specific role on the panel, the Rules and Regulations make no distinction between the types of assessors – all have equal responsibility and each member of a panel is responsible for its outcome. The panel has considerable flexibility in how it carries out an assessment. For example, the knowledge assessment may be carried out by one person (the Knowledge Assessor) or by some or all panel members. You have been appointed as an assessor because of your technical knowledge and experience and

your interpersonal skills.

Your role is to determine whether a candidate meets the requirements of the relevant competence standard and to make a recommendation to the Competency Assessment Board as to whether or not a candidate should be registered.

Page 9: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 9 of 71

Be ready to ask for further evidence if you require it. If you have any doubts or reservations, you should express them and the panel should do more work (use all assessment tools available) to remove any doubts or reservations.

You also need to be aware of the following issues.

7.1 CONFLICTS OF INTEREST

If you have been appointed to an assessment panel and you consider there may be a conflict of interest – or others may perceive a possible conflict of interest – you should report it to National Office in the first instance, preferably in written form (email is acceptable) so that you can retain a written record.

Further, if at any stage you are subject to a complaint, you are advised to voluntarily disclose this to National Office staff and make yourself unavailable until the matter has been resolved.

National Office staff do not normally appoint engineers who are employed by the candidate’s employer as assessors. A conflict could arise if, after the panel has been appointed, a candidate changes employer and is employed by the employer of one of the assessors.

Generally speaking, merely having an acquaintance with a candidate is not considered to be a conflict. However, if an assessor considers that the relationship is more than an acquaintance (the candidate may have previously reported directly to the assessor) then the assessor is encouraged to declare the situation. Some examples of past conflicts of interest include:

The candidate’s employer and assessor’s employer were taking each other to Court in legal action (although neither the assessor nor candidate were involved in the litigation);

An assessor had previously dismissed a candidate from an earlier employment relationship;

Candidate and assessor were neighbours;

A candidate and an assessor had previously had a strong disagreement on a professional matter.

7.2 CONFIDENTIALITY

You must not discuss any aspects of the candidate you have assessed (including the outcomes of that assessment) with anyone other than IPENZ personnel, your co-assessor(s), CAB members and the Registrar.

You also need to take care to protect the confidentiality of information provided to you by referees.

7.3 TIMELINESS

IPENZ has set expectations stating that assessments are normally completed within 12 weeks of receipt of the candidate's completed portfolio of evidence. While the Staff Assessor is contracted to manage the process to meet the contracted deadlines, all assessors share responsibility in contributing to the assessment in a timely manner. If there is likely to be any delay, the candidate should be informed as soon as possible of the delay and the expected completion time. Surveys of candidates show that nothing is more frustrating than to hear nothing on the progress of an assessment because assessors have not provided feedback on delays in the assessment process.

Page 10: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 10 of 71

8. YOUR APPROACH TO CANDIDATES

You must remain courteous and professional at all times during the process, even in difficult circumstances. You need to approach all applications objectively, giving the candidate every opportunity to demonstrate he/she meets the standard. However, you do not have to tolerate rude or aggressive candidates. In this situation, you should do the following:

Make it plain to the candidate that the behaviour is unacceptable and ask them to stop

it continues, terminate the conversation

make full notes of what happened immediately noting the date and time

report the situation to IPENZ

You must be extremely careful not to say anything that will bring the assessment process or quality marks into disrepute.

9. GIVING ADVICE TO CANDIDATES

Your role is as an assessor. It’s not appropriate to use the CA07 report to give ‘helpful advice’ to the candidate. However, you can give sensible advice as part of the feedback process. For example if you see a glaring gap in the evidence it’s acceptable to suggest ways to remedy the gaps/omissions – but not in the CA07 report to the CAB.

10. BEST PRACTICE FOR MANAGING ASSESSMENTS

The following check list for Staff Assessors has been developed by senior assessors who found the following practice, particularly when conducting interactive assessments, to be very efficient and effective in providing the Competency Assessment Board with full documentation – especially when an interactive assessment is involved. These are guidelines and don’t have to be followed slavishly but the checklist is an approach others have found helpful in consistently getting good reports to the CAB on time with few ‘refer backs’.

Page 11: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 11 of 71

Item Staff Assessor Actions Completed

1

a. On receipt of portfolio of evidence email/phone candidate to acknowledge receipt of documents and introduce yourself

b. Update the assessor log

2.

Email co-assessor(s) to

a. ask if he/she has received the papers

b. brief the co-assessor(s) on your expectations of them – are they happy with your timelines and target dates?

c. identify if more information is required (refer to ‘triggers’ on page 6)

d. Prior to the interactive, set times for pre-interactive meeting or discussion

e. Update the assessor log

3

Ring candidate to

a. Introduce yourself and explain the process;

b. Confirm dates of interactive assessment etc;

c. Advise candidate of any elements where the panel wants additional information or work samples (see information on page 31)

d. Invite candidate to prepare a brief summary (10 mins) of the key evidence that shows he/she meets the standard (i.e. in preparation to speaking to the portfolio of evidence)

e. Advise candidate how the interactive assessment will be run, so that candidate is clear on what is expected of him of her

f. Update the assessor log

4

a. Prepare draft CA07 report based on initial evaluation of the portfolio of evidence listing issues to be resolved at interactive assessment;

b. Advise co-assessor(s) draft report is ready to be reviewed

c. Update the assessor log

Page 12: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 12 of 71

Item Staff Assessor Actions Completed

5

Meet with (or contact) Practice Area Assessor(s), say 2 weeks prior to interactive to

a. Review draft report and identified issues;

b. Discuss issues raised and decide how to probe further at the interactive

c. Draft questions based on elements of concern (refer to the ‘Question Bank’ on page 25 for guidance)

d. Decide if a written assignment is required; and if so, develop appropriate test for elements of interest (refer to CAB advice on ‘Written Assignments’ on page 29)

e. Update the assessor log

6

Conduct interactive assessment.

a. Use pre-prepared questions (see section in preceding row of this table) as a basis for probing and clarification

b. Confirm candidate’s role in the various projects submitted.

c. Check candidate's understanding and application of the code of ethics.

d. Agree any changes to the practice area description (see page 50 for further information).

7

After the interactive:

a. If a written assignment is required, brief the candidate on purpose of assignment and the panel’s expected outcomes (refer to CAB advice on ‘Written Assignments’ on page 29)

b. Decide what issues need to be addressed with referees

c. Decide who will contact referees and approach to be taken to avoid unproductive discussion or feedback from panel (refer to page 34)

d. Set time line for process to complete the assessment report

e. Inform candidate of next steps: If panel still has issues with any element, advise candidate and seek his or her help in obtaining further evidence to complete the assessment.

f. Update the assessor log

8

a. Give the candidate feedback – indicate what steps the panel plans to take, what it is likely to recommend to Competence Assessment Board and a timeline through to the CAB meeting (refer to page 28).

b. Update the assessor log.

Page 13: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 13 of 71

Item Staff Assessor Actions Completed

9

a. If referee contacts and/or additional information from candidate have not resolved earlier problems (or raised issues), consult with co-assessor(s) to determine next steps.

b. Update the assessor log

10

a. Complete the CA07, adding ‘post interactive’ observations (as appropriate) and inputs from referees. Retain any pre-interactive assessment notes and questions, and record any post–interactive assessment notes with appropriate headings (such as ‘Prior to interactive assessment’ and ‘After interactive assessment’) to show the CAB the issues the panel needed to clarify with the candidate at the interactive assessment and how it helped resolve these issues.

b. Submit to CAB via assessor portal sign-off process (refer to page 57 for details).

c. Update the assessor log

11. ASSESSOR PORTAL – AN INTRODUCTION

These guidelines will follow the steps through an assessment, and comment how the panel's findings might be recorded in its report to the Competency Assessment Board.

PLEASE NOTE: SCREEN IMAGES IN THIS SECTION MAY NOT ACCURATELY REPRESENT ACTUAL CURRENT SCREEN IMAGES. 12. ACCESSING ASSESSOR PORTAL

Access to the assessor portal is via the link http://www.ipenz.org.nz/ipenz/members/assessors/ or via the ‘Member’s Area’; see below:

Once you enter the IPENZ Members’ only area (you must be registered to access), you click on the link to the ‘Assessors area’:

to

On entry to the ‘Assessors area’, you will see a set of folders down the left hand side of the screen – one for each candidate that you are currently assessing, and a table in the centre of the screen listing the candidates, assessors (centre column) and the status of the CA07 report (in the right-most column).

Click link

“Assessor’s Area”

Page 14: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 14 of 71

13. ASSESSOR PORTAL FAQS

13.1 WHEN CAN ASSESSORS ACCESS ASSESSOR PORTAL?

Assessors can access candidate information and the CA07 report as soon as National Office appoints them to a panel. If the candidate uses the IPENZ on-line system for tracking CPD, assessors will also have immediate and automatic access to his/her CPD records during their time on the panel. Similarly, if the candidate used the on-line referee request system, the referee reports will be immediately available to the assessors.

National Office staff also pre-load other relevant information such as the last assessment report, work samples and other documents relevant to the assessment.

13.2 CAN ASSESSORS USE ‘TRACK CHANGES’ ON ASSESSOR PORTAL?

No. Assessors can complete the CA07 form on-line on a page-by-page basis, with each page being updated as the user moves from one page to the next. As the user moves from a page, the information (with or without change) is written back to the assessor portal. This mode of operation means it is not possible to ‘track changes’ in a way familiar to Word users.

Assessors are therefore encouraged to draft the body of their report in Word or other word-processing package before cutting and pasting the text into the assessor portal. Assessors can then draft the report off-line making full use of the word-processor facilities (spell checking etc) and when the report is ready, simply go on-line and ‘cut and paste’ text into the assessor portal.

Any member of the panel can access and edit the CA07 report. Once the panel is happy with the report, they must use the “signoff” process (in a similar way to that when physically signing the report) to ‘submit’ the report to National Office.

Page 15: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 15 of 71

13.3 CAN ASSESSORS ACCESS REFERENCE MATERIAL VIA ASSESSOR PORTAL?

Assessors can access the guidelines and other reference material easily from the assessor portal by clicking on the link called ‘Reference Material”.

13.4 CAN ASSESSORS ARCHIVE CANDIDATE’S DOCUMENTS ON ASSESSOR PORTAL?

No. Once the CA07 form has been submitted to National Office and staff prepare the report for the Competency Assessment Board, the entire candidate information will disappear from the assessors’ view. It will only re-appear if the CAB refers the assessment back to the Panel for some reason.

13.5 WHAT CANDIDATE INFORMATION CAN ASSESSORS ACCESS?

Assessors can access a candidate’s documents and CA07 report by clicking on the candidate’s name in list on left side of screen on entry to the assessor portal, where links to other material will be displayed. The screen image below is not current.

The following documents will be displayed.

a. CPD – the candidates CPD records, if candidate has used IPENZ on-line systems for tracking CPD)

b. Referees - If candidates used the on-line ‘Referee Request’ system, the referee reports would be accessed via this link. If the link does not exist, either the candidate did not use the on-line request system or the referees have not yet completed their reports.

c. Uploaded documents – The report for the candidate’s last assessment, which is normally uploaded by National Office staff. If referees emailed or sent in hard-copy reports, referee reports would also be accessed via this link. Other relevant up-loaded documents may also be found via this link.

Page 16: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 16 of 71

d. Assessor log – access to the assessor working log and the ‘quick log’ (see page 16 for details) for this candidate

e. CA07 report – the CA07 (in writeable form if it has not yet been signed off). Once signed off, this link will only allow access to the ‘print version’ of the CA07.

f. CA07 Print Version - access to the CA07 form in a format suitable for copying into a Word document. This is the same format that is used for presenting CA07 reports to the Competency Assessment Board.

g. Work samples. – will be uploaded unless they are extremely large files.

The screen will also display the candidate’s application history – see screen image below.

Clicking on the “View Details” link in the above screen will display the history of this candidate’s application or assessment.

14. ASSESSORS LOG

The Assessors’ log consists of two parts:

a. The Assessors’ Working Log; and

b. The ‘Quick Log’.

These are shown in the screen image below. Note that the Assessors’ Working Log is a 'rich form' and requires the flash browser plug-in to be installed. You can download it from http://get.adobe.com/flashplayer/

14.1 ASSESSORS’ WORKING LOG

The Assessors’ Working Log allows assessors to track their actions as the assessment progresses. The candidate cannot access this log as it is for recording all contact with the candidate or other parties involved in the process (National Office, referees, CAB members etc) and actions taken during the assessment process. These records are not sent to the Competency Assessment Board but remain on file as a record of the assessment and would be vital evidence if there is any scrutiny of the assessment process later (such as during an appeal or procedural review).

Page 17: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 17 of 71

There are two fields that can be input – the first is to note of an action (or a request, such as for further information) and the second is for the result (if any) to that action, with scope for giving different dates for the action and the result – but they must be entered at the same time.

14.2 THE QUICK LOG

The ‘Quick Log’ has been added the candidate can track progress of his or her assessment. The Quick Log uses a simple drop-down box with a pre-determined range of options that the panel can select to reflect progress. The purpose of this facility is to provide another option for candidates to monitor progress – and assessors are asked to keep it current. While it is no substitute to an email or telephone call to the candidate, it is another means of communication with the candidate.

15. COMPLETING AN ASSESSMENT

The Rules prescribe the assessment process the panel must follow in reaching a conclusion, and the panel’s conclusion is conveyed to the CAB as recommendations (with reasons) using the CA07 report.

Preparation of the CA07 report is the outcome of a process with the following steps.

Page 18: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 18 of 71

The guidelines will provide more detail on each step of the process.

16. DETERMINE RELEVANT COMPETENCE STANDARD

The portfolio of evidence for assessment is similar for both AFA and CRA assessments, although the volume of evidence for an AFA is expected to be greater than for a CRA.

Step 1 • Determine relevant competence standard

Step 2 • Use assessment tools

Step 3 • Evaluate and validate evidence

Step 4 • Validate practice area and practice field

Step 5 • Prepare recommendations

Step 6 • Report sign-off

Step 1 • Determine relevant competence standard

Page 19: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 19 of 71

NOTE 1 – SET COMPETENCE STANDARD(S) TO BE USED

This option is only available for AFAs. Assessors can specify if the assessment is for a professional engineer, an engineering technologist or an engineering technician. If only one level is selected, then the elements of competence for only one standard will be shown and panels can only make an assessment against that standard.

The panel should initially select registers at the level the candidate has applied for. Then, if after initial assessment finds the candidate competent at a different level of complexity to that applied for (for example, an ETPract candidate practising at either CPEng level or CertETn level), it would then tick an additional box reflecting the standard for that level as a ‘relevant register’ (refer to detailed description of relevant registers on page 54). Thus in the element-by-element analysis, two standards would be displayed and the panel will be asked to consider them both.

NOTE 2 – CANDIDATE’S PRACTICE AREA DESCRIPTION

The practice area description provided by the candidate is listed here – more about practice area description later (see page 50).

NOTE 3 – TYPE OF ASSESSMENT

The assessor portal ‘knows’ the type of assessment being carried out and makes certain assumptions on that basis. If the assessment is an AFA, the panel will be required to complete an element-by-element assessment (Section A), whereas for CRAs, completing the element-by-element analysis is optional (subject to the responses to the ‘three CRA questions’ – see page Note 10 on page 36).

With CRAs, the CA07 form also includes the term to the current assessment, as set by the assessment panel from the previous assessment.

NOTE 4 – CONVICTIONS

Assessment panels should always check the candidate’s submission for convictions. If any candidate has a conviction, National Office staff will obtain advice from the Competence Assessment Board before referring the documents to the assessment panel. If there is no accompanying guidance from the CAB, please contact the Registrar.

Page 20: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 20 of 71

NOTE 5 – BENCHMARKING QUALIFICATIONS

National Office will benchmark qualifications against one of the 3 international Accord agreements, and this is usually displayed on the CA07 form after the qualification. This evaluation is particularly important for the international registers – IntPE and IntET.

Assessment panels should contact National Office if any qualification is listed without a record of how it is has been benchmarked (other than non-engineering degrees, which cannot be benchmarked).

17. USE OF ASSESSMENT TOOLS

For an AFA, the Rules/Regulations require use of the following assessment tools (refer Rule 11/Regulation 12):

a. A preliminary evaluation (to determine if additional information is required to complete the assessment)

b. An interactive assessment (unless the panel thinks it unnecessary – although good reasons would be required to convince the CAB that no interactive assessment was required)

c. A written assignment (unless the panel thinks it unnecessary)

d. An assessment of the candidate’s knowledge (unless the panel thinks it unnecessary)

e. Referee inputs – candidates must supply contact details of two independent referees, but the rules/regulations are silent on how the panel is to use the information.

For a CRA, the Rules/Regulations require use of the following assessment tools (refer Rule 25/Regulation 26):

a. A preliminary evaluation (to determine if additional information is required to complete the assessment)

b. An interactive assessment

c. A written assignment (if the panel thinks it necessary)

d. Referee inputs – candidates must supply contact details of two independent referees, but the rules/regulations are silent on how the panel is to use the information.

There is provision with both AFAs and CRAs for the assessment panel to repeat these steps or combine them or take additional steps to carry out the assessment.

18. PRELIMINARY EVALUATION

The preliminary evaluation involves the assessors reading the documentation submitted by the candidate and referees to decide whether more evidence is required and which assessment tools will be used to complete the assessment.

Step 2 • Use of assessment tools

Page 21: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 21 of 71

In reviewing this material assessors need to consider the following:

a. What information (or lack of it) sets the scene for the interactive assessment?

b. What issues/competencies/elements does the documentation raise that you need to be explored further? (Refer to the section on ‘triggers for assessors’ on page 6).

c. Does the candidate need to submit additional information (refer to Note 7 on page 31) or undertake a written assignment (refer to page 29) to address these matters?

All panel members need to participate in making decisions on these matters, as this must be a panel decision.

The panel may use assessment tools in any order – if for example it considers the evidence of analytical skills is weak, it may require a written assignment (to test analytical skills) first then explore further with the candidate at the interactive. After the interactive, it may require the candidate to undertake a further written assignment (or submit further information or additional referees).

19. KNOWLEDGE ASSESSMENT

The knowledge assessment is primarily intended for assessing the knowledge element (primarily element 1) of candidates applying for professional engineer quality marks (CPEng, MIPENZ or IntPE(NZ)) who do not have a formal Washington Accord qualification (or recognised equivalent) to assess whether or not they have knowledge equivalent to the level required for a Washington Accord degree. Successful completion of a knowledge assessment is not a qualification and does not meet the qualification entry requirements for the international registers.

The IPENZ knowledge assessment involves appointment of a specialist assessor to the assessment panel. The assessment panel has the ability to determine how it wishes to conduct the knowledge assessment – whether all or some panel members participate in the knowledge assessment, whether the knowledge assessment is done at the start of the process or partway through or whether the knowledge assessment is an exam or done as an interactive assessment.

Currently, the practice is to require all non-Washington Accord degree qualified persons to prepare and pay for a knowledge assessment on application. The Rules/Regulations require a knowledge assessment (unless the assessment panel considers it unnecessary), so National Office staff will appoint the Knowledge Assessor and forward copies of the candidate’s portfolio of evidence.

Normally the Knowledge Assessor will conduct the knowledge assessment separately as a first step of the process so that the knowledge assessment report is available for the interactive assessment. The knowledge assessment usually involves some form of interactive assessment – usually done by teleconference or videoconference for logistical reasons.

The Knowledge Assessor is an assessor and must be party to the panel’s deliberations and recommendations, even if not directly involved in all steps of the assessment process. The Staff Assessor is responsible for co-ordinating knowledge assessment inputs and ensuring the full panel considers all such information in making its decision.

20. INTERACTIVE ASSESSMENT

The interactive assessment should be more like a ‘professional conversation’ which is ‘candidate led’ – where the candidate is given the opportunity to illustrate how his/her portfolio of evidence demonstrates he/she meets the competence standard. Some candidates are comfortable in this situation and have little difficulty in presenting themselves well to the

Page 22: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 22 of 71

assessors. However, others, even with strong evidence, find it stressful. The assessors’ role is to help:

1. create a relaxed environment so that candidates are able to focus on the evidence that demonstrates their competence; and

2. assess the evidence – seeking clarification and probing as required, but remember the interactive is not an interrogation. The panel should record critical evidence derived from the interactive assessment – be it confirmatory in nature or otherwise.

3. operate on a ‘no surprises’ basis. If the assessment panel identifies an issue of concern during the preliminary review of the candidate’s documents, the candidate should be informed of this (referencing the relevant elements) in sufficient time prior to the interactive assessment to allow the candidate to prepare prior to the interactive assessment.

20.1 USE OF INTERACTIVE ASSESSMENT

The value of the interactive assessment was tested in the first appeal against a decision to not register a candidate. The case involved a mature practitioner who had previously successfully demonstrated competence as a Professional Member, however, due to poor evidence in some key areas, the panel required an interactive assessment. The Appeals Panel wanted evidence to show the use of an interactive assessment provided useful information for the assessment and that it was not an unfair imposition on the candidate who incurred significant costs in travelling to the interactive assessment. It was important that the panel had documented how the interactive assessment had yielded evidence that was used in making the panel’s decision.

Assessment panels need to record how the interactive assessment contributes to its final decision when documenting its findings.

20.2 INITIAL CONTACT

The first verbal contact with the candidate should be a phone call from the Staff Assessor to introduce him or herself and explain the process. Staff Assessors will already have received and reviewed the application documentation in conjunction with the Practice Area Assessor. This is an opportunity to:

Advise the candidate of any elements of the standard which the panel has reservations and requires further evidence. Providing further evidence may involve the use of a written assignment or further referees;

Give the candidate the opportunity during this telephone conversation to ask questions about the process or anything else he or she wants to know;

Ensure the candidate is aware of and prepared to give a summary of why he or she thinks his/her portfolio of evidence demonstrates the required level of competence. Make sure the candidate is aware of the panel’s expectations.

20.3 SETTING UP VENUE PRIOR TO INTERACTIVE ASSESSMENT

Before you start the interactive assessment, ensure the room layout is conducive to a friendly and non-threatening interaction - for example the seating layout. With two assessors you might consider it more relaxing to arrange the seating in a triangle or around a round table so the candidate feels more included. Avoid the simple and obvious symbols of ‘status’ or ‘power’ – such as having assessors seated on one side of a table opposite the candidate or having the assessors with a window as a source of bright light behind them.

Page 23: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 23 of 71

20.4 AT INTERACTIVE ASSESSMENT

After introducing yourself and your co-assessor(s) and briefly outlining your roles, spend some time ‘breaking the ice’ before launching into the assessment proper.

How much time is spent doing this depends on the person and the situation;

Some people are very relaxed and confident and are ready more quickly than others who are very anxious, unsure or apprehensive and take more time to ‘settle down’. It is common for people to feel this way so assessors need to have strategies to make the candidate feel at ease

Some candidates are happy to launch into a well-prepared presentation. However, sometimes they may be unsure of how or where to start – use one of the following strategies to assist the candidate in getting started:

Get the candidate to talk through his or her work history, including his or her background and experience in the industry

Ask the candidate to select some material from his or her portfolio and talk you through it.

It’s critical not to turn the assessment into a question and answer session as, if the assessor takes control from the candidate, the momentum of the discussion is lost.

Your role as an assessor is to facilitate the conversation so that the candidate produces the information you require to make a recommendation regarding his or her application. Three of the most important techniques for doing this are

o building rapport,

o listening and

o questioning.

Refer to the following guidelines on these techniques.

To ensure that you run to time, avoid conducting the interactive assessment on an element-by-element basis. Experience has shown that a more productive way is to have the candidate summarise his/her evidence first – candidates commonly cover two or three elements in describing one piece of work. If the candidate leaves gaps after the presentation, then focus on the areas where evidence is weak.

Remember: as a general rule - the more the candidate talks, the more the panel is likely to obtain the information it needs.

20.5 BUILDING RAPPORT

People get on with people who are like them. Therefore, it is important to build a rapport with your candidate in order to effectively communicate with them, and for them to give you the information that you need. Below are some tips to help you do this

Right from initial contact try to find something in common – you’ve had their projects and other information – there must be something – school, university, CPD course, travel, type of work – anything will do

Match body language, vocabulary, metaphor, pace, tone, breathing

Validate something they have said; you don’t have to sell your soul.

Match the level of emotional connection. Are they polite, sharing facts and information, discussing emotions, or being intimate?

Page 24: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 24 of 71

Most people don’t listen - they just take turns to speak. We speak at about 75-100 words per minute, but can listen at about 600-700 words per minute. A good listener is able to pick up more information more readily – to assist you in information gathering:

Listen for senses candidates prefer: audio, visual or kinaesthetic.

Listen for their emotions: what is driving them at this moment – it’ll usually be fear or anxiety

Listen for pace, vocal tone, body language, facial expressions and values/beliefs.

Stay positive.

It is important that you remain positive and professional, as the candidate will feel more comfortable and likely to share information you need. Therefore try the following:

Use positive language such “yes that can be a challenge” rather than “that shouldn’t happen”;

Find some merit in the candidate’s evidence or background – there’s always something you can praise;

Be aware of your own emotions and prejudices. Are you tired, stressed or just plain irritated? Are you finding it difficult to understand someone who does not speak English well? Remember this is not about you and the candidate has a huge amount invested in this process.

Acknowledge status

Status is about our standing compared with others. Status anxiety is alive and well and it doesn’t look like abating. Remember this is not about your power and control – be aware that people may have equal status to you in a wide range of areas. You can acknowledge status by matching levels of formality and forms of address and acknowledging status in:

education

specific technical and business skills

thinking or strategic skills

connections

moral standing

social skills

life experience

emotional insight

The interaction is more like a dance, rather than a game of tennis or a boxing match (with you as the heavyweight). Use your interpersonal skills to guide the candidate through the areas of interest to the panel – but let the candidate feel that he or she is ‘in control’ – such as:

Smile. Just because you’re doing an assessment doesn’t mean you have to remain po-faced for the whole time.

A sense of humour, used appropriately, goes a long way in building relationships and putting people at ease.

20.6 QUESTIONING TECHNIQUES

In the process of gathering evidence you will need to ask questions.

Page 25: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 25 of 71

You must ensure your questions are relevant and do not confuse the candidate or cause undue stress;

Your questions should help the candidate in the assessment process and help to keep the interactive assessment on track.

There are two main types of question you can use:

Open questions give the candidate the opportunity to ‘open up’ and respond in more detail. Open questions usually begin with words like; what, why, when, where, who, how, tell me?

Closed questions usually get short specific answers. Closed questions usually begin with words like; have you, are you, will you, did you?

Question Type

Example Use

Open

Exploring ‘What made you decide to do it that way?’

Encourages candidate to explain and explore in more detail

Hypothetical ‘What would you do if.…..?’

Allows candidate to show ability to transfer competence across a range of situations

Closed

Yes/No ‘Did you use council guidelines?’

For quickly checking facts

Specific

‘Tell me about the contractors involved in this project?’

For specific information

20.7 WHAT MAKES AN EFFECTIVE QUESTION?

An effective question:

Is short

has one idea

is relevant

creates interest

should emphasise a key point or check understanding

20.8 INTERACTIVE ASSESSMENT QUESTION BANK

The following questions will assist assessors with the interactive assessment of candidates. These questions are not mandatory or exclusive – regard them as prompts only. Assessors should use their professional judgment to decide which questions they will ask and when.

Why did you choose this project?

Page 26: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 26 of 71

Have you ever had to say ‘no’ to a client? If so, can you describe how you prepared for it and what the outcome was?

What factors contributed to the complexity of this project?

What were the major challenges for you?

How do you know the project was successful?

What sorts of quality assurance processes did you use in this project?

What risk management approach did you apply? What were the risks associated with this project?

What sorts of resources did you require to complete the project process?

How did you organise and manage these resources?

What sorts of environmental and social impacts did you have to consider on this project?

Who were the specialist consultants and contractors on this project?

How did you go about working with these people?

How did you build and maintain relationships with the client on this project?

How did you know the clients’ requirements were met?

What sorts of conflicts did you come across during the project and how did you deal with these?

What sort of administration work was involved in this project?

What contract documentation were you personally responsible for?

20.9 ACTIVE LISTENING TECHNIQUES

Leave your ego at the door – this is not about you

Acknowledge your biases to yourself

Give undivided attention and show interest

Let the candidate set the pace

Watch for non-verbal cues. Use your eyes as well as your ears for listening

Use words and actions that affirm the candidate without necessarily expressing agreement

Use attentive posture; comfortable eye contact; and gestures, expressions, and intensity that match the candidate's

Feel comfortable about taking notes, but don’t lose eye contact with the candidate

Listen with the intent to understand, not to reply

Respond appropriately

20.10 CLOSING INTERACTIVE ASSESSMENT

It is important to close the assessment with a discussion on what will happen next.

Go through all the competencies and review whether you have obtained sufficient evidence to meet the requirements of the competencies. You may also want to use the evidence checklists for this as well. It’s useful to do this aloud so the candidate is aware of your thinking.

If after review there are any further questions or information you require ask for them now.

Page 27: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 27 of 71

If the candidate has produced all the information required then you should advise him or her that the recommendation being made to the Competency Assessment Board will be to approve the candidate’s application. You can indicate this by saying something like:

We are feeling very positive about the evidence you’ve produced. At this point we need to contact your referees and if no issues are raised there, we will be recommending that CAB approve your application. We will update the assessor log so that you can monitor progress via the Member’s area of the IPENZ website. Once the CAB has made its decision, you will be contacted by the IPENZ National Office with formal notification of the outcome.

If the candidate still needs to produce information or there is something else you need to check before you can make your recommendation, you should give clear feedback about what is required and discuss timeframes.

We’ve identified that there are some gaps in your evidence and would appreciate your assistance in providing us with more information in relation to element(s) {specify the elements of concern}. Can you provide us with this information by {date – say one or two weeks out}? I’ll follow up this request by email so that we all have a record of this request. We’ll also need to contact your referees. We plan to get a recommendation to the CAB by {date}, so we would appreciate your co-operation in meeting this target. We will update the assessor log so that you can monitor progress via the Member’s area of the IPENZ website. You’ll be contacted by the IPENZ National Office staff once CAB has made its decision on your application.

Other closing comments:

We have now got all the information we need to make a recommendation to the CAB regarding approval of your application. We will update the assessor log so that you can monitor progress via the Member’s area of the IPENZ website

Is there anything else you would like to add at this stage in support of your application?

Do you have any questions for us about the assessment or what is going to happen from here?

How did you find the assessment process?

Thank you very much for your time.

21. PROFESSIONAL JUDGEMENT

While these guidelines provide information that informs the decision-making, there will be occasions when the application doesn’t entirely fit the guidelines. This is where you need to exercise professional judgement. Some situations where this may happen include:

the jobs don’t meet currency requirements but the candidate has worked in the industry continuously in progressive roles and/or owns/manages a company and is intimately involved in the day to day management of jobs;

the answers to the questions are not entirely conventional but meet the intent of the questions;

the candidate is a tutor/knowledge leader who has practised in the past but is now teaching.

If you make a decision that varies from the requirements of the guidelines you must do the following:

Page 28: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 28 of 71

make full notes that include an explanation of why you made this decision;

submit your decision for peer review/moderation by IPENZ.

21.1 GIVING FEEDBACK

Feedback is a critical part of the assessment process. Even when you are not in a position to divulge your recommendations, you still need to give feedback to the candidate.

Skilled people make feedback a positive experience, leaving everyone feeling valued, even if the feedback itself is difficult or negative. If feedback is delivered badly, or not at all, the impact can be demoralising and long-lasting.

Whether the message you intend to give is positive or negative, the skill you use to give it will still affect the impact you have. A message you intend to be positive can demoralise someone, if they walk away feeling confused. A tough message about poor performance can leave a person feeling supported and motivated if you deliver it with skill.

Provide an indication of timing. Tell the candidate the steps the panel proposes to take to complete its assessment (contact referees, receive additional information etc) and when it expects to submit its report to the CAB. This will help manage candidate expectations.

Top Tips For Giving Effective Feedback

Prepare! Be clear about what you want to say - and why - before you start. Feel free to take a few minutes to gather your thoughts after the interactive assessment.

Think of specific examples - what someone is getting right as well as what is wrong. Make sure your examples are detailed, recent, accurate and relevant.

Be clear and specific.

Never make jokes or remarks that could offend.

Make feedback a two-way conversation, not a speech.

Be ready for resistance - they may not agree. Avoid arguing or giving up - use more examples to illustrate what you mean.

If someone doesn't like your feedback, don’t take it personally - this is an evidence issue, not about you as a person.

Watch your own airtime. Use pauses. Give the candidate time and space to think.

Find a way to end on a positive. Sum up any agreements you’ve made, or thank the candidate for his or her time and attention.

Feedback is information - not an instruction list. Try not to get into a debate - it is better to leave than argue.

21.2 AFTER COMPLETION OF ASSESSMENT?

The panel’s goal of an interactive is to be able to make a decision on the candidate’s application. However, how should the candidate feel when he or she leaves the room? Here are some suggestions.

Leave with a positive feeling about the assessment process – irrespective of the outcome. If the panel’s finding is that the candidate did not achieve the level of competence applied for, the candidate should feel he or she was treated with respect and was given a fair hearing.

Page 29: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 29 of 71

To feel the process has been a beneficial learning experience – combining self-reflection with objective input from neutral observers (i.e. assessors) on areas of weakness and strengths.

To be able to set positive and constructive goals for the future – either for areas to be developed for a successful application next time (if if competence was not achieved) or, if competence has been achieved, on-going development for the next assessment for continued registration.

Also refer to the notes about closing an interactive assessment on page 26.

22. WRITTEN ASSIGNMENT

The written assignment is not (and should not be) limited to an essay. The controlled written assignment can be used at any stage through the assessment – for example, it could be used by assessment panels prior to an interactive to provide information for the interactive assessment. Or it may be used after the interactive as a means of addressing shortcomings identified in the interactive.

Types of controlled written assignment that could be used are:

a. Case studies – to test analytical skills, decision-making and judgement, risk management;

b. Work-simulations – to test analytical skills, selection of options;

c. A work-place based task – where the candidate is given a topic and asked to complete it in a period of time (may be even days) to test analytical skills, communications skills and/or judgement. If this option is used, the candidate should be asked to provide a statement of the persons the candidate has contacted in the course of performing the work;

d. An essay – to test communications skills; or

e. An examination – to test knowledge; or

f. A combination of the above.

This is not an exhaustive list but a summary of the types of controlled written assignment available to a panel. The panel needs to take care that it selects a form of controlled written assignment that will provide a valid test of evidence for the competence element(s) in question.

The Practice Area Assessor is most likely the panel member who will set the topic or task for the written assignment but the Staff Assessor should ensure that the written assignment is:

a. appropriate for and relevant to the candidate’s practice area; and

b. relevant to the elements of the competence standard being tested.

22.1 USE OF WRITTEN ASSIGNMENT

CAB has noted panels do not always make best use of the written assignment to gather and/or validate evidence of the candidate’s competence. Traditionally the essay (a form of written assignment) was carried out in a ‘closed book’ environment in the afternoon after the interactive assessment.

In today’s competence-based process, the written assignment is most useful if it precedes the interactive assessment (or a further interactive assessment). The written assignment can be a source of evidence to address any gaps identified in the portfolio of evidence and the subsequent interactive assessment can be used to validate the evidence.

Page 30: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 30 of 71

The CAB offers the following advice to panels when requiring a candidate to undertake a written assignment. As a matter of good practice, the candidate should be given clear instructions on what the panel expects from the written assignment.

1. The candidate should be told what the panel expects to get from the written assignment - for example, is it a test of written skills? Or a test of analytical or other skills? - referencing the appropriate elements of the competence standard. The panel’s expectation must be aligned to one or more elements of the competence standard (eg., element 2 if related to jurisdictional good practice, or element 10 if related to ability to communicate clearly).

2. The panel should be explicit about the conditions under which the written assignment is undertaken. If it is a ‘closed book’ exercise, the candidate must be told resources can or cannot be used – such as existing data on the candidate’s computer, internet access, emails or text information - and what material is acceptable. Will the written assignment be undertaken under supervision? If so, how will this be done?

3. If the written assignment is an ‘open book’ challenge, the candidate should be given directions on how to acknowledge the source of information he/she has used – for example by referencing the sources, declaring resources used and persons consulted with during the controlled written assignment.

4. The candidate should be told how long he/she has to undertake the written assignment, and how the written assignment should be submitted to the panel for review at the end of the assignment.

5. CAB recommends the use of a template which contains full instructions so the candidate can access the information at any time during the written assignment. National Office will prepare a suitable template for such use in the near future.

Page 31: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 31 of 71

NOTE 6 – ASSESSMENT TOOLS

Panels must record the assessment tools used (as required by the Rules/regulations) on the front page of the CA07 form.

As the rules/regulations require the use of all the assessment tools for an AFA, no reason is required for if a panel uses a specific tool, but a reason must be given if a panel decides not to use an assessment tool. The most likely reason for not using a specific assessment tool is that the assessment panel is satisfied that the candidate has already provided adequate evidence to demonstrate competence by other means.

With each particular assessment tool used, the panel is asked to specify which elements it wished to explore by using that assessment tool (the default is ‘all elements’ if none are specified).

NOTE 7 – REQUESTING ADDITIONAL INFORMATION

When should a panel request further information? If there are obvious gaps in the evidence provided, such as identified by the triggers (see page 6), the panel should ask the candidate for further information.

The Rules (and Regulations) make provision for assessment panels to request additional information and/or contact details of 2 additional referees and require the assessment panel to specify the date by which the candidate is to supply the information. In specifying the time, give the candidate a reasonable time – the test is on ‘reasonable’.

The process the assessment panel should follow is:

a. Contact the candidate (preferably by telephone) and outline what information is required, why (i.e. identify the elements that the panel is concerned about), and specify the date the information is due (a reasonable time is typically one to two weeks - unless the panel considers the candidate has good reasons for requiring more time);

b. Follow up the verbal request with an email (or fax) putting the request in writing with a copy to National Office (so that records can be maintained);

c. If nothing is received, follow up with a final notice immediately after the deadline date, advising that the requested information has not been received and that the assessment panel will make a recommendation with the limited information it has unless the requested material is received within 24 hours. If this advice is given verbally, confirm it by email.

d. After the 24 hours has elapsed, complete the assessment (with or without the requested information) and make a recommendation to CAB in accordance with these undertakings. The panel may have no option but to present a negative recommendation if has not been given sufficient evidence to satisfy itself that the candidate is able to ‘consistently demonstrate competence’ for the elements of concern.

Page 32: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 32 of 71

NOTE 8 – CONTACTING REFEREES

The purpose of contacting referees is to validate the candidate’s evidence and/or clarify information you have read or heard during the interactive assessment. The panel should not be influenced by the referee’s views on the candidate’s ability to meet the relevant competence standard – that is a panel decision.

22.2 INITIAL REFEREE CONTACT

Here are some ground rules to assist panels in managing contacts with referees.

a. Prepare your questions before contacting a referee. You need to be very clear what information you’re seeking (ie in relation to which elements of the standard);

b. Only approach referees nominated by the candidate. If you wish to contact someone other than a nominated referee, you must first obtain the candidate’s permission.

c. Always introduce yourself, explain why you are calling;

d. Explain that the evidence provided will not be discussed with the candidate but will be discoverable in the event of an appeal;

e. Advise the referee that you are taking notes including the reasons why and what the information will be used for;

f. Spend some time conversing in general to ‘break the ice’ until the referee is comfortable talking with you.

22.3 OPENING

o Good morning, I’m {your name} and I’m an assessor for {candidate name}’s application for assessment for {quality mark name}. I have your referee’s report here and I’d like to discuss some details of it with you if that’s OK. Is now a suitable time?

22.4 QUESTIONING

In general you will be contacting referees because you want information to confirm the candidate’s competence.

If there is anything you have read or heard that concerns you in respect of the candidate’s competence, this is an opportunity to ask specific questions to clarify the information you have been provided with;

You will need to listen carefully to what the referee says and identify triggers which will prompt you to ask further clarifying questions;

Triggers may include:

o vague responses which do not tell you anything substantial

o comments about work not being up to standard

Your questions need to be relevant to the information you require

If the panel needs input on one (or some) elements, the best approach is to ask the referee to think of a situation where the candidate undertook an activity relevant to the element(s) in question. Ask the referee what the candidate did in that situation. The referee will provide information the panel can use to make a judgement on the candidate’s competence. Avoid asking the referee if he or she thinks the candidate is competent in any specific element, as the referee is not a trained assessor and may apply a judgement that is not consistent with the competence standard. Further, it is the panel’s responsibility to make decisions on the candidate’s competence.

Page 33: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 33 of 71

If, for example, the panel was concerned about the candidate’s ability to analyse and investigate complex engineering problems (Element 3), you could ask the referee if he or she could recall a situation where the candidate was given a piece of engineering work requiring the gathering of data, analysing it to develop various options. Once an example has been selected, question the referee about how the candidate approached the problem.

a. Did the candidate explore negotiable constraints with the client?

b. How did the candidate scope the work?

c. How did he or she go about collecting data about the problem – did he/she arrange for measurements to be carried out?

d. Did he/she speak to other engineers with specialist engineering knowledge or skills in the area to explore solutions to similar problems?

e. Did he/she search the literature for new knowledge relevant to the problem?

f. How did the candidate test his or her analysis for correctness? Experiments? Prototypes? Computer simulations?

g. Were there any aspects of the problem that contributed to its complexity? What were the consequences to the problem? Were there many stakeholders involved? Were there any conflicting issues?

Based on these inputs, the panel could then make judgements on whether these things were things a competent engineer would do or not.

The referee could be asked the reasons for his or her comments in the referee report – especially if the referee's view was at odds with that the panel had developed.

Some sample questions

o Please can you describe a project you worked on with the candidate? (Element 6)

o What parts of the project did you work on with the candidate?

o What do you think are the candidate’s key technical skills?

o How effectively did the candidate:

o communicate with you in relation to this project? (Element 10)

o Analyse and investigate engineering problems? (Element 3, 4)

o Manage engineering risk associated with the work? (Element 7)

o In relation to the part of the project you worked on with the candidate, did it go according to plan? (Element 6)

o How did the candidate influence the outcome? (Element 6)

o Would you be happy to work with the candidate on another project?

Please note these are just a guide, as the questions will ‘fall out’ of the assessment and you will develop questions that suit each candidate. Remember – one size doesn’t fit all!

If you have referees who give evidence leading to conflicting views of competence, you will need to do further follow up and you may need to ask for additional referees.

22.5 CLOSING THE CONVERSATION

Thank the referee for his or her time.

Never discuss your thoughts about the candidate ;

Never give any indication of the panel’s likely recommendations to the referee.

Page 34: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 34 of 71

23. REFEREES AND REPORTS

Panels for AFAs are not normally appointed until all documents have been received – including two referee reports. However, panels for CRAs may be appointed without all referee reports being received due to the fact CRAs must be completed within a specific calendar year – and the evidence provided may be sufficiently strong that a decision can be made without the referee inputs.

23.1 WHERE ARE REFEREE REPORTS ON ASSESSOR PORTAL?

Referee reports are accessible on the assessor portal but the means by which they are submitted may mean they are accessed differently - refer to page 15 for details on how to access referee reports.

23.2 REFEREE ELIGIBILITY

The Rules or Regulations require candidates to submit contact details of two independent referees who meet the eligibility criteria – that is they have current competence of an equivalent level of competence to that being applied for.

For CPEng candidates, referees must either be CPEng registered or have CPEng equivalence. ‘CPEng-equivalence‘ is not defined in the Rules, but for the purposes of being a CPEng referee the Registrar applies the following interpretation:

‘CPEng equivalence‘, for the purposes of being a referee, means a qualification or title requiring the same level of competence to that required of a Chartered Professional Engineer. It requires

a. attainment of competence to the CPEng standard; and

b. reasonable evidence that the competence is current.

As the rule is currently interpreted, to demonstrate CPEng equivalence a person must:

o have undergone a competence assessment to the same standard as CPEng, as evidence by either of

o registered on the International Professional Engineers Register in any jurisdiction

o Professional Membership of IPENZ or an equivalent professional body; or registration which requires competence assessment meeting the standard implied by the Engineers Mobility Forum and APEC Engineers agreement

o provide evidence of currency in the form of proof of

o having undergone a competence assessment as described above in the last five years

o membership of a professional body or registration which requires compliance with

a code of ethical conduct that includes active participation in CPD to maintain

competence, and proscribes practicing beyond one‘s current competence; and

being actively engaged in professional engineering activities.

The acceptance of CPEng equivalence will be determined by the Registrar on a case-by-case basis.

23.3 WHEN IS A REFEREE ‘INDEPENDENT’?

The term ‘independent’ is not defined in the Rules but is interpreted as having no vested (financial) interest in the outcome of the assessment. For example, this test can be applied when the two referees to come from the same organisation – if a candidate works within a small practice where the two referees were principal shareholders, the panel may consider them not

Page 35: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 35 of 71

to be ‘independent’. However, if the two referees were managers (i.e. employees) from within the same (larger) organisation, the referees will be unlikely to have any financial interest in the assessment outcome.

23.4 REFEREE INPUTS

CAB requires assessment panels to make contact with referees in the following situations:

a. For all AFAs – at least one referee needs to be contacted;

b. If an assessment panel proposes to decline, suspend or remove registration,

c. All cases in which a candidate is found to have ‘marginal’ competence in some areas.

CAB requires panels to contact referees in these situations to ensure that all available sources of evidence are used in completing the assessment. It is good practice for assessment panels to make contact with each referee to discuss the information provided, although assessment panels may accept the written referee input when all the other evidence is ‘strong’.

The assessors are entitled to make use of information disclosed by a referee, even if not included in the referee’s written evaluation report. Referees are ethically obligated to disclose information affecting the candidate’s competence assessment. Thus, if the referee input is critical evidence influencing the panel’s decision, panels should either request the input in written form (with signature) or make a summary note to the referee (e.g., by email) seeking the referee’s confirmation of the information before making a recommendation to the Competency Assessment Board.

24. USING INFORMATION FROM REFEREES

The name of the assessor who made contact with the referee(s) needs to be recorded on the CA07 form, along with the specific elements explored. If a panel tried to make contact with a referee but the referee was too difficult to contact, this information should be recorded in the CA07 - particularly if competence was either marginal or not demonstrated.

It is unclear as to the extent the law offers protection of referee inputs, however, because IPENZ gives referees assurances that their input is treated as confidential, panels must:

Avoid naming referees or

Avoid quoting referee input verbatim in the CA07 report or

Avoid making any comment that will identify the referee or his/her input in the CA07.

If a panel gains key information from a referee, it needs to record this in a way that protects the confidentiality of the referee – for example, “based on inputs from referees, the panel considers that {the candidate has not always acted ethically}”. The safest way is to state that it is the panel’s conclusion or decision – and the referee's input has just been part of the input to that decision.

NOTE 9 – ADVICE FROM CAB MEMBER

A panel can seek advice from a Member of the CAB for the purposes of moderation. If it does so, the panel needs to record in the CA07 the name of the CAB member contacted and a brief note on the nature of the advice. Enter ‘Nil’ if no specific advice was given by a CAB member. If a CAB member gives the panel advice, summarise the advice in a single sentence – do not quote the feedback verbatim.

Page 36: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 36 of 71

NOTE 10 – ‘THREE CRA QUESTIONS’

On the front page of the CA07 form are 3 important questions only applicable to CRAs. They determine the way the panel can proceed with the assessment – whether to do an element-by-element analysis first (as with an AFA) or proceed directly to the holistic assessment, as required by the Rules (Rule 20) and Regulations (Regulation 21).

Question 1 asks if the panel considers the candidate’s practice area has changed materially. If it has, the assessment panel must proceed with the element-by-element analysis (as if it were doing an AFA) before completing the holistic assessment. The panel may ask the candidate for additional examples (as is required in the competence self-review for an AFA, as in the CA03 form).

Question 2 asks if the panel considers the evidence shows the candidate has taken reasonable steps to maintain the currency of his/her engineering knowledge and skills. This is a critical requirement of the standard for continued registration. If the assessment panel is not certain that this requirement has been met, it must undertake the element-by-element analysis.

Question 3 asks if the assessment panel is convinced that the candidate has provided sufficient evidence to show that he/she continues to consistently demonstrate competence. If convinced, the panel can proceed directly to the holistic assessment and document its findings in Section B. If the panel has any doubts about the candidate’s ability to demonstrate competence, then it must undertake an element-by-element assessment.

If the panel answers ‘No’, ‘Yes’ and ‘Yes’ respectively to these questions, it can proceed directly to Section B (the holistic assessment).

If a CPEng registrant not currently on the IntPE register seeks IntPE, the panel need only note its view on the candidate’s competence and National Office will complete the assessment by validating the candidate’s qualifications requirements. The panel is referred to “Prepare Recommendations” on page 54 for advice on documenting its decisions in relation to the holistic assessment.

25. EVALUATION OF EVIDENCE

25.1 ASSESSING CRAS

Following the Rule changes that took effect in 2012, there has been a greater emphasis placed on the holistic nature of the CRA. The Rule changes made interactive assessments mandatory for CRAs and as registrants would most likely developed higher levels of competence since last assessment, the majority of candidates should be able to readily demonstrate current competence in an initial interactive assessment. For a minority of candidates, a more comprehensive assessment would be required, and this would require a more comprehensive portfolio of evidence (possibly including a competence self-review) and a further interactive assessment.

If the assessment is a CRA and depending on the answers to the ‘3 CRA questions’, the panel can by-pass the element by element analysis by selecting the ‘Continue’ button (as per the screen image below) to progress directly to the holistic assessment (go to page 45).

Step 3 • Evaluation of evidence

Page 37: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 37 of 71

Otherwise, for CRAs requiring a more comprehensive portfolio of evidence, the panel would proceed in a similar manner to an AFA.

25.1.1 CRAs Involving Overseas Registrants

There are some 200 CPEng registrants currently practising overseas, and assessment panels are likely to be required to perform CRAs for such candidates. The competence standard has a New Zealand-specific requirement – that the candidate is ‘still able to’ comprehend and apply knowledge of good professional engineering practice that is specific to New Zealand.

This does not require the candidate to reside or practice in New Zealand. However, it does require the candidate to provide evidence of his/her competence to perform good New Zealand-specific engineering practice in his or her practice area.

If the candidate’s practice area involves limited New Zealand-specific engineering knowledge (such as for a software engineer), then there will minimal evidence required. If, however, the candidate’s practice area involves a high level of New Zealand-specific engineering good practice (such as seismic engineering), then evidence will be required to show that he/she is ‘still able to practice competently’ in the New Zealand context, through activities such as:

a. Performing engineering work for New Zealand-based clients;

b. Working in an international company which has New Zealand offices, and he/she participates in New Zealand-based activities;

c. Working in an environment where design codes, standards etc are based on New Zealand standards and codes. In such a situation, the candidate would need to show how he/she is aware of and applies these in the context of the New Zealand regulatory environment (e.g. application of the New Zealand Building Code);

d. CPD includes development and/or training on New Zealand-specific practice. This CPD would normally be evidence in addition to one or more of (a) through (c) above. Note that passive CPD (e.g., private reading) alone is unlikely to provide sufficient evidence to satisfy the requirements for continued registration.

25.2 ASSESSING AFAS AND SELECTED CRAS

For AFAs and CRAs where the panel has decided more extensive evidence is required, the following screen will appear.

Page 38: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 38 of 71

NOTE 11 – SECTION A: ELEMENT-BY-ELEMENT ANALYSIS

All AFAs and CRAs where the panel require it, assessment panels must complete the element-by-element analysis contained in Section A.

The assessor portal requires assessment panels to first make a judgement on competence displayed for each element (i.e., one of ‘consistently demonstrates competence’, ‘demonstrates marginal competence’ etc) for each of the ‘relevant registers’ before seeking justification for those decisions. The competence standard for each element will depend on the “relevant registers” (for AFAs – see “Note 1” on page 19) or the ‘registers under review’ (for CRAs).

NOTE 12 – OPTIONS FOR ELEMENT-BY-ELEMENT ANALYSIS

Assessment panels doing an AFA are required to consider the extent to which competence is demonstrated for each element. The competence standard for the selected competence levels is listed and assessment panels are asked to make their assessment by selecting the appropriate option from the drop–down box. Further details on the justification for each decision follow.

The panels have the following options when assessing each element of the competence standard:

‘Consistently demonstrates competence’ – means that the candidate meets the standard of competence required;

‘Demonstrates marginal competence’ - means the evidence of competence is weak and the assessment panel considers the standard has not been consistently met. There are three situations where this may occur:

o The candidate shows competence but there is some negative evidence – such as occasionally making significant errors through carelessness or inconsistent performance; or

o The candidate is performing competently but the nature of the engineering work is at the threshold of the required level of complexity; or

o The candidate is performing at the appropriate level of complexity but is on the threshold of competence.

Page 39: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 39 of 71

‘Not yet demonstrating competence but developing’. The evidence presented shows that the candidate does not yet meet the standard, but the assessment panel considers he/she has the ability to perform to the standard. Typically BE-graduates who have not yet been exposed to the full range of professional engineering activities may not yet have acquired adequate evidence to demonstrate they satisfy requirements of the element. An assessment panel may decide that based on the candidate's ability to investigate and analyse engineering problems and his or her commitment to CPD, the candidate will develop this level of competence as he/she encounters appropriate work in the future.

‘Does not demonstrate competence’. The candidate has not provided evidence that clearly shows he or she is performing at the level required, or there may be ‘negative’ evidence which shows he/she does not meet the requirements of the element.

NOTE 13 – JUSTIFICATION OF DECISIONS FOR EACH ELEMENT

Assessment panels must document the reasons for making assessment decisions. This means citing specific examples of evidence that show the candidate meets (or otherwise) the level of competence required by the standard.

It is good practice to make explicit reference to the performance indicators when stating the critical evidence used by the panel. For example, in element 3:

“The candidate developed a prototype model and conducted tests under a range of conditions to test her analysis for correctness.”

Or for Element 7:

“The candidate developed a risk matrix which listed a comprehensive range of factors contributing to risk and safety in construction and operation associated with his design. For each risk item, a strategy was developed to mitigate the effects of the risk. The candidate managed design risks that impacted on people and property.”

Page 40: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 40 of 71

Panels should state the factors contributing to the complexity (with respect to the relevant standard) of the engineering work undertaken (or state why the work does not meet the complexity requirements).

25.3 REFERENCING PERFORMANCE INDICATORS

Assessment panels are encouraged to refer to the performance indicators when evaluating critical evidence in the element-by-element analysis. For example, the standard and the performance indicators for element 11 are:

11 Maintain the currency of his or her professional engineering knowledge and skills

• Demonstrates a commitment to extending and developing knowledge and skills

• Participates in education, training, mentoring or other programmes contributing to his/her

professional development

• Adapts and updates knowledge base in the course of professional practice

• Demonstrates collaborative involvement with professional engineers (NZ engineers for

CPEng assessments)

The following is an example of a panel assessing a candidate with poor CPD in a practice that was both professionally and geographically isolated. The panel’s assessment was that the candidate did not 'consistently demonstrate competence', and it gave the following reason as justification of its decision on element 11 (references to the performance indicators underlined):

Element 11: The candidate is aware of his lack of diligence in this area. He advises of his attendance at one seminar (AS/NZS 1170) and claims he looks out for seminars pertinent to practice area but finds few that are really suitable. His evidence shows poor participation in education, training, mentoring and other programmes contributing to his professional development. He also cites his involvement with other professionals such as, surveyors re subdivisions and architects re building projects in which the currency of his engineering knowledge was maintained and enhanced by the interaction with these professionals on which he was involved. While this interaction has benefits in terms of the linkages with those professions in his daily engineering activities, there is little evidence of a collaborative involvement with professional engineers. His portfolio of evidence shows his commitment to extending and developing his engineering knowledge and skills falls short of that expected of a reasonable professional engineer. While competence can be currently demonstrated, the panel is concerned that with the isolation of his practice he may be unaware of any regression in competence - and if his competence drops below an acceptable standard.

A similar reference to the performance indicators can be used when documenting critical evidence in support a decision that the candidate ‘Consistently demonstrates competence’.

25.4 IMPLIED EVIDENCE

Assessment panels must not rely on 'implied evidence' - they can only use evidence which clearly shows the candidate has actually done the things required by the competence element. For example, it is not acceptable to record (for element 10) “The candidate is a senior executive engineer and must be able to communicate clearly to perform his job”. A more appropriate example would be “The candidate is a senior executive and is responsible for chairing client meetings, briefing staff on management decisions, managing contractors and reporting to senior management on the performance of his division.”

Page 41: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 41 of 71

25.5 COMPLEXITY OF ENGINEERING

Panels can ensure their CA07 reports are robust by specifying attributes that contribute to candidates’ engineering problems and activities. The definitions are given as part of the relevant competence standard and are summarised on the CA03 competence self-review forms. Similarly, in cases where a negative recommendation is proposed, reference to the attributes that are lacking (if the engineering was not at the required level of complexity) will give better support for the panel’s recommendation. The CPEng standard contains the following definitions of ‘complex engineering’:

“The design of foundations for the multi-storey hotel, which was located in a geothermal

area, involved complex problems because there was very limited information available for

designs in such environments, it was outside problems encompassed by standards and

codes of practice for professional engineering, had no obvious solution and required

researching the literature and application of originality in analysis.”

26.4 NEGATIVE RECOMMENDATIONS

If the panel proposes a negative recommendation (either to ‘decline’, ‘suspend’ or ‘remove’ registration), a ‘paper trail’ record of the efforts made by the assessment panel to obtain evidence of the candidate’s competence needs to be kept and summarised in the CA07. All assessment tools (including the interactive assessment) must be offered to the candidate. The panel must also make contact with referees before completing its recommendation to the CAB. The CAB (and potential reviewers or Appeal Panels) can then be assured that the assessment panel gave the candidate every opportunity to demonstrate competence.

Panels should also state the reasons why the work undertaken by the candidate did not meet the required levels of complexity, having regard to the definitions for the relevant competence standard.

CAB has decided all reports with negative recommendations will be moderated by a CAB member. The normal process is for National Office to refer the draft report to a CAB member before it is presented to the CAB. In this way, the CA07 report can be reviewed for inconsistencies (see “Consistency within CA07 Reports” on page 49) and the panel provided with advice for the purposes of moderation between assessments.

Page 42: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 42 of 71

25.6 ASSESSING ELEMENT 2

25.6.1 New Zealand-based candidates

‘Good practice’ is linked to the engineer's practice area and panels should be guided by the relevant practice guidelines, IPENZ ‘Practice Notes’, advisories from regulators, codes of practice, standards etc within the candidate’s practice area.

25.6.2 Assessing Overseas AFA candidates for CPEng

An AFA applicant for CPEng does not have to live or practice in New Zealand to meet element 2 requirements. The applicant must demonstrate that he/she ‘is able to’ comprehend and apply knowledge of ‘good professional engineering practice specific to New Zealand’. While New Zealand work experience is good evidence, assessors need to be aware that overseas candidates may be able to demonstrate competence through:

working for a New Zealand client,

working within a New Zealand-based company while posted overseas or

working in environment where the local jurisdiction applies the same good practice as New Zealand.

The IPENZ ‘credit schedule’ now applies the principles of the Trans Tasman Mutual Recognition Agreement (TTMRA) when considering candidates with overseas registrations for CPEng registration. This includes the concept of assessing for ‘occupational equivalence’, which is explained more fully in the section on page 63.

Assessors should also be familiar with the information provided in the ‘credit schedule’ for overseas candidates who have previously undertaken competence assessments outside New Zealand and now seek ‘credit’ under the IPENZ ‘credit schedule’ – full information is available from the IPENZ web site at http://www.ipenz.org.nz/IPENZ/Forms/pdfs/Credit_for_Registrants_from_other_Jurisdictions.pdf

25.6.3 Assessing Overseas Candidates for IPENZ Membership

Assessment for IPENZ membership requires assessment of the applicant’s knowledge of ‘good practice’ in the jurisdiction within which he/she currently practises. New Zealand-based assessment panels would normally only be able to assess overseas candidates for practice within the New Zealand jurisdiction, so the expectation is that element 2 will be based on New Zealand-specific practice unless assessors have knowledge of good practice in the applicant’s jurisdiction.

The exceptions when assessing overseas candidates for IPENZ membership are:

When candidates are assessed within the UK (where local assessors are used to assess for practice within the UK);

When candidates are accepted via the credit schedule (meaning that the engineer has already been assessed as being competent by an assessment panel in that jurisdiction); or

When, in rare circumstances, a New Zealand-based assessment panel has the expertise to make an assessment in the candidate’s jurisdiction.

25.6.4 CAB Advice if Candidate Overseas at Time of Assessment

If a candidate has been or is practising outside New Zealand at the time of assessment, CAB ask that the assessment panel record how long the candidate has been out of New Zealand, and specify the efforts the candidate has made to maintain competence of New Zealand-

Page 43: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 43 of 71

specific skills in his or her practice area. CAB wants clear evidence that the panel has considered this issue and it has not been overlooked.

25.6.5 Case Study – CRA For Overseas Registrant

A candidate is currently overseas after being educated in New Zealand and completing postgraduate studies before departing from New Zealand. The candidate gained CPEng whilst overseas and was given a 4-year term due to the limited evidence of application of good professional engineering practice specific to New Zealand. The engineer is now a highly regarded practitioner, performing work in a number of European countries and has now submitted a portfolio of evidence for continued registration assessment. After reviewing the portfolio of evidence, the assessment panel was in no doubt that the engineer was practising as an “expert” – but there is no evidence of New Zealand specific practice. How should the assessment panel prepare a recommendation for the Competence Assessment Board?

The issue for the panel is the lack of evidence demonstrating competence in the New Zealand jurisdiction. The standard for initial registration - Rule 6(2) - requires that ‘the extent to which the person is able to … comprehend and apply’ good practice specific to New Zealand must be taken into account (for a continued registration assessment, read ’is still able to’ …). If there is no specific New Zealand based evidence provided, is there any other evidence that the panel can use to make a judgement on the candidate’s ability to practice competently in New Zealand? What are the particular requirements that distinguish practice in this practice area in New Zealand and the European countries in which the engineer practises? Are these differences significant?

If the panel considers that the standard for continued registration is met, then it must determine the term to the next assessment. In making this decision, the panel may set a term of 4 years (in accordance with the CAB policy on term to next assessment – see page 55) as the evidence provided is weak for demonstrating competence in New Zealand-specific good practice.

The panel must comply with the CPEng Rules, and link the issues under discussion with relevant Rules. It cannot take into account irrelevant information in making its decision.

25.7 ASSESSING ELEMENTS 3 AND 4 – ENGINEERING MANAGERS

The requirements for these elements are that the candidate has provided evidence of his or her ability to analyse, investigate and develop solutions to (or designs for) complex engineering problems. Panels commonly encounter candidates in an engineering management role who provide evidence of competently analysing, investigating and developing solutions to complex engineering (management) problems but with little or no engineering design. This is not a reason for an automatic ‘decline’, but will require the assessment panel to consider the evidence in terms of the practice area description – and may require some amendment to the practice area description if appropriate.

The table below summarises some of the things that an engineering manager might submit as evidence for a competence assessment.

Elements 1 to 6

knowledge, analyse + solve +manage

Elements 7 to 10

risk, ethics, impact, communication

Element 11

Continuing Professional Development

Element 12

Engineering Judgement

Page 44: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 44 of 71

Elements 1 to 6

knowledge, analyse + solve +manage

Elements 7 to 10

risk, ethics, impact, communication

Element 11

Continuing Professional Development

Element 12

Engineering Judgement

Hand calcs, investigations reports, specs

design concepts, standards, decisions, schedules,

project plans, plans, policies,

strategies

Risk matrices QA, peer review, ethical issues, stakeholder interaction, workshops,

consultation,

environmental awareness

Interpersonal skills, team management,

efficiency, project mgt, product mgt ,

financial mgt ,

governance

best examples

product success

design success

+ve feedback

recognition by

peers &clients

longevity

robustness

25.8 ASSESSING ELEMENT 8 (ETHICS)

The Code of Ethical Conduct for CPEng is prescribed in Rules 43 to 53 inclusive and by IPENZ Regulations by Regulations 44 to 54 inclusive for the other registers.

Assessment of element 8 differs from the other elements in that:

1. if an assessment panel decides that there is sufficient negative evidence to decide that the standard has not been met, the panel must specify which part (or parts) of the ethical code has been breached. For example, if a person has performed recklessly putting lives at risk, the assessment panel must reference Rule 43 or Regulation 44 in justification of its decision.

2. If a candidate is declined because the panel identified negative evidence of ethical conduct, the evidence has to be of sufficient gravity that it could be grounds for initiating disciplinary action. This was the outcome of an appeal where the Appeals Panel ruled that breaches of a minor nature were not sufficient grounds for declining a candidate.

Element 8 is unique in that it applies the same standard of competence for all types of engineer – technicians, technologists and professional engineers – who must behave to the same ethical standard. Thus when considering a candidate for relevant registers there is no need for an assessment panel to consider Element 8 differently for each of the various quality marks.

25.9 ASSESSING ELEMENT 11 (MAINTAINING COMPETENCE)

CAB advice to assessment panels when assessing element 11 is that formal class-room type training is not the only acceptable form of CPD. CAB has noticed many panels place emphasis on formal training, so panels need to consider learning outcomes from a range of learning activities that enable engineers to maintain currency in their practice area. The question assessment panels need to ask when considering a CPD activity is “How does this activity contribute to the engineer’s competence in his or her practice area?”. Valid CPD activities may include mentoring, conducting assessments, contributions to developing ‘good professional

Page 45: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 45 of 71

engineering practice’ or standards, papers to publications or presentations at ‘learned society’ events (conferences, seminars etc), IPENZ ‘Practice Notes’.

NOTE 14 – USING WORD TO COMPLETE THE CA07 FORM

Assessment panels are encouraged to use Word when documenting panel findings in the on-line CA07 form. This involves ‘cutting and pasting’ the entire text from the on-line system into a Word document, then compiling the report and copying the text back into the on-line CA07 report.

26. HOLISTIC ASSESSMENT

NOTE 15 - HOLISTIC ASSESSMENTS

Mark the relevant box summarising the panel’s holistic decision on the candidate’s competence.

Page 46: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 46 of 71

NOTE 16 – JUSTIFICATION OF HOLISTIC ASSESSMENTS

26.1 HOLISTIC ASSESSMENTS FOR AFAS

The key points panels must consider in drafting a holistic assessment are:

Ensuring consistency between the holistic assessment and each of the elements (see “Consistency within CA07 Reports” on page 49);

The grounds for a reduced term being recommended – follow the CAB policy (see page 55) and provide good reasons for any deviation from this policy.

If any elements are assessed as other than ‘Consistently demonstrates competence’ and a positive recommendation is being made, the holistic assessment needs to explain why the ‘weak’ element(s) did not detract from assessment of the candidate’s overall competence.

The reasons (critical evidence) for any recommendation to ‘decline’ (or ‘suspend’ or ‘remove’) registration.

For an AFA, the assessment panel is required to consider the candidate’s competence for each of the relevant registers, taking into account the extent to which he/she has demonstrated competence for each of the elements at that level, as assessed in the element-by-element analysis. Then it must make a holistic assessment of the candidate’s competence.

Consider the candidate’s competence holistically in his or her practice area - is he/she practising as a reasonable professional engineer? If the candidate is considered marginal in some elements, how critical are these elements to the candidate’s practice area? For example, a project manager might be weak in design or developing solutions to engineering problems’ (element 4) – but is this activity important in his/her role? The assessment panel may decide that even if the candidate is weak in this element, holistically he/she meets the competence standard because this element is not so critical to his/her practice area.

If the assessment panel considers the candidate does not meet the required level of complexity for some elements (either engineering problems as in elements 3 or 4, or engineering activities in elements 5 and 6), it may consider the candidate for a different quality mark (‘relevant register’) that more closely matches his or her level of engineering complexity.

The assessment panel is asked to make a holistic assessment for each of the relevant registers by marking one of the 5 columns:

1. ‘Consistently demonstrating competence’, where the candidate has provided evidence of competence over time and in a range of different contexts;

2. ‘Demonstrates competence, but early review recommended’ is used when a candidate is assessed as other than ‘consistently demonstrating competence’ for one or some elements of the standard. The panel is satisfied that the candidate meets the minimum standard of competence, but considers the term to next assessment should be less than 5 years as the engineer needs to address some matters that may adversely impact on his/her competence in the future.

3. CAB’s advice to assessors on use of a reduced term to the next assessment is: the first decision for any panel to consider is “is the engineer competent now?”. If not, the decision should be to decline registration (or suspend for CRA). Assessment of the term to next assessment is a separate issue and is covered in a later section (see page 55) of these guidelines.

4. ‘Not yet demonstrating competence but developing’, where the panel has evidence of the engineer’s abilities or potential but lacks evidence of actual performance at the level required.

Page 47: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 47 of 71

5. ‘Does not demonstrate competence’, where the candidate has provided plenty of evidence but the evidence shows the engineer is not performing at the level of competence required.

26.2 CRA HOLISTIC ASSESSMENTS

Where the panel has found the CRA candidate meets the standard for continued registration assessment based solely on a holistic assessment, the panel should report its findings in response to the following questions:

Did the candidate demonstrate he or she had taken reasonable steps to maintain his or her current competence?

o Had the candidate listed the changes that had occurred in his or her practice area?

o What steps had the candidate taken to gain new knowledge to keep abreast of these changes?

Did the work samples show the candidate is ‘still able to’ practice competently at the level of complexity required for continued registration on the registers under review?

o What were the factors contributing to the appropriate level of complexity?

o What evidence was there of the candidate having applied the new knowledge to keep current in his or her practice area?

26.3 GOOD HOLISTIC ASSESSMENTS FOR CRA

When the panel bypasses the element by element section of the report for a CRA, the holistic statement must cite evidence that satisfied the panel that:

a. The candidate has identified areas within his or her practice area where changes have occurred and

b. He or she has taken reasonable steps to maintain current knowledge and skills in these areas of change; and

c. Current competence has been demonstrated by the application of new knowledge in the current practice area.

d. This does not automatically require an element by element assessment. If there is any indication of competence not being demonstrated, a more extensive assessment will be required. In such a situation, the panel may require the candidate to provide a more comprehensive portfolio of evidence with supporting evidence for each element (as in the case of an AFA candidate).

26.4 POOR HOLISTIC ASSESSMENTS FOR CRAS

NOTE 17 – EXTRA REQUIREMENTS FOR INTERNATIONAL REGISTERS

Section C of the CA07 provides the facility to record a panel’s judgement on whether the candidate meets the additional non-competence requirements for the international registers. These include holding the relevant exemplar qualification (Washington Accord for IntPE, or the Sydney Accord for IntET), having had 7 years post-graduation experience (i.e. work experience at the appropriate level of complexity since graduation of the Washington Accord or Sydney Accord degree), 2 years responsible in-charge experience and citizenship (if working overseas).

Page 48: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 48 of 71

Assessors are not expected to make a judgement on whether or not a qualification is benchmarked to the relevant Accord – that information is provided by National Office. However, the agreements underpinning the international registers require candidates actually hold an accredited Accord qualification (or recognised equivalent) – a knowledge assessment does not satisfy this requirement. A candidate who does not hold an exemplar qualification but satisfies element 1 through the IPENZ knowledge assessment process, does not satisfy the qualifications requirement for registration on the international registers.

26.5 ASSESSING OVERSEAS CANDIDATES FOR INTERNATIONAL REGISTERS IPENZ assesses overseas candidates for the international registers in accordance with the principles

outlined in the respective agreements. Deciding on which jurisdictional practices to assess against

for element 2 will depend on (i) the jurisdiction within which the candidate practises, (ii) the

candidate’s citizenship if he/she practises outside New Zealand, (iii) if the jurisdiction within which

he/she practises is an IntPE/IntET signatory, and finally (iv) the appropriate section of the register on

which he/she should be registered – as illustrated in the chart below:

Jurisdiction in which applicant is currently practising?

The applicant is a citizen of which jurisdiction?

Which register is the applicant eligible for?

Assessment based on practice within which jurisdiction?

New Zealand Any IntPE(NZ) or IntET(NZ) – assessed by IPENZ

New Zealand

Other IntPE signatory New Zealand Either IntPE(NZ) / IntET(NZ) if assessed in New Zealand by IPENZ

New Zealand

Page 49: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 49 of 71

Jurisdiction in which applicant is currently practising?

The applicant is a citizen of which jurisdiction?

Which register is the applicant eligible for?

Assessment based on practice within which jurisdiction?

Or IntPE/IntET register in other jurisdiction (if assessed by signatory body of other jurisdiction)

Other IntPE/IntET jurisdiction

Non-New Zealand

IPENZ cannot assess – must be assessed by local signatory for local IntPE/IntET register

Other IntPE/IntET jurisdiction

Non-IntPE signatory

New Zealand Can be assessed for IntPE(NZ) / IntET(NZ) by IPENZ

New Zealand

Non-New Zealand Cannot be assessed by IPENZ for IntPE(NZ) or IntET(NZ)

Not relevant

26.6 IN SUMMARY

1. If the candidate is practising in New Zealand, he/she must be assessed by IPENZ against New Zealand-specific practices for element 2 to gain entry into the New Zealand section of the international registers irrespective of his/her citizenship.

2. If the candidate is a New Zealand citizen practising in an IntPE/IntET signatory jurisdiction, he/she has a choice between (a) being assessed by IPENZ using New Zealand jurisdictional practice for the New Zealand section of the international registers, or (b) being assessed by the signatory body of the other jurisdiction (using its jurisdictional practice) for its section of the international registers.

3. If the candidate is a New Zealand citizen practising in a non-signatory jurisdiction, he/she can be assessed by IPENZ for New Zealand section of the international registers using the New Zealand-specific jurisdictional requirements for assessment.

4. Candidates who are practising outside New Zealand who are not New Zealand citizens cannot be assessed by IPENZ for the New Zealand section of the international registers.

5. Assessors are referred to the ‘credit schedule’ for advice being given to overseas candidates with previous competence assessments outside New Zealand seeking ‘credit’ under the IPENZ ‘credit schedule’ – see http://www.ipenz.org.nz/IPENZ/Forms/pdfs/Credit_for_Registrants_from_other_Jurisdictions.pdf

26.7 CONSISTENCY WITHIN CA07 REPORTS

Before an assessment panel submits its completed CA07 form for the CAB, it needs to ensure that the CA07 document is congruent – that the text for the individual elements is consistent with the holistic assessment summary and final recommendation. One of the main reasons the Competency Assessment Board returns reports to panels for clarification is due to incongruence in the report. CAB commonly encounters incongruence in reports and these can be categorised into one of six

categories:

Page 50: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 50 of 71

a. Inconsistency between the holistic assessment and the element by element analysis. For example, element 11 (maintaining competence) may be assessed as meeting the standard with the critical evidence documentation for that element showing the assessment panel had no reservations that competence has been demonstrated – yet in its recommendations, the panel recommends a reduced term to next assessment because of “poor CPD”.

b. Inconsistency between elements. For example, a candidate with a dated exemplar qualification (or who does not hold an exemplar qualification for the specific quality mark being applied for) would be expected to provide evidence of a background of strong CPD to meet the standard. An inconsistency could then occur if an assessment panel assessed such a person for element 1 (engineering knowledge) as meeting the standard and element 11 (maintaining competence) as ‘marginal’ or ‘not meeting the standard’ because of poor CPD.

c. Inconsistency between practice area description and critical evidence. For example, the practice area description may include “structural design of buildings and bridges”, yet the critical evidence cited in the report may be based on managing construction of (say) solid waste landfills and waste water treatment systems. In such a situation, the assessment panel needs to either review and amend the practice area description or be more specific in recording the critical evidence.

d. Use of irrelevant information. For example, the practice area descriptor might include ‘specialist investigations and analysis of corrosion in concrete pipes’. Yet the panel find that the engineer does not meet elements 5 and 6 because “there is no evidence of the candidate having experience in tendering and contract management”. In such a case, the assessment panel needs to determine whether the engineer has responsibility for and makes engineering decisions on engineering activities that match the level of complexity required, and whether he or she competently manages such engineering activities. If the practice area description does not include tendering and contract management, and the engineer is not involved in tendering or contract management, then this information is irrelevant.

e. Over reliance on referees. CAB has frequently seen reports where the only critical evidence cited is the referee input – such as “both referees state that the candidate is competent at (risk management, say) and recommends her for CPEng. The panel agrees.” While the referee input may be compelling, the panel is responsible for making the assessment, and it needs to show that, having considered all the evidence, it is satisfied that the candidate meets the standard. A better way of recording its view might be: “The candidate provided strong evidence of competence in risk management (cite specific examples) and this was confirmed by reports from the referees.”

f. Inconsistencies with the concepts of competence. Assessment panels must ensure that they do not take into account matters that are inconsistent with the concepts of competence (such as lack of specific qualifications, insufficient experience or being too young). Refer to section “Competence concepts” on page 5.

27. VALIDATE PRACTICE AREA

Validation of the practice area description involves matching the practice area description with supporting evidence (as evaluated through the assessment process) and removal of activities

Step 4 • Validate Practice Area

Page 51: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 51 of 71

that have no engineering content – such as team leadership, budget planning and financial management. First panels need to understand what ‘practice area’ means.

27.1 DEFINITION OF PRACTICE AREA

All assessments must be made in the engineer's practice area, as defined in Rule 3 and Regulation 2:

practice area means an engineer’s area of practice, as determined by—

the area within which he or she has engineering knowledge and skills; and

the nature of his or her professional engineering activities.

NOTE 18 – REVIEW CANDIDATE’S PRACTICE AREA DESCRIPTION

Assessment panels must review the practice area description (as provided by the candidate) taking into account the evidence presented by the candidate. Everything in the practice area description must be supported by evidence and if there is no supporting evidence for some aspect of the practice area description and the candidate does not provide supporting evidence (or the evidence does not demonstrate the appropriate level of complexity), it must be removed.

The practice area description consists of two essential parts:

a. knowledge and skills and

b. nature of the engineering activities.

The nature of the activities can be described with words such as:

Designing

Construction monitoring

Production management

Asset management

Project management

Policy development

Forensic investigations and failure analysis

Research and development

Technical due diligence

Computer modelling

Environmental impact assessment

Engineering knowledge can be quite diverse, but could include things such as:

Page 52: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 52 of 71

High power multi-band transmitter antenna

Reinforced concrete and steel bridges

Transportation networks and infrastructure

HV Electrical power reticulation networks

Petrochemical exploration and production

Biomedical instrumentation and software

Water, waste water and storm water reticulation

By combining these essential components, the development of a practice area description can be more readily understood. For example:

a. Design and construction monitoring of reinforced concrete and steel bridges

b. Policy development for transportation networks and infrastructure

c. Project management in the design and production management of biomedical instrumentation and software.

An engineer’s practice area is regarded as being unique to the engineer, and candidates are required to provide a description using typically 15 to 25 words. Assessors should not make any reference to ‘practice fields’ in their CA07 reports as they are not relevant to the assessment process. (Practice fields are only used by National Office staff when appointing assessors to an assessment panel).

Candidates wishing to be assessed for Recognised Engineer or Design Verifier should have terms such as ‘dams safety engineering’ or ‘design verification’ included in their practice area descriptions. Any candidate may have these terms in their practice area descriptions – but if they do, IPENZ staff will assume that they are seeking to be assessed as Design Verifier or Recognised Engineer.

When candidates provide evidence of undertaking dam safety engineering but have not applied to be a Category A Recognised Engineer, CAB requests panels to indicate his or her competence in dams safety engineering by making specific reference to it in the candidate’s practice area description. This information is useful as confirmation of the engineer’s ability to self-declare as a Category B Recognised Engineer. In the Building (Dam Safety) Regulations a category B recognised engineer is defined as an engineer who “…has general civil engineering ability and experience”.

NOTE 19 – AMEND INACCURATE OR POOR PRACTICE AREA DESCRIPTIONS

CAB requires assessment panels to review candidates’ practice area descriptions in the light of the evidence presented – assessment panels must amend practice area descriptions if they are inaccurate or inappropriate but should avoid excessively worded descriptions. The ideal time to do this is at the interactive assessment (if there is one) in conjunction with the candidate.

Terms such as ‘mentoring’ and ‘project management’ are generic and not specific to engineering. Engineers doing project management must qualify the project management work with some engineering related function – for example, ‘project management of design and production of precision electronic test equipment’. Assessment panels are also asked to make a judgement on the complexity of engineering involved in what the candidate actually does, as described in his or her practice area description. Panels should restrict comment to the level of complexity involved in the work samples submitted – while breadth is one factor of complexity, there are others. Ability to solve engineering problems and perform engineering activities of sufficient complexity may be demonstrated in a relatively specialised or focussed area. It is the

Page 53: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 53 of 71

nature of the candidate’s work and not the nature of the candidate’s employment status or employer that is the relevant factor in making this judgement.

28. VALIDATE PRACTICE FIELD INFORMATION

Assessment panels are required to validate candidate’s practice fields as practice field information is now listed in the on-line registers of current competence.

The purpose of adding practice field information is to provide meaningful information to the public on the general nature of engineering work an engineer is engaged in to help select a suitable engineer from a list of registrants when engaging the services of an engineer.

Publication of practice field information is not intended to assist engineers in marketing their services, nor is it intended to indicate that an engineer is competent to perform a specific task.

CAB offers the following advice to assessment panels when considering practice fields:

1. Alignment of the practice area. An engineer’s practice area, or substantive aspects of it, should lie within the most relevant practice field. However, there is no requirement to demonstrate competence across the whole practice field for it to be listed.

2. Multiple practice fields. Panels should list the minimum number of fields that ‘contain’ the candidate’s specialist skills (practice area). For example, an electrical engineer who is engaged in the building services area should not have ‘Electrical’ and ‘Building Services’ listed unless he or she is also doing electrical engineering outside of building services (such as generation or reticulation); similarly a ‘Structural’ engineer should only have ‘Civil’ listed if he or she is also doing non-structural civil engineering (such as potable or waste water treatment and reticulation).

3. Engineering Management. The Engineering Management practice field is intended to cover engineers who manage multi-disciplinary engineering activities that are so multi-disciplined that it is difficult to readily link their engineering practice with any other specific practice field. For example, an engineer who manages mechanical engineering projects should use either the ‘Mechanical’ practice field or the ‘Engineering Management’ but not both. The ‘Engineering Management’ field is not intended to cover general business management. Project managers, asset managers and engineers working in policy development are considered to be working in areas of work that are generally multidisciplinary in nature and therefore most likely to use the ‘Engineering Management’ field.

4. ‘Practice field’ and ‘practice area’. CAB advises panels to focus on practice area descriptions rather than practice fields during the assessment. If a candidate’s practice area is well-defined, panels are unlikely to struggle identifying the relevant practice field(s). Panels should therefore focus their efforts into getting clarity with the practice area and how this is supported by the evidence presented. The alignment of practice area with practice field should then be straight forward and panels should not need to spend a lot of time on validating practice field information.

Step 5 • Prepare recommendations

Page 54: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 54 of 71

29. PREPARE RECOMMENDATIONS

Before completing this section, it is worth reviewing the decision making process (as outlined in section on ‘Decision Making’ on page 5). Recheck the validity, sufficiency and authenticity of evidence. Does the panel have any reservations about any aspect of these? If so, the panel needs to do more work.

NOTE 20 – RECOMMENDATIONS FOR ‘RELEVANT REGISTERS’ (REGULATION 11)

The IPENZ Regulations require assessment panels to consider not only the registers a candidate applied for, but also other registers for which the candidate may qualify (referred to as the ‘relevant registers’) when undertaking AFAs. For example, an candidate for CPEng should also be considered for IntPE (unless he/she clearly does not meet the standard – such as does not have a Washington Accord degree); similarly, an ETPract candidate who is found to be performing complex engineering may also be assessed for CPEng (or vice versa).

For AFAs, the panel can only recommend one of two options – that the application either be approved or declined. If the recommendation is to ‘approve’ the application, then the panel must specify the period to the next assessment for quality marks of current competence (the term to next assessment is not relevant for IPENZ membership).

29.1 CAB ADVICE: ‘RELEVANT REGISTERS’

CAB does not want relevant registers to be (or seen to be) offered as a ‘consolation prize’ when a panel recommends declining an application. Assessment panels must only offer a relevant register when the evidence shows the candidate is competently performing work at the appropriate level of complexity – offering a ‘relevant register’ is not appropriate when, for example, a CPEng candidate is performing complex engineering work incompetently. If a CPEng candidate is competently performing ‘broadly defined’ engineering work (rather than complex engineering work) then recommending ETPract would be appropriate.

CAB advises assessment panels to follow the process below in such a situation:

1. Assesses and document findings for the quality mark(s) applied for.

Page 55: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 55 of 71

2. If the candidate is found to not meet the competence required for those quality mark(s), then identify the ‘relevant registers’ and assess the candidate’s competence at the required level of complexity on an element by element basis.

3. If the candidate is found to be performing work competently at the level required for the relevant register, then make a recommendation for registration on the relevant register(s).

If assessment panels need further guidance on the level of complexity for ‘relevant registers’, CAB recommend panels refer to ‘engineering edge’ (located on the IPENZ website at http://www.ipenz.org.nz/ipenz/forms/pdfs/engineering_edge.pdf ).

NOTE 21 – TERM TO NEXT ASSESSMENT

CAB’s policy provides detailed advice on setting the term to next assessment. The policy is printed in full on page 66

29.2 CAB ADVICE – TERM TO NEXT ASSESSMENT POLICY FOR AFA CANDIDATES

The term ‘significantly’ was used in the policy to describe the extent demonstrated competence needed to be above the minimum standard to justify a six-year term for AFA candidates. CAB experience had shown inconsistency by panels in the application of this provision, accordingly, the CAB developed advice to panels to assist in achieving a more consistent approach by panels.

The only way to measure a candidate’s competence is by the evidence presented (by the candidate, referees etc) to an assessment panel.

Accordingly, CAB offers the following advice to panels in the application of the term to next assessment policy:

a. Once a panel has decided a candidate meets the standard for initial registration (i.e., in an AFA), the starting point in deciding the term to next assessment should be four years;

b. If there is evidence supporting competence demonstrably above the minimum standard, (meaning there is evidence that the engineer is a ‘low risk’ in terms his or her ability to practice ‘safely’ as an independent practitioner) a six-year term can be recommended. Panels must document evidence to support this recommendation.

c. Evidence supporting a six-year term would normally include several of the following:

Post graduate qualifications building on a Washington Accord undergraduate degree. Note: the need to complete a Knowledge Assessment introduces an element of risk to the assessment which would generally point towards a four-year term;

Evidence of a track record of successful assessments in other engineering jurisdictions (such as overseas registration, Heavy Vehicle Specialist Certifier);

Evidence that the candidate is operating at a career stage requiring technical competence (independent practice – National, or International scope) and/or leadership competencies (Team Leader or Technical Manager) beyond the minimum standard for independent practice;

Evidence that the candidate is continuing to develop their engineering knowledge and engineering practice through a structured and proactive approach to professional development;

Evidence includes demonstration over a sustained period (ie typically four years).

d. Evidence of “safe” practice across a broad range of engineering projects and activities in their practice area that is supported by good risk management (element 7), professional

Page 56: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 56 of 71

development (element 11) and engineering judgement (element 12). Similarly, a two-year term should be recommended where evidence in elements critical to the candidate’s practice area are marginal or inconsistent. This evidence may be exhibited in poor judgement, risk management, Continued Professional Development; limited complexity in engineering problems and or activities; limited knowledge (or application of it) or as poor professional behaviour.

e. If a panel is satisfied that an AFA candidate meets the minimum standard, no specific justification is required if a four-year term is proposed.

NOTE 22 – REGISTERS UNDER REVIEW

When an assessment panel is processing a CRA, it will only be able to assess for ‘registers under review’ – meaning only the registers that the candidate is registered on. For example, if a candidate is on the CPEng register but not IntPE(NZ), then the panel will not be asked to assess for IntPE(NZ).

For CRAs assessment panels can recommend one of three options – ‘continue’, ‘suspend’ or ‘remove’.

a. If the registrant has satisfactorily demonstrated competence for continued registration, then the panel’s recommendation should be to ‘continue’ registration. If the case is marginal, the panel may reduce the term to next assessment.

b. If the registrant’s evidence did not adequately demonstrate competence for continued registration but he/she is likely to be able to ‘bridge the gap’ relatively easily, CAB advice is that the panel should recommend ‘suspend’. The candidate’s registration will then be suspended for a period of up to 12 months, and if during this period he/she provides further evidence to demonstrate he/she meets the standard, registration will automatically be revived. If after 12 months of suspension the candidate has not demonstrated competence, his/her registration will be removed.

c. If the registrant provides no evidence for a continued registration assessment, or if the evidence provided shows that the engineer is clearly no longer competent, the panel should recommend ‘remove’.

If a registrant is found to be competently performing work but not currently at the level of complexity required for that register, the assessment panel may recommend registration on the appropriate register. However, this would have to be recorded as a text recommendation in the holistic assessment for the registers under review.

29.3 ASSESSING FOR INTPE(NZ) AS PART OF CPENG CRA

If a CPEng registrant undertaking a continued registration assessment was not currently on the IntPE(NZ) register but wished to be assessed for IntPE(NZ), the panel is not required to make that judgment. National Office staff will ensure that the registrants qualifications meet the Washington Accord requirement, and will include the IntPE(NZ) recommendation, only asking the assessment panel to check the 7-year/2-year requirements if necessary.

30. RECOGNISED ENGINEER AND DESIGN VERIFIER ASSESSMENTS

CAB has considered how assessment panels should report the specific skills required to demonstrate competence as a Category A Recognised Engineer and design verifier. Rather than make amendments to the CA07 form specifically for these specialist assessments, panels are asked to specify the critical evidence required to demonstrate the specific attributes required for a Design Verifier and Recognised Engineer as part of its element by element analysis. For example, in assessing a Recognised Engineer, an assessment panel must assess the extent to

Page 57: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 57 of 71

which the candidate has experience and skills in 10 additional attributes - listed as (a) to (j) in the guidelines – and the panel is expected to make specific reference to these as part of its element by element analysis and/or holistic assessment.

If the candidate for either Design Verifier or Category A Recognised Engineer is currently a CPEng, the assessment will be deemed to be a CRA and the assessment panel has the choice of doing an element by element analysis or making reference to the specialist skills as part of its holistic assessment.

31. CANDIDATE ACCESS TO CA07 REPORTS

Panels should be aware that the CA07 report (as presented to the CAB) will be accessible by candidates after the CAB has made its final decision on the assessment. Panels are reminded to avoid inappropriate remarks, verbatim quotes from referee reports and other unprofessional comments (such as reference to age, lack of work experience etc) in their reports.

32. PRE-SIGN OFF CHECKLIST

Before the panel signs off a report, check to see that it has provided CAB with adequate information to make a decision.

Action to Check that the report: Completed?

Identifies the critical and validated evidence that convinced the panel of the candidate’s competence (or otherwise)?

Summarises attributes that contribute to complexity of the candidate’s projects or activities? (see page 41)

Provides adequate supoporting evidence for negative recommendations? (see page 41)

Is free of incongruence and inconsistencies? (see page 49)

Does not contain incriminating or inappropriate information and is suitable for the candidate to read? (see page 57)

Records changes to practice area description (see page 50)

Provides reasons for panel’s recommendations on meeting the relevant competence standard? (see page 7)

Provides reasons for the panel’s recommendations on term to next assessment? (see page 55)

Step 6 • Report sign-off

Page 58: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 58 of 71

CAB relies on the CA07 report on to make its final decision with respect to recommendations from assessment panels. CAB must comply with the CPEng Rules in making its decision on a candidate’s assessment. To allow CAB to be fully informed and comply with its obligations under the CPEng Rules, CAB asks that assessment panels format CA07 reports for CRAs using the 5 CRA questions as headings. A holistic closing statement summarising the panel’s view may be included, particularly if it provides clarity to the panel’s findings.

The CA40 checklist (below) has been developed specifically for CRA reports to assist panels prepare reports efficiently to a professional standard prior to submitting reports to National Office.

CA40 CRA CA07 CHECKLIST

Question 1 - ‘What are the technological, regulatory and good practice changes that have occurred in the candidate's practice area (since the previous assessment or over the past 6 years)? ‘

Does the report:-

Summarise changes identified by candidate?

Note other key changes not identified by candidate?

Note nature of changes? (Regulatory, technological, good practice)

Link relevance of changes to the candidate's practice area description?

State panel's views on the adequacy of candidate's awareness of changes?

Convey the above information within the optimal length of approx 1 - 2 paragraphs

Question 2 - ‘What are the actions the candidate has taken to keep abreast of these changes? ‘

Does the report:-

Summarise actions taken by candidate to address changes (noted in Q1)?

State panel’s view on relevance of CPD activities to practice area description?

Draw on specific examples of such CPD?

State panel's view on the adequacy of the candidate's CPD activities to address changes (noted in Q1)?

Convey the above information within the optimal length of approx 1 - 2 paragraphs

Question 3 - ‘How do the candidate's work samples demonstrate the application of this new knowledge? ‘

Does the report:-

State how the work samples show new knowledge has been applied?

Show the new knowledge being applied is in response to the changes (as noted in Q1)

State the candidate's role and responsibilities in work sample provided?

States panel's view that (i) work samples align with candidate's practice area description ; and (ii) are an adequate demonstration of application of new knowledge in response to changes (as noted I Q1)

Convey the above information within the optimal length of approx 1 - 2 paragraphs

Question 4 - ‘How do the candidate's work samples demonstrate the factors that contribute to the complexity of the candidate's engineering work?’

Does the report:-

Make specific reference to definitions of complexity and factors contributing to complexity of work samples?

State the candidate's role and responsibilities in work sample provided?

States panel's view on the adequacy of work samples to demonstrate complexity?

Convey the above information within the optimal length of approx 1 - 2 paragraphs

Question 5 - ‘How do the work samples demonstrate that the candidate is still able to practice competently as an engineer? ‘

Page 59: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 59 of 71

Does the report:-

Summarise the adequacy of the work samples to show the candidate is still able to practice competently in his or her practice area description having regard to the required level of complexity and candidate's roles and responsibilities?

State panel's view on the candidate's competence in a holistic way?

Convey the above information within the optimal length of approx 1 - 2 paragraphs

GENERIC CA07 CHECKLIST

Have I?:-

Completed a spell check?

Checked for cut and paste errors?

Checked for sense?

Removed inappropriate reference to referees?

Removed all superlatives?

Removed redundant words and text?

Checked there is no jargon and acronyms?

Applied the ABC of report writing….Accuracy….Brevity…..Clarity?

Removed irrelevant or inappropriate information i.e. refer only to evidence, standards, definitions or panel’s views?

Clearly stated reasons and referenced CAB guidance if non-standard outcome recommended?

Checked for contradictions

Does the text align with the assessment ratings?

Does the evidence and text align with the practice area description?

Is the holistic summary consistent with text and ratings?

33. CA07 REPORT SIGN OFF PROCESS

The Staff Assessor normally prepares the first draft of the CA07 report, and identifies areas where the Practice Area Assessor is asked to provide input or comment on specific aspects of the report. The Practice Area Assessor then reviews the report adding comments as appropriate – and if satisfied with the document can ‘sign off’ the report with only accessing it once. If the Staff Assessor is satisfied with the Practice Area Assessor’s comments or additions, the Staff Assessor can then submit the report to National Office for preparation for Competence Assessment Board. The following table charts this process with screen images.

A video has been prepared to show how the sign-off process works – it is located at https://www.ipenz.org.nz/ipenz/members/Assessors/Assessor%20Training%20-%20CA07%20Sign%20Off.mp4, which can be found under the ‘Training Material’ part of the assessors area.

32.1 ARCHIVING COMPLETED CA07 REPORTS

The print version can be selected, copied and pasted into a Word document for archive records if the assessor wishes to do that. By doing this, assessors can preserve the report in full for future reference.

Page 60: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 60 of 71

33. ASSESSING ‘RECOGNISED ENGINEER’ OR ‘DESIGN VERIFIER’

Recognised Engineer (or more correctly ‘Category A Recognised Engineer’) and Design Verifier are quality marks that require candidates to be CPEng registered, and assessment panels established to conduct these assessments will be required to make a decision on whether or not the candidate not only meets the standard for CPEng but also has the additional skills and knowledge to satisfy these quality marks. Assessment panels appointed to conduct assessments for Recognised Engineer or Design Verifier will have at least one assessor with a relevant engineering background – being either a Recognised Engineer or a Design Verifier.

Panels should be aware of the guidelines available on the IPENZ website for:

Guidelines for assessing Recognised Engineer Category A at

http://www.ipenz.org.nz/IPENZ/Forms/pdfs/Guidelines_for_assessment_of_recognised_eng

ineer.pdf

and

Guidelines for assessing Design Verifier at http://www.ipenz.org.nz/IPENZ/Forms/pdfs/Design_Verifier_Guidelines.doc

34. COMPETENCE ASSESSMENT BOARD DECISIONS

When a recommendation from an assessment panel is presented to the CAB, it can either accept, vary or reject the recommendation – but it can only vary or reject the recommendation after it has referred the matter back to the assessment panel with reasons for its decision to refer it back.

Page 61: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 61 of 71

Once the CAB has made a decision, the CA07 report is made available to the candidate no matter what the outcome was.

34.1 CAB ADVICE: REPORTS REFERRED BACK TO PANEL

As noted above, the CAB can either reject or vary a recommendation from an assessment panel. The most common reason a recommendation is rejected is that the report contains inconsistencies (see page 49 for more detail). CAB will normally advise the panel its recommendation is not consistent with some part of the report and further work is required by the panel to achieve clarity. As the CAB does not have access the full portfolio of evidence, it is not in a position to direct the panel to a specific outcome – the panel is left to make a recommendation taking into account the CAB advice.

34.2 CAB ADVICE: ASSESSOR COMMUNICATIONS WITH CANDIDATES

Based on the ‘no surprises’ approach, assessment panels are obliged to inform the candidate of the panel’s progress with assessments and provide an indication of its likely recommendations to the Competency Assessment Board.

When making recommendations to the CAB on the CA07 form, panels should avoid acting as an advisor suggesting things the candidate should do in response to an assessment outcome. For example, an assessment panel should not make a recommendation such as “the candidate should work for a larger company and develop skills managing more complex engineering activities”.

There is a natural tendency to want to be helpful and offer advice (especially if the panel is about to make a negative recommendation), but such advice should not be (or seen to be) part of the assessment process and be included in the CA07 report.

If members of a panel wish to offer advice to a candidate, they should consider alternative means of communication outside the assessment process. For example, they could give their advice directly (verbally or by email) to the candidate outside the CA07 reporting process.

35. NATURAL JUSTICE

Once an assessment panel recommends registration be declined (AFA), or suspended or removed (CRA), the CAB is required to apply the natural justice provisions of the CPEng Rules. This process requires notifying the candidate of the panel’s decision and reasons for its decision and give the candidate the opportunity to make a submission for the CAB to consider when it makes its decision on the panel’s recommendation.

The rules of natural justice require that:

the decision-making process is fair;

the person(s) making the decision are unbiased;

the candidate is afforded the opportunity to understand and comment on prejudicial material.

As the CAB is not in a position to identify any new evidence in the submission – it does not view the candidate's original portfolio of evidence – it will rely on the assessment panel to give advice on whether the candidate’s submission contained new evidence. Accordingly, the CAB will often refer the submission to the assessment panel and ask it to:

1. review the submission for any new evidence; and

Page 62: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 62 of 71

2. re-consider its original recommendation if it finds there is new evidence. If there is new evidence, the assessment panel may need to do more work – such as conduct a further interactive assessment to validate the new evidence or do further follow up with referees.

The assessment panel must then report its findings back to the CAB.

As the CAB needs to be sure that the assessment has been carried out in accordance with the principles of natural justice, it offers the following advice to assessment panels when considering and reporting findings on natural justice submissions:

1. Review the natural justice submission for new evidence. If new evidence has been presented, does it cause the panel to amend its original recommendation? If so, it is important to have regard to (and record in findings) steps taken or assessment tools used to validate information (use of written assignment, interactive assessment or further referee input) and the factors contributing to the complexity of engineering involved.

2. If there is no new evidence in the natural justice submission (or there is new evidence but it is not of sufficient complexity to meet the required standard and change the panel’s original recommendation), note the facts relevant to the new evidence to show that the panel has considered the evidence objectively and with ‘an open mind’.

3. Do not react to emotional or provocative allegations that might be made in the submission – those are matters for the CAB to address. However, panels should record relevant facts. For example, in a recent natural justice submission a candidate claimed an assessor was inconsistent as a work sample included work done for a building consent. The assessor had been involved in the approval of the consent – yet the work was not considered demonstrate competence for the assessment. The assessor noted that the work satisfied the requirements for the consent under the requirements of the Building Code but did not meet the complexity requirements for the assessment.

36. APPEALS AND PROCEDURAL REVIEWS

Candidates have two remedies in the event that a decision is not to their satisfaction – they can apply for either:

o a procedural review (where the candidate believes the assessment was not conducted in accordance with the required procedure) or

o an appeal (where the candidate believes the decision is manifestly at odds to the evidence provided).

CPEng candidates are unique in that they have the ability to seek both a procedural review and lodge an appeal.

A procedural review involves IPENZ appointing an independent reviewer to determine if the assessment was in accordance with the required process, and if any departure from process disadvantaged the candidate.

An appeal under the CPEng regime involves the candidate making application to a separate statutory body (the Chartered Professional Engineers Council) which conducts the appeal as a re-hearing. Under the IPENZ Regulations for Competence Registers appeals are conducted by an independent reviewer.

If a candidate pursues either a procedural review or an appeal, assessors will be required to submit all records and notes made during the assessment process to National Office. This information will be compiled into a chronological record which along with relevant information from the candidate’s file will be submitted to either the reviewer or the Appeals Panel. Assessors are not expected to be involved in the process, although National Office staff may

Page 63: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 63 of 71

request further information from panel members if clarification is required or information is missing.

37. ASSESSMENTS UNDER TRANS TASMAN MUTUAL RECOGNITION ACT (TTMRA)

The Trans Tasman Mutual Recognition Act is the legislative output of the agreement between Australia and New Zealand Governments. The principle of the agreement is that a registered occupation in one jurisdiction should be recognised as registration within the other jurisdiction if there is ‘occupational equivalence’.

CPEng is deemed to be a registered occupation in New Zealand, and thus anyone on a statutory based register of professional engineers in Australia is eligible for CPEng in New Zealand subject to the occupational equivalence provisions of the agreement. Queensland is currently the only state with a statutory backed register for engineers in Australia, thus any engineer who has Registered Professional Engineer Queensland (RPEQ) is eligible for CPEng under the TTMRA. While this approach was supported by IPENZ in principle, there was concern within the New Zealand structural and geotechnical sector that RPEQ engineers with no demonstrated competence in seismic engineering could gain CPEng through TTMRA.

The test is for ‘occupational equivalence’. As the CPEng Rules require candidates to be assessed in their practice area, the only way to establish occupational equivalence is to assess the TTMRA candidate in his or her practice area and compare that with a CPEng practising in the identical practice area in New Zealand. Thus for a Registered Professional Engineer Queensland (RPEQ) to gain CPEng, he or she would have to be able to do the same things as a New Zealand engineer practising in the identical practice area. For the purposes of establishing occupational equivalence, all elements except element 2 (and element 11 to the extent to which it impacts on element 2) are deemed to have been satisfied through the candidate’s RPEQ registration. Hence, when considering establishing occupational equivalence for a TTMRA candidate, assessors will only be asked to make a judgement on element 2 (and element 11 as it relates to element 2) – comparing competence in his or her practice area to that of a New Zealand engineer with the same practice area. The Trans Tasman Mutual Recognition Act only allows 30 days for a decision to be made, so this assessment must be completed promptly to meet the statutory deadlines.

If the TTMRA candidate was assessed for his or her RPEQ more than 5 years ago, he or she would be required to undertake an immediate continued registration assessment and would be given 3 months notice to submit a portfolio of evidence. This assessment is a full CRA and differs from the test to establish occupational equivalence in that all elements of the competence standard are to be assessed. While candidates may re-submit material from their earlier TTMRA application, assessors must assess all elements.

38. APPLICATION OF TTMRA PRINCIPLES TO CREDIT SCHEDULE

IPENZ has decided to apply the TTMRA principles more broadly. For example, candidates with IntPE from other jurisdictions, registration on the Australian National Professional Engineers Register (NPER) or CPEng(Australia) applying for CPEng in New Zealand will have the ‘occupational equivalence’ test applied to gain admission as CPEng. National Office may approach a Staff Assessor or set up an assessment panel specifically to establish occupational equivalence, when only elements 2 and 11 would need to be considered.

Assessors are referred to the ‘credit schedule’ for advice being given to overseas candidates with previous competence assessments outside New Zealand seeking ‘credit’ under the IPENZ ‘credit schedule’ – see http://www.ipenz.org.nz/IPENZ/Forms/pdfs/Credit_for_Registrants_from_other_Jurisdictions.pdf

Page 64: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 64 of 71

39. ADMINISTRATIVE INFORMATION

IPENZ will make payment to Staff Assessors on receipt of an invoice for an assessment, and invoices (with assessor logs) can be sent as soon as the assessment panel’s reports (completed CA07 forms) are sent to National Office ready for presentation to the Competency Assessment Board.

Fees for Staff Assessor (GST exclusive, if registered for GST) are:

Assessments involving an AFA assessment $625.

CRAs involving an initial interactive assessment $330.

CRAs involving an additional interactive assessment $XXXX

Please note: The current fees review will result in increases in payments after 1 January 2015.

Travel costs and toll calls – actual and reasonable costs will be reimbursed in accordance with the IPENZ Staff policy on reimbursement of expenses (see expense claim form at http://www.ipenz.org.nz/ipenz/forms/pdfs/IPENZ%20Expense%20Claim.xls?86314).

40. CPENG RULES AND IPENZ REGULATIONS COVERING ASSESSMENTS

Assessors should always refer to the relevant Rules or regulations when making an assessment.

CPEng Rules are located at http://www.ipenz.org.nz/ipenz/forms/pdfs/CPEngRules.pdf.

The IPENZ Regulations covering the other current competence-based registers are at http://www.ipenz.org.nz/ipenz/forms/pdfs/IPENZ_Competence_register_regulations_Final_April_2007.pdf.

40.1 MAPPING IPENZ REGULATIONS FOR COMPETENCE REGISTERS AND CPENG RULES

IPENZ Regulations CPEng Rules

2 3

3

4

5

6

7

8 8

9 9

10

11 10

12 11

13 12

14 13

15 14

16 15

21 20

22 21

23 22

24 23

Page 65: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 65 of 71

IPENZ Regulations CPEng Rules

25 24

26 25

27

28 27

29 28

30 29

31 30

32 31

33 32

34 33

43 42

44 43

45 44

46 45

47 46

48 47

49 48

50 49

51 50

52 51

53 52

54 53

88

89 72

90 73

91 74

92 75

93 76

Page 66: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 66 of 71

41. POLICY ON TERM TO NEXT ASSESSMENT

Objectives

To achieve a consistent application of the term to next assessment based on competence-related risk factors and to protect the credibility of the quality mark of the current competence registers.

The two proxies considered relevant to this risk and risk trajectory are:

1. The level of an engineer’s competence when assessed against the standard (either as marginally meeting the standard, or satisfactorily meeting the standard, or demonstrating competence well-above the standard); and

2. An engineer’s assessment history. Past evidence of engineers having maintained competence over a period of time is a good indicator of their being able to maintain competence into the future. Thus, engineers who have previously demonstrated an equivalent level of competence are likely to be a lower risk than those who have not previously been assessed. Similarly, those who marginally met the standard in the past can be considered a higher risk.

A candidate is considered to meet the standard marginally when his or her competence is marginal in elements covering critical aspects of his or her practice area.

41.1 POLICY APPLICATION An assessment panel should apply this policy once it has decided to recommend that a candidate meets the relevant standard of competence for registration and is deciding its recommendation for the candidate’s term to next assessment. If a Panel deviates from the policy in its recommendation it must document its reasons .

41.2 POLICY FOR TERM TO NEXT ASSESSMENT

The table below summarises the proposed policy for term to next assessment and uses a risk-based approach in setting the term to next assessment.

Term AFA CRA

Two years

The applicant meets the standard for registration but only marginally (i.e. one or more of the elements regarded as critical to the applicant’s practice area were assessed at lower than ‘consistently demonstrates competence’).

The candidate demonstrates that he/she is still able to practice competently, but only marginally (eg. competence was not demonstrated in the initial holistic assessment so an element by element analysis was undertaken; and/or the candidate’s competence was assessed as marginal in elements covering critical aspects of his or her practice area).

Page 67: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 67 of 71

Term AFA CRA

Four years The applicant meets the standard for registration

The candidate demonstrates (through work samples) that he/she is still able to practice competently,

AND EITHER

There is evidence of reasonable steps being taken to maintain the currency of his/her knowledge but the steps being taken to maintain competence are barely adequate.

OR

He/she has taken reasonable steps to maintain the currency of his/her knowledge and skills but the last term to re-assessment was less than four years.

Six years

The applicant meets the standard for registration

AND EITHER:

The assessment panel has identified evidence of the applicant demonstrating competence at a significantly higher level than the minimum standard for registration;

OR

The applicant was successful in an assessment to an equivalent standard of competence within the last six years;

OR

The applicant is currently registered on a register recognised as requiring an equivalent level of competence.

The candidate satisfactorily demonstrates (through work samples) that he/she is still able to practice competently;

AND

He/she has taken reasonable steps to maintain the currency of his/her knowledge and skills;

AND

The last term to re-assessment was not less than four years.

COMPETENCY ASSESSMENT BOARD ADVICE: GUIDANCE ON APPLICATION OF POLICY ON TERM TO NEXT ASSESSMENT Purpose: To provide advice to panels on application of policy. 1. BACKGROUND

Page 68: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 68 of 71

During 2012 the CAB developed a risk-based policy to guide panels when considering the term to next assessment. It suggested a four-year term to next assessment as a starting point in an assessment for admission (AFA), with provision for panels to increase this term (to six years) or reduce it (to two years) if there is supporting evidence to justify such a decision.

The term ‘significantly’ was used to describe the extent demonstrated competence needed to be above the minimum standard to justify a six-year term (for AFAs). Recent experience has shown inconsistency by panel’s in the application of this provision.

Accordingly, the CAB developed advice to panels to assist in achieving a more consistent approach by panels. 2. CAB ADVICE

The only way to measure a candidate’s competence is by the evidence presented (by the candidate, referees etc) to an assessment panel.

Accordingly, CAB offers the following advice to panels:

Once a panel has decided a candidate meets the standard for initial registration (i.e., in an AFA), the starting point in deciding the term to next assessment should be four years;

If there is evidence supporting competence demonstrably above the minimum standard, (meaning there is evidence that the engineer is a ‘low risk’ in terms his or her ability to practice ‘safely’ as an independent practitioner) a six-year term can be recommended. Panels must document evidence to support this recommendation.

Evidence supporting a six-year term would normally include several of the following:

Post graduate qualifications building on a Washington Accord undergraduate degree. Note: the need to complete a Knowledge Assessment introduces an element of risk to the assessment which would generally point towards a four-year term

Evidence of a track record of successful assessments in other engineering jurisdictions (such as overseas registration, Heavy Vehicle Specialist Certifier)

Evidence that the candidate is operating at a career stage requiring technical competence (independent practice – National, or International scope) and/or leadership competencies (Team Leader or Technical Manager) beyond the minimum standard for independent practice

Evidence that the candidate is continuing to develop their engineering knowledge and engineering practice through a structured and proactive approach to professional development

Evidence includes demonstration over a sustained period (ie typically four years).

Evidence of “safe” practice across a broad range of engineering projects and activities in their practice area that is supported by good risk management (element 7), professional development (element 11) and engineering judgement (element 12).

Similarly, a two-year term should be recommended where evidence in elements critical to the candidate’s practice area are marginal or inconsistent. This evidence may be exhibited in poor judgement, risk management, Continued

Page 69: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 69 of 71

Professional Development; limited complexity in engineering problems and or activities; limited knowledge (or application of it) or as poor professional behaviour.

If a panel is satisfied that a candidate meets the minimum standard, no specific justification is required if a four-year term is proposed.

5 December 2012

Further advice

CAB added further advice in February 2013 that the CA07 report should show evidence that the panel had given consideration to the term to next assessment in making its recommendation. For example, after showing an AFA candidate had demonstrated he/she met the competence standard, report “The applicant met the required standard of competence and accordingly the panel recommends a 4 year term to next assessment”. Or “The applicant demonstrated a significantly higher level of competence above the standard through {state the relevant factors as listed in text on page 68}. Accordingly the panel recommends a 6 year term to next assessment”.

Page 70: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 70 of 71

42. INDEX

A

Active listening See interactive assessment .............. 26

Assessment panel Purpose of panel ................................. 7

Assessment tools CPEng equivalence ........................... 35 Overview............................................ 31 Referee being independent ............... 35 Referee contact ................................. 32 Referee eligibility ............................... 35 Referee questioning .......................... 33 Requesting further information .......... 32

Assessor Log Quick Log .......................................... 17 Working Log ...................................... 16

Assessor portal Overview............................................ 13

C

CA07 forms Advice to candidates ......................... 10

CA07 Report Assessing Element 2 ......................... 43 Benchmarking qualifications ............. 20 Complexity of Engineering ................ 42 CRAs for overseas candidates .......... 38 Implied Evidence ............................... 41 Negative recommendations............... 42 Previous term to assessment ............ 19 Selecting competence levels ............. 19

CAB Advice Advice from CAB Member ................. 36 Application of policy on term ............. 56 Candidates from overseas ................ 43 Candidates with convictions .............. 19 Checklist for CRA reports .................. 59 Communicating with candidates ....... 62 Consistency within CA07 reports ...... 50 Dams safety engineering................... 53 Natural justice submissions ............... 62 Relevant registers ............................. 55 Reports referred back to panels ........ 62 Review practice area descriptions .... 53 Use of written assignment ................. 30 Using referee input ............................ 36 Validate practice field information ..... 54 When to contact referees .................. 35

Competence concepts Decision making .................................. 5 Introduction .......................................... 5 Validation of evidence ......................... 6

Complaint against you See Conflicts of interest ...................... 9

Confidentiality Respecting privacy .............................. 9

Conflicts of Interest Disclosure ............................................ 9

CPEng Equivalence See Assessment tools ....................... 35

CPEng Rules Mappping with IPENZ Regulations ... 65

CRA Material change to practice area ....... 37 Reasonable CPD ............................... 37

E

Exercisng Judgement Assessment decisions ....................... 28 Consistency within CA07 reports ...... 50 Holistic decisions ............................... 47 Practice area decisions ..................... 52 Term to next assessment .................. 56

G

Giving feedback See Interactive assessment .............. 28

Good practice Assessing Element2 .......................... 43

I

Interactive assessment Acknowledging status ........................ 24 Active Listening ................................. 26 After interactive ................................. 29 Building rapport ................................. 23 Closing............................................... 27 During ................................................ 23 Giving feedback ................................. 28 Initial contact ..................................... 22 Overview............................................ 22 Question bank ................................... 26 Questioning techniques ..................... 25 Use as assessment tool .................... 22 Venue set-up ..................................... 23

P

Policy Term to next assessment .................. 67

Practice area Validation of ....................................... 51

Q

Question bank See Interactive assessment .............. 26

R

Referee being Independent See Assessment tools ....................... 35

Requesting further information

Page 71: CA10 Guidelines for Assessors

CA10_Guidelines_For_Assessors July 2014.Docx Page 71 of 71

See Assessment tools ....................... 32

T

Triggers for assessors Positive and Negative .......................... 6

TTMRA Assessments ..................................... 64

W

Written assignment CAB advice ........................................ 30 Types ................................................. 29