26
Designing Assessment tasks : Rita Kizito march 2014 Rita Kizito march 2014 Workshop for Earth Sciences Department April 2014

Faculty assessment presentation april 2014

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Faculty assessment presentation april 2014

Designing Assessment tasks :

Rita Kizitomarch 2014

Rita Kizitomarch 2014

Workshop for Earth Sciences DepartmentApril 2014

Page 2: Faculty assessment presentation april 2014

In this workshop we intend to…

• Link assessment to module learning outcomes

• Learn how to set assessment activities at the appropriate NQF level

• Briefly interrogate assessment (purpose, types , features)

• Refine/develop assessment tasks

hand-out #1 – UWC Assessment policy

Page 3: Faculty assessment presentation april 2014

Integrate outcomes and assessment

Knowledge reproduction/creationRecallComprehensionApplicationAnalysisSynthesis Evaluation

AttitudesInquiry focused Ethically, environmentally and socially awareMotivated

SkillsSkilled communicatorsAutonomous and collaborative Problem solving

Page 4: Faculty assessment presentation april 2014

Are your outcomes well designed ?

Specific Provide details of aspect of expectation

Meaningful Written in understandable language

Appropriate Suit learner’s abilities & experiences

Realistic Achievable in given time constraints

Testable Some measure of progress/achievement

(Butcher, et al., p. 41, 2006)

Page 5: Faculty assessment presentation april 2014

Bloom’s taxonomy for generating outcomes and assessment tasks

Page 6: Faculty assessment presentation april 2014

Bigg’s (2003) SOLO taxonomy:

Single point

Unanticipated extension

Logical related answer

Multiple unrelated

points

Lower level outcomes Higher level outcomes

Page 7: Faculty assessment presentation april 2014

Bigg’s (2003) SOLO taxonomy:

hand-out #2 – Learning taxonomy

SOLO TaxonomyStructure of Observable Learning Outcome

PrestructuralUnistructuralMultistructural Relational

Extended

abstract

Quantitative Qualitative

Page 8: Faculty assessment presentation april 2014

Level Descriptors - SA Qualifications Framework

To ensure coherence in learning achievement and to facilitate assessment at the appropriate levels.

NQF Level

Education Level

Qualification

10 Doctorate

9 Masters

8 PG Diploma/CertHonours

7 Bachelor (ord.) degree

6 Diploma

5 Certificate

4 Matric

3

2

1

HIGHER EDUCATION

FURTHEREDUCATION

GENERAL EDUCATION

Page 9: Faculty assessment presentation april 2014

Level Descriptors

KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION

listnameidentifyshowdefinerecognizerecallstate

summarizeexplainput into your own wordsinterpretdescribecompareparaphrasedifferentiatedemonstratevisualizefind more information aboutRestate

solveillustratecalculateuseinterpretrelatemanipulateapplyclassifymodifyput into practice

analyzeorganizededucechoosecontrastcomparedistinguish

designhypothesizesupportschematizewritereportdiscussplandevisecomparecreateconstruct

EvaluateChooseEstimateJudgeDefendCriticizeJustify

YEAR 1 YEARS 1, 2 YEARS 2, 3 YEAR 3 YEARS 3, 4 YEAR 4

Page 10: Faculty assessment presentation april 2014

Level Descriptors - SA Qualifications Framework

What should the students know about the learning

area/subject?

What types of problems should the student be able to

solve? …in what contexts? …with what

methods/procedures?

How should students obtain and process information?

How should students communicate?

How independent should the students be?

Which values should students uphold?

hand-out #3 – level descriptors

Page 11: Faculty assessment presentation april 2014

“We first have to be clear about what we want students to learn, and then teach and assess accordingly in an aligned system of instruction”

(Biggs, 1996).

Curriculum alignment

Page 12: Faculty assessment presentation april 2014

Teacher’s Intentions Student’s Activity

Exam’s Assessment

e.g.ExplainRelateProveApply

e.g.Memorizedescribe

e.g.Memorizedescribe

Unaligned course

Page 13: Faculty assessment presentation april 2014

Unaligned course

Teacher’s Intentions

Student’s Activity

Exam’s Assessment

e.g.ExplainRelateProveapply

e.g.ExplainRelateProveapply

e.g.ExplainRelateProveapply

Page 14: Faculty assessment presentation april 2014

Why assess? -two key purposes

• Making judgements about student learning for certification/ grading

• Helping to prompt and promote further learning/ for monitoring learning progress

Summative assessment

Formative assessment

Page 15: Faculty assessment presentation april 2014

What does assessment do ?

• Defines what students will concentrate on when learning

• Affects how they learn

• Specifies what counts as learning

• Provides information about shortfalls between performance and specification

• Simulates conversation about, and reflection, on improvement

Ndagire Kizito 04-04-2014

Peter Knight, 2001

Page 16: Faculty assessment presentation april 2014

Types of assessment

• Laboratory Work

•Class Presentations

• Essays / Research

• Online Tests

• Problem Based Learning

Case studiesProjects

• Reflective Journal

• Multiple choice questions

(MCQs)• Portfolio

• Group work

Page 17: Faculty assessment presentation april 2014

Assessment features

Reliability refers to the degree to which an assessment tool produces stable and consistent results.•Test-retest reliability correlating tests given twice over a period of time to a group of individuals.•Parallel forms reliability correlating different versions of an assessment tool (probing the same construct, skill, knowledge base, etc.) to the same group of individuals. •Inter-rater reliability is a measure of the degree to which different judges or raters agree in their assessment decisions.•Internal consistency reliability is an evaluation of the degree to which different test items that probe the same construct produce similar results. similar results

Page 18: Faculty assessment presentation april 2014

Assessment features

Validity refers to how well a test measures what it is purported to measure.•Face Validity ascertains that the measure appears to be assessing the intended construct under study•Construct Validity is used to ensure that the measure is actually measuring what it is intended to measure (i.e. the construct), and not other variables.Ways to improve validity•Make sure your goals and outcomes are clearly defined and operationalized. •Match your assessment measures to your learning outcomes. Use outside reviewers.•If possible, compare your measure with other measures, or data that may be available.

Page 19: Faculty assessment presentation april 2014

Rubrics/ Assessment criteria

• A rubric is a scoring /set of expectations used to judge student performance. It shows students how well they have performed on an assessment task.

• Uses assessment criteria and levels of performance to break down a task in parts explaining what is required .

• It can be used for a large number of tasks (essays, research projects, oral presentations, portfolios, etc.) and is especially useful for assessing complex , subjective subjects.

Page 20: Faculty assessment presentation april 2014

Rubrics/ Assessment criteria

Examples of rubrics .

hand-outs # 4 & 5 – Rubrics

Page 21: Faculty assessment presentation april 2014

Task 1

1. Identify one outcome from your module outline and develop an assessment task ( you can develop more than one task).

2. Create a rubric for one of the tasks in which you develop criteria (2 or more) and levels of expected achievement for that task answer.

3. Allow a colleague to review the tasks and make comments.

Page 22: Faculty assessment presentation april 2014

Task 2

Compare the two past question papers and answer the following questions

1.Are hey appropriate for the grade levels they have been prepared for?

2.Which one would you give to your students? Why?

3.How would you modify each one to fit your own context?

hand-outs # 6 & 7 – Past papers

Page 23: Faculty assessment presentation april 2014

Hints for writing exam papers

1. Don’t do it on your own! Get one or two colleagues to do your questions.

2. Have your intended learning outcomes in front of you as your draft your questions.

3. Keep your sentences short. 4. Work out what you’re really testing. 5. Don’t measure the same things again and

again. 6. Include data or information in questions to

reduce the emphasis on memory.

Page 24: Faculty assessment presentation april 2014

Hints for writing exam papers

7. Make the question layout easy to follow. 8. Write out an answer to your own question. 9. Decide what the assessment criteria will be. 10.Work out a tight marking scheme. 11.Use the question itself to show how marks are

to be allocated. 12.Try your questions out. 13.Proof-read your exam questions carefully

hand-out # 8 A moderation checklist for exam papershand-out # 9 Moderation checklist RGU

Page 25: Faculty assessment presentation april 2014

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher education, 32(3), 347-364.

Butcher, K. R. (2006). Learning from text with diagrams: Promoting mental model development and inference generation. Journal of Educational Psychology, 98(1), 182.

Knight, P. (2001). A Briefing on Key Concepts: Formative and summative, criterion and norm-referenced assessment. Learning and Teaching Support Network.

SAQA (2000), The South African Qualifications Authority Level Descriptors for The South African National Qualifications Framework, ttp://www.saqa.org.za/docs/misc/level_descriptors.pdf

References

Page 26: Faculty assessment presentation april 2014

Thank - you