26
This may be the author’s version of a work that was submitted/accepted for publication in the following source: Cathcart, Abby, Greer, Dominique,& Neale, Larry (2014) Learner-focused evaluation cycles: facilitating learning using feedforward, concurrent and feedback evaluation. Assessment and Evaluation in Higher Education, 39 (7), pp. 790-802. This file was downloaded from: https://eprints.qut.edu.au/65471/ c Consult author(s) regarding copyright matters This work is covered by copyright. Unless the document is being made available under a Creative Commons Licence, you must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the docu- ment is available under a Creative Commons License (or other specified license) then refer to the Licence for details of permitted re-use. It is a condition of access that users recog- nise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to [email protected] Notice: Please note that this document may not be the Version of Record (i.e. published version) of the work. Author manuscript versions (as Sub- mitted for peer review or as Accepted for publication after peer review) can be identified by an absence of publisher branding and/or typeset appear- ance. If there is any doubt, please refer to the published source. https://doi.org/10.1080/02602938.2013.870969

c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

This may be the author’s version of a work that was submitted/acceptedfor publication in the following source:

Cathcart, Abby, Greer, Dominique, & Neale, Larry(2014)Learner-focused evaluation cycles: facilitating learning using feedforward,concurrent and feedback evaluation.Assessment and Evaluation in Higher Education, 39(7), pp. 790-802.

This file was downloaded from: https://eprints.qut.edu.au/65471/

c© Consult author(s) regarding copyright matters

This work is covered by copyright. Unless the document is being made available under aCreative Commons Licence, you must assume that re-use is limited to personal use andthat permission from the copyright owner must be obtained for all other uses. If the docu-ment is available under a Creative Commons License (or other specified license) then referto the Licence for details of permitted re-use. It is a condition of access that users recog-nise and abide by the legal requirements associated with these rights. If you believe thatthis work infringes copyright please provide details by email to [email protected]

Notice: Please note that this document may not be the Version of Record(i.e. published version) of the work. Author manuscript versions (as Sub-mitted for peer review or as Accepted for publication after peer review) canbe identified by an absence of publisher branding and/or typeset appear-ance. If there is any doubt, please refer to the published source.

https://doi.org/10.1080/02602938.2013.870969

Page 2: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

1 | P a g e

Learner-focused Evaluation Cycles: Facilitating Learning Using Feedforward,

Concurrent and Feedback Evaluation

Abby Cathcart*

Dominique A. Greer

Larry Neale

QUT Business School, Queensland University of Technology, Brisbane, Australia

*Dr Abby Cathcart, School of Management, QUT Business School, Queensland University

of Technology

Mail: GPO Box 2434, Brisbane Qld 4001, AUSTRALIA; Phone: 07 3138 1010;

Email: [email protected]

Brief Biographical Notes

Abby Cathcart is currently a Senior Lecturer in Management at Queensland University of

Technology in Brisbane, Australia. Abby holds a PhD from Leicester University in the

United Kingdom and a Postgraduate Certificate in Academic Practice from Northumbria

University. She was awarded an Australian Office of Learning and Teaching Excellence

Award in 2011. Her research includes doctoral students’ teaching development, employee

voice, and customised working practices.

Dominique A. Greer is currently Lecturer of Marketing at Queensland University of

Technology in Brisbane, Australia. Dominique holds a PhD from Queensland University of

Technology. Her research interests include academic professional development, service

deviance, and co-creation in professional services.

Larry Neale is currently Associate Professor of Marketing at Queensland University of

Technology in Brisbane, Australia. Larry earned a PhD from the University of Western

Australia and an MBA from the University of Illinois in the United States. Larry’s current

research interests include consumer ethics, electronic marketing, sports marketing, and

marketing education. Larry's research activities frequently take him to Asia and the United

States as an instructor, presenter and facilitator.

* Corresponding Author

Page 3: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

2 | P a g e

Learner-focused Evaluation Cycles: Facilitating Learning Using Feedforward,

Concurrent and Feedback Evaluation

Abstract

There is a growing trend to offer students learning opportunities that are flexible,

innovative, and engaging. As educators embrace student-centred, agile teaching and

learning methodologies, which require continuous reflection and adaptation, the need to

evaluate students’ learning in a timely manner has become more pressing. Conventional

evaluation surveys currently dominate the evaluation landscape internationally, despite

recognition that they are insufficient to effectively evaluate curriculum and teaching

quality. Surveys often (1) fail to address the issues for which educators need feedback,

(2) constrain student voice, (3) have low response rates, and (4) occur too late to benefit

current students. Consequently, this paper explores principles of effective feedback to

propose a framework for learner-focused evaluation. We apply a three-stage control

model, involving feedforward, concurrent and feedback evaluation, to investigate the

intersection of assessment and evaluation in agile learning environments. We conclude

that learner-focused evaluation cycles can be used to guide action so that evaluation is

not undertaken simply for the benefit of future offerings, but rather to benefit current

students by allowing ‘real-time’ learning activities to be adapted in the moment. As a

result, students become co-producers of learning and evaluation becomes a meaningful,

responsive dialogue between students and their instructors.

Page 4: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

3 | P a g e

Keywords: evaluation; assessment; feedforward; feedback; agile teaching and learning;

concurrent evaluation, technology

Introduction

In recent years, a growing trend has challenged conventional learning and teaching structures

in order to offer students flexible, innovative and engaging learning experiences. Various

concepts have emerged to represent this transformation in university teaching, ranging from

the flipped or inverted classroom (Goodwin and Miller 2013; Steed 2012) to agile teaching

and learning (McAvoy and Sammon 2005). Agile teaching and learning methodologies are

adapted from the principles of software development outlined in the Agile Manifesto (Beck et

al. 2001). This concept represents the view that educators create agile learning environments

when they adapt the curriculum and delivery to match the unique needs, knowledge and

preferences of the student cohort (Chun 2004). Educators who use agile teaching and learning

strategies tailor their materials and modify course delivery (often in ‘real-time’) in response

to current student needs. This approach relies on a reorientation of practice from managerial

approaches that are based on centralisation and control to those based on decentralisation and

flexibility (Masson and Udas 2009). Agile teaching methods are especially useful for classes

where there is diversity of incoming knowledge levels, or where knowledge levels are only

able to be ascertained once a teaching session has started.

While there is evidence of a trend towards agile student-centred approaches to

learning and teaching, conventional methods of evaluating practice continue to govern

decision-making in the sector. For example, student surveys have been the predominant tool

used to evaluate teaching in Australia, the United Kingdom and the United States for many

years (Freeman and Dobbins 2013; Otto, Sanford, and Ross 2008). However, evaluating

teaching and learning via student surveys alone is problematic. First, there is a growing

Page 5: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

4 | P a g e

recognition that effective evaluation of higher education teaching and curriculum needs to

draw on evidence from a number of sources, rather than relying purely on student survey data

( Alderman, Towers, and Bannah 2012; Berk 2005; Trigwell, Rodriguez, and Han 2012).

Academics are questioning the impact of feedback via questionnaires on the quality of

teaching and learning, and there is little published evidence of the systematic use of student

evaluations for improving practice (Freeman and Dobbins 2013; Smith 2008a). All too often,

gathering student feedback is seen by academic staff as an exercise in compliance and

external auditing rather than a way of developing their practice (Stein et al. 2013). Second,

surveys often conflate teaching and subject evaluation (Timpson and Desley 1997) and for

many staff, the data from student surveys are too little and come too late. Third, rather than

providing evidence for reflection, evaluation can simply lead to confusion. As Berk (2011)

has noted, for every useful piece of evidence there is at least one which is ‘junk’. Finally,

research highlights that student ratings are influenced by a number of biases, including

expected grade, ethnic background, gender, age (Worthington 2002) and instructor

characteristics such as sexiness (Felton, Mitchell, and Stinson 2004) or the ‘seductive style’

of the lecturer (Ware and Williams 1975, 154).

In short, relying only on post-experience student survey data to evaluate practice and

inform student learning is dangerous. Alternative evaluative methods should be employed to

fully understand the student learning experience and reflect on how best to improve it. For

evaluation to be effective, it needs to move beyond external reporting compliance that

informs future practice to instead identify gaps in student learning in order to benefit current

and future students. By focusing on the learner, evaluation should be a mechanism not just

for gathering student voice, but using that voice to inform practice and enhance learning. Any

student views that are collected should be used as part of a culture of improvement (Harvey

2003; Josefson, Pobiega, and Stråhlman 2011).

Page 6: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

5 | P a g e

Consequently, we propose a model of learner-focused evaluation, which adapts a

systems approach to control and extends work on principles of effective feedback in order to

facilitate student-centred approaches to learning. The first section of the paper explores

principles of effective feedback and emphasises the importance of dialogue between students

and educators. The second section of the paper describes the types of control that underpin a

system view of organisation and adapts this work to propose a model of learner-focused

evaluation cycles. The third section of the paper illustrates the implementation of the cycle

using an Instant Response System (IRS) as an evaluation tool. The final section of the paper

presents a discussion and a conclusion.

Principles of effective feedback and evaluation

It is widely accepted in higher education that assessment drives learning (Brown 2010; Race

2010; Stobart 2008) and that effective feedback is strongly related to improved achievement

(Nicol and Macfarlane-Dick 2004). However, both British and Australian students report

more dissatisfaction with the assessment and feedback processes in higher education than any

other aspect of their student experience ( Australian Council for Educational Research

[ACER] 2009; Higher Education Funding Council for England and Wales [HEFCE] 2012;

Price et al. 2010). Students complain that feedback on their learning is narrow, vague,

confusing and arrives too late to be useful (Race 2010; Weaver 2006).

In order for feedback to be useful, it should be constructive and help students to

develop skills to evaluate their own performance, as well as provide them with opportunities

to close the gap between current and desired performance (Nicol and Macfarlane-Dick 2004).

This requires a shift away from transmission models of feedback where the student is seen as

a passive receiver of feedback knowledge. Instead, there are calls to develop models of active

Page 7: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

6 | P a g e

learning in which students construct feedback so that the act of feedback production is just as

valuable for learning as for modifying future deliveries (Nicol, Thomson, and Breslin 2013).

One fundamental principle of good feedback is that it should feed forward so that it

can be used to inform future work (Orsmond et al. 2011). Stobart (2008) argues that

assessment is a social activity that shapes both learner identity and learning. He positions

assessment for learning, rather than assessment of learning, emphasising the formative nature

of feedback in order to feedforward and support future learning. Others (see for example:

Nicol and Macfarlane-Dick 2004) argue that formative assessment should be a core part of

teaching and learning, and that feedback and feedforward should be systematically embedded

in curriculum practices.

Recently, Freeman and Dobbins (2013) argued that the principles of effective

feedback to students should be applied to the evaluation of learning and teaching. Many of

the problems associated with current evaluation practices in the sector are similar to the

problems identified with feedback and assessment. Thus, recent developments in assessment

and feedback might also apply to current concerns about evaluation practices. For example, a

key problem with the student survey approach to evaluating learning and teaching is that it

assumes that ‘one size fits all’: a standard survey will work for every type of course and

student cohort. Clearly this is not the case, and there has been a growing recognition that

different evaluation tools will work best in different circumstances depending on the purpose

of the evaluation itself.

Focusing on the purpose for undertaking evaluation is a crucial step in determining

the most appropriate tools to generate the most appropriate outcomes from an activity (Smith

2008b). The purpose of evaluation can vary from formative (i.e. providing diagnostic

feedback to educators) to summative (i.e. measuring teacher effectiveness for appointment or

promotion, or quality assurance purposes). Most of the emphasis on the use of student survey

Page 8: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

7 | P a g e

data has been for personnel decisions rather than enhancing teaching effectiveness (Marsh

2007); consequently, conventional forms of evaluation are of questionable relevance for new

student-centred approaches to learning (Abrami, d'Apollonia, and Rosenfield 2007). In

particular, the needs of stakeholder groups should be a key consideration (Spiel, Schober, and

Reimann 2006) so that the evaluation focuses not simply on telling the educator what he or

she needs to know, or informing an external compliance requirement, but on identifying gaps

in student learning and enabling current students to benefit from that knowledge. In short, any

student views that are collected need to be used as part of a culture of improvement (Harvey

2003; Josefson, Pobiega, and Stråhlman 2011).

By focusing on the learner, evaluation should be seen as a mechanism for using

student voice to inform practice and enhance learning. This is not just a future-focused

activity; it also has immediate benefits for current learners. Taking a formative stance,

Freeman and Dobbins (2013) draw on outcomes of the Student Enhanced Learning through

Effective Feedback (SENLEF) project to establish a conceptual model and principles of good

feedback practice (Juwah et al. 2004). Their approach emphasises the need for dialogue

between students and educators and consequently a process of evaluation that seeks to benefit

both parties. Evaluation is positioned as diagnostic in purpose and an opportunity to close the

gap between current and desired performance. Consequently, feedback benefits both the

educator and the learners and enables them to collaborate to monitor learning and reflect on

changes to enhance it (Freeman and Dobbins 2013).

This paper adapts and extends Freeman and Dobbins’ (2013) work by applying core

principles of effective feedback to develop a model of learner-focused evaluation in order to

facilitate student-centred approaches to teaching. The next section of the paper describes how

the model is underpinned by a systems view of organisational control, which draws on three

types of control to maximise performance and adapt to changing needs.

Page 9: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

8 | P a g e

Control and learner-focused evaluation cycles

Although an agile learning environment is flexible by definition, providing structured agility

requires a scaffold of control so that neither educators nor learners lose sight of the intended

learning outcomes. Student evaluations can be seen as a form of control; providing data

which can be used to reflect upon the success of learning activities, and prompting action if

appropriate. Drawing on our disciplinary and practitioner expertise, and experience of quality

control frameworks, we searched the management literature to find theoretical models that

had been developed to scaffold control while allowing for flexibility and innovation. One

such theoretical perspective is known as a systems view of organisation. According to this

view, the term control refers to both monitoring activities and taking corrective action in

order to ensure that goals are achieved. The systems view of organisation proposes three

types of control, each of which represents a different stage of the productive cycle:

feedforward, concurrent and feedback (Bartol et al. 2008, 342). Systems Theory is used

extensively in management practice for applications as diverse as designing manufacturing

plants, formulating business strategy and structuring high-performing teams (Martin and

Fellenz 2010). Furthermore, feedforward, concurrent and feedback controls are increasingly

featuring in research outside of business for purposes as diverse as planning interventions to

support students struggling with algebra (Cooper 2011), teaching aircraft pilots how to land

planes (Huet et al. 2009) and treating brain damage (Botzer and Karniel 2013).

In this paper, the systems view of organisational control is adopted and adapted to

provide a scaffold of control in agile learning environments. First, feedforward control is

defined as regulating inputs to the learning process to ensure they meet the standards

necessary for the planned learning to occur. This refers to activities such as learning about

participants’ backgrounds and experiences in order to establish baselines for new learning, or

Page 10: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

9 | P a g e

selecting appropriate learning environments based on identified needs. This stage focuses on

planning learning activities that are tailored to learner characteristics and needs.

Next, concurrent control is defined as regulating and adapting ongoing learning

activities to ensure that they conform to standards and learners are on track to meet planned

goals. This refers to activities such as evaluating learning frequently and formatively, as well

as taking action to adjust curriculum design when necessary. This stage focuses on assuring

that the planned learning is happening, and if it is not, taking action immediately to address

learner needs.

Finally, feedback control is defined as regulation exercised after the learning has been

assessed and involves checking that the output meets the goal. This refers to activities such as

analysing assessment grades or conducting post-assessment surveys to determine whether

learners have met the intended outcomes. This stage focuses on revising curriculum or

learning outcomes to better meet the needs of future learners.

All three types of control can contribute to effective learning by alerting both the

learner and the instructor to gaps in understanding; however, the success of each form

depends very much on the timing and context. If a single form of control is used in isolation

it will not necessarily be effective in promoting learning. Consequently, models of control

that use all three dimensions—feedforward, concurrent and feedback—are used widely in

business and thus would be most effective to evaluate learning experiences.

Our proposed model of Learner-Focused Evaluation Cycles (see Figure 1) enables

instructors to evaluate learning in a number of different ways at a number of different times

in order to inform the reflective process and provide feedback to both the educator and the

students. The cyclical model is intended to guide action so that evaluation is not something

undertaken simply for the benefit of future offerings of the unit or course, but rather an

Page 11: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

10 | P a g e

integrated system that is intended to benefit current students by allowing instructors to reflect

on and adapt ‘real-time’ learning activities.

[please insert Figure 1 about here]

The model of Learner-Focused Evaluation Cycles (LFEC) extends existing work on effective

feedback in two key ways. First, it incorporates a third form of control, concurrent evaluation,

into the feedback/feedforward approach to assessment in order to regulate and adapt learning

activities in agile learning environments. Second, it emphasises the relationship between

assessment and evaluation by applying the principles of assessment for learning to enhance

course evaluations. As noted earlier, recent research recognises that effective evaluation of

higher education teaching and curriculum needs to draw on evidence from a number of

sources, instead of relying purely on student survey data (Alderman, Towers, and Bannah

2012; Berk 2005). Learner-focused evaluation cycles enable educators to draw on a range of

tools to gather multiple forms of evidence in order to plan, assure and revise during all stages

of learning.

The model has been tested using three established evaluation tools—Think-Pair-Share

(TIPS), One Minute Papers (OMPs), and Instant Response Systems (IRS)— that were

implemented at all points of evaluation to assess their ability to capture data that would build

structured agility into learning experiences and environments. The techniques were tested

with two different groups of students. The first group comprised 40 doctoral students

undertaking a program to prepare them for university teaching roles as part of an academic

teaching development program at an Australian university. The students attended six three-

hour workshops over a four month period. The doctoral students were drawn from across the

university, and were from myriad disciplines and backgrounds. The second group were 70

Page 12: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

11 | P a g e

students registered in a postgraduate marketing unit as part of a master’s degree in Marketing.

The students were predominantly from countries where English is a second language, and

attended 12 two-hour seminars over the semester. All three evaluation tools were successfully

used as mechanisms of feedforward, concurrent and feedback evaluation and data generation

with both cohorts.

With the doctoral student group feedforward, concurrent and feedback evaluation data

were gathered in each of the six workshops. This was a deliberate strategy to model the

innovative approach to evaluation and to expose these current and future university teachers

to a diverse range of tools. With the Marketing student group, feedforward, concurrent and

feedback evaluation data were gathered across the semester, with feedforward concentrated in

sessions one and three, concurrent occurring in sessions two, four, six, eight, and 10, and

feedback occurring in session 12. Again, this was a deliberate strategy to minimise additional

workload for staff and students, and ensure that students were able to realise a tangible

benefit from participating in the evaluations by seeing clear responses to their evaluation.

To illustrate how these tools could be used to facilitate learning, the next section of

the paper explains how one evaluation tool, Instant Response Systems, was used to complete

a full cycle of learner-focused evaluation. IRS has been selected for illustrative purposes

because of the growing use of IRS technologies in teaching, and its ability to collect and

visualise high volumes of responses rapidly. IRS can also be implemented with both large

and small cohorts.

Instant Response Systems

Instant Response Systems (IRS) have been used in higher education for many years and have

a demonstrated impact on engagement and learning (Cathcart and Neale 2012; Kift 2006;

Williamson Sprague and Dahl 2010). The benefits of using IRS to enhance learning include

Page 13: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

12 | P a g e

improved academic performance, student satisfaction, engagement, attendance, participation

and perceived feedback (Bates and Galloway 2012; Keough 2012). IRS can involve web-

based interfaces (facilitated by smartphones, laptops or tablets) or physical devices (e.g.

clickers) linked to an infra-red receiver. At the university in question, two main systems are

in operation: clickers linked to TurningPoint software, and the web-based interface

GoSoapBox, which students access using their own web-enabled device. On this occasion,

clickers were assigned to each student and the instructor created a series of polls using

TurningPoint software. Questions were embedded in presentation software and students

selected a response by pressing a button on their clicker to vote anonymously for pre-

determined choices or entering text to respond to open questions. Once the instructor closed

the poll, a graph would appear on the screen showing the responses in chart form or a list of

open-text comments. Each of the three stages of the learner-focused evaluation cycle is

outlined below.

Instant Response Systems for feedforward evaluation

Feedforward evaluation refers to regulating inputs to the learning activity to ensure they meet

the standards necessary for the planned learning to occur. Consequently, polls were designed

to generate data that could feedforward to subsequent teaching sessions. Examples of the

questions used include:

Which of the following topics would you like us to focus on next week?

Rank the following issues in terms of your confidence in dealing with them.

Which of the following approaches would help you learn about topic XXX (e.g. a

case study, problem-based learning, quiz, peer instruction, class discussion, etc.)?

What questions would you like answered about topic XXX? (Suitable for open text

IRS)

Page 14: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

13 | P a g e

Once the poll was closed, the results were shown to the class. The instructor then

summarised the prevailing position and explained how that information would be used to

feedforward to the next workshop. At this point, a dialogue about the results and the way in

which they might frame subsequent learning experiences was opened so that students could

collaborate in the evaluation of planned activities. A fundamental part of this approach was

the idea of joint ownership of course enhancement processes based on a vision of the learning

process as a co-operative enterprise (McCulloch 2009). It enabled the instructor to reframe

planned learning outcomes in order to explicitly address students’ responses. That is not to

say that intended outcomes are abandoned and replaced with the preferred outcomes

identified by students, but rather that a dialogue has been opened about both content and

process, which enabled a reflective discussion to feedforward into future learning. The tool

was used for evaluation and as a scaffold of control to ensure that learners did not drift too far

from intended learning outcomes.

Instant Response Systems for concurrent evaluation

Concurrent evaluation involves regulating and adapting ongoing activities to ensure they

conform to standards and are on track to meet the planned learning outcomes. Using IRS,

polls were designed to generate data that could provide a mechanism for concurrent control

by providing information about the students’ learning and experience in order to evaluate and

adjust activities to better meet their needs. Polls were conducted throughout the session so

that activities could be adapted in real-time to address the diverse needs of the student cohort.

Examples of the questions used included:

Which aspect of the current session do you feel you need more help with?

Demographic questions or questions linked to the cohort’s mood, energy levels, and

background experience

Page 15: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

14 | P a g e

Questions designed to evaluate the effectiveness of the session by testing

understanding and ability to apply the content to novel situations

Once students had been polled, the instructor either revealed the responses and opened a

discussion or, before revealing the result, asked the students to find someone who answered

differently and persuade them to change their answer. This process of peer instruction using

conceptual questions helps students explore their understanding and deepen their learning

(Mazur 1997). Once the discussions had taken place, the instructor asked students to vote

again, and then revealed the chart and summarised the issues. Gaps in understanding were

noted and the instructor explained how the data would be used concurrently to inform the rest

of the workshop. This meant reallocating time to problematic topics, revisiting the core

concepts, and/or providing opportunities for students to apply the challenging content to

different scenarios.

Instant Response Systems for feedback evaluation

Feedback evaluation involves regulation exercised after the learning opportunity is complete

and involves checking that the learning outcomes have been achieved. Using IRS, polls were

designed to generate data that could provide a mechanism for feedback to inform future

practice and offerings of the unit or course. Examples of feedback questions asked at the end

of the workshop series include:

Which learning activities did you find most/least helpful in this workshop?

How effectively did the learning activities prepare you for the assessment?

What else would you have liked us to cover? (suitable for open text IRS)

The anonymous answers were revealed and used to facilitate a brief discussion aimed at

exploring and clarifying the responses. The instructor summarised the responses and

explained how the feedback would be used to reflect upon the teaching and curriculum. After

Page 16: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

15 | P a g e

the class, responses were collated and used to reflect on the workshop for future iterations.

This stage of the learner-focused evaluation cycle was most similar to evaluating teaching

practice using student surveys; however, a key difference between these approaches was the

immediacy of the results and the opportunity for the instructor to engage in a ‘real-time’

discussion with students about the outcomes.

Discussion

Our model of learner-focused evaluation cycles encourages academics to customise

evaluation practices and use them to guide action in order to benefit current and future

students. Recognising that many of the problems associated with current evaluation practices

are similar to those problems identified with assessment and feedback, our model draws on

the principles of assessment for learning by using evaluation to open a dialogue with learners

and make them co-producers of their learning. Furthermore, evaluation is not a post-learning

exercise (equivalent to feedback on summative assessment), but, drawing on systems theory,

is conceived of as a series of cycles designed to plan, assure and revise curriculum and

learning events through feedforward, concurrent and feedback evaluation in agile learning

environments. Evaluation is thus not simply used for reporting purposes, or for the benefit of

future cohorts, but in a way that enhances real-time learning and maximises the opportunities

for students to influence and take ownership of their learning in order to best meet their

particular needs.

By making explicit the three stages of evaluative control, the educator enables

learners to take ownership of their learning through the creation of multiple opportunities for

student voices to be heard and for learners to reflect on their journey to achieving stated

outcomes. Rather than evaluation benefiting future cohorts of learners, students directly

Page 17: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

16 | P a g e

benefit by participating in the evaluation cycle, which feeds directly into reframing activities

to promote their learning. The immediacy of the data generated through the three stages of

the model facilitates agile approaches to learning and teaching in line with the principle of

seeking to adapt practice to meet the unique needs and preferences of each cohort of students.

In the learner-focused evaluation cycle, learners are placed at the very centre of the

evaluation experience.

However, there are a number of limitations to the model. First, implementing learner-

focused evaluation cycles is likely to increase the workload of academic instructors in a

number of ways. In the examples outlined above, instructors needed to (1) learn how to use

IRS technology, (2) ensure that students had access to a clicker or web-based device, and (3)

put thought into developing meaningful questions that would open up (rather than close

down) dialogue with students about their learning. Furthermore, time needed to be built into

each workshop or lecture to conduct polls, discuss results with the students and revise content

if needed. After workshops and lectures had concluded, instructors needed to reflect on the

outcomes of the evaluation and use that reflection to feedback or feedforward to future

learning events. Although we argue that implementing this cycle improves learning, the

process clearly generates greater work for the instructors than static transmission-based

teaching environments or post-experience student survey collection. In the case of the

Marketing unit, this additional workload was spread across the semester by only undertaking

evaluations every few sessions, rather than each session. Even though the increased effort and

time spent by the instructor is mitigated as their technical expertise improves and they

compile a list of evaluation questions, workload implications remain.

A second challenge of implementing learner-focused evaluation cycles is that it

requires students to actively participate in evaluation and therefore takes time away from

what previously might have been additional content delivery. The challenge for the instructor

Page 18: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

17 | P a g e

is to persuade students that the evaluation activity is for their benefit, and to use the data that

is generated to better support their learning. In our experience, this is initially met with

scepticism by students, most of whom have previously only experienced ‘post-activity’

surveys, the outcome of which is rarely known (Alderman, Towers, and Bannah 2012).

However, by ensuring that the feedback loop is closed (Venning and Buisman-Pijlman 2012),

students quickly see how their evaluation is used to reframe the content of the learning

environment and tailor the learning activities to their interests and needs. This shift in the

perception of evaluations, from something that is required for institutional reporting purposes

to something that is designed to improve the learning experience of the students doing the

evaluation, is a big one. As with any reframing activity at university, some students will need

longer than others to be convinced.

A third challenge of implementing learner-focused evaluation cycles is that it requires

a specific set of skills and competencies from the teaching team in order for it to be

effectively implemented. Previous research has found that inviting feedback and critique

from students can make instructing staff anxious and insecure (Dall’Alba 2005; Gourlay

2011). For some academics, the process of evaluation (of both teaching and curriculum) has

been a negative one (Otto, Sanford, and Ross 2008) and is experienced as something done to

rather than by or for them. This particularly applies to centralised student survey evaluations

where academics may have no control over the questions or timing of the survey, or

ownership of the results. Learner-focused evaluation cycles facilitate more control of

evaluation for the instructor. At the same time, it requires instructors to engage in dialogue

with their students about curriculum, learning outcomes and processes. The very act of asking

questions and publicly revealing responses potentially exposes the instructor to criticism and

also requires them to adopt a flexible approach to their teaching in order to fully engage with

learners’ feedback. In short, it requires both sensitivity and a thick skin so that feedback from

Page 19: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

18 | P a g e

learners can be reflected upon as an opportunity to better meet their learning needs, rather

than a personal attack on the instructors practice or values. Not all academic instructors

would feel comfortable in this situation.

Conclusion

There is a growing trend to draw upon student-centred agile teaching and learning

methodologies in higher education in order to better meet the needs of diverse cohorts of

students. Despite this, evaluation of learning and teaching has failed to keep pace with the

need for flexible, immediate and diagnostic feedback that can be used to benefit current

learners as well as future intakes. By drawing on principles of effective feedback for learning

and adapting a model of organisational control, learner-focused evaluation cycles provide a

framework to take a more holistic approach to evaluation. By developing a dialogue about

learning through feedforward, concurrent and feedback evaluation, instructors are able to

make real-time adjustments to their teaching and respond flexibly and quickly to challenging

student needs. In order for educators to implement learner-focused evaluation cycles, they

need to develop confidence in gathering and responding to feedback, flexibility in their

approach to curriculum design, openness in their discussions with learners, and belief in

education as a co-operative enterprise.

Page 20: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

19 | P a g e

References

Abrami, P., S. d'Apollonia, and S. Rosenfield. 2007. "The Dimensionality of Student Ratings

of Instruction: What We Know and What We Do Not." In The Scholarship of

Teaching and Learning in Higher Education: An Evidence-based Perspective, edited

by Raymond Perry and John Smart, 385–456. Dordrecht: Springer.

ACER, Australian Council for Educational Research. 2009. Australasian Survey of Student

Engagement: Australasian Student Engagement Report. Victoria: Australian Council

for Educational Research.

Alderman, L., S. Towers, and S. Bannah. 2012. "Student Feedback Systems in Higher

Education: A Focused Literature Review and Environmental Scan." Quality in Higher

Education 18 (3): 261–280. doi: 10.1080/13538322.2012.730714.

Bartol, K., M. Tein, G. Matthews, B. Sharma, P. Ritson, and B. Scott-Ladd. 2008.

Management Foundations: A Pacific Rim Focus. 2nd ed. NSW: McGraw-Hill.

Bates, S., and R. Galloway. 2012. "The Inverted Classroom in a Large Enrolment

Introductory Physics Course: A Case Study." In HE Academy, STEM Papers. York:

The Higher Education Academy: STEM.

Beck, K., M. Beedle, A. van Bennekum, A. Cockburn, W. Cunningham, M. Fowler, J.

Grenning, J. Highsmith, A. Hunt, and R. Jeffries. 2001. "The Agile Manifesto."

Accessed February 7 2013. http://www.agilemanifesto.org/principles.html

Berk, R. 2005. "Survey of 12 Strategies to Measure Teacher Effectiveness." International

Journal of Teaching and Learning in Higher Education 17 (1): 48–62.

Berk, R. 2011. "Evidence-based Versus Junk-based Evaluation Research: Some Lessons from

35 Years of the Evaluation Review." Evaluation Review 35 (3): 191–203.

Page 21: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

20 | P a g e

Botzer, L., and A. Karniel. 2013. "Feedback and Feedforward Adaptation to Visuomotor

Delay During Reaching and Slicing Movements." European Journal of Neuroscience

38 (1): 2108–2123. doi: 10.1111/ejn.12211.

Brown, S. 2010. "Assessing for Learning and Smarter Feedback." In Teaching and

Educational Development Institute (TEDI) Seminar Series. University of Queensland,

Brisbane.

Cathcart, A., and L. Neale. 2012. "Using Technology to Facilitate Grading Consistency in

Large Classes." Marketing Education Review 22 (1): 9–12.

Chun, A. 2004. "The Agile Teaching/Learning Methodology and Its E-learning Platform."

Paper presented at Advances in Web-based Learning, at the third international

conference of ICWL, August 8-11, Beijing, China.

Cooper, C. I. 2011. "A Concurrent Support Course for Intermediate Algebra." Research &

Teaching in Developmental Education 28 (1): 16–29.

Dall’Alba, G. 2005. "Improving Teaching: Enhancing Ways of Being University Teachers."

Higher Education Research & Development 24 (4): 361–372. doi:

10.1080/07294360500284771.

Felton, J., J. Mitchell, and M. Stinson. 2004. "Web-based Student Evaluations of Professors:

The Relations between Perceived Quality, Easiness and Sexiness." Assessment &

Evaluation in Higher Education 29 (1): 91–108.

Freeman, R., and K. Dobbins. 2013. "Are We Serious about Enhancing Courses? Using the

Principles of Assessment for Learning to Enhance Course Evaluation." Assessment &

Evaluation in Higher Education 38 (2): 142–151. doi

10.1080/02602938.2011.611589.

Goodwin, B., and K. Miller. 2013. "Evidence on Flipped Classrooms Is Still Coming In."

Educational Leadership 70 (6): 78–80.

Page 22: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

21 | P a g e

Gourlay, L. 2011. "New Lecturers and the Myth of 'Communities of Practice'." Studies in

Continuing Education 33 (1): 67–77. doi: 10.1080/0158037X.2010.515570.

Harvey, L. 2003. "Student Feedback [1]." Quality in Higher Education 9 (1): 3–20. doi:

10.1080/13538320308164.

HEFCE. 2012. "Highest Ever Satisfaction Rates in 2012 Student Survey." Higher Education

Funding Council for England and Wales. Accessed February 11 2013.

http://www.hefce.ac.uk/news/newsarchive/2012/name,75522,en.html.

Huet, M., D. M. Jacobs, C. Camachon, C. Goulon, and G. Montagne. 2009. "Self-Controlled

Concurrent Feedback Facilitates the Learning of the Final Approach Phase in a Fixed-

Base Flight Simulator." Human Factors: The Journal of the Human Factors and

Ergonomics Society 51 (6): 858–871. doi: 10.1177/0018720809357343.

Josefson, K., J. Pobiega, and C. Stråhlman. 2011. "Student Participation in Developing

Student Feedback." Quality in Higher Education 17 (2): 257–262. doi:

10.1080/13538322.2011.582799.

Juwah, C., D. Macfarlane-Dick, B. Matthew, D. Nicol, D. Ross, and B. Smith. 2004.

"Enhancing Student Learning Through Effective Feedback." The Higher Education

Academy Generic Centre. www.ltsn.ac.uk/genericcentre.

Keough, S. 2012. "Clickers in the Classroom: A Review and a Replication." Journal of

Management Education 36 (6): 822–847.

Kift, S. 2006. "Using an Audience Response System to Enhance Student Engagement in

Large Group Orientation : A Law Faculty Case Study." In Audience Response

Systems in Higher Education: Applications and Cases, edited by D. A. Banks, 140.

London: Information Science Publishing.

Marsh, H. 2007. "Students' Evaluations of University Teaching: Dimensionality, Reliability,

Validity, Potential Biases and Usefulness." In The Scholarship of Teaching and

Page 23: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

22 | P a g e

Learning in Higher Education: An Evidence-based Perspective, edited by Raymond

Perry and John Smart, 319–383. Dordrecht: Springer.

Martin, J., and M. Fellenz. 2010. Organizational Behaviour and Management. 4th ed.

Andover: Cengage Learning.

Masson, P., and K. Udas. 2009. "An Agile Approach to Managing Open Educational

Resources." On the Horizon 17 (3): 256–266.

Mazur, E. 1997. Peer Instruction: A User's Manual. New Jersey: Prentice Hall.

McAvoy, J., and D. Sammon. 2005. "Agile Methodology Adoption Decisions: An Innovative

Approach to Teaching and Learning." Journal of Information Systems Education 16

(4): 409–420.

McCulloch, A. 2009. "The Student as Co‐producer: Learning from Public Administration

about the Student–University Relationship." Studies in Higher Education 34 (2): 171–

183. doi: 10.1080/03075070802562857.

Nicol, D., and D. Macfarlane-Dick. 2004. "Rethinking Formative Assessment in HE: A

Theoretical Model and Seven Principles of Good Feedback Practice."

http://www.heacademy.ac.uk/resources/

detail/ourwork/tla/web0015_rethinking_formative_assessment_in_he.

Nicol, D., A. Thomson, and C. Breslin. 2013. "Rethinking Feedback Practices in Higher

Education: A Peer Review Perspective." Assessment & Evaluation in Higher

Education 38 (6): 1–20. doi: 10.1080/02602938.2013.795518.

Orsmond, P., S. J. Maw, J. R. Park, S. Gomez, and A. C. Crook. 2011. "Moving Feedback

Forward: Theory to Practice." Assessment & Evaluation in Higher Education 38 (2):

240– 252. doi: 10.1080/02602938.2011.625472.

Page 24: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

23 | P a g e

Otto, J., D. A. Sanford, and D. N. Ross. 2008. "Does Ratemyprofessor.com Really Rate My

Professor?" Assessment & Evaluation in Higher Education 33 (4): 355–368. doi:

10.1080/02602930701293405.

Price, M., J. Carroll, B. O’Donovan, and C. Rust. 2010. "If I Was Going There I Wouldn’t

Start from Here: A Critical Commentary on Current Assessment Practice."

Assessment & Evaluation in Higher Education 36 (4): 479–492. doi:

10.1080/02602930903512883.

Race, P. 2010. Making Learning Happen. 2nd ed. London: Sage.

Smith, C. 2008a. "Building Effectiveness in Teaching through Targeted Evaluation and

Response: Connecting Evaluation to Teaching Omprovement in Higher Education."

Assessment & Evaluation in Higher Education 33 (5): 517–533.

Smith, C. 2008b. "Design-focused Evaluation." Assessment and Evaluation in Higher

Education 33 (6): 631–645.

Spiel, C., B. Schober, and R. Reimann. 2006. "Evaluation of Curricula in Higher Education:

Challenges for Evaluators." Evaluation Review 30 (4): 430–450.

Steed, A. 2012. "The Flipped Classroom." Teaching Business & Economics 16 (3): 9–11.

Stein, S. J., D. Spiller, S. Terry, T. Harris, L. Deaker, and J. Kennedy. 2013. "Tertiary

Teachers and Student Evaluations: Never the Twain Shall Meet?" Assessment &

Evaluation in Higher Education 38 (7): 892–904. doi:

10.1080/02602938.2013.767876.

Stobart, G. 2008. Testing Times: The Uses and Abuses of Assessment. Oxon: Routledge.

Timpson, W., and A. Desley. 1997. "Rethinking Student Evaluations and the Improvement of

Teaching: Instruments for Change at the University of Queensland." Studies in Higher

Education 22 (1): 55–66.

Page 25: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

24 | P a g e

Trigwell, K., K. Caballero Rodriguez, and F. Han. 2012. "Assessing the Impact of a

University Teaching Development Program." Assessment & Evaluation in Higher

Education 37 (4): 499–511.

Venning, J., and F. Buisman-Pijlman. 2012. "Integrating Assessment Matrices in Feedback

Loops to Promote Research Skill Development in Postgraduate Research Projects."

Assessment & Evaluation in Higher Education 38 (5): 567–579. doi:

10.1080/02602938.2012.661842.

Ware, J., and R. Williams. 1975. "The Dr Fox Effect: A Study of Lecturer Effectiveness and

Ratings of Instruction." Journal of Medical Education 50 (2): 149–156.

Weaver, M. R. 2006. "Do Students Value Feedback? Student Perceptions of Tutors’ Written

Responses." Assessment & Evaluation in Higher Education 31 (3): 379–394. doi:

10.1080/02602930500353061.

Williamson Sprague, E., and D. W. Dahl. 2010. "Learning to Click: An Evaluation of the

Personal Response System Clicker Technology in Introductory Marketing Courses."

Journal of Marketing Education 32 (1): 93–103.

Worthington, A. 2002. "The Impact of Student Perceptions and Characteristics on Teaching

Evaluations: A Case Study in Finance Education." Assessment & Evaluation in

Higher Education 27 (1): 49–64.

Page 26: c Consult author(s) regarding copyright matters Notice Please …eprints.qut.edu.au › 65471 › 3 › 65471.pdf · Learner-focused Evaluation Cycles: Facilitating Learning Using

Assessment and Evaluation in Higher Education, Published Online First, December 2013

25 | P a g e

Figure 1. Learner-focused evaluation cycles.