12
Assessment Matters The Assessment Newsletter November 2015 Vol. 5, No. 4 Published by the University Assessment Council and Office of Institutional Research & Assessment Effective Use of Rubrics for International Students By: Jacqueline Olson, EdD, Assessment Coordinator, College of Management and Technology Thomas Kohntopp, Ph.D., Core & Lead Faculty, College of Management and Technology In the College of Management and Technology, we are focusing our efforts to best address the needs of our international students in multiple ways, including ensuring rubrics are assignment-specific, consistent from course to course in a program, provide the structure of how an assignment will be assessed, and allow for instructor feedback that will enable the student to continuously improve their performance. In our courses that have large numbers of international students (i.e., dual-degree programs with our partner institution, Universidad de Valle de Mexico), we have simplified the language on the rubrics and broken down the rubric elements that align to the assignment into manageable chunks while still maintaining the integrity of the assessment tool and course content. Students come to class with various experiences and expectations as to how instructors evaluate assignments. Some may even have encountered the professor who embraces the, “I know an ‘A’ paper when I see one,” philosophy. In addressing questions and comments from international students, previous grading experiences may have completely lacked any degree of objectivity and consistency, where instructors are all-knowing and pass judgment with little specific feedback to help students improve. In addition to potential grading standards that may be ill defined, international students often face language challenges. Well composed rubrics that directly connect to assignment criteria provide international students, and all students, with defined grading expectations written with understandable clarity. Using rubrics with a set format and structure also help in that they tend to reduce time and effort that a student would otherwise invest searching for relevant information week to week in rubrics with a different appearance or format. This is especially true for international students. From the instructor’s perspective, expertly crafted rubrics make grading and student feedback less arduous. With international students in class, who may lack extensive English comprehension, having clearly written criteria as the foundation of feedback eases a grading burden that some instructors may face. As a best practice, grading rubrics aid instructor-student communication, which is essential for all students. Rubrics offer all students an opportunity to better understand what is assessed on assignments, the weighting of components, and the organization or structure of an assignment. This is particularly useful for international students who may be unfamiliar with American expectations of how to organize an academic paper or who are struggling with understanding what is needed on an assignment. In this Issue: Effective Use of Rubrics for International Students School of Counseling CACREP Accreditation Update Blooms Digital Taxonomy: Moving Assessment Into the 21st Century New graduate course from the Academic Skills Center Riley College Transition from NCATE to CAEP: Update Examining the Impact of Educational Roundtables on Faculty Engagement Value and Necessity of Program Progression Matrices

Assessment Matters Newsletter_November 2015 (3)

Embed Size (px)

Citation preview

Page 1: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters The Assessment Newsletter

November 2015 Vol. 5, No. 4 Published by the University Assessment Council and Office of Institutional Research & Assessment

Effective Use of Rubrics for International Students By: Jacqueline Olson, EdD, Assessment Coordinator, College of Management and Technology Thomas Kohntopp, Ph.D., Core & Lead Faculty, College of Management and Technology

In the College of Management and Technology, we are focusing our efforts to best

address the needs of our international students in multiple ways, including ensuring

rubrics are assignment-specific, consistent from course to course in a program,

provide the structure of how an assignment will be assessed, and allow for instructor

feedback that will enable the student to continuously improve their performance. In

our courses that have large numbers of international students (i.e., dual-degree

programs with our partner institution, Universidad de Valle de Mexico), we have

simplified the language on the rubrics and broken down the rubric elements that align

to the assignment into manageable chunks while still maintaining the integrity of the

assessment tool and course content.

Students come to class with various experiences and expectations as to how

instructors evaluate assignments. Some may even have encountered the professor

who embraces the, “I know an ‘A’ paper when I see one,” philosophy. In addressing

questions and comments from international students, previous grading experiences

may have completely lacked any degree of objectivity and consistency, where

instructors are all-knowing and pass judgment with little specific feedback to help

students improve. In addition to potential grading standards that may be ill defined,

international students often face language challenges. Well composed rubrics that

directly connect to assignment criteria provide international students, and all

students, with defined grading expectations written with understandable clarity. Using

rubrics with a set format and structure also help in that they tend to reduce time and

effort that a student would otherwise invest searching for relevant information week

to week in rubrics with a different appearance or format. This is especially true for

international students.

From the instructor’s perspective, expertly crafted rubrics make grading and student

feedback less arduous. With international students in class, who may lack extensive

English comprehension, having clearly written criteria as the foundation of feedback

eases a grading burden that some instructors may face. As a best practice, grading

rubrics aid instructor-student communication, which is essential for all students.

Rubrics offer all students an opportunity to better understand what is assessed on

assignments, the weighting of components, and the organization or structure of an

assignment. This is particularly useful for international students who may be

unfamiliar with American expectations of how to organize an academic paper or who

are struggling with understanding what is needed on an assignment.

In this Issue:

Effective Use of Rubrics for International Students

School of Counseling CACREP Accreditation Update

Blooms Digital Taxonomy: Moving Assessment Into the 21st Century

New graduate course from the Academic Skills Center

Riley College Transition from NCATE to CAEP: Update

Examining the Impact of Educational Roundtables on Faculty Engagement

Value and Necessity of Program Progression Matrices

Page 2: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 2

School of Counseling CACREP Accreditation Update By: Kristin Cannon, PhD, LPC, NCC, Director of Assessment, School of Counseling

The School of Counseling (SoC) recently submitted a self-study to the Council for

Accreditation of Counseling and Related Educational Programs (CACREP) for re-

accreditation/accreditation of all five of our counseling programs. This effort reflects a

multi-purpose initiative to reaccredit our oldest counseling program (MS in Clinical

Mental Health Counseling), seek accreditation for two of our newer counseling

programs (MS in Addiction Counseling and MS in School Counseling), and allow for all

of our programs to be placed on the same accreditation cycle, including those that

were recently accredited by CACREP in 2014 (MS in Marriage, Couple, and Family

Counseling and PhD in Counselor Education and Supervision).

As with any accreditation process, this effort has required a substantial response from

leadership and faculty within the SoC as well as additional university partners and

stakeholders. Of particular challenge to this process was the fact that the CACREP

standards were recently revised, with a final published set of standards only made

available in March of 2015. Despite this, members of the SoC have been working

toward this effort for the past year under the guidance and direction of our CACREP

Project Manager, Dr. Stacee Reicherzer, and Senior Director of Accreditation and

Academic Operations, Dr. Kelly Coker.

With the changes to the CACREP standards came a significant and increased focus on

assessment practices, inclusive of how programs assess students’ skill development as

well as student dispositions. Additionally, a higher emphasis was placed on how

programs evaluate overall program functioning. To meet these requirements efforts, a

comprehensive assessment plan was developed by the SoC assessment team and

vetted through leadership and the SoC CAP, to be included in the self-study. This plan

now includes a formal annual review cycle of key program data as well as an on-going

and enhanced review of individual student performance across students’ programs of

study.

Other key projects accomplished through the self-study process include curriculum

mapping of all courses in each program of study, the development of new program and

student learning outcomes, the creation of new comprehensive syllabi for all courses,

the development of new assignment rubrics, changes to curriculum, and course

overhauls, where necessary. Significant efforts were also made to enhance the field

experience curriculum and further refine the assessment practices conducted through

our data management platform, Meditrek.

The self-study was submitted to the CACREP board in early November with the goal of

review and approval for a site visit in summer of 2016. Many thanks are due to our

amazing SoC faculty, coordinators, directors, associate dean, vice president, product

managers, business operations, and partners in OIRA, PSID, and Business Intelligence

for their outstanding efforts and work on this undertaking. We look forward to sharing

an update on progress in upcoming publications of Assessment Matters.

Page 3: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 3

Blooms Digital Taxonomy: Moving Assessment into the 21st Century By: Laura Schindler, PhD, Director, Quality Assurance

Sarah Puls-Elvidge, MPA, Director, Quality Assurance

Computers themselves, and software yet to be developed, will revolutionize the way we learn. -Steve Jobs Recent surveys reveal that approximately two thirds of college students use mobile

devices (e.g., smartphones, tablets) for learning and believe that technology helps

them achieve academic outcomes and prepares them for the workforce (Chen,

Seilhamer, Bennett, & Bauer; 2015; Dahlstrom, 2012). Given consistent trends related

to the use and perceived importance of technology for learning, there is an excellent

opportunity to reexamine traditional educational delivery and assessment methods to

better serve the needs of students. In this article, we will examine how the latest

iteration of Bloom’s Taxonomy can be used to breathe new life into traditional

educational assessments. In the paragraphs that follow, we begin by providing a brief

evolution of Bloom’s Taxonomy and how it is used in current Academic Quality and

Accreditation Quality Assurance (QA) reviews. Then we will use Bloom’s Digital

Taxonomy to revise traditional writing-intensive assignments to purposefully

incorporate technology.

In 1956, Dr. Benjamin Bloom published the Taxonomy of Educational Objectives

(commonly known as Bloom’s Taxonomy) to classify learning and encourage higher-

level thinking. There are three domains of learning within the taxonomy: cognitive

(knowledge), affective (emotions/values), and psychomotor (skills). For the purpose

of this article, we will focus on the cognitive domain, which includes six categories of

cognition ordered from simple to complex (knowledge, comprehension, application,

analysis, synthesis, and evaluation; Bloom, Englehart, Furst, Hill, & Krathwohl, 1956).

In 2001, the original version of Bloom’s Taxonomy was revised and the cognitive

categories were changed from nouns to verbs (e.g., application to applying) to make

them actionable. In addition, a remembering category was added, and the original

synthesis category was renamed creating and placed at the highest level of the

taxonomy (Anderson & Krathwohl, 2001). The latest version of Bloom’s Taxonomy

incorporates 21st-century learning by taking technology and digital learning into

consideration. Bloom’s Digital Taxonomy uses the sequence of verbs from Bloom’s

Revised Taxonomy (remembering, understanding, applying, analyzing, evaluating, and

creating) and includes digital techniques that can be used to assess learning. For

example, bullet pointing, highlighting, and bookmarking are associated with the

remembering category, while programming, filming, and podcasting are associated

with the creating category (Churches, 2008).

Academic Quality and Accreditation’s QA team conducts comprehensive QA reviews of

Walden academic programs during the Academic Program Review (APR) process.

During the reviews, the QA team uses Bloom’s Taxonomy to recommend

improvements in clarity, measurability, and alignment among weekly objectives,

course goals, and program outcomes. Recently, the team has begun to examine how

Page 4: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 4

Bloom’s Digital Taxonomy can be used to recommend improvements related to the

integration of technology into assignments. Table 1 shows how traditional, writing-

intensive assignments can be transformed using Bloom’s Digital Taxonomy. The

alternative assignments require students to use both written and technical skills to

demonstrate the learning specified in the objective. For example, students can use an

educational app, such as Simple Mind+, to create a conceptual map that shows the

relationships between and among the research aims, theoretical framework, and

independent and dependent variables for their dissertation research. The benefits of

this alternative assignment extend well beyond achievement of the desired learning.

For example, one secondary benefit is the development of technological skills that may

be useful in other learning environments and in the workplace. Another benefit is that

the alternative assignment allows for more convenient and portable learning.

Specifically, an educational app can be accessed across various devices (e.g., laptop,

smartphone, and tablet) in a variety of locations, such as on the train when commuting

to and from work.

Table 1.

Learning Objectives, Traditional Assignments, and Proposed Alternative Assignments Using Bloom’s

Digital Taxonomy

Learning

Objective Traditional Assignment

Alternative Assignment using Bloom’s

Digital Taxonomy

Apply

Maslow’s

Hierarchy of

Needs to the

explanation of

motivation

Describe a scenario from your

life that illustrates two levels of

Maslow’s Hierarchy of Needs

and explain how it does so.

Then, explain how Maslow’s

theory may not capture all of

the motivational factors

involved in your behavior.

Using free digital storytelling

software, such as Adobe Voice or

ShowMe Interactive Whiteboard,

create a presentation that depicts how

two or more levels of Maslow’s theory

apply to a scenario in your life. Also,

be sure to address why other levels of

Maslow’s theory many not capture all

of the motivational factors involved in

your behavior. To submit your video,

click on the Assignment–Week 1 link

and then Write Submission. Next click

on the HTML button and paste the

embed code for your video, then click

submit.

Analyze

relationships

among the

research aims,

theoretical

framework,

and

independent

and dependent

variables for

Describe the research aims,

theoretical framework, and

independent and dependent

variables for your dissertation

research study. Then explain

how the research aims and

theoretical framework support

the proposed relationships

between your independent and

dependent variables and the

Using a mind mapping tool, such as the Mind Meister or Simple Mind+ apps, create a conceptual framework of your dissertation research study. The conceptual framework should visually depict the relationships among your research aims, theoretical framework, and independent and dependent variables. Be sure to show how your research aims and theoretical framework support the proposed relationships between your

Page 5: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 5

dissertation

research

expected outcomes of the study. independent and dependent variables and the expected outcomes of the study. Export your conceptual framework as a PDF file and submit it to the Week 4 drop box.

Compose

arguments

about political

controversies

Describe the political

controversy you selected. Then,

provide an argument about the

political controversy. Be sure to

include at least three points

advancing your stance on the

controversy.

Create a blog posting about the

political controversy you selected. In

your blog, use text, embedded images

and videos, and web links to convey

your argument about the political

controversy. Be sure to include at

least three points advancing your

stance on the controversy.

The increasing desire among students to integrate technology into learning

experiences provides significant support for Bloom’s Digital Taxonomy. The goal in

using the digital taxonomy is not to eliminate traditional, writing-intensive

assignments entirely, but rather to purposefully introduce technology into

assignments where it is reasonable and appropriate. There is still significant value in

requiring students to write traditional papers, particularly in graduate programs

where students need to prepare for completing a thesis or dissertation. However,

there is also value in integrating technological tools in assignments where, for

example, students are required to explain how a theory applies to a scenario

(storytelling software), to show relationships among concepts (mind mapping apps),

or to construct a compelling argument (blogs). Fortunately, there is no shortage of free

educational software and the demand for educational apps, in particular, has increased

worldwide, with nearly a quarter of students in South Africa, India, and the United

States indicating they have purchased educational apps (Mobile Ecosystem Forum,

2014). Therefore, in the realm of digitizing assessments, the saying, “there’s an app for

that” certainly rings true.

References:

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching,

and assessing: A revision of bloom’s taxonomy of educational objectives. New

York: Longman.

Chen, B., Seilhamer, R., Bennett, L., & Bauer, S. (2015, June). Students’ mobile learning

practices in higher education: A multi-year study. EDUCAUSE Review.

Retrieved from http://er.educause.edu/articles/2015/6/students-mobile-

learning-practices-in-higher-education-a-multiyear-study

Churches, A. (2008). Bloom's digital taxonomy. Retrieved from

http://burtonslifelearning.pbworks.com/f/BloomDigitalTaxonomy2001.pdf

Dahlstrom, E. (2012). ECAR study of undergraduate students and information

technology, 2012 (Research Report). Retrieved from EDUCAUSE website:

http://net.educause.edu/ir/library/pdf/ERS1208/ERS1208.pdf

Page 6: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 6

Englehart, M., Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational

objectives: The classification of educational goals. Handbook I: Cognitive

domain. B.S. Bloom (Ed.). New York: Longman.

Mobile Ecosystem Forum (2014, April 25). Report shows growth markets driving

update of education apps. Retrieved from

http://www.mobileecosystemforum.com/2014/04/25/report-shows-growth-

markets-driving-uptake-of-education-apps-infographic/

New graduate course from the Academic Skills Center By: Emily Dahlen, PhD, Associate Director for student Learning, Academic Skills Center

On October 12, the Academic Skills Center launched a revised version of our four week

APA course called Basic APA Style: Citations and References, now WCSS 6200 (quarters;

1.5 credits) and WCSS 6201 (semesters; 1 credit). This course is designed to instill the

foundational APA concepts of proper citations and references. All graduate students

may register for the class via their academic advisor. The schedule is located on the

Academic Skills Center’s website. The course costs $195, not including any applicable

taxes or fees.

Additionally, on January 4, 2016 (semesters) and February 29, 2016 (quarters), the

Academic Skills Center will launch new versions of our graduate writing courses:

Graduate Writing I: Basic Composition Skills (WCSS 6050 – quarters; WCSS 6051 –

semesters), and Graduate Writing II: Intermediate Composition Skills (WCSS 6060 –

quarters; WCSS 6061 – semesters). Graduate Writing I focuses on critical reading and

effective summary; Graduate Writing II emphasizes paraphrasing, synthesizing, and

evaluating main ideas. Each of our graduate writing courses is eight weeks long. The

semester versions of the classes are 1 credit, and the quarter versions are 1.5 credits.

Any graduate student may register for the classes via their academic advisor. The

schedule is located on the Academic Skills Center’s website. Each course costs $195,

not including any applicable taxes or fees.

Page 7: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 7

Riley College Transition from NCATE to CAEP: Update By: Kate Steffens, PhD, Dean, Richard W. Riley College of Education and Leadership

Martha Larkin, PhD, Assessment Director, Richard W. Riley College of Education and Leadership, Teaching, Learning, and Professional Licensure Division

Suzanne Wesson, EdD, Assessment Director, Richard W. Riley College of Education and Leadership, Teaching, Learning and Professional Licensure Division

Robert Marshall, EdD, Assessment Director, Richard W. Riley College of Education and Leadership, Higher Education and Adult Learning, Administration and Leadership Division

Diane Penland, PhD, Assessment Director, Richard W. Riley College of Education and Leadership, Teaching, Learning and Professional Licensure Division

Debbie Bechtold, PhD, Assessment Director, Richard W. Riley College of Education and Leadership

As the Riley College prepares for a re-accreditation visit under the new Council for the Accreditation of Educator Preparation (CAEP, formerly the National Council for Accreditation of Teacher Education (NCATE)) standards in Fall 2018, we have undertaken several initiatives to reinforce our infrastructure for collecting, analyzing, and reporting data with the ultimate goal of producing competent, caring, effective educators who have a positive impact in their learning communities. With the new standards still somewhat in flux as CAEP takes into consideration the accessibility of the evidence required by the standards as well as the capacity of Education Preparation Providers (EPPs) and CAEP’s own capacity to collect and process this evidence, developing strategies to address each standard has been a little like trying to build a house on a moving foundation. Nevertheless, Riley College program leadership and faculty are meeting the challenges with enthusiasm, a lot hard work, and a focus on program improvement. One of the first tasks undertaken after the release of the new standards was the creation of study groups to review the standards in detail, consider data currently collected that could serve as evidence for meeting each standard, identify gaps in evidence, and determine action steps to address these gaps. Action steps resulting from the work of the study groups prompted a re-examination of our expectations with respect to student professional dispositions, diversity proficiencies, and technology proficiencies. Task Force groups consisting of the Dean, faculty representing programs across the College, and College Assessment Directors reviewed and updated the dispositions and proficiency expectations to better reflect the professional skills required by the standards and better prepare students for their roles as educators. Another important development resulting from recommendations of the study groups is the creation of the College Recruitment and Selection Committee. The role of the committee is to review, revise (if needed), and monitor marketing, recruitment, selection, and retention efforts to ensure the quality and diversity of our students while still meeting the high standards set by CAEP. CAEP recognizes that quality of evidence and the efficacy of data are essential to continuous improvement. In an effort to build the capacity of EPPs to collect data that provides relevant, actionable information contributing to program improvement, CAEP has offered to review assessments and rubrics submitted by EPPs prior to accreditation visits to assist the EPPs in strengthening their measures of student knowledge and skills. In late Fall 2015, the Riley College will submit assessments from

Page 8: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 8

the Master of Arts in Teaching (MAT) in Special Education program for review by CAEP, along with supporting information describing the context of the assessments (e.g., what they measure, when they are administered, etc.) and processes used to establish the validity and reliability of the assessments. CAEP also has developed many resources to assist EPPs in creating valid and reliable assessments. We have used these resources in several webinars conducted for program directors and faculty addressing best practices in rubric development and establishing the validity and reliability of measurements. In addition, we have implemented many of the best practices CAEP recommends in an internal review of assessments and rubrics currently being developed for new courses and programs. The interest and involvement of the Riley College faculty in these initiatives has been overwhelmingly positive. The study groups, task force groups, committees, and webinars have been well-attended and have provided opportunities for the faculty to share experiences, ideas, and recommendations for improving our programs and increasing the positive student experience. Their dedication, hard work, and willingness to address the challenges of change are the driving force behind all of these initiatives.

Examining the Impact of Educational Roundtables on Faculty Engagement By: Gilbert Singletary, PhD, JD, MBA, MSW, Program Coordinator, School of Social Work & Human Services

Debora Rice, PhD, MSW, Core Faculty, School of Social Work & Human Services

Sara Plummer, PhD, MSW, Core Faculty & Assessment Coordinator, School of Social Work & Human Services

Creating an engaging classroom for our students at Walden is a priority and is

addressed through multiple tools and skills. The faculty members at Walden instill a

sense of community through a consistent presence in the classroom in multiple ways,

and engagement and reciprocity is achieved through various avenues. A strong

learning community “enables participants to quickly establish working relationships,

share ideas openly and honestly, and benefit from the insights of the collective” (Hill,

2002, p. 69). Perry and Edwards (2005) suggested that an instructor can create and

maintain a dynamic and supportive learning environment by creating a Community of

Inquiry (CoI), through the use of three interlocking elements: a social, cognitive, and a

teaching presence. A CoI is focused on “the creation of communities of learners

actively and collaboratively engaged in exploring, creating meaning and confirming

understanding” (Garrison, 2009, p. 352). This is accomplished by establishing a

positive, affirming, and respectful environment while still challenging students to meet

high educational standards (Perry & Edwards). A “physical” and social presence is

achieved through the use of pictures, videos, and the creation of a class café or

“student lounge”. Weekly announcements, “check ins”, and encouraging emails are

sent regularly to build a sense of collaboration amongst the faculty and students.

The use of weekly discussion threads encourages consistent interaction between

students and faculty. Through the use of discussion threads a cognitive presence is

created to help students learn how to critically evaluate complex material, synthesize

Page 9: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 9

complex text, and form hypothesis based on the information presented. In addition,

discussion threads help students evaluate and practice ethical behavior. Faculty

members provide a teaching presence by offering feedback that not only evaluates the

writing and content of the post or assignment, but use the opportunity to teach, affirm,

challenge and influence student learning outcomes.

Many of our faculty members are educated on the best tools to engage students (in

large part due to their attendance in the faculty orientation), and have successfully

demonstrated their commitment to their students’ engagement in the classroom.

However, an informal survey of contributing faculty members in the social work

program revealed that many contributing faculty members are less effective in these

areas. This past year, Walden sent out a survey to faculty and staff in order to obtain

feedback on the areas of engagement and promoting a healthy organization. In an

effort to address the aforementioned goals established by the university, core faculty

members in the social work program worked together by engaging both core and

contributing faculty members in educational “roundtables”. The intent of the

roundtables was three fold:

To improve the engagement of the contributing faculty in the social work

program by bringing them together to encourage a dialogue on how they each

engage with their students

To improve the Key Performance Indicators (KPIs) of faculty by reacquainting

the faculty of the University’s expectations around their presence in the

classroom, grading, etc.

To improve the student experience by positively impacting the faculty’s use of

engagement skills

Three of the core faculty co-facilitated eight 60 minute roundtables, over two weeks,

using a PowerPoint that shared information on the KPIs and the skills faculty members

can incorporate to best engage their students in the classroom. The roundtables were

viewed as both an educational and social opportunity to engage the faculty in

reminding them of their responsibilities in the classroom, sharing some additional best

practices in creating an engaging classroom, and to build stronger relationships and

connections with contributing faculty. In all, 53 faculty members (both core and

contributing) attended these roundtables.

Using the data obtained from the KPIs and other indicators, the goal moving forward

will be to assess the impact of the “intervention” by using a General Linear Model (e.g.,

comparing pre and post test scores) to assess the effectiveness of the roundtables.

Faculty engagement is being measured in terms of overall participation in the online

classroom, using KPIs, and other classroom / blackboard metrics including when

grades are posted (if within the 7 day requirement), if an instructor responds to the

contact the instructor box within 48 hours, if the instructor responds to two thirds of

the original posts, and if instructor provides qualitative feedback along with

quantitative grades.

Although we have not begun gathering data, we believe it was quite successful as the

discussion was active and the faculty routinely thanked us for the information. Several

Page 10: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 10

participants shared that information would be very helpful to their ability to fulfill the

KPI requirements and better engage their students. It also appeared that the

interaction of having the roundtables themselves seemed to build stronger

relationships and connections with contributing faculty. For example one such email

we received after the roundtable states: Good morning, I just wanted to thank you for a

very helpful and very productive Round Table meeting. I feel more connected and

supported and better prepared.

References:

Garrison, D. R. (2009). Communities of inquiry in online learning. Encyclopedia of

Distance Learning. Retrieved on July 23, 2014 from

http://books.google.com/books?hl=en&lr=&id=sC9Le3jIwzIC&oi=fnd&pg=PA

352&dq=Communities+of+inquiry+in+online+learning.&ots=869iklRb5q&sig

=hRKdp8C1aR7zNkOmLAQ2e-

KoZJE#v=onepage&q=Communities%20of%20inquiry%20in%20online%20le

arning.&f=false

Hill, J. R. (2002). Overcoming obstacles and creating connections: Community building

in web-based learning environments. Journal of Computing in Higher

Education, 14(1), 67-86.

Perry, B. & Edwards, M. (April, 2005). Exemplary Online Educators: Creating a

Community of Inquiry, Turkish Online Journal of Distance Education, 6(2).

Retrieved online on July 23, 2014 from

https://tojde.anadolu.edu.tr/tojde18/articles/article6.htm.

Value and Necessity of Program Progression Matrices By: Jacqueline Olson, EdD, Coordinator of Assessment, College of Management and Technology

Ron Senterfitt, MBA, Coordinator of Institutional Effectiveness and Accreditation, College of Management and Technology

The College of Management and Technology (CMT) is taking the alignment of learning

outcomes included in each syllabus and is creating a Program Progression Matrix

(PPM) for each program that provides a comprehensive view of where the program

learning outcomes are addressed in the assignments for the courses that make up the

program. The value of creating a PPM for Program Directors is that they can more

easily identify which assignments they want to use for their Learning Outcomes

Report (LORBook) assessments, which past assignments they have used for previous

LORBook cycles, and how well their assignments address their program learning

outcomes.

The work of constructing a PPM is a collaborative effort between the Coordinator for

Assessment, Coordinator for Institutional Effectiveness and Accreditation, as well as

the Program Director. Not only does the PPM provide a comprehensive view for the

Program Director, it also allows accreditation reviewers to quickly see how the

Page 11: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 11

learning outcomes are addressed in the courses and aligns with accreditation

expectations.

How Accreditation Reviewers Use PPMs

CMT has earned specialized accreditation with the Accreditation Council for Business

Schools and Programs (ACBSP), the Accreditation Board for Engineering and

Technology (ABET), and the Project Management Institute Global Accreditation

Center. These organizations have many common focuses such as student learning and

continuous improvement. Our specialized accreditors want to see that we have

created program learning outcomes; they want to see the path a student will take to

achieve those outcomes; and they want to see that we are assessing ourselves in order

to continuously improve. The PPM is a simple overview that will help show a reviewer

the path taken to achieve the program learning outcomes, from the assignments to the

program learning outcome. It documents what is taught and when; it can help to

improve program coherence; and it can help reveal gaps in the curriculum.

It is not enough to say our students are achieving the outcomes; we have to assess,

improve where needed, and document it all. The PPM shows the students’ path and the

LORbooks prove through assessment and documentation that faculty members are

guiding students along the right path. The LORbook cycle is an integral and necessary

part of accreditation.

CMT’s Step-by-step Process for Constructing PPMs

After a new course is constructed by PSID, the Coordinator for Assessment uses the

syllabus to complete an alignment between the assignments and the program’s

learning outcomes. This provides the basic framework for the PPM. Depending on the

accreditation needs of each program, the Coordinator for Institutional Effectiveness

and Accreditation works with the Program Director to identify any additional

accreditation outcomes and to label each assignment as Introduced, Practiced, or

Assessed. Typically the Program Director will work with Lead Faculty to complete the

work of identifying the assignments as Introduced, Practiced, and Assessed, and it is

this activity that assists in recognizing gaps and redundancies in the assignments that

align to their program’s learning outcomes. The Coordinator for Institutional

Effectiveness and Accreditation also works with the Program Director to ensure there

is a positive social change alignment to the program’s learning outcomes identified on

the PPM because positive social change is part of Walden’s mission and because

accreditors are stressing social responsibility and ethics more than ever. The Program

Director is responsible for reviewing the accuracy of the PPM annually during the

program’s LORBook Cycle, and informing the Coordinator for Assessment of any

course changes to assignments and/or program learning outcomes that will

necessitate changes to the basic framework of the PPM.

Conclusion

While CMT has informally utilized PPMs over the past couple of years, there has not

been a formalized process that provides consistency in format and expectations of use,

as well as fully addresses the accreditation expectations for the program. Currently the

Page 12: Assessment Matters Newsletter_November 2015 (3)

Assessment Matters Newsletter | November 2015 page 12

Coordinator for Assessment and Coordinator for Institutional Effectiveness and

Accreditation are in the early stages of working with each Program Director on

updating their PPMs, and they will share examples of PPMs in future Assessment

Matters newsletters.

Assessment Council Members

Name College/Center/Dept Email

Shari Jorissen OIRA [email protected]

Michelle Burcin CHS-SHS [email protected]

Leslie Hussey CHS-SoN [email protected]

Sandra Bever (AC) CHS [email protected]

Yvonne Doll CMT-SoM [email protected]

Ron Senterfitt (AC) CMT [email protected]

Jackie Olson (AC) CMT [email protected]

John Borton CMT-SoIT [email protected]

Kristi Cannon (AC) CSBS-SoC [email protected]

Esther Benoit CSBS-SoC [email protected]

Sara Plummer (AC) CSBS-SoSWHS [email protected]

OPEN CSBS-SoSWHS

George Larkin CSBS-SoPPA [email protected]

Lori LaCivita CSBS-SoP [email protected]

Sandra Harris (AC) CSBS [email protected]

Gary Carson CUGS [email protected]

Jon Paulson (AC) CUGS [email protected]

Robert Marshall (AC) RWRCoEL [email protected]

Suzanne Wesson (AC) RWRCoEL [email protected]

Martha Larkin (AC) RWRCoEL [email protected]

Debra Chester RWRCoEL [email protected]

Michael Burke RWRCoEL [email protected]

Darragh Callahan RWRCoEL [email protected]

Deborah Bechtold (AC) RWRCoEL [email protected]

Lyda Downs CFE [email protected]

Deborah Inman CRQ [email protected]

Emily Dahlen CSS [email protected]

Monica Hill AQA Quality Assurance [email protected]

Stephanie Hossbach Laureate IT [email protected]

Susan Subocz PSID [email protected]

Fun facts:

86.9% of employers of Walden Alumni are Very satisfied or Satisfied with their Walden Alumni Employee.

97.3% of employers of Walden Alumni would hire another Walden graduate.

2015 Employer Survey Results