20
A ssessment Action in The Napkin Board : Rutgers Dining Services’ Low-Tech Assessment Tool Winter 2013 Vol. 1 Issue 2 A Publication of the Rutgers University Division of Student Affairs

Assessment in Action Issue 2

Embed Size (px)

DESCRIPTION

Welcome to the second issue of Assessment in Action! Assessment in Action is a publication of the Rutgers University Division of Student Affairs to help tell the story of assessment in our work. Assessment in Action highlights both internal examples of assessment as well out external expert advice on the subject in hopes of making assessment a more natural part of our work. This publication will: Encourage conversation around various assessment topics; Encourage the continuing development of a culture of assessment and a culture of evidence in the division; Provide resources related to assessment practices; Introduce and explain the Mission Assessment and Alignment Planning (MAAP) program; and Highlight a variety of effective assessment practices from throughoutthe division.

Citation preview

Page 1: Assessment in Action Issue 2

Assessment Actionin

The Napkin Board :Rutgers Dining Services’

Low-Tech Assessment Tool

Winter 2013 Vol. 1 Issue 2

A Publication of the Rutgers University Division of Student Affairs

Page 2: Assessment in Action Issue 2

Happy New Year and welcome to the second issue of Assessment in Action.

This publication is but one piece of an emergent divisional program of assessment. In addition to AiA, we are working on a modular assessment training program that can be accessed by anyone or any unit in the division. However, the most significant addition to our assessment efforts is happening this month when Marylee Demeter joins us as a full-time Coordinator of Student Affairs Assessment and Research. This marks the first time we will have a full-time expert to assist with assessment activities throughout the division.

Marylee has an M.E. in Educational Psychology and an M.Ed. in Measurement, Statistics, and Program Evaluation from the Graduate School of Education here at Rutgers. She has worked for the Center for Applied Psychology, Middlesex County College, The Graduate School of Education, and the Educational Testing Service. We look forward to her starting her work with us and you will be hearing from her directly as we move forward with coordinating and supporting divisional assessment efforts. Marylee’s email address is [email protected].

We welcome all feedback about this publication and also invite anyone interested in writing for it to contact us. Best wishes for a successful start to the spring semester.

Patrick LoveAssociate Vice Presidentfor Student Affairs

Welcome to

Assessment

Action in

02page Assessment Action in

Editorial StaffScott Beesley*Chris Brittain Andrew CampbellCharles Kuski*Patrick LovePatricia Rivas*Cheyenne SeoniaDesignScott BeesleyPhotographyScott Beesley Abbas Moosavi

Special ContributorsLisa EndersbyStudent Experience AdvisorUniversity of Ontario Institute of TechnologyGavin HenningHigher Ed. Assessment Expert and Professor at New England College and Salem State University.Michelle YureckoAssistant Dean of Academic Assessment, The College of St. Elizabeth

Questions?Please Contact:Student Affairs Marketing, Media, & Communications115 College AveNew Brunswick, NJ 08901Phone: (732) 932-7949Email: [email protected]

* Students in the Rutgers College Student Affairs Master’s Program

Page 3: Assessment in Action Issue 2

03page

Winter 2013

in ThisIssue

MAAP under the MicroscopeBy Cheyenne Seonia 04

page

Net Promoter ScaleBy Patrick Love 06

page

Why I Hate MissionStatementsBy Michele Yurecko 08

page

Synergistic AssessmentMethodsBy Gavin Henning 10

page

Interview with John SchuhBy Charles Kuski 15

page

Telling Stories A Data Sharing PilotBy Lisa Endersby 18

page

10page

Cover StoryThe Napkin Board: Rutgers Dining Services’ Low-Tech Assessment Tool Joe Charette (pictured above) believes that assessment doesn’t have to be messy. In this article, Joe discusses how suggestions pinned to a cork board help Rutgers Dining Services assess what they are doing.By Patty Rivas

Page 4: Assessment in Action Issue 2

04page Assessment Action in

hen I first met Ji Lee, Director of the Asian American Cultural Center (AACC), it was amidst preparations for their annual open house. The center buzzed as students decorated the walls with

warmth and completed last minute tasks to help make the new academic year bright and prosperous. Despite the sense of urgency throughout the center, Ji was able to find a few moments to sit down with me and talk “assessment.” The AACC just completed an assessment using the new Mission, Alignment, Assessment and Planning (MAAP) program, a visual assessment matrix developed at Rutgers University. It is designed to help departments throughout Rutgers to clarify departmental contributions,

enhance communication and collaboration, all the while aligning department goals with university goals. Essentially, it enables individual units of Rutgers University to work together and become unified. Assessment continues to grow as a significant part of the culture on campus, but the term is often met with a groan and a sigh of exhaustion by professionals. But for Ji, assessment does not carry the same anxiety as a midterm or final. Assessment in her case signifies the ability to take a snapshot of her department’s functions and locate places where growth is possible. Ji explained that when the AACC was first developed, the department was exceptionally small. With just a few staff members, assessment did not seem pertinent to their mission. However, as the department grew she recognized the need to start thinking about assessment and worked to get it on everyone’s radar. “Assessment is absolutely critical,” said Ji as she

W

MAAPUnder the Microscope

By Cheyenne Seonia

MAAP is a tool, like a microscope, it allows you to zoom in and focus on the small stuff, and zoom out to the big picture - Ji Lee

Page 5: Assessment in Action Issue 2

05page

Winter 2013

turned to MAAP to help her department flourish and function more efficiently. She explained that MAAP had allowed the AACC to adjust their “assessment lens” and realign their values, while strengthening their sense of community at Rutgers. Ji noted MAAP advocated self-reflection for the AACC. The ability to look closely inside the department as well as how the department functions on a larger university scale, gave Ji and fellow AACC staff members a greater understanding of their work and achievements. “It allowed our department to not only step into a micro level but also to step back on the macro level and see that we are doing some good.” On the micro level, Lee’s department looked closely at programs and events to understand how they were affecting students. “What resonated strongly with the MAAP matrix was that it gave us a chance to go back and allocate our resources a bit differently; maybe more of this type of program versus that type of program,” explained Ji. The ability to reflect on their department enabled clearer communication and more meaningful involvement with students. On the macro level, MAAP was a way for AACC to take pride in all the work they were doing. Said Ji: “There’s a lot of pride in what we do here at the cultural center, but then to find out the work that you’re doing is being justified because it fits in with this larger mission, that’s a big win.” The AACC began using the MAAP matrix like a microscope, zooming inside their department exploring individual goals, and zooming out, broadening their view and ultimately connecting with the university as a whole. It is this macro scope that creates a stronger sense of community within the AACC as well as within Rutgers University. Keeping with the science theme, Ji further explained how MAAP provided clear identification of external and internal assessment needs by thinking of Rutgers

as the human body. The human body has multiple functions and systems functioning as parts of a whole. This is analogous to the many departments and units within the University. “It’s important for all systems to be in place in order to have a smooth operating body,” explained Ji. Assessment, and specifically MAAP, allows for the evaluation of these systems and their functions in relation to a larger mission. In this sense, it is important for each part of the body to work on an individual level as well as on a collective level. “When

the body needs to heal just slapping a band aid on the wound won’t cut it. There needs to be a more holistic approach in order for some institutional change to really take place. On the exterior, the wound may appear to be healed, but in reality it is because all systems on the interior level are working together properly. MAAP allows us to look at the work we do in a more holistic way,” she continued. Just like the focusing the microscope, Ji and her fellow department members were able to place the work they do within a larger frame, achieving greater successes and making more meaningful strides. “Folks can have a greater appreciation for the work that they do. MAAP definitely increases morale and essentially gets us to do our work better.” As I walked back out onto the main floor of the AACC after our interview, I noticed everyone’s passion and drive as they prepared to welcome new students. The

energy flooded the space and I could tell the AACC’s experience with MAAP truly gave them a deeper sense of pride and community. Ji’s scientific metaphors for understanding assessment ultimately transformed the idea of assessment into a crucial practice that is fun and motivating rather than dreary and exhausting. It’s the reward of positive growth that makes the necessity for assessment clear. Just walk into the AACC and see for yourself.

Cheyenne Seonia is an undergraduate intern with the Rutgers Student Affairs Marketing, Media and Communications department. She will be graduating in the Spring of 2013 with a degree in English.

Ji Lee, Director of the Asian American Cultural Center

Page 6: Assessment in Action Issue 2

06page Assessment Action in

Net Promoter ScaleBy Patrick Love

It is a form of assessment that has been adopted by Apple, Amazon, Allianz, P&G, Costco, Vonage, Intuit, Google, American Express, Verizon, General Electric, Phillips, Overstock.com, T-Mobile, eBay, Jet Blue Airways, Symantec, TD Bank, and the Vanguard Group. And it is only TWO questions. All of these companies have adopted the Net Promoter Scale (NPS) methodology to assess customer satisfaction and loyalty and to drive organizational improvement. The NPS was developed by Frederick F. Reichheld and Satmetrix Systems, Inc. The

NPS measures customer loyalty and satisfaction, and can be directly linked to organizational growth and profitability. The appeal of the NPS is its simplicity. Derived from a single Lickert item, it is easy to understand, administer, evaluate, and explain. NPS assessment programs administer a two-item survey consisting of the Net

Promoter Lickert item, and a second, open-ended question: 1) How likely are you to recommend [company/product/experience] to a colleague or friend? (0 - 10)AND2) Why did you give us this score? Respondents who respond with a 9 or 10 to the first item are classified as Promoters, and those who respond with 7 or 8 are classified as Neutrals. Respondents who answer 6 or lower are classified as Detractors. The Net Promoter Score is calculated by subtracting the percentage of Detractors from the

percentage of Promoters. The absolute range is from -100 to +100. What constitutes a good score varies depending on the industry, for example, good hotels expect to be in the 70 range, while 15-20 is considered good for airlines. Typically, corporations calculate the NPS using repeated, random samples, in order to monitor consumer satisfaction

t is always my intention to provide a program or conference feedback on my experience. But then I get the survey – 4 pages, 5 pages, 6 pages, or more, and with questions about many topics I just don’t care about. Sometimes I don’t

finish those surveys. There’s got to be a better way, and I beleive I’ve found it.

Iover time and gauge the impact of policies and programs. However, one-time administrations are also done frequently. Responses to the open-ended question are what are most important to the individual answering the question, not the company administering the survey. Also, the qualitative data can be analyzed and linked to the score pattern of Promoters, Detractors, and Neutrals, in order to create a deeper understanding of participant satisfaction (and dissatisfaction). These analyses can generate actionable items and concrete programmatic changes grounded in

robust participant response data. The survey is designed to be neither anonymous nor confidential, because organizations can follow up with both Promoters (e.g., to recruit further participation) and Detractors (e.g., to see if problems are correctable). To date, there are no published sources documenting use of the NPS in higher education. However, while at Pace University,

Detractors Neutral Promoters

0 1 2 3 4 5 6 7 8 9 10

The Net Promoter Scale uses two question. 1) How likely are you to recommend [company/product/experience] to a colleague or friend? (0 - 10) 2) Why did you give us this score?

Page 7: Assessment in Action Issue 2

06page

Summer 2012 07page

Winter 2013

Net Promoter Scale

% Promoters % Detractors.

Patrick Love is the Associate Vice President for Student Affairs at Rutgers University. His areas of scholarship include assessment, organizational culture, leadership and management issues in student affairs, applying theory to practice, spiritual development, and GLBT issues.

I successfully implemented a Net Promoter program in order to monitor the effectiveness of the Office for Student Success. The results helped to shape our services

to students. I also used it to evaluate the 2011 ACPA Convention. Imagine evaluating an entire convention in two questions. The process gave ACPA actionable data to improve

future conventions.Try it and let us know your experience!

• Onlinewebinarsonassessment

• OnlineRAtrainingandassessment

• Onlineprofessionaldevelopmentworkshops

• Hundredsofcataloguedwebsitesandlistservs

• Theinternetresourceforstudentaffairsprofessionals

Page 8: Assessment in Action Issue 2

08page Assessment Action in

I have a confession to make. I hate Mission Statements. There. I said it, and it’s too late to take it back. Given that I am an assessment professional, my aversion to Mission Statements may surprise you, particularly those members of the administration and evaluation crowd. In the workplace, most of us

accept assessment models that place the Mission Statement at the core of institutional or program evaluation. Mission Statements are supposed

A mission statement is defined as “a long awkward sentence that demonstrates management’s inability to think clearly.” All good companies have one.- Scott Adams, creator of Dilbert

Mission Statements

Why I HATE

By: Michele Yurecko

to define our institutional identity and determine success, but are they really up to the job? A Mission Statement is a guiding statement of purpose. It is the big idea that characterizes the essential nature, values and work of an organization. Although the Mission Statement serves the

important function of declaring an organization’s purpose and core values, it is important to recognize that Mission Statements

come with limitations - limitations that, unfortunately, are rarely discussed. The Mission Statement is viewed as the heart and soul of an organization. We praise “mission driven companies,” and “mission focused leadership.” However, the Mission Statement is merely one component in the complex process

of assessment, and too much focus on that one component can narrow our vision and draw our attention from other salient factors that have

Page 9: Assessment in Action Issue 2

09page

Winter 2013

enormous impact on organizational success. The first problem I have with Missions Statements is that they are too darn aspirational. They describe what an organization intends to be, without necessarily taking into account what the organization actually is. If you want to know what an organization wants to be when it grows up, read the Mission Statement. If you want to know what an organization actually is, you’re going to have to dig a bit deeper than that. The “boots on the ground” realities which deeply impact the day-to-day conduct of an organization, can have tremendous influence on the identity of an organization and the success of its efforts. However, the lofty Mission Statement frequently glosses over, or even ignores, these powerful factors. In their aspirational smugness,

Mission Statements are always looking forward; they never look backward. This belies an interesting contradiction with evidence-based practice. We frequently see Mission Statements touted and revered by the same individuals who cry for evidence-based decision-making and evaluation. Yet whereas the Mission Statement continually looks ahead, in an evidence-based culture, we frequently look back, using lessons from the past to shape practice in the future. Mission Statements articulate the endpoint, without capturing the dynamic, developmental nature of organizations. In addition to being aspirational,

Michele Yurecko is the Assistant Dean of Academic Assessment, The College of St. Elizabeth.

Mission Statements are also overwhelmingly affirmational. They are like tiny little cheerleaders, root- root- rooting your organization to success with positive statements. Mission statements describe what an organization intends to achieve, what legacy it will leave, how it is going to shine. Mission Statements rarely discuss challenges, limitations, or anything that might cast the organization in a bad light. In effect, Mission Statements eliminate an organization’s challenges and limitations from the declaration of its identity. Yet, in reality, those limitations and challenges are key features that reveal hard won successes and heroic efforts. With its positive and perky language, the Mission Statement can be reduced from a means of institutional renewal, to a tool for self-promotion. Rather than foster

genuine reflection, evaluation, and change, the Mission Statement can become a means to promote and market an organization with hand-picked evidence and assessment results. While this practice may enhance public perception, it denies an opportunity for real renewal. As pointed out by Robert Stake and his colleagues, “It is difficult to fix weaknesses in an atmosphere of self-promotion.” Ethical and effective assessment practice must occupy the real estate between what an institution intends to be, and what it actually is; between what the Mission states, and the story told by the “boots on the ground.” Assessment cannot

be limited by the language of the Mission Statement, but must look beyond the aspiration and affirmation, to construct a fuller picture of organizational identity.Effective assessment must take into account an organization’s mundane challenges and limitations, as well as its lofty Mission. In constructing this more comprehensive approach, the assessment professional can better serve the honorable goal of organizational improvement and renewal, and avoid the temptation of self-promotion. Would we all be better off ditching these aspirational, affirmational, self-promoting Mission Statements? Should we just abandon the window dressing, and get back to work? Maybe. In spite of the fact that Mission Statements drive me crazy, I’m still not ready to vote them off the island entirely, but they must be placed in “time-out.” When attempting to evaluate organizational effectiveness, it’s always good to practice skepticism and deconstructionism, particularly with regard to the Mission Statement. I’ll let you in on a little secret. When I started my most recent position as Assistant Dean of Academic Assessment, I didn’t write a mission statement for my office. This wasn’t an oversight or even an act of rebellion, but more of an experiment. I shelved the Mission Statement, and instead composed a statement of Assessment Philosophy, one that I hope is powerful and flexible enough to support changing goals and an expanding role on campus. Do I feel a little rudderless without a Mission? No. I feel free and excited about where the “boots on the ground” will take me.

A Mission Statement is a dense slab of words that a large organization produces when it needs to establish that its workers are not just sitting around downloading Internet porn.- Dave Barry, Humorist

Page 10: Assessment in Action Issue 2

10page Assessment Action in

Assessment: as·sess·ment/

ə’sesmənt/ to review napkins

pinned onto a board by

students. Wait, that doesn’t

sound right. Actually, it’s exactly

right. The “napkin board” has

been utilized by Rutgers Dining

Services for over 20 years.

The Napkin

Board: Rutgers Dining

Services’ Low-Tech

Assessment ToolBy Patty Rivas

Page 11: Assessment in Action Issue 2

11page

Winter 2013

I had the pleasure of interviewing Joe Charette, Executive Director of Dining Services. We discussed various assessment tools Dining Services utilizes, but the napkin board is what stood out to me. The napkin board began when students started writing feedback on napkins and taping them on the wall at dining halls. They would address each dining hall with it’s own moniker. For example, when writing a note in Brower Commons, they would begin with, “Dear Captain Commons.” Little did these students know, this napkin board would become a major catalyst for a lot of change within Dining Services. Students are able to leave notes at each dining hall. When they check back, there is an index card with an answer, waiting on the student’s napkin. Each dining hall’s unit manager answers the napkins. Unit managers see questions such as “Why do you have Captain Crunch but not Captain Crunch with berries?” or “You have Skippy peanut butter, but I’m more of a Jiff guy.” Joe says they are able to accommodate requests such as these if “it doesn’t cost us any extra.” However, major changes have been

implemented as well. One of those is continuous service. Dining halls used to close in between meals in order to set up for the next shift. Many students wrote that if they had class at odd hours, they were never able to eat in the dining hall. Now, each dining hall stays open throughout the day. Other major changes have included take-out services, the sushi line at Brower Commons, and making Greek yogurt available. “It is the best feedback system that we have,” states Joe. Dining Services has decided to take the napkin board concept and go digital. They now have a Facebook page for each dining hall, a Twitter account, an iPhone app, and a computer kiosk at each dining hall. All encourage students to give their instant feedback. They plan on expanding to other social media outlets, such as Foursquare, in order to have more areas where students can give their opinions. “We are able to get information out there. Students aren’t reading flyers, ads in the Targum, or sitting down at desktop computers anymore. They get their information from their phones. We want to relate to students in the way that they want

to get their information and also hear what they say to each other,” says Joe. Every year, thousands of new students come to campus, and every year, there is something different about them. Lately, Joe has noticed that students want more local and sustainable foods, hormone-free foods, and organic food, options. Dining Services has tried to accommodate this by being sensitive to allergies, and providing various foods for special populations. Because every class may have different opinions, “the challenge is finding out what’s the most popular thing today,” says Joe. Dining Services is constantly assessing the student population, whether it’s through their napkin board or online services. Of course, Dining Services also utilizes traditional assessment methods as well. They use an outside company, Food Insights, to conduct bi-annual qualitative and quantitative surveys. These surveys are given to faculty and staff, alumni, and students. Various changes have been, or will be, made due to these surveys, such as the addition of a diner and a Starbucks on Livingston campus (so you can thank Dining Services for those!). They also use a system called “Satis Track” to assign surveys to specific groups of students. The students are sent two surveys throughout the semester, with the incentive of a $25 RU Express credit. Both of these assessment tools, as well as social media, will be used to garner feedback once several new dining areas open on Livingston campus, such as those mentioned above, and a “green grocer.” Surveys will ask students things like “are the hours meeting your needs?” or “is there something you need to leave campus to buy that you can’t find in the green grocer?” Each new unit will also have it’s

During Hurricane Sandy, many students left thank you notes to dining hall staff members who were serving food during the storm. The one from Sasha reads, “On behalf of the familes in Marvin, Russell, and Johnon apartments. Thank you. You have no idea.”

Page 12: Assessment in Action Issue 2

02page

Summer 201212page Assessment Action in

own Facebook page and napkin board. When asked if he thinks the use of social media will mean less student interaction for him and his dining crew, Joe responds, “Not at all. Social media gives people who don’t feel comfortable or don’t have the time to have a face-to-face interaction to be heard. People who want to speak with you face-to-face will seek you out.” Dining Services is using traditional assessment methods, but also has various innovative concepts that create a well-rounded strategy. Joe believes other units are capable of doing the same. He advises units

to make it known that they are seeking feedback from students. “Students think ‘this place is huge, who’s going to listen to me?,” says Joe, “make it known that what they feel is something you want to hear. You have to encourage that.” His tip is to start with pulling together a random sampling of people you know and ask them to recommend students who are interested and willing to speak with you. “Make students understand that they are the purpose of our work, not an interruption to our work,” he says. Lastly, Joe emphasizes the importance of having fun. Dining Services holds an Iron Chef competition

where students are invited to make dishes out of any of the 5,000 items in the dining hall. The dishes are brought to a panel of judges who analyze for plate presentation, taste, and creativity. This event prevents students from getting bored of the food, and shows them the many options they have at each dining hall. Also, other students eating at the dining halls see them making these dishes, and then start getting information and feedback from their fellow peers. “This is a more fun way of getting this information out there than a newsletter. It also gives us feedback as to what students would create if they could make the

menu,” says Joe. It’s certainly not traditional assessment, but those fun and creative initiatives are what have helped Dining Services make it known that they care about what students want.

Patty Rivas is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. She is currently an intern at Rutgers Office for Violence Prevention and Victim Assistance.

We make students understand that they are the purpose of our work, not an interruption to our work.

- Joe Charette

Page 13: Assessment in Action Issue 2

13page

Winter 2013

hen is assessment not just assessment? When assessment is also educational.The notion that assessment can also be

education is confusing to many folks especially if they view assessment as an activity that is completed at the end of an interaction such as a program, service, or meeting. Synergistic assessment is assessment that provides data to help understand if goals were achieved, where improvements can be made, and helps foster learning. Thinking in this synergistic way requires a shift in our mindset. Student affairs educators need to be actively learner-centered. If educators are learner-centered then assessments will be focused on what students are receiving from an experience, not what they doing as educators, programmers, or administrators. The result is that intentions will be focused on fostering learning in multiple ways. Some methods are more adept at implementing this type of synergistic assessment. In 1993, Thomas Angelo and Patricia Cross published an informative book entitled Classroom Assessment Techniques. This book was meant to be an educational assessment guide for college teachers, but there are benefits for student affairs educators. Most of the 50 techniques in this book can be used in student affairs, and virtually

W

Synergistic Assessment MethodsBy Gavin Henning

all of these assessment methods identified in the book foster learning. Let me highlight just a few of these instructive methods so you can see the synergy of partnering assessment with learning. A useful classroom assessment technique is the 1-minute paper. There are many ways to adapt this technique for a variety of learning experiences. The 1-minute paper really is the Swiss Army knife of assessment methods. After an interaction, which could be program, meeting with a student, student organization meeting, or even a conduct hearing, the educator takes a few moments to ask the student to respond to a question using an index card. The question is usually some form of “what was your major take-away from this activity?” In addition to allowing the educator to learn if the student took away from the activity what she was hoping, this method also creates what can be called a “reflection trap.” The 1-minute paper creates a space for the student to reflect on what he/she learned. The reflection trap is the contemplative oasis in a desert of overstimulation. It forces the student to reflect on their learning when there are few opportunities to do that. Another benefit of the synergistic approach of the 1-minute paper is its ability to assess learning at all levels of Bloom’s revised cognitive taxonomy (below).

What are five ways identified in the readings that can help you reduce stress? (Remembering)

Based on the floor meeting, what do you believe are at least three reasons we don’t allow alcohol in the residence halls?

(Understanding)How can you use what you learned in these activities in your student organization?

(Application)After participating in the ropes course, describe the keys to success for the group?

(Analyzing)As you reflect on this past year as I have served as your organization’s advisor, in what areas have I

been most effective and in what areas can I improve(Evaluating)

Based on our conversation regarding potential careers, what would be four steps in your action plan?

(Creating)

Page 14: Assessment in Action Issue 2

14page Assessment Action in

The “muddiest point” is a classroom assessment technique that is the opposite side of the 1-minute paper coin. Where the 1-minute paper helps both student and educator understand what the student learned, the muddiest point helps both understand what the student doesn’t know or is struggling to learn. The muddiest point can be used in conjunction with the 1-minute paper exercise having students use the backside of the index card. This method also serves as a reflection trap providing students an opportunity to reflect on what they are confused about from the learning activity or what they would like to know more about. Questions are usually a variation of what you are still confused about or what you would like to know more about. This helps the educator know what he/she needs to do a better job explaining next time. If there is an opportunity to follow-up with the student or the group, the educator can also resolve the confusion. Directed paraphrasing is another classroom assessment technique that fosters student learning. This method challenges learners to paraphrase a technical statement or passage into language that is more colloquial. This is a fabulous technique for RA. One of the most challenging tasks for RAs is

describing the alcohol policy to students on their floor during the first meeting in fall. This is an often-scary experience. Directed paraphrasing can be a great teaching tool. During a 1-on-1meeting between hall director and resident assistant before the first floor meeting, the hall director can ask the RA to paraphrase how he/she would describe the alcohol policy. This method allows the hall director to know if the RA can accurately summarize the policy in student language. It also provides the RA the opportunity to consider how she would like to paraphrase the policy and to practice this statement in a supportive environment. This recitation can help ease the discomfort of the RA, allowing for effective communication between RA and student. There are many more assessment techniques that foster learning in addition to helping one understand if a learning activity or interaction achieved its espoused goals or identified ways to improve implementation of the activity ahead of time. Synergistic assessment begins with a learner-centered individual that sees assessment more than just an activity and is continuously seeking new and innovative ways to help students learn. I challenge you to be that synergistic assessor. In doing so, you will become a better educator.

Gavin Henning has nearly 20 years of experience in student affairs and over 10 years experience in student affairs assessment and planning. He teaches student affairs assessment and research in the master’s programs at New England College and Salem State University. Gavin’s scholarship has been published in professional journals and he has been an invited speaker at regional and national conferences.

Write for AiA!We’re looking for contributors From RU and Beyond

It’s not about publish or perish. It’s about making the culture of assessment visible on campus. It’s about making assessment part of every program. It’s about breaking the stigma of assessment as busy work.Would you like to learn more about assessment by writing about it?If you would like to find out more about the opportunity to write for AiA, please send us an email: [email protected]

Page 15: Assessment in Action Issue 2

pageWinter 2013 15

page

r. John Schuh was gracious enough to sit down and talk with me about how the subject of assessment has changed

and evolved since he started writing the books that are now used as standards in the field. While Dr. Schuh recently “retired” as a Distinguished Professor of Educational Leadership and Policy Studies at Iowa State University, he still serves as an Editorial Consultant identifying and advising authors in the higher education and student affairs areas of Jossey-Bass. He admits that he looks at his work schedule and questions whether or not he’s really retired. Dr. Schuh comes from an impressive Residence Life background and served as the Associate Vice President for Student Affairs at Wichita State University. In addition to this, he held administrative positions at the University of Kansas, Indiana State University, Indiana University Bloomington, and Arizona State University. Even though Dr. Schuh has authored and coauthored multiple books on assessment in student affairs, he still describes his introduction to assessment as “falling into it.” Having never taken a formal assessment

Interview with Assessment Expert

John SchuhBy Charles Kuski

course while getting his Master’s in Counseling at Arizona State University, Dr. Schuh’s first lesson in assessment came in the form of a 1976 Western Interstate Commission for Higher Education (WICHE) seminar by Dr. Ursula Delworth while working for the Residence Life Office at ASU. WICHE offered a simple promise; they would teach a team of “student personnel workers” (this was 1976, after all) an ecosystem assessment model in exchange for their agreement to actively use the model in their institution. Dr. Schuh contrasts this story with the emphasis on effective assessment initiatives that we see in student affairs today. When Dr. Schuh attended that workshop, he stated that assessment with the sophistication that we know today just wasn’t conducted. In addition, there was no professional development for assessment, no convention or conference workshops on the topic, and certainly no CAS standards. Even in 1996 when Dr. Schuh and Dr. Lee Upcraft published Assessment in Student Affairs, assessment was broadly defined as “keeping track of services,

D

Assessment ought to be rewarded but I think in our contemporary environment, this ought to be thought of as a routine aspect of student affairs work. Programs that aren’t engaged in routine assessment are simply incomplete. - John Schuh

Page 16: Assessment in Action Issue 2

16page Assessment Action in

programs, and facilities, and whether or not such offerings have the desired impact.” Dr. Schuh contests that the now-sixteen-year-old definition really only shows an accountability dimension. In practice today, that accountability dimension certainly still exists, but the focus now is on assessment and evaluation of what students have learned, how they have learned, and how they have changed and developed.

Dr. Schuh believes this could be due to a paradigm shift in student affairs as well as a changing external environment. Today, student affairs professionals measure departmental and divisional effectiveness more in terms of student learning and development rather than simply student

satisfaction. Looking beyond the field, Dr. Schuh highlights a change in environment, especially with regard to governmental interests in higher education. There is more pressure on practitioners today to be able to demonstrate effectiveness to stakeholders of a college. Not engaging in routine assessment can leave an institution vulnerable to scrutiny from that external environment. Dr. Schuh asks, “How do you

know if you’re doing a good job or not? It’s got to be more than, ‘we know what we’re doing, take our word for it.’“ With regards to the changing environment, Dr. Schuh recounts simply, “that isn’t the way it’s always been.” As a broad statement, the luxury of labeling

student affairs as a “faith-based profession” may be well behind us. Dr. Schuh sees both pros and cons to the shift to an assessment focus. While assessment adds a dimension of pressure and stress to creating and improving programs and initiatives, it also suggests that there exists a mindset at the birth of a

There’s a mindset created when you take on a project that you need to completely close the loop and have a means for measuring the extent to which a specific initiative, project, or pursuit has been successful. And that’s not bad—not by a longshot.- John Schuh

Page 17: Assessment in Action Issue 2

pageWinter 2013 17

page

project that we should completely close the loop. That is, we should have a means for measuring the extent to which a specific project has been successful. Dr. Schuh very clearly asserts, “it ought to be thought of as a routine aspect of student affairs work. Programs that aren’t engaged in routine assessment are simply incomplete.” For a practitioner, engaging in routine assessment seems more easily said than done. When Dr. Schuh started publishing books on the subject, much of the resistance to assessment was from those who believed assessment was overwhelmingly about statistics. Someone who had no background

in statistics couldn’t possibly assess effectively. Dr. Schuh advocates keeping assessment as simple as possible. As student affairs moves from solely a focus on assessment for accountability to a focus on the assessment of student development and learning, the not-so-statistically-literate become more fluent. As a result of successful assessment initiatives, Dr. Schuh explained that universities are able to assert their effectiveness and influence more boldly than thirty years ago. For example, at the University of South Carolina, there is an initiative to see if students in a certain first year transition program are more likely to be retained and graduate on time. At the Ohio State University, there is a push to conceptualize and improve the sophomore year experience and the redevelopment of a sophomore residence area. Dr. Schuh described these and other initiatives with

excitement that universities around the country are engaging in assessment to create positive, sweeping changes to their students’ experience. As Dr. Schuh gently reminded me in the beginning of the interview, by the time I was born in 1989, he had already been working full-time in the field for 19 years. Of course he didn’t say that to be rude, but rather to underscore the vast change over time that has occurred in our field. During that time, he’s seen the field of student affairs go from one ecosystem model to complex assessment and evaluation models spanning years and touching upon everything from a student’s

minute details to the entirety of a student’s developmental process in college. It remains clear that assessing and evaluating is simply about how to best serve our students. I’m thankful that Dr. Schuh accidentally fell into assessment and brought it to the forefront of our profession. Be sure to look for Dr. Schuh’s monograph on this topic in the New Directions Monograph, published by Jossey-Bass, due to arrive in the spring of 2013.

Assessment used to be just keeping track of how many students participated in a particular program or how many patient visits there were to the student health service. I think we’re past that and getting much more into what students have learned and how experiences are making a difference in their lives. - John Schuh

Charles Kuski is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. He is currently a Hall Director at Rutgers University.

Page 18: Assessment in Action Issue 2

ime and time again, I heard the phrase “Data rich, but information poor.”

Say the word assessment in almost any student affairs office, and the conversation immediately turns to the data – piles of paper surveys,

percentage scores and student testimonials for marketing brochures. The data is

everywhere and anywhere we look, but what happens to it after the report is

written? To add to this poverty of information, much of our data collection is

done in silos. We hunt and gather data much like an animal might prepare

for winter hibernation. We shore up our data resources, our sights set on

the ‘one day’ it will be useful. Particularly in larger departments, these

stores get divided between different areas, holding onto data that could

inform and celebrate our work. In an attempt to share our rich data resources across the department,

I recently piloted a data sharing initiative at our staff retreat. Knowing that

assessment and data are still tough words for some in our office, I titled

the project ‘Storytelling at the Student Experience Center Retreat’. With a charge to tell their piece of our students’ story, my colleagues

worked to create posters that would serve as a visual representation of the data

they had collected. This retreat and initiative was timely; we had just come out of a busy

summer of orientation and transition programs, along with the launch or re-lalunch of

new and improved programming. With all of this activity came a flurry of assessment

and report writing. Reports and presentations were shared with Directors, higher level

committees and some students, but never, surprisingly, with each other.

Thanks in no small part to a talented and patient Multimedia Developer, almost

every colleague and unit created a poster to share numbers and testimonials. From

inforgraphics to pie charts, to Wordles and event photos, each poster told a story. I kept

reinforcing the need to tell our students’ stories – we each have a piece of the puzzle,

a chapter of the larger novel, but we haven’t been able to put those pieces or pages

together. After a brief introduction, I handed over the finished products to their creators so

18page Assessment Action in

Telling Stories: A Data Sharing Pilot

ASSESSMENT IN ACTION18

By Lisa Endersby

T

Page 19: Assessment in Action Issue 2

Rutgers University does not discriminate on the basis of race, color, national origin, sex, sexualorientation, gender identity or expression, disability, age or any other category covered by law in itsprograms, activities, or employment matters. The following people have been designated to handleinquiries regarding the non-discrimination policies:

Judy Ryan, Title IX Coordinator for Students & Jayne M. Grandes, DirectorADA/Section 504 Compliance Officer Office of Employment Equity, Office of the Vice President for Student Affairs University Human Resources83 Somerset Street, Suite 101, CAC 57 US Highway 1, ASB II, Cook Campusp. 848-932-8576 p. [email protected] [email protected]

For further information on notice of non-discrimination, visit http://wdcrobcolp01.ed.gov/CFAPPS/OCR/contactus.cfm for the address and phone number of the Office for Civil Rights that serves your area, or call 1-800-421-3481.

pageWinter 2013 19

page

they could hang them around the room. Over lunch, I couldn’t help but swell with pride watching my colleagues examine these works of art. Using a one minute paper, I asked colleagues to reflect on one new thing that they learned after viewing the posters. Many reflections speak of surprise and awe over our

facts and figures, noting, “how highly regarded our programs [are] by survey respondents.” They discovered “how amazing our assessment results and feedback [have] been” and “how a poster can tell a story.” Colleagues who had no or very little involvement with our orientation programs saw “how these events really connect students to campus.” Perhaps most importantly, my colleagues saw “that [our] students are engaged and hopeful about the future.” This is only the beginning of their story. In creating these posters, we see now that the students are truly the authors of their own experience – we are the agents of change, the publishers and book binders who will help navigate the plot twists and help bring their stories to life.

TELLING STORIES - A DATA SHARING PILOT 19

Lisa Endersby - Taking on the title of ‘Advocate for Awesome,’ her work in higher education spans defining and chasing student success in leadership development, career services, community engagement and, her most recent love, assessment. Lisa has presented and facilitated at numerous local and national conferences. Lisa is an avid user of social media. Follow her on Twitter (@lmendersby) to keep the conversation going.

Page 20: Assessment in Action Issue 2

2013Rutgers

Student Affairs Awards for Excellence

May 22nd 3-5 p.m.

Save The Date!