10
Institutional Planning, Assessment & Research 2010 Institutional Planning, Assessment & Research Assessment Review Committee Report for 2012-2013 Student Affairs Kathleen Hill Director for Assessment, Research & Retention Prepared for Presentation on March 4, 2014 1 2013 Institutional Planning, Assessment, and Research

Assessment Review Committee Report for 2012-2013 Student Affairs

Embed Size (px)

DESCRIPTION

Assessment Review Committee Report for 2012-2013 Student Affairs. Kathleen Hill Director for Assessment, Research & Retention Prepared for Presentation on March 4 , 2014. 2013 Institutional Planning, Assessment, and Research. Mentoring/Review Process. - PowerPoint PPT Presentation

Citation preview

Page 1: Assessment Review Committee Report for 2012-2013 Student Affairs

Institutional Planning, Assessment & Research

2010 Institutional Planning, Assessment & Research

Assessment Review CommitteeReport for 2012-2013

Student AffairsKathleen Hill

Director for Assessment, Research & Retention

Prepared for Presentation on March 4, 2014

12013 Institutional Planning, Assessment, and Research

Page 2: Assessment Review Committee Report for 2012-2013 Student Affairs

Mentoring/Review Process• Decision made by SA Executive Council that ARC should be primarily composed

of individuals NOT currently serving on the Student Affairs Assessment Team but would be effective contributors to the review process.

• Assessment Review Committee finalized in August 2013:– Gretchen Brockmann – Campus Living– Kathy Hill – SAARR– Susan Morrissey – IPAR (OIA Rep)– Bernie Schulz – SA Administration– Meagan Smith – Student Involvement & Leadership– Laura Sweet – Dean of Students

Broad representation across Division of Student Affairs, assessment knowledge and experience, critical eye – constructive comments

22013 Institutional Planning, Assessment & Research

Page 3: Assessment Review Committee Report for 2012-2013 Student Affairs

Mentoring/Review Process• Committee training held on September 10

• Committee group review session held on September 20– Looked at 2011-2012 review exemplar attending to feedback language– Completed two 2012-2013 reviews as a group– Discussed feedback strategies and inter-rater reliability considerations

• 32 Unit Reports: Each reviewer was assigned 5-6 unit reports; reviewed on an individual basis; reviewers did not review any units with direct affiliation

• Initial messaging regarding process and timeline shared with SAEC, SALT, and SAAT in September; regular updates provided at respective leadership meetings

• Face to Face Feedback: Kathy Hill met with “clusters” of units to review feedback, discuss steps for addressing developing ratings, and integrate with 2013-2014 assessment plan/processes in progress

32013 Institutional Planning, Assessment & Research

Page 4: Assessment Review Committee Report for 2012-2013 Student Affairs

2012-13 Component Data

Developing Acceptable Proficient

Outcome 20 48 27

Means of Assessment

46 46 3

Criteria for Success

47 24 19

Results 43 30 22

Actions Taken 51 24 20

Follow-Up to Actions Taken

51 37 7

42013 Institutional Planning, Assessment & Research

Page 5: Assessment Review Committee Report for 2012-2013 Student Affairs

Data Visualization

Out-come

MOA CFS Results AT Follow up

0%10%20%30%40%50%60%70%80%90%

100%

ProficientAcceptableDeveloping

52013 Institutional Planning, Assessment & Research

Page 6: Assessment Review Committee Report for 2012-2013 Student Affairs

Aggregate Comparison: 2011-2012 and 2012-2013

Division of Student Affairs Aggregated 2012-2013 Assessment Report Review Summary: D = Developing A = Acceptable P = Proficient

• D = (48%)

• A = (36%)

• P = (16%)

• (A+ P = 52%)

Division of Student Affairs Aggregated 2011-2012 Assessment Report Review Summary: D = Developing A = Acceptable E = Exemplary

• D = (51%)

• A = (42%)

• E = ( 7%)

• (A+ E = 49%)

62013 Institutional Planning, Assessment & Research

Page 7: Assessment Review Committee Report for 2012-2013 Student Affairs

2012-13 Best Practices – “Closing the Loop”

• Marketing: “Effectiveness” operational outcome assessment related to project ticketing system – have been able to resolve any issues around timeliness, process and product dissatisfaction. SA ticketing system has been adopted by Institutional marketing.

•Volunteer and Service-Learning Center: Outcomes refined, multiple measures with appropriate criteria for success, documented results, actions generated out of results…evidence that assessment is fully embedded in practice of VSLC.

• Development and Parent Services: New leadership to new assessment plan – integration of strategic planning and annual assessment cycle.

• CRW Wellness: Wellness Passport outcome – exemplar of “authentic” embedded assessment process with consistent “closing the loop” process. Ongoing and immediate adjustments to program planning and implementation.

72013 Institutional Planning, Assessment & Research

Page 8: Assessment Review Committee Report for 2012-2013 Student Affairs

Substantive ChangesDivisional Perspective

• Focus on internalizing responsibility for assessment cycle among SA leadership and staff at large: Division alignment with institutional practices, regular briefings re progress and needed improvements, accountability support

• Annual Division Assessment Retreat: Scheduled in May in support of August reporting

• Roll Out of New Reporting Framework: Results and actions planned, then implementation report; positive reception!

• Cycle Emphasis: “learning” outcomes, multiple measures, results to actions reporting “logic” and documentation.

• Strategic Plan to Outcomes Assessment Integration: Stronger alignment with current planning process in progress

82013 Institutional Planning, Assessment & Research

Page 9: Assessment Review Committee Report for 2012-2013 Student Affairs

Rubric and Review Process FeedbackWorked well:

• IPAR Training followed by Committee “group review” session - examples and completing several reviews together “on the same page”

• Consultation with ARC chair throughout process, addressing questions, etc.

• Not having committee members review unit reports they have affiliation – preventing bias, and perception of bias

Difficulties:

• Two members new to university - understanding ECU systems and terminology

• Technology - timed out of system, having to re-enter reviews

• Incomplete unit reports delayed review process

92013 Institutional Planning, Assessment & Research

Page 10: Assessment Review Committee Report for 2012-2013 Student Affairs

Rubric and Review Process FeedbackSuggestions for Improvement

• Integration emphasis – unit assessment cycle, review feedback, improvement applied to report reviewed and current year’s assessment cycle

• Continue to engage SA leadership and staff in “improvement” feedback and implementation – ownership and accountability

• Focus on inter-rater reliability

• Formalize a “sample” unit report for practicing unit review (independent of group review)

• Committee members should review same unit reports for 2013-2014 review cycle – assessing unit progress improvement and assuring consistency in feedback

• “Refresher” training prior to launching 2013-2014 review process

102013 Institutional Planning, Assessment & Research