38
1 Formatted: Right Rockingham Community College Planning Process Guide for Institutional Effectiveness Approved December 6, 2017 Updated December 21, 2017

Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

1

Formatted: Right

Rockingham Community College

Planning Process Guide for Institutional Effectiveness

Approved December 6, 2017

Updated December 21, 2017

Page 2: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

2

Formatted: Right

Table of Contents

Section I: Introduction Section II Institutional Effectiveness Section III Terms and Definitions Section IV Academic Planning Unitst Plans Section V Administrative Support Unit Plans Section VI Assessment/Data Timeline Section VII Internal Audit Section VIII General Education Assessment Section IXV 5 Year Program Review Section X CEWD Accountability and Integrity Plan Section XI RCC Strategic Plan Section XII Appendices

Formatted: Highlight

Formatted: Highlight

Formatted: Highlight

Formatted: Highlight

Formatted: Highlight

Page 3: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

3

Formatted: Right

I. Introduction Mission The mission of Rockingham Community College is to enhance individual and community success in Rockingham County through education as well as full development of human potential, employment assistance, service to business and industry, and contributions to cultural and social development. Purpose Rockingham Community College was chartered in 1963 as a comprehensive, public community college with an open door admissions policy. Our purpose is to provide individuals with high quality, economical, and convenient educational opportunities consistent with student and community needs. To fulfill its mission and achieve its purpose, Rockingham Community College provides:

• support services to assist individuals in problem-solving and in their personal, career, and academic planning;

• courses for students who wish to complete the HSE (high school equivalency) or develop basic skills and competencies;

• training for employment in industrial and technical fields; • courses to develop the skills of under prepared students; • freshman and sophomore level courses transferable to other colleges and universities; • courses to develop and upgrade students’ vocational, technological, occupational, and academic

skills and competencies; • in-service and start-up training for area business and industry; and • opportunities for continuing personal growth and cultural and academic enrichment for

students and the community. Values Commitment to Students We believe that each person is important. We appreciate the diversity in the students we serve. We treat our students with respect and fairness. We are committed to giving students individual assistance and support. We provide an educational environment that encourages students to progress to their maximum potential. Commitment to Access We believe that the programs and services of the college should be equally available and accessible to all. Commitment to Excellence We believe that each individual should strive for excellence and we value a job well done. We aim for the highest level of professionalism, competence, and productivity as standards for our college. We aim for responsible participation and high achievement as standards for our students and our community. As role models, our faculty and staff strive to improve the chance of success for each student.

Page 4: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

4

Formatted: Right

Contribution to the Community We are committed to enhancing the quality of life, increasing the value of education, and promoting the unity within the community to achieve social and economic success. Quality Work Environment We recognize the importance of faculty and staff through open and honest communications, and appropriate involvement in planning and decision-making. We encourage responsible and creative risk-taking, recognize and reward exceptional performance, and provide for professional development. Motto Learning, Service, Leadership Rockingham Community College Strategic Plan 2016-2019 RCC’s Strategic Plan is reviewed and presented to the Board of Trustees bi-annually in March and September. The strategic plan is maintained in Compliance Assist and posted on the Employee Portal under the Office of Institutional Effectiveness.

Page 5: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

5

Formatted: Right

II. Introduction to Institutional Effectiveness Beginning in 2017, RCC modified its planning, assessment, and improvement efforts to align with the Southern Association of Colleges and Schools Commission on Colleges new principles. The institution’s efforts included: 1) forming a XX Strategic Planning Management Committee charged with guiding the college to compliance; 2) creating the Office of Technology and Institutional Effectiveness to officially acknowledge the permanent role and structure of institutional effectiveness at RCC; 3) developing the position of Associate Vice President for Technology and Institutional Effectiveness to lead and facilitate all planning, assessment, and improvement activities; 4) establishing a college-wide committee structure and outcomes assessment process to provide opportunities for all employees to participate actively in planning, assessment, and improvement efforts; 5) crafting and distributing an institutional effectiveness plan and handbook outlining RCC’s outcomes assessment and strategic planning processes; and 6) providing opportunities for and requiring attendance at professional development trainings focused specifically on assessment, documenting assessment results, and communicating use of results for continuous improvement. The 2017-2018 Planning Process Guide for Institutional Effectiveness captures the guiding principles underlying RCC’s current planning, assessment, and improvement processes. Institutional Effectiveness What is Institutional Effectiveness? Institutional effectiveness is the measure of how well an institution is achieving its mission and goals through the alignment of its planning, resource allocation, and assessment processes. The purpose of the institutional effectiveness process at RCC is to demonstrate continuous improvement in student learning, academic programs and administrative and student support services. Institutional Effectiveness consists of strategic planning, program review, academic and administrative support unit assessment, and general education assessment. What is the purpose of Assessment? Assessment is simply the tool used to determine the degree of institutional effectiveness. You must assess in order to demonstrate how effective you are. Assessment attempts to answer the following question: Are your efforts bringing forth the desired results? Effective program and general education assessment can be used to improve student learning, facilitate academic and institutional improvements, and validate institutional effectiveness. Assessment is a tool to be used for institutional improvement and improvement in student learning. In addition, assessment

Formatted: Font color: Text 1

Formatted: Font color: Text 1, Not Highlight

Formatted: Font color: Text 1

Formatted: Not Highlight

Formatted: Font color: Text 1, Not Highlight

Formatted: Not Highlight

Page 6: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

6

Formatted: Right

is a formal process that allows the institution to document continuous improvements and be accountable to constituents of the College and programmatic and institutional accrediting bodies. A Requirement for Accreditation The demonstration of institutional effectiveness is an important component of the Southern Association of Colleges and Schools (SACSCOC) accreditation process. The elements of institutional effectiveness are outlined in Section 7 Institutional Planning and Effectiveness and Section 8 of the Principles for Accreditation.

Section 7. Institutional Planning and Effectiveness Effective institutions demonstrate a commitment to the principles of continuous improvement. These principles are based on a systematic and documented process of assessing institutional performance with respect to mission in all aspects of the institution. An institutional planning and effectiveness process involves all programs, services and constituencies; is linked to the decision-making process at all levels; and provides a sound basis for budgetary decisions and resource allocations. The Quality Enhancement Plan (QEP) is an integral component of the reaffirmation of accreditation process and is derived from an institution's ongoing comprehensive planning and evaluation processes. It reflects and affirms a commitment to enhance overall institutional quality and effectiveness by focusing on an issue the institution considers important to improving student learning outcomes and /or student success. 1. The institution engages in ongoing, comprehensive, and integrated research-based planning and

evaluation processes that (a.) focus on institutional quality and effectiveness and (b.) incorporate a systematic review of institutional goals and outcomes consistent with its mission. (Institutional Planning) (Core Requirement)

2. The institution has a Quality Enhancement Plan that (a.) has a topic identified through its ongoing, comprehensive planning and evaluation processes; (b.) has broad-based support of institutional constituencies; (c.) focuses on improving specific student learning outcomes and/or student success; (d.) commits resources to initiate, implement and complete the QEP; and (e.) includes a plan to assess achievement. (Quality Enhancement Plan)

3. The institution identifies expected outcomes of its administrative support services and demonstrates the extent to which the outcomes are achieved. (Administrative Effectiveness)

Section 8. Student Achievement Student learning and student success are at the core of the mission of all institutions of higher learning. Effective institutions focus on the design and improvement of educational experiences to enhance student learning and support student learning outcomes for its educational programs. To meet the goals of educational programs, an institution provides appropriate academic and student services to support student success.

Page 7: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

7

Formatted: Right

1. The institution identifies, evaluates, and publishes goals and outcomes for student achievement appropriate to the institution's mission, the nature of the students it serves, and the kinds of programs offered. The institution uses multiple measures to document student success. (Student achievement) (Core Requirement)

2. The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of seeking improvement based on analysis of the results in the areas below:

a. student learning outcomes for each of its educational programs, (Student outcomes: educational

programs) b. student learning outcomes for collegiate-level general education competencies of its undergraduate

degree programs. (Student outcomes: general education) c. academic and student services that support student success. (Student outcomes: academic and

student services)

In an effort to carry out the institutional mission and support the strategic plan, RCC’s academic and administrative support planning units develop measurable outcomes, analyze data, and use results for continuous improvement. Systematic assessment of outcomes is important because it benefits students, faculty, administrators, and staff and external stakeholders. For students, outcomes assessment will:

• Communicate clear expectations about what’s important in a course or program • Inform them that they will be evaluated in a consistent and transparent way • Reassure them that there is common core content across all sections of a course • Allow them to make better decisions about programs based on outcomes results.

For faculty, participating in outcomes assessment will:

• Determine what is and what is not working in courses or programs • Facilitate valuable interdisciplinary and intercampus discussions • Provide evidence to justify needed resources to maintain or improve programs • Communicate success to individuals outside their area (e.g. administrators, legislators,

employers, prospective students, transfer institutions) • Provide assurances that all faculty teaching a particular high demand coursethe same course

agree to address certain core content For administrators, implementing college wide outcomes assessment will:

• Demonstrate an institutional commitment to continually improving the academic programs and services offered by the College

• Provide valuable data to support requests for funds from state and local government and private donors

• Demonstrate accountability to funding sources • Provide valuable data for academic planning and decision-making

Page 8: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

8

Formatted: Right

• Enable communication to elected officials, legislators, local business and industry, and potential donors about the college's impact on students and the community

For staff, the implementation of outcomes assessment will: • Provide evidence for the effectiveness of the units in which they work • Provide information to assist in making improvements in service delivery • Demonstrate the support provided for student learning • Provide staff with data to assist in departmental decision making

For community partners, outcomes assessment will:

• Provide transparent measures of success and effectiveness of the institution • Allow the institution to serve as a resource when recruiting for business and industry • Provide a trained and competent workforce

The Southern Association of Colleges and Schools Commission and on Colleges (SACSCOC) requires institutions to outline expected outcomes and identify how these outcomes are measured, and most importantly, how the results are used to make improvements in programs, services and operations. There is no one right way to conduct outcome assessment and there are many proven approaches. Rockingham Community College’s assessment of outcomes falls into four main categories:

• Institutional Strategic Planning • Administrative Support Unit Planning • Academic Planning Program • Learning General Education Assessment

Outcomes assessment is the responsibility of everyone employed at the College. Outcomes assessment is a function of all academic and administrative support units. The main objective is to demonstrate that the College’s overall planning process is supported through outcomes assessment. The goal is to be able to understand the quality of student learning, to ?inform?enhance teaching and to improve institutional quality. The Institutional Effectiveness Guide is designed to provide all members of the campus community an overview of the planning process and outcomes assessment procedures at Rockingham Community College. The document provides the following:

• definitions of important terms relative to the College’s assessment process • directions for developing outcomes assessment plans including examples of various types of

outcomes measures of assessment statements • the procedure for completion of internal audits of assessment plans, • a timeline for the assessment cycle at the College • the forms that are used to outline administrative outcomes, program outcomes, and

student learning outcomes for academic units and support units.

Formatted: Font color: Auto

Formatted: Font color: Auto

Formatted: Font color: Auto

Formatted: Font color: Auto

Formatted: Font: Not Bold, Not Italic, Font color: Black

Formatted: Not Highlight

Page 9: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

9

Formatted: Right

III. Terms and Definitions

Outcomes Assessment: Terms and Definitions Outcomes Assessment: Assessment is a systematic and ongoing method of gathering, analyzing, and using information from measured outcomes to improve student learning. Assessment is a process in which expectations are made explicit and published; criteria is set for measuring quality; evidence is gathered, analyzed and interpreted systematically; results are used to document, explain and improve performance. Outcomes are written in present or future tense and are measurable.

Assessment is the systematic and ongoing method of gathering, analyzing and using information from measured outcomes to improve student learning.

Institutional Strategic Planning Outcomes (SPO): Institutional Strategic Planning Outcomes are reflective of the values of the entire institution. They are tied directly to the mission and purpose of the College. What does the institution want for students, employees and its community? Sources of data to develop the strategic plan might come from surveys, forums, focus group with representatives of all constituents, and SWOT analysis.

Administrative Support Unit Outcomes (ASUO): Administrative outcomes are not direct measures of student learning but are related to student learning and do provide a unit with useful information regarding its efficiency and effectiveness. These outcomes are primarily used by support units that do not directly interact with student learning or the completion of programs.

For example, “the Business Office will offer effective training to faculty and or staff on using the new insurance registration web page. (XX)” However, programs that do deal directly with student learning might also be interested in assessing administrative outcomes. An example would be, “the Electronics Engineering Technology program will increase the number of students in the program by 15%. A notation should be made of each administrative outcome indicating support of the College’s strategic plan.

General Education Student Learning Outcomes (GESLO): General Education Student Learning Outcomes are specific statements that describe the general intellectual skills all students will acquire regardless of their chosen educational program. General education student learning outcomes are assessed by the college using direct, authentic student artifacts and scoring them with common rubrics. Examples of other means of assessment for GESLOs are a standardized, direct measure and a standardized, indirect measure (Noel-Levitz Student Satisfaction Survey) to assess student learning.

Need to insert definition for Academic Unit Administrative Outcome (AUAO):

Academic unit administrative outcomes are about the academic program itself or the faculty who teach in that program. AUAOs will provide an academic program with knowledge regarding other indicators of program effectiveness and relevance. An example would be, “the Electronics Engineering Technology program will increase the number of students in the program by 15%.

Formatted: Not Strikethrough

Formatted: Not Strikethrough

Formatted: Font: Not Bold, Not Highlight

Formatted: Font: Not Bold, Not Italic, Not Highlight

Formatted: Font: Not Bold, Not Highlight

Formatted: Font: Not Bold

Page 10: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

10

Formatted: Right

Program Outcomes (PO): Program outcomes describe what service providers expect students to achieve once they complete a program or receive services. Is the program effective and efficient? In some cases program outcomes will be supported by evidence gathered while the student is still progressing through the program but more often after the student has graduated. These outcomes measure the gains to students (graduates) after completing the program. Data may be gathered by the college or by an outside agency. Two examples are average GPA of transfer students provided by the University of North Carolina General Administration (UNC-GA) and Employer Surveys (College initiated) to address the job preparedness of former students. It could also include employment rates, licensure pass rates, and student memberships in professional organizations. A notation should be made of each administrative outcome indicating support of the College’s strategic plan.???What???.

Program Student Learning Outcomes (PSLO): Program student learning outcomes are specific statements that describe what students will know (cognitive), think (attitudinal), or do (behavioral) as they progress through and complete a program or receive services. (Note: Do not confuse this with a Program Outcome (PO), see above. This is a student learning outcome.) Program student learning outcomes are assessed via direct and indirect; standardized and authentic; and/or formative and summative measures. For example, faculty may create surveys which question students about their perception of their learning (authentic, indirect measure) or use item analysis of answers to a national licensure standardized exam such as National Council Licensure Examination, NCLEX (standardized, direct measure). An example to consider for the Academic Resource Center [ARC] would be scores on the Noel Levitz Satisfaction Survey indicating student satisfaction with services offered (standardized, indirect measure). Course Student Learning Outcome (CSLO): Course student learning outcomes are specific statements that describe what students will know (cognitive), think (attitudinal), or do (behavioral) as they progress through a course and/or upon completion of a course. Course student learning outcomes are assessed using faculty created common assessments to measure student acquisition of course content. The CSLOs will be common across courses and listed on course syllabi. Assessment Assessment is the process of gathering and discussing information from multiple sources in order to determine what students know, understand, and can do as a result of their academic experiences. Assessment results are used to improve student learning and program performance. Means of Assessment Embedded Assessment: Embedded assessments are built directly into a course or program and are reflected on course syllabi or other program requirement documentation. When assessment is embedded in existing course assignments or program requirements, it is less obtrusive and more valued by faculty and students. This is not a guarantee students will perform their best, but it is more likely they will perform better than if the assignment has no value in the course. Requiring work of students only to satisfy an assessment request will be detrimental to the learning process. Types of Assessment:

• Direct Assessment – Direct assessment measures student performance of identified learning outcomes based on well-defined criteria. Direct assessments can be authentic (faculty created

Formatted: Font color: Text 1

Formatted: Font: Not Italic, Font color: Text 1, NotHighlight

Formatted: Font: Not Bold, Font color: Text 1, NotHighlight

Formatted: Font: Not Bold, Font color: Text 1

Formatted: Font: Not Bold, Font color: Text 1

Page 11: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

11

Formatted: Right

assessment tools) or standardized (national normed tests). Examples of direct assessments include pre-tests and post-tests; common-embedded questions; writing assignments; projects, lab practical exams/checkoffs; laboratory reports, standardized exams (NC DAP; portfolio evaluation; video or audio tape of performance). Direct Assessment, whether standardized or authentic, may be formative in nature or summative in nature. Formative measures assess student performance of identified learning outcomes based on defined standards of performance prior to the completion of a task for the purpose of improvement. Summative measures assess student performance at the completion of a task and results are used for improvements. An example of a formative, direct, authentic assessment measure would be item analysis of a pop quiz. The most missed items could be used to identify concepts that need additional reinforcement in class prior to the test. An example of a summative, direct, standardized assessment measure would be the pass rate for a national licensure exam such as Certification in Surgical Technology (CST).

• Indirect Assessment – Indirect measures assess opinions or thoughts about student knowledge, skills, attitudes, learning experiences and perceptions of services. Indirect measures of student learning can also be authentic, standardized, formative and/or summative. Examples of authentic, indirect assessments are focus groups, graduate surveys, and employer surveys. An example of a standardized, indirect measure is Noel-Levitz Student Satisfaction Survey.

Outcome Indicator/Assessment Method – Description of the means of assessment and plan for data collection and analysis. Means of assessment are the markers that indicate that the expected outcomes from a course or program have been met. Note: Final course grades should not be used as a means of assessment. Final course grades represent an aggregate assessment of one student’s entire body of work for the course, which may include participation or course attendance or extra credit. A commonly used term in education to describe this is “grade inflation.”. Because we cannot assure that faculty teaching a particular course assign final course grades in exactly the same way, to include assignments, quizzes, exams, etc., there is little to ascertain about ‘student learning”. A review of grade distribution is one way to review student success as defined by the North Carolina Community System [NCCCS]; however, it does not provide information about what students have learned about concepts deemed of value to faculty. Assessment Scores: Assessment scores are collected in a consistent manner across sections of

a course or courses, or at the conclusion of a service for the purpose of assessing a population or a sample of students’ performance in relation to a specific outcome. Assessment is about many students not rather than just one student.

Target Benchmark: The target benchmark is the Criteria for Success; benchmarks should be aggressive and move the program in the direction of your vision; and, benchmarks should be realistic.

Rubrics A rubric is a coherent set of criteria for students' work that includes descriptions of levels of performance quality on the criteria. Rubrics are tools used to interpret and assess students' work against criteria and standards. A rubric outlines a range of assessment criteria and expected performance standards. Assessors evaluate a student's performance against all of these, rather than assigning a single subjective score. Program areas should develop rubrics specifically for the outcome indicator and not

Formatted: Font color: Auto

Page 12: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

12

Formatted: Right

rely on the "grading rubric" in a particular class. A "grade" is between the student and the faculty member and course specific grading rubrics may not adequately assess the outcome.

Page 13: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

13

Formatted: Right

Mid-yYear Report – MYR , tThe academic program units must provide a written, brief narrative about how the strategies for improvement are working. This narrative will be referred to as the Mid-Year Report Implementation Summary (ISMYR). (Appendix 7 – Implementation Summary Form Example). It will identify the outcome, the year it was assessed, the use of results from that year and a brief description of the implementation. The ISMYR will give program areas an opportunity mid-year check-up. A time to reflect on what is working or not working with new strategies and how this might or might not lead to program improvements. The ISMYR will be completed after strategies for improvement have been implemented in the fall semester and uploaded into Compliance Assist by January 15th. The ISMYR will demonstrate the use of results from assessment to continuously make improvements in the program and student learning. The APU, ISMYR and BJF Budget Request Description in Compliance assist documents will provide administrators the necessary data to make policy, procedure and budget related decisions that will impact student learning and institutional effectiveness. It is the responsibility of each department chair or program director, and the faculty, to develop appropriate program outcomes; student learning outcomes; assessment measures; analyze assessment results; and identify appropriate actions to make program improvements.

Formatted: Font: Bold, Not Highlight

Formatted: Font: Bold, Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Indent: Left: -0", Hanging: 0.01"

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Font: Bold

Formatted: Not Highlight

Page 14: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

14

Formatted: Right

Page 15: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

15

Formatted: Right

Academic Planning Units

Formatted: Indent: Left: 0", First line: 0", Space After: 8 pt, Line spacing: Multiple 1.08 li

Page 16: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

16

Formatted: Right

IV. Academic Planning Unit Developing an Academic Planning Unit (APU) for an Academic Program (Academic Planning Unit)

For academic units, the purpose of outcomes assessment is to improve the educational programs offered and strengthen students’ abilities within their chosen fields of study. Faculty are best suited to determine the intended educational outcomes of the courses taught, the academic program requirements and the specialized skills students will need to advance. Faculty will determine how to assess these outcomes and how to use the results for program development and improvement. In turn, the information collected will be used to develop and improve academic programs demonstrating that assessment efforts are leading to improvements in programs and the overall effectiveness of the institution. RCC uses a standardized process for implementing and reporting assessment activities that provides consistency and uniformity among the diverse educational programs. Educational programs use a template for assessment plans that include eight areas listed below:

1) Outcome Number – Use Arabic numeral(s) beginning with the number 1 and use the abbreviations AUAO, PO, or PSLO. For example, AUAO 1, AUAO 2, PSLO 1, PSLO 2

2) Outcome Title – Use key words or phrases describing the outcome

3) Outcome Description – Statement of outcomes, [PO] Graduates will...[PSLO] Students will...or [AUAO] Program Faculty/Department will...and provide the Strategic Plan Number supported by this outcome.

4) Outcome Indicator/Assessment Method – Description of the means of assessment, and plan for data collection and analysis. A measure is a tool(s) used to determine if you have met your expected outcome. To increase the likelihood of valid results, you should strive to use more than one measure for each outcome/objective if possible. However, two measures are required for PSLOs. If you are struggling to identify a measure ask the following questions about your outcome/objective:

o How will we know if this is being accomplished? o What will provide us this information?

5) Target Benchmark – Identify the criteria for success; Lead with, X% of students score X or higher on the (insert assessment method). 6) Results – Summary of assessment data collected and its analysis;

o Determine if Benchmark was Met or Not Met o Identify percentage, include the number of students who participated in the assessment

method [don’t include scores from students who did not participate in the assessment method]

o Disaggregate by class type [traditional, hybrid or online]

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Page 17: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

17

Formatted: Right

o Where applicable, segment the data to document how well students performed on specific tasks. For example, if using a rubric, show the scores for each criterion

o Upload evidence to verify results Examples of evidence include spreadsheets that summarize/tally student scores,

certification exam scores, basically any documentation to support the results o Conduct item analysis o Support files for the Results section should be named using the following naming

architecture: ENG_111_0001_R1_FA17 ENGMAT_11143_000140_R2_FA17 ETC…

7) Use of Results – Identify improvements in programs and services that resulted from data collection and analysis. The narrative should specifically describe how results will be used for improvement such as:

• What strategy will be implemented? • What change will take place as a result of the data analysis? • How will you incorporate previous year's results for trending analysis? • Did you change something last year and for current year have new results? If so, discuss the

results for the two years.

Use the past tense when writing use of results. For example, program faculty discussed or the rubric was revised or a new assessment measure was developed. If benchmarks were met, conduct item analysis to “drill down” on a specific concept and identify a specific strategy for improvement. If all of the results meet or exceed expectations, it is time to consider a new assessment method or review the target benchmark or a new outcome.

8) Budget Request Description – Request of specific funds using the results of assessment to determine the need. Examples include, but are not limited to, requests for personnel, equipment, professional development and facilities space allocation or renovation. Note: Budget requests must be based on data analysis of assessed outcomes.

The APU is entered electronically in Compliance Assist. Compliance Assist is the College’s official database of all planning and accreditation activities. Reports can be generated from Compliance Assist in either a word, pdf or spreadsheet format if printed copies are needed. The Academic Planning Unit (APU) is the academic program level planning model and part of the RCC Five Year Program Review Processprogram review process.

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Not Highlight

Page 18: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

18

Formatted: Right

The following section outlines (7) specific steps to build an academic planning unit: Step 1 - Write or Modify the Program’s Purpose Statement (Academic Planning Unit) The purpose statement describes why the program exists. When creating or modifying a program purpose statement it is important to refer to current publications. Program faculty should refer to the North Carolina Community College System (NCCCS), http://www.nccommunitycolleges.edu curriculum standards and the College catalog http://rockinghamcc.edu for this information. The purpose statement should be no more than two to four sentences. In composing the purpose statement, consider the following questions.

• Whom do you serve? • What services and/or programs do you provide? • What are the primary functions of the education program department? • What are your core activities?

Step 2- Write or Modify the Expected Outcomes (Academic Planning Unit) The number of outcomes required for an Academic Planning Unit depends on the degree earned [Associate, Diploma or Certificate]. A. AUPOAUAO (Academic Unit Administrative Outcomes for an Academic Unit) Academic unit administrative outcomes Administrative outcomes are about the academic program itself or the faculty who teach in that program. AUPOAUAOs will provide an academic program with knowledge regarding other indicators of program effectiveness and relevance. Academic units are encouraged to create administrative outcomes but to limit the number to just one or two each academic year and each AUPOAUAO should be measured by a minimum of one mean. This recommendation is motivated by the desire to keep the volume of assessment for academic programs manageable, meaningful, and focused on student learning. B. PO (Program Outcomes for an Academic Unit) POs describe what service providers expect students (graduates) to achieve once they complete a program or receive services. What will graduates be able to do after completing the program? Each program should identify one to three program outcomes to assess each academic year and PO should be measured by a minimum of one mean. Be sure that the outcomes are appropriate for the specific area. C. PSLO (Program Student Learning Outcomes for an Academic Unit) Program student learning outcomes represent the specific skills and knowledge necessary for a student to be successful as they progress through the program. Student learning is about what actions students will take based on the experiences provided to them in the classroom.

Page 19: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

19

Formatted: Right

It would be difficult and counterproductive to prescribe the number of program student learning outcomes a diverse group of programs should measure. For assessment to be meaningful it must be valued by and useful to the faculty making decisions about the curricula of the program. It is important to note that programs can suffer from “analysis paralysis”. Analysis paralysis is when a lot of data is gathered but there is no time to reflect or use the results effectively. For consistency sake, all educational programs will assess one to four program student learning outcomes each academic year and each PSLO should be measured by a minimum of two means.

These outcomes [AUPOAUAO, PO and PSLO] will be entered into Compliance Assist and referred to as the Academic Planning Unit (Appendix 1 – APU Examples). The completion of the APU is time sensitive. The APU should be completed with input from faculty by May of the year the plan is intended to assess and faculty input should be documented in supporting meeting minutes that will also be loaded into Compliance Assist. It is critical that the APU is completed in time for faculty to include any selection for outcomes assessment on course syllabi. Supporting documentation should include meeting minutes indicating that faculty have participated in the development of the current APU. Faculty should review the APUs in August upon their return for the next academic year to review the strategies for improvement and begin the discussion for the next academic year’s APU. APUs are not intended to be static. Any portion of an APU can change from academic year to academic year and change based on evidence is expected to occur from academic year to academic year. The Academic Planning Unit (APU) entered into Compliance Assist should be used to record all the outcomes developed or modified for assessment for a given academic year along with the Assessment Methods and Target Benchmarks and must be entered finalized into Compliance Assist by August 30th of the assessment year. To summarize, an APU should consist of the following: Associate degree programs should consist of 6 to 9 outcomes

1 to 2 Administrative Outcomes with a minimum of 1 Outcome Indicator/Assessment Method per outcome

2 to 3 Program Outcomes with a minimum of 1 Outcome Indicator/Assessment Method per outcome; and,

3 to 4 Program Student Learning Outcomes with a with a minimum of 2 Outcome Indicators/Assessment Methods per outcome

Diploma programs should consist of 4 to 6 outcomes

1 Administrative Outcome with a minimum of 1 Outcome Indicator/Assessment Method per each outcome

1 to 2 Program Outcomes; with a minimum of 1 Outcome Indicator/Assessment Method per each outcome and

2 to 3 Program Student Learning Outcomes with a minimum of 2 Outcome Indicators/Assessment Methods per each outcome

Formatted: Font color: Text 1

Formatted: Font color: Auto

Page 20: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

20

Formatted: Right

Certificate programs should consist of a minimum of 3 outcomes 1 Administrative Outcome with a minimum of 1 Outcome Indicator/Assessment

Method per each outcome 1 Program Outcome with a minimum of 1 Outcome Indicator/Assessment Method per

each outcome 1 Program Student Learning Outcome with a minimum of 2 Outcome

Indicators/Assessment Methods per each outcome

Step 3-Develop Assessment Method and Target Benchmark (Academic Planning Unit) The Assessment Method and Target Benchmark is a statement that identifies the means and criteria for success used to measure outcomes. For each outcome, identify the assessment tool and a benchmark to measure achievement. Assessment tools include but are not limited to surveys, embedded exam questions, common exams, national exams, tracking forms, electronic tracking systems, rubrics and more. Benchmarks or criteria for success are quantitative such as counts, ratios, proportions, frequencies, scores, etc.. Consider prior results of assessment and justify benchmarks according to past performances and expectations of skill development. As a rule of thumb, the benchmarks should be rigorous, yet realistic. When achievement levels are set too low or unrealistically high, it is very difficult to use assessment data to improve programs. It is important to drill down into the assessment tool and look for patterns of strengths and weaknesses, and identify opportunities for improvement. This practice can provide useful information on which areas need improvement even when benchmarks are met. Item analysis of a rubric or series of questions similar to the first example below can accomplish this easily (Appendix 2 – Item Analysis Example). The purpose of assessment is to make improvements in programs even when the target benchmarks are met. Step 4 – Implement the Plan and Conduct Assessment Activities (Academic Planning Unit) Step 4- Implement the Plan and Conduct Assessment Activities (Academic Unit) Once the assessment plan has been formulated and entered into Compliance Assist, it must be put into action and data must be collected to measure the identified outcomes for the academic year. Each program director or department chair should designate someone to collect the data. This will be the program’s data manager. For purposes of continuity only, each program area should report the results of outcome assessment using Excel workbooks when appropriate. Some examples have been provided in the appendix (Appendix 3 – Excel Workbook Example Academic Unit). Of course, the format may need to be altered to accommodate the scores being recorded. It is important to include the raw data in addition to summarizations and analysis inside Compliance Assist, but remove students’ names from all files. Compliance Assist is the official record of the college’s planning activities and there must be evidence of assessment. How often and how much assessment and data is at the discretion of the program as long the program adheres to the institutional planning timeline. Data collection needs to help the program tell its story,

Formatted: Font color: Text 1

Formatted: Indent: Left: 0", First line: 0", Right: 0",Space After: 0 pt, Line spacing: Multiple 1.08 li

Formatted: Font color: Auto

Page 21: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

21

Formatted: Right

and the timelines for collection need to be meaningful and valued by the program. The key to effective assessment is being able to act and make decisions based on results aimed at planning unit improvements. In addition to the collection of data, units should demonstrate faculty involvement in the planning process. An outcomes committee should be formed within each department. The purpose of the outcomes committee is to assure faculty are involved in the assessment process and that the process is ongoing. The committee members will be the opinion leaders for the departments or programs and from within the program they will facilitate discussions with faculty identifying what the faculty value regarding the benefits of the program and expectations of students. Academic programs with small numbers of faculty may recruit faculty from other departments to serve on outcome committees. For example, the Surgical Technology program may recruit biology or social behavioral science faculty as members. Or, Public Service Technology programs like Criminal Justice Technology or Cosmetology may elect to serve as members to each other’s outcomes committees. The minutes of committee meetings (Appendix 4 – Outcomes Committee Meeting Minutes Example)will follow the minutes template format on the Committees page of the RCC Employee Portal. Minutes will outline membership and data collection responsibilities and these minutes will be filed electronically in Compliance Assist providing evidence that faculty are involved in the planning process. Step 5- Document and Analyze Results: Summarize Assessment Data Collected (Academic Planning Unit)

Once the data manager for the program area has compiled the results, the department or program faculty should convene to review the results and draw any inferences the data might reveal about how to move forward with improvements in the program. Remember, assessment is about aggregate results, not individual student results. Steps must be taken to hide the identity of students. The scores from specific faculty sections may be noted when necessary for program improvement and utilized as a tool to improve program effectiveness. The aggregate data recording forms for each outcome assessed will be uploaded into Compliance Assist as the official college record along with the departmental meeting minutes (Appendix 5 – Departmental Meeting Minutes Example) demonstrating robust faculty input into how those results will be used for program improvements. Results, intended Use of Results, and departmental minutes should be recorded and/or filed in Compliance Assist prior to Spring commencement of the assessment year. For programs that have a summer component, the documentation should be in Compliance Assist by August 1 of the assessment year. For health science programs who are awaiting certification exam results, all other outcome data should be recorded by August 1. The documentation should be organized in a way that anyone can review the files and formulate an idea of how assessment was planned, implemented and reviewed in the program area. The ability to review files on a regular basis will be important for future peer review, program accreditation and departmental or administrative review to ascertain program progress over a period of time. Step 6- Identifying Needed Improvements (Academic Planning Unit)

Formatted: Not Highlight

Formatted: Not Highlight

Formatted: Font color: Text 1, Not Highlight

Formatted: Font: Not Bold, Font color: Text 1, NotHighlight

Formatted: Font: Not Bold, Font color: Text 1

Formatted: Font color: Text 1, Not Highlight

Commented [SR1]: Outcomes Committee and Departmental Meeting Minutes ?? Both?

Formatted: Font color: Text 1

Formatted: Font color: Text 1, Not Highlight

Page 22: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

22

Formatted: Right

Collecting and evaluating viable data will ensure that program outcomes are being met, whether new measures are needed to correct any unmet outcomes, and how to make improvements in curriculum, instruction, and methodology which are based on data. Program faculty convened and discussed specific strategies for improvement as a result of data analysis in Step 5. The strategies identified must be clearly stated in the “Use of Results” section of the APU. It is wise to think ahead and plan ahead. This is a time to be purposeful about what interventions would create program improvements. A concise and clear account of how the current results will be implemented in the future is entered as use of results. It is important to keep changes based upon assessment realistic. Do not create a long list of vague strategies like improve rubric or improve writing assignment. Pick one or two meaningful strategies and develop them. If faculty are going to convene to improve assignments, rubrics, instructional approaches, and, or classroom policies then it really needs to happen! If identified strategies require monetary resources then it must be reflected in Compliance Assist. The Budget Request field in Compliance Assist allows for a brief description of the request, the amount, and type of request being made by the program. In addition, a Budget Justification Form (BJF) must be completed and uploaded as supporting documentation for the budget request (Appendix 6 – Budget Justification Form). Completing budget requests and justification forms will assure the College is allocating funds based upon program level planning. The budget requests documentation will be uploaded into the electronic planning platform by August 1 and the prioritization of equipment funds will be dependent upon the completion of the request and justification form. Step 7- Closing the Loop and Modifying the Outcomes Assessment Plan (Academic Planning Unit)

The Academic Planning Unit has been completed but the unit must be thinking ahead to the next academic year. Outcomes and measures of assessment for the next assessment cycle year must be entered into Compliance Assist by Timeline for Academic Planning Units

September 01 –All planning units for new planning cycle are complete and in Compliance Assist.

There are two things to consider: What strategies identified in the previous APU must be reflected in the current APU? Are there identified improvements that will be implemented but that will not be recorded on

the current APU?

January 15 – A Mid-Year Report will be submitted and uploaded in Compliance Assist summarizing fall implementation strategies. Details will be provided soon. SR to provide definition. See text copied into definitions section.

Formatted: Indent: Left: 0.26"

Formatted: Indent: Left: 0.51"

Formatted: Font: Not Italic

Formatted: Font: Italic

Page 23: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

23

Formatted: Right

May 15 – All planning units Results and Use of Results will be completed and in Compliance Assist.

July 30 – Academic Planning Unit audits are completed.

August 30 – All prior year Academic Planning Units are completed and in Compliance Assist

September 01th. There are two things to consider. What strategies identified in the previous APU must be reflected in the current APU? Are there identified improvements that will be implemented but that will not be recorded on the

current APU? Mid-year, the academic program units must provide a written, brief narrative about how the strategies for improvement are working. This narrative will be referred to as the Implementation Summary (IS) (Appendix 7 – Implementation Summary Form Example). It will identify the outcome, the year it was assessed, the use of results from that year and a brief description of the implementation. The IS will give program areas a mid-year check-up. A time to reflect on what is working or not working with new strategies and how this might or might not lead to program improvements. The IS will be completed after strategies for improvement have been implemented in the fall semester and uploaded into Compliance Assist by January 15th. The IS will demonstrate the use of results from assessment to continuously make improvements in the program and student learning. The APU, IS and BJF documents will provide administrators the necessary data to make policy, procedure and budget related decisions that will impact student learning and institutional effectiveness. It is the responsibility of each department chair or program director, and the faculty, to develop appropriate program outcomes; student learning outcomes; assessment measures; analyze assessment results; and identify appropriate actions to make program improvements.

Formatted: Highlight

Page 24: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

24

Formatted: Right

Support/Administrative Support Planning Unit Plannings

Formatted: Left

Page 25: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

25

Formatted: Right

IV. Support/Administrative Support Unit Plans Developing an Administrative /Support Planning Unit (ASPU) for an Administrative S/Support Program (Support/Administrative Planning Unit) The assessment of outcomes must be embraced by all administrative/support units and all support units should be engaged in outcomes assessment. The purpose of outcomes assessment is to improve the services offered and students’ abilities. Input from all the unit staff is critical to the development of a successful outcomes assessment plan. The staff members, working as a team, are best suited to determine the intended outcomes of the services they provide to students. Therefore, the staff will determine how to assess these outcomes, and how to use the results for program or departmental development and improvement. The results of outcomes assessment must be used to develop and improve services and the delivery of those services demonstrating that assessment efforts are leading to improvements. Administrators must provide leadership and accountability to the process. Outcomes are future oriented and are measurable. Units need to ask, “What are the benefits to students, staff, faculty, and the institution if the support unit fulfills its function successfully?” It is important that the unit focuses on key areas that are meaningful and valued by the staff working in that unit. RCC uses a standardized process for implementing and reporting assessment activities that provides consistency and uniformity among the administrative/support units which can all be quite different from one another. Each support unit will create a planning unit. The template for assessment plans includes eight areas of focus: RCC uses a standardized process for implementing and reporting assessment activities that provides consistency and uniformity among the diverse educational programs. Educational programs use a template for assessment plans that includes eight areas listed below: 1) Outcome Number – Use Arabic numerical (s) beginning with the number 1 and use the abbreviations AO, Administrative Outcome 2) Outcome Title – Use key words or phrases describing the outcome 3) Outcome Description – Statement of outcomes, [AO] The Department will...and provide the Strategic Plan Number supported by this outcome. 4) Outcome Indicator/Assessment Method – Description of the means of assessment, and plan for data collection and analysis. A measure is a tool(s) used to determine if you have met your expected outcome. To increase the likelihood of valid results, you should strive to use more than one measure for each outcome/objective if possible. However, two measures are required for PSLOs. If you are struggling to identify a measure ask the following questions about your outcome/objective: o How will we know if this is being accomplished? o What will provide us this information? 5) Target Benchmark – Identify the criteria for success; Lead with, X% of students score X or higher on the (insert assessment method).

Formatted: Normal

Formatted: Font color: Red

Formatted: Indent: Left: 0.01", Right: 0", Space After: 0.25 pt

Formatted: Font color: Red

Formatted: Normal, No bullets or numbering

Formatted: Normal, Indent: Left: 0"

Formatted: Normal

Page 26: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

26

Formatted: Right

6) Results – Summary of assessment data collected and its analysis; o Determine if Benchmark was Met or Not Met o Identify percentage, include the number of students who participated in the assessment method [don’t include scores from students who did not participate in the assessment method] o Disaggregate by class type [traditional, hybrid or online] o Where applicable, segment the data to document how well students performed on specific tasks. For example, if using a rubric, show the scores for each criterion o Upload evidence to verify results Examples of evidence include spreadsheets that summarize/tally student scores, certification exam scores, basically any documentation to support the results o Conduct item analysis 7) Use of Results – Identify improvements in programs and services that resulted from data collection and analysis. The narrative should specifically describe how results will be used for improvement such as: • What strategy will be implemented? • What change will take place as a result of the data analysis? • Incorporate previous year's results for trending analysis? • Did you change something last year and for current year have new results? If so, discuss the results for the two years. Use the past tense when writing use of results. For example, program faculty discussed or the rubric was revised or a new assessment measure was developed. If benchmarks were met, conduct item analysis to “drill down” on a specific concept and identify a specific strategy for improvement. If all of the results meet or exceed expectations, it is time to consider a new assessment method or review the target benchmark or a new outcome. 8) Budget Request Description – Request of specific funds using the results of assessment to determine the need. Examples include, but are not limited to, requests for personnel, equipment, professional development and facilities space allocation or renovation. Note: Budget requests must be based on data analysis of assessed outcomes. The APU is entered electronically in Compliance Assist. Compliance Assist is the College’s official database of all planning and accreditation activities. Reports can be generated from Compliance Assist in either a word, pdf or spreadsheet format if printed copies are needed. The Administrative/Support Planning Unit (ASPU) is the administrative unit program level planning model and part of the program review process.

V. Support/Administrative Unit Plans Planning Process Timeline Ughhhhhhhhhh July 30 – Planning Unit audits are completed

Formatted: Normal, Right: 0", No bullets ornumbering

Formatted: Normal, Indent: Left: 0", Right: 0"

Formatted: Normal, Right: 0"

Formatted: Normal, Right: 0", No bullets ornumbering

Formatted: Indent: Left: 0.01", Right: 0"

Formatted: Indent: Left: 0.01", Hanging: 0.01", Right: 0"

Formatted: Normal, Right: 0"

Formatted: Normal, Indent: Left: 0", Right: 0"

Formatted: Indent: Left: 0.01", Hanging: 0.01", SpaceAfter: 0.25 pt, Line spacing: Multiple 1.03 li

Formatted: Indent: Left: 0.01", Right: 0"

Formatted: Normal

Formatted: Normal, Right: 0", Space After: 0 pt, Linespacing: single

Formatted: Normal, Left, Right: 0", Space After: 0 pt,Line spacing: single

Commented [SR2]: This is the timeline the Deans and I created. The timeline below is Robeson’s – Note dates for institutional data snapshots. I think we will need to provide data for program areas to report or not – see spreadsheet example; But, need Ids – some may want gender, # of credit hrs earned, etc.

Formatted: Indent: Left: 0.01", Right: 0", Space After: 0.25 pt

Page 27: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

27

Formatted: Right

August 30 – All Planning Units are completed and in Compliance Assist September 01 –All planning units for new planning cycle are complete and in Compliance Assist Mid September January 15 – Mid- year Implementation Summary is completed by each unit and uploaded into Compliance Assist denoting progress of the changes that took place because of previous year’s assessment results May 15 – All planning units Results and Use of Results completed and in Compliance Assist. July 30 – Planning Unit audits are completed August 30 – All Planning Units are completed and in Compliance Assist ********************************************************************************************** August 30 – Needs identified through assessment, data, and internal audit summary are reported to the Executive Staff September 15 – All units have Outcomes and Measures of Assessment/Criteria of Success in CA for the current year September 30 – Internal Audit Planning Process Progress Report is submitted September 30 – Budgets are released to units and the needs approved for purchase from Executive Staff are communicated to units October 31 – Institutional Data Snapshots January 15 – Implementation Summary is completed by each unit and uploaded into CA denoting progress of the changes that took place because of previous year’s assessment results January 30 - Internal Audit Planning Process Progress Report is submitted March/April - Community College Survey of Student Engagement (CCSSE, CCFSSE) Administered in odd years. March/April - Collegiate Assessment of Academic Proficiency (CAAP) administrations. Administered in even years. (Adoption is still pending) March/April - General Education Outcomes Assessment Report is completed March 31 – Institutional Data Snapshots May 31 – All academic and support units should have Results and Use of Results in Compliance Assist (CA). That year’s Outcome Assessment Plan (OAP) should be complete. Budget Request Form is completed and filed in CA June 5 – Internal Audit Planning Process Progress Report is submitted June 5 – OAP audits begin July 31 - Institutional Data Snapshots August – CCSSE, CCFSSE, CAAP and other survey/standardized institutional collection results are reported to the campus community

Formatted: Font: Not Italic

Formatted: Indent: Left: 0.01", Hanging: 0.01", SpaceAfter: 0.25 pt, Line spacing: Multiple 1.03 li

Formatted: Indent: Left: 0.01", Right: 0", Space After: 0.25 pt

Formatted: Indent: Left: 0.01", Hanging: 0.01", SpaceAfter: 0.25 pt, Line spacing: Multiple 1.03 li

Page 28: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

28

Formatted: Right

Appendix 1 Academic Planning Unit Examples

Example 1 IS THERE ANY POSSIBILITY THAT WE HAVE ONE STRONG ENOUGH TO USE??? Automotive Technology: Purpose: The Automotive Technology program prepares students to apply technical knowledge and skills to repair, service, and maintain all types of automobiles. The curriculum Includes instruction in brake systems, electrical systems, engine performance, engine repair, suspension and steering, automatic and manual transmissions and drive trains, and heating and air condition systems AO 1: Current Practice Program faculty will maintain currency and relevancy of the field. Outcome Indicator/Assessment Method: Faculty will participate in local return to industry opportunities. Target Benchmark: All full time faculty will document a minimum of twenty hours spent in local industries directly related to their discipline. Results: Not met. One faculty member did not meet the benchmark due to scheduling difficulties. Use of Results: The Technology Department Chair will facilitate the scheduling to ensure class coverage in order for all faculty to participate in return to industry initiatives. Budget Request Description: Adjunct salary for one week to cover classes. PO 1: Job Performance Employers of Automotive Technology employment ready students and graduates will be satisfied with the student(s) job performance according to entry-level industry standards. **This outcome is required annually by NATEF (National Automotive Technician Education Foundation) for program accreditation**. Outcome Indicator/Assessment Method: A web-based survey using Survey Monkey will be conducted using questions approved by the Automotive Advisory Board. Local employers will take the survey within the first 6 months after the student(s) completion of program or hire date. Target Benchmark: 90% of responding employers will report on a survey that they are satisfied with the entry-level job performance of the automotive student(s) according to industry standards. Results: Not met. 80% (12/15) of employers were satisfied. Use of Results: Survey results indicate that while employers are satisfied with job skills, graduates need improvement in soft skills such as dependability and interpersonal communication. WBL 101 will be recommended to the Curriculum Committee for inclusion in the AAS degree. Budget Request Description:

Formatted: Normal

Formatted: Font color: Text 1

Formatted: Normal, Left

Page 29: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

29

Formatted: Right

PO 2: Job Skills Certification Employment ready students and graduates of the Automotive Program will obtain official ASE Student Recognition certification through ASE/NATEF for one or more of the 8 core module areas of study: Engine Repair, Automatic Transmission & Transaxle, Manual Drivetrain & Axles, Suspension & Steering, Brakes, Electrical & Electronics, Heating & Air Conditioning, and Engine Performance. Outcome Indicator/Assessment Method: The ASE administered web-based program exit exam will be administered in the RCC Testing Center. Target/Benchmark: 75% of the students who take the ASE/NATEF “End of Program” exam will achieve the ASE Student Recognition Certification for each module. Results: Met. 75% (9/12) achieved the certification for each module. Use of Results: While the benchmark was set, section analysis indicated that students were weakest in the areas of heating and air conditioning and electrical/electronics. Faculty will conduct two field trips to area automotive businesses, and invite guest speakers from other businesses whose specialties are in these two areas. Budget Request Description: PSLO 1: Brakes Students will demonstrate proper disassembling and reassembling of brake system. Outcome Indicator/Assessment Method: AM1: Students in AUT 151 will be assessed using ten embedded test questions related to assembly and disassembly of brake systems. Answering eight or more questions correctly is the benchmark. AM2: Students in AUT 251 will demonstrate disassembly and reassembly of brake system components, scoring 3 or higher on each section of a 4 point rubric. Target/Benchmark: 85% of students will meet the benchmark. Results: Met. Overall, 85% (23/27) met the benchmark. Specifically, 80% (12/15) met the benchmark in AUT 151; 92% (11/12) met the benchmark in AUT 251. Use of Results: Automotive faculty discussed the results and determined that students needed more opportunities to observe such demonstrations in AUT 151. Item analysis revealed that students struggle with assembly to a greater extent. A project involving assembly will be assigned. In addition, faculty concluded that a video which students could watch repeatedly would benefit the students. Faculty will create a video to be uploaded into Moodle. The same video will also be available in AUT 251. Budget Request Description: A video camera, estimated cost $500, will be needed to facilitate creating this and other videos for the automotive program. PSLO 2: Transmissions Students will identify the components of an automatic transmission. Outcome Indicator/Assessment Method: AM1: Common imbedded test questions will be administered to AUT 141 students. AM2: A practical exam will be administered in AUT 241. Target/Benchmark: 80% of AUT 141 students will identify all components correctly on a written exam. 90% of AUT 241 students will identify each component correctly on a practical exam. Results: 53% (8/15) correctly identified all components in AUT 141.

Page 30: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

30

Formatted: Right

Use of Results: Automotive faculty discussed the results and determined that using matching questions did not fully reveal student knowledge. Upcoming test questions will provide more challenge by offering fill in the blank questions instead. Budget Request Description:

Page 31: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

31

Formatted: Right

Example 2 Radiography Purpose: The purpose of the Radiography Program is to prepared students to be a radiographer. The program provides the professional knowledge and technical skills needed to deliver excellent patient care in a diverse community. Outcome Number: AO1 Outcome Title: Recruitment Outcome Description: Program representatives will attend career fairs at areas schools and conduct program information sessions at the College to aid in recruitment efforts. Outcome Indicator/Assessment Method: Career fair or other external activity associate with recruiting and Information Session sign-in sheets will document programmatic participation; Student satisfaction/interest surveys will determine satisfaction with the information sessions and determine level of interest in the program. Target Benchmark A. Minimum of (4) Career Fair sign in sheets indicate faculty and/or students engaged in recruitment

activity. B. Minimum of (4) Program Information sign in sheets indicate program faculty conducted (4)

information sessions. C. 75% of Radiography Information Session attendees indicate satisfaction, 3 or higher on a Likert

scale with knowledge gained about the application process. Results A. Measure Not Met Radiography Program faculty participated in (2) off-campus recruiting activities; Sign-in sheets for Annie Penn Hospital and Reidsville High School career fair uploaded. B. Measure Met (4) Radiography Program information sessions were conducted; Sign-in sheets for two information sessions in the fall and spring uploaded. C. Measure Met 100% of students (68 of 68) indicated satisfaction at 3 or higher on a Likert scale with knowledge gained and confidence in the application process for Radiography.

INSERT FILES HYPERLINK WHAT DOES THIS MEAN?Upload Attachments in Compliance Assist

Use of Results A. The Radiography Clinical Coordinator participated in (2) recruitment activities – An invitation from Annie Penn Career Development Director to discuss health careers with non-clinical staff, and the Career Fair at Reidsville High School. B. The Clinical Coordinator learned that all RCC programs could attend the Annie Penn Career Development Day. The premise of the event is to make all staff aware of continuing education opportunities available in the county. The Clinical Coordinator met the CTE Coordinator at Reidsville high school and will plan class visits with students enrolled in Health Careers I and II. The Clinical

Formatted: Font color: Text 1

Page 32: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

32

Formatted: Right

Coordinator will contact the CTE Coordinator at each high and offer class visits to discuss Radiography and other careers available in the health care environment. C. Participation in a program information session is a requirement for all health science programs effective 2017-2018. This administrative outcome will no longer be assessed. Budget Request Description: None Outcome Number: PO 1 Outcome Title: Knowledge Outcome Description: Graduates [operationally defined as students enrolled in the 5th semester of the program] will possess the knowledge to become Registered Radiologic Technologists. Outcome Indicator/Assessment Method: Corectec Testing, a mock radiography national registry exam, administered the last semester [RAD 271] of the program and scores indicate readiness for the ARRT National Registry Exam. Target Benchmark A. 75% of graduates score 75 or higher on the Corectec Mock Registry exam on their first attempt. B. 75% of graduates score 75 or higher on the ARRT National Registry exam on their first attempt. Results A. Measure Not Met 67% (6 of 9 graduates) scored 75 or higher on the Corectec Mock Registry exam on the first attempt. B. Measure Not Met 67% (6 of 9 graduates) score 75 or higher on the ARRT National Registry exam on their first attempt.

Upload Attachments in Compliance Assist

ARRT__Passing_Score_Report__sample PO1_MOA1 MOCK EXAM SP 2016 PO1_MOA2 ARRT EXAM SP 2015

INSERT FILES HYPERLINK Use of Results A. The (3) students who did not pass the Corectec Mock Registry exam did not pass the ARRT National Registry exam. For the next cohort, the mock registry exam will be administered 2 weeks before the end of the fifth semester. Student who were unsuccessful will be required to spend 10hrs per week in the computer lab reviewing missed questions and will retake the Mock Registry exam the last day of the semester. B. Upon review of the 6 students who passed the ARRT National Registry exam on the first attempt, program faculty learned through item analysis that the Sections of Equipment Operation and Image Acquisition had the most missed questions. For the next academic year, the instructional modules and assessments for those modules in RAD 271 will be revised to more accurately reflect the registry exam. Budget Request Description: $500 to purchase additional test attempts for the Corectec Mock Registry exam.

Page 33: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

33

Formatted: Right

Outcome Number: PO 2 Outcome Title: Career Preparation Outcome Description: Graduates [operationally defined as students enrolled in the 5th semester of the program will exhibit professional behaviors [interview and cover letter/resume] in preparation for employment as a Registered Radiographer. Outcome Indicator/Assessment Method JRCERT employer survey for professional behaviors, rubric analysis to assess professional behaviors [professional attire and responses to interview questions] during mock interviews in RAD 271; Departmental rubric for cover letter and resume creation, assess the quality of students' cover letters and resumes. Target Benchmark A. 100% of students indicate 70% or higher [7 of 10 responses] Equivalent to Professional or Highly

Professional on professional behavior rubric analysis. B. 100% of students in indicate 70% or higher [7 of 10 responses] Equivalent to Professional or Highly

Professional on professional presentation of a cover letter and resume during mock interviews. Results A. Measure Not Met

78% (7 of 9 graduates) scored 70 or higher on professional behaviors. B. Measure Not Met

78% (7 of 9 graduates) scored 70 or higher on professional behaviors.

Upload Attachments in Compliance Assist

PO1_MOA1_SP 2016 PO1_MOA2_SP 2016 RAD 271 Professional Skills Rating Chart

INSERT FILES HYPERLINK Use of Results Program faculty discussed the need for students to mock interview with individuals unfamiliar to them. Program faculty will request that Medical Imaging Managers at one or more program clinical sites conduct interviews, complete the rubric analysis for professional behaviors, and assess the written documentation required for an interview. This will provide a more realistic interview experience. Also, program faculty will organize a meeting with the institution’s Human Resource Development [HRD] Director to conduct the Working Smart modules before the mock interview (s) with the Medical Imaging Managers. Budget Request Description: $500 to pay for HRD class Outcome Number: PSLO1 Outcome Title: Clinical Competence Outcome Description: Students will be clinically competent by utilizing effective positioning skills, setting appropriate technical factors, and practicing radiation protection and safety. Outcome Indicator/Assessment Method: A. Clinical competency forms in RAD 161 for “Positioning Skills" (Item# 5) and "Technique

Manipulation" (Item #8)

Page 34: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

34

Formatted: Right

B. Student evaluation forms in RAD 161 for Clinical Item numbers Item# 8 Student attempted to collimate on exams (Item# 9) Student demonstrated proper Radiation Protection techniques through the use of lead shielding devices

Target Benchmark A. 100% of student scores indicate 1.80 or higher on a 2 point scale for the following exams: Chest

routine, KUB, hand, knee, and foot. B. 100% of student scores indicate 3.3 or greater on a 4 point scale for Items 8 and 9 Results A. Measure Not Met 86 % of students (6 of 7) met the benchmark for each of the criteria, Positioning Skills, and Technique Manipulation. B. Measure Not Met 71% of students (5 of 7) met this benchmark.

Upload Attachments in Compliance Assist

PSLO1_MOA1_Positioning_RAD161_SP16 PSLO1_MOA1_Technique_RAD161_SP16 PSLO1_MOA2_RAD 161 Spring 2016

INSERT FILES HYPERLINK Use of Results A. Item analysis of the competency check-off sheet for the selected exams indicated inconsistent assessment by clinical preceptors. The program faculty will provide professional development with the clinical preceptors and will conduct at least one observation of the clinical preceptor using the competency check-off sheet for selected exams. B. Program faculty believe students understand the importance of providing radiation protection to all patients; however, students do not always transfer good habits to the clinical environment because of inconsistent practice by staff technologists. Program faculty will share this outcome data and student focus group comments that outline the inconsistent practice in the clinical facility at the next Radiography PAC meeting. Budget Request Description: None Outcome Number: PSLO2 Outcome Title: Critical Thinking Outcome Description: Students will use critical thinking to draw conclusions through radiographic critique, and use problem­solving skills to generate possibilities through quantitative reasoning. Outcome Indicator/Assessment Method: A. Radiographic Formula exam in RAD 121 [: 15% Rule, Calculation of mAs, Density Maintenance

formula, Inverse Square Law, Grid Conversion, and Screen Speed] B. Radiographic critique of 5 images (images (Chest, KUB, hand, knee, odontoid) in RAD 245 Target Benchmark A. 100% of students score 90 or higher on the formula exam B. 100% of students score 8- or higher on radiographic technique Results A. Measure Not Met

Page 35: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

35

Formatted: Right

43% of students (3 of 7) met the benchmark. B. Measure Not Met 78% of students (7 of 9) met the benchmark.

PSLO2_MOA1_RAD121_SP2016 PSLO2_MOA2_SP16_RAD245

Upload Attachments in Compliance Assist

INSERT FILES HYPERLINK

Use of Results A. Program faculty discussed the need to integrate Math conversions in all RAD courses with a laboratory component. Laboratory analysis documentation will be revised to include radiographic formulas calculations and interpretations. B. Program faculty elected to assess students orally and in written form. Students were able to verbally communicate 5 images (chest, abdomen, skull, knee, odontoid) for accuracy in positioning, centering, technical factors, collimation, marker placement, etc. However, when asked to write a critique for each image, program faculty learned that some students had difficulty reporting their critiques of the images in written form. When image critique is introduced during the first year, program faculty will require both oral and written assessment. Budget Request Description: None Outcome Number: PSLO3 Outcome Title: Communication Outcome Description: Students will employ effective oral and written communication skills. Outcome Indicator/Assessment Method: A. Final exam, Patient Assessment/AIDET section in RAD 112; B. Case study Research paper/oral presentation in RAD 211 Target Benchmark A. 100% of students score 8 or higher on 10-point scale for 10 items specific to AIDET B. 100% of students score 16 or higher on a 20-point scale Results A. Measure met. 100% of students (7 of 7) scored 8 or higher B. Measure not met. 100% of students (9 of 9) scored 16 or higher.

PSLO3_MOA1_SP2016 Upload Attachments in Compliance Assist

PSLO3_MOA2 RAD 211 ORAL PRESENTATION Fall 2015

Page 36: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

36

Formatted: Right

PSLO3_MOA2 RAD 211 Written Research Fall 2015

INSERT FILES HYPERLINK Use of Results A. Benchmarks were met; however item analysis of the 10 criteria revealed that 50% of the class failed to estimate the duration of the exam to the patient (D). Program faculty will conduct this assessment 4 times in two courses each semester. Program faculty believe the repetitive practice of AIDET will benefit students in the clinical environment. B. Faculty did not use a rubric to score the oral and written assessment. Program Faculty will collaborate with the Department Chair of English to develop a rubric. Budget Request Description: None

Page 37: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

37

Formatted: Right

Appendix 2 Item Analysis Example I DO NOT LOVE THIS CHART.

90

313

2

33

45

88

52

60

4

21 22

35

19

6975

24

77

63

10 2 11 13

27

1814

8

55

84

17

0

10

20

30

40

50

60

70

80

90

100

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

Number Missed for Each of the 30 PSY 150 Common Exam Questions (N=96 Exam Takers)

Formatted: Font color: Text 1

Page 38: Planning Process Guide for Institutional Effectiveness · 6 Formatted: Right is a formal process that allows the institution to document continuous improvements and be accountable

38

Formatted: Right

Appendix 3 Excel Workbook Example Academic Unit ENG 112 Assessment Worksheets THIS SHOULD BE IMBEDDED IN THE DOCUMENT.

Appendix 4 – Outcomes Committee Meeting Minutes Example

Appendix 5 – Outcomes Committee Meeting Minutes Example

APPENDIX 5 IS THE SAME AS APPENDIX 6.

Appendix 6 – Budget Justification Form

NOT SURE ABOUT THIS

Appendix 7 – Implementation Summary

DEFINITELY NOT SURE ABOUT THIS

Formatted: Font color: Auto, Highlight

Formatted: Font color: Auto

Formatted: Font: 12 pt, Not Bold, Font color: Black

Formatted: Heading 2, Right: 0.01", Space After: 10.6pt