42
ABHES PROGRAM EFFECTIVENESS PLAN (PEP) MANUAL

A B H E S · Web viewIndependent &

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A B H E S · Web viewIndependent &

ABHESPROGRAM EFFECTIVENESS PLAN

(PEP) MANUAL

Page 2: A B H E S · Web viewIndependent &

Updated January 2012

TABLE OF CONTENTS

The Purpose of the Program Effectiveness Plan (PEP)..................................................... 3Developing, Implementing, and Monitoring the Program Effectiveness Plan...................5Format and Content Guidelines.......................................................................................7

Subsection 1 –Program Effectiveness Plan content Title/Cover Page......................................................................................................7

a. Student Population.................................................................................................7b. Program Objectives.................................................................................................8c. Program Retention Rates......................................................................................10d. Program Job Placement Rates...............................................................................12e. Credentialing Examination Participation Rates.....................................................14f. Credentialing Examination Pass Rates..................................................................15g. Program Assessment............................................................................................16h. Student, Clinical Extern Affiliate, Graduate And Employer Satisfaction Surveys. .18i. Faculty Professional Growth And In-Service Activities .........................................22

Subsection 2 – Outcome Assessment i. Historical outcomes...............................................................................................23ii. Types and Uses of Assessment Data.....................................................................23iii. Initial baseline rates & measurement of results.....................................................24iv. Summary and analysis of data collected...............................................................25v. How data is used to improve the educational process...........................................25vi. Goal Adjustment....................................................................................................26vii. Activities undertaken to meet future goals............................................................26Format Examples..........................................................................................................27Conclusion……................................................................................................................ ..

30

Program Effective Plan Manual Updated January 2012 2

January 2012

Page 3: A B H E S · Web viewIndependent &

PURPOSE OF THE PROGRAM EFFECTIVENESS PLAN

The Program Effectiveness Plan (PEP) is an internal quality assessment tool that evaluates each program within an educational institution by

establishing and documenting specific goals, gathering outcome data relevant to these goals, analyzing outcomes in relation to benchmarks and program’s short- and long-

term objectives, and designing strategies to improve program performance.

The program effectiveness assessment is expected to result in the achievement and maintenance of outcomes. For each of the outcomes identified by a program, the program establishes the level of performance that serves as a benchmark for acceptable program performance. These benchmarks meet or exceed requirements established by any applicable state or federal authority and by ABHES policies and/or standards.

Program success is based on student achievement in relation to its mission, including but not limited to consideration of the following:

Retention rates Participation in and results of licensing and certification examinations Graduation rates Job placement rates Program assessment Survey responses from students, clinical externship sites, graduates, and

employers faculty professional growth and in-service activities

Developing and using the Program Effectiveness Program (PEP) should fulfill several purposes, including:

1. Assisting the institution in achieving internal effectiveness through establishing goals for short- and long-term successes. Further, criteria for measuring the accomplishment of these goals can be defined, allowing the institution to focus its plans and activities on the critical processes needed for effectiveness. Once defined, these goals and criteria should then be used to unify administrative and educational activities, which can achieve a high degree of commitment and common direction among all employees.

2. Assessing progress and the need for change and continuously reviewing the process to help the institution make timely changes based upon valid information to achieve even greater effectiveness.

3. Communicating key information about the institution’s goals, its degree of effectiveness, and how it plans to enhance overall quality to external publics such as graduates, employers, and community leaders. Information, which depicts the most important elements of the institution’s operation, communicates clearly and accurately to external publics how well the institution is meeting the needs of students and providing quality-learning experiences.

Program Effective Plan Manual Updated January 2012 3

Page 4: A B H E S · Web viewIndependent &

4. Measuring how the PEP meets the expectations and requirements of approving or accrediting organizations, including state boards and ABHES, to demonstrate regulatory compliance. A document which defines institutional goals and educational processes is a primary focus of most accrediting agencies as they measure overall effectiveness and the quality of programs and services provided.

All goals and activities are key indicators of program effectiveness and should relate to the institution’s mission to demonstrate mission achievement and continuous improvement, as the institution’s mission is the impetus and barometer of the program’s effectiveness. The PEP requires an institution to look at its past, present, future, and strategies and to continuously ask:

Where have we been? This data becomes the baseline for gauging and demonstrating improvements.

Where are we now? Current data demonstrates how you will measure change from the baseline data using the caparison to identify changes needed.

Where do we want to go?

A look toward the future for goals to improve or enhance processes and/or programs.

How do we get there? Processes used to achieve the new direction based upon the input of all relevant constituents.

Program Effective Plan Manual Updated January 2012 4

Page 5: A B H E S · Web viewIndependent &

DEVELOPING, IMPLEMENTING, AND MONITORING

THE PROGRAM EFFECTIVENESS PLAN(Standards and Examples)

The standards addressing the Program Effectiveness Plan may be found in the ACCREDITATION MANUAL 17TH Edition, Chapter V, Section I, pages 72-76 as published by the ACCREDITING BUREAU OF HEALTH EDUCATION SCHOOLS (ABHES). The standards outline the ABHES requirements in relation to the development, implementation, and maintenance of the PEP, including the outcomes assessment requirements and (Section I, Subsection 2), which gives a detailed description and explanation of the meaning and implications on the required components of the PEP. This manual provides suggestions and examples for addressing each PEP standard.

Developing an PEP requires that each program collect, maintain, and use information reflecting the areas outlined in Chapter V, Section I of the ABHES Manual. The data should be analyzed for a specific 12 month period of time as defined by the institution and be used as the foundation for making comparisons across future time periods. Many institutions perform its analysis in conjunction with its fiscal/calendar year or in conjunction with the ABHES annual reporting period (July 1 – June 30), since the majority of the required PEP information is also required on the ABHES Annual Report. Regardless of the selected timeframe, the data is to be updated at least annually.

The PEP is unique to the program and institution and the institution must evidence its efforts made to ensure continuous improvement. The process requires that the institution: (1) systematically collect data and information on each of the educational outcomes areas and achievement of its occupational objectives at least annually; (2) complete an analysis of the data and information including, but not limited to, performing a comparison with previous findings; and (3) identify what changes in educational operations or activities it will make based on the analysis.

Steps in preparing and managing the PEP are similar to those suggested for preparing an institution’s self-study. Structured organization is essential. Although the exact organizational procedures will vary from institution to institution, the following suggestions may be helpful:

The program faculty (full time and part time) assisted by the president/director, director of education, and a representative from admissions and placement are the key individuals acting as a team to initiate, guide, and direct the development and implementation of the PEP. It is their commitment to the PEP and empowerment of the team to oversee these activities that will ensure continuous improvement and the ultimate success of the planning process.

The process is a collective effort that should involve all faculty, administrators, staff, and advisory board members. Consideration should also be given to actively recruiting student, graduate, and employer representatives in the process. It is important that all members of the

Program Effective Plan Manual Updated January 2012 5

Page 6: A B H E S · Web viewIndependent &

administration, faculty, governing board, and student body understand and appreciate the importance of the PEP and its value to the institution.

Establish subcommittees to prepare specific PEP sections. These subcommittees should be effectively utilized to complete the various tasks in all facets of the PEP, including development, implementation, and evaluation. The consideration of subcommittee members should depend on each member’s responsibilities. Include the names of those responsible for implementing and monitoring the PEP.

Establish baseline rates developed through analyzing the results of past annual retention and placement rates, which will be used in the analysis process. The data collected each year on the ABHES Annual Report includes retention and placement percentage; therefore, it is a valuable part of the PEP. Each program should maintain these annual reports with supporting documentation, for at least three years so as to provide historical data from which goals may be set. Be specific in the data to be collected and collect data that will clearly evidence the level of educational outcomes and satisfaction experienced by current students, graduates, and employers.

The PEP may include any other elements determined to be important measures of program effectiveness such as a review of default rates in the student loan programs under Title IV of the Higher Education Act, based on the most recent data provided by the Secretary of Education. These findings may be coupled with student retention and placement rates to determine what correlation, if any, can be determined. Any correlation identified should be reviewed for correction.

Because the PEP focuses on overall program improvement, it is a work in progress as there are many potential elements of the institution’s daily operations, which are relevant and important to improving effectiveness. Each program is encouraged to collect a variety of statistical data, which will assist it in improving the educational outcomes.

The PEP team and subcommittees should adopt and implement a realistic and enforceable periodic schedule throughout the year to review the PEP and document progress through minutes of all meetings where the PEP is discussed. The meeting minutes should show the progress to date, a short summary of the data analyzed, changes anticipated, and continuation or new direction the institution is taking to improve the educational processes. Minor revisions to goals may be made during the monitoring of the PEP; however, substantial revisions should only be made at the annual review unless there is a major change in the institution’s leadership and/or mission. These periodic meetings will ensure that the PEP is utilized and evaluated on a continuing basis.

Program Effective Plan Manual Updated January 2012 6

Page 7: A B H E S · Web viewIndependent &

PROGRAM EFFECTIVENESS PLANFORMAT AND CONTENT GUIDELINES

While each program must address each element required of the Program Effectiveness Plan (PEP), the plan may be a comprehensive one which collectively represents all programs within the institution, or may be individual plans for each distinct program. The following section is to serve as a guide as it contains elements that should be incorporated into the PEP. Each standard is given, followed by examples of how an institution might demonstrate established goals and compliance.

Title/Cover Page to include the following:ABHES I.D. CODEName of InstitutionAddressCityName of ProgramProgram DirectorCredential awardedPortion of program offered via distance learningLength of program (clock hours, semester/quarter credits, weeks, etc.) 12 Month period covered by the plan (e.g., July 1, 20?? through June 30, 20??)

V.I.1. A program has an established documented plan for assessing its effectiveness as defined by specific outcomes.

The Program Effectiveness Plan includes (all of the following standards) clearly stated:STANDARDa.student populationA description of the characteristics of the student population is included in the Plan.

Student population demographics such as gender ratios, median age, race/ethnicity, marital status, and socioeconomic descriptions should also be included and identified by program if they differ from the overall institutional demographics.

EXAMPLES: There are many ways an institution may identify such information. The following examples include narrative, listing and chart formats:Narrative FormatThe institution’s student population has doubled over the last three years and is represented by a diversity of demographic characteristics. Approximately 80% of the population is independent with an average annual income below $22,000, and 20% are dependent with an average annual family/household income of $40,000. Male to female ratio is 39% to 61% respectively, and student ages

Program Effective Plan Manual Updated January 2012 7

Page 8: A B H E S · Web viewIndependent &

range from 18 to 63. Recent business closings have resulted in an increase in the student population of dislocated workers seeking retraining. The majority of students require some form of financial assistance. The race/ethnicity composition is African American/Black 13%, American Indian/Alaskan Native 0.3%, Asian/Pacific Islander 0.7%, Hispanic 1.5%, Mexican American 0.5%, Caucasian/White 78.8%, and Undisclosed race 5.1%

Listing FormatIn the 2010-2011 Annual Report year, the student body consisted of approximately:

61% female, 39% male60% attend day classes: 40% attend evening classes83% earned an average or above grade in high school English75% earned an average or above grade in high school math 9% English as a second language58% HS 12% GED30% had prior postsecondary education71% were first in family to receive postsecondary education61% were employed80% were independent with a household income of $22,00 or less91% attended full-time classes and 9% part-time36% were married29% under age 25, 34% age 25-34, 26% age 35-44, 7% age 45-54, 4% age 55+Afro-American Black 13%, American Indian/Alaskan Native 0.3%, Asian/Pacific Islander 0.7%, Hispanic 1.5%, Mexican American/Chicano 0.7%, Caucasian/White 78.8%, Undisclosed race 5%

Chart Format

PROGRAM

Gender ratiosM F

Median age

Race/Ethnicity

W NW UMarital status

Socio-EconomicsIndependent & <$22,000

Nursing AssistantDental HygienistMassage TherapistMedical Assistant Medical Billing & Coding SpecialistPatient Care SpecialistPharmacy TechnicianPhlebotomy Technician

INSTITUTION TOTALS 39 69 13 82 5

STANDARDb.  program objectives            Programs objectives are consistent with the field of study and the credential offered and include as an objective the comprehensive preparation of program graduates for work in the career field. 

Program Effective Plan Manual Updated January 2012 8

Page 9: A B H E S · Web viewIndependent &

Program Characteristics of each currently offered program should include: Degree level, Program description, Program objectives, and Description of student outcomes, specifying the competencies students

should possess upon conclusion of the program.

If an institution offers Medical Assistant, Nursing Assistant, and Surgical Technology programs, then its PEP might present the following overview:

EXAMPLE: The Medical Assistant academic associate’s degree program prepares the student to become a multi-skilled allied health professional with diverse duties in medical offices, clinics and health centers. The program includes a balance of classroom, laboratory, and clinical experiences.Objectives of the program are to:

Prepare a knowledgeable entry-level employee with technical skills and work habits necessary to perform effectively in various health-care related fields including medical transcriptionist, medical billing specialist, medical office manager, and medical assistant.

Provide clinical activities that include assisting the physician in patient care responsibilities by recording medical histories, taking vital signs, preparing the patient for examination, assisting physician during patient examination and surgical procedures, collecting and performing various laboratory tests, administering medications, performing diagnostic procedures such as EKGs and dressings, and providing patient education.

Teach courses in anatomy, physiology, pharmacology, computer applications, clinical procedures, interpersonal skills, confidentiality, medical ethics, professional behavior, and patient interface, as well as basic office procedures to ensure competency.

At the completion of the program, the student will be able to:

Assume a wide range of responsibilities in a medical office or ambulatory care center.

Communicate with patients to schedule appointments, receive and process payments.

Sit for the credentialing examination

The Nursing Assistant diploma program prepares the student to function under the supervision of a physician and/or a registered nurse and to participate as a member of a healthcare team in providing nursing care. The program includes classroom, laboratory, and clinical patient care experiences.

Objectives of the program are to: Prepare a competent, nurse assistant to function effectively in acute, long-

term care, and ambulatory settings. Provide a collaborative learning environment in which the student will

develop and apply principles of systematic reasoning through critical thinking.

Program Effective Plan Manual Updated January 2012 9

Page 10: A B H E S · Web viewIndependent &

Guide the learner in the continuing process of personal and professional growth.

At the completion of the program, the student will be able to: Function in the delivery of care to clients. Communicate with clients, client families, and members of the healthcare

team. Perform nursing skills applying critical thinking. Integrate ethical, professional, legal responsibility, and accountability into

actions and decisions. Assume responsibility for personal and professional growth. Sit for the State certification board exam.

The Surgical Technology certificate program prepares the graduate to function as an intraoperative team member under the direct supervision of a surgeon or registered nurse. The graduate is prepared for this role through didactic, laboratory, and external clinical experiences. Objectives of the program are to:

Prepare the graduate for a professional career. Prepare a competent surgical technologist to perform intraoperative first

scrub duties. Guide the learner in the processes for certification and professional

development. At the completion of the program, the graduate will be able to:

Effectively perform pre-, intra-, and post-operative duties Practice aseptic and sterile technique Practice all patient safety measures and act in an ethical manner Assume responsibility for personal and professional growth Sit for a national certification exam

Continue same format for all programs offered at the institution.

STANDARDc. program retention rateAt a minimum, an institution maintains the names of all enrollees by program, start date, and graduation date.    The method of calculation, using the reporting period July 1 through June 30, is as follows:             

                               (EE + G) / (BE + NS + RE) = R%

Include the retention results for the last three annual reporting years as the baseline, if available, along with goals for the upcoming year. If an institution has developed long-term goals for retention, this information should also be included with status updates.

Program Effective Plan Manual Updated January 2012

EEGBENSRER%

======

Ending Enrollment (as of June 30)GraduatesBeginning Enrollment (as of July 1)New StartsRe-EntriesRetention Percentage

10

Page 11: A B H E S · Web viewIndependent &

EXAMPLE: Retention rates for the past three years, taken from the institution’s Annual Report:

2008-2009 2009–2010 2010–2011Medical Assistant 67% 69% 70%Nursing Assistant 64% 65% 67%Surgical Technology 80% 81% 85%

To establish the goals for the next reporting period 2011-2012, an institution may choose to average the three previous years for each program. However, in this example the goal would be below the 70% benchmark in the medical assistant and nursing assistant programs; therefore, this would not be a practical way to determine the next year’s program goal. A program may elect to establish its goal by an increase of a given percentage each year, such as five percent or by determining the percent increase from year to year of the three previous years. Note in the example that the Surgical Technology program increased retention 1% between 2008-2009 and 2009-2010 and then increased 4% between 2009-2010 and 2010-2011. So the average increase among those three years is 2.5%. Using the averaged percent method, a realistic 2012 goal then might be 87.5%. The program may also establish intermittent goals of a percentage increase from month to month or an increase in relation to the same month or other predetermined time periods in the previous year—e.g., a 1% increase from month to month or a 2% increase in April 2012 over the April 2011 rate. Intermittent goals are advantageous as they keep everyone on target throughout the year.The chart below shows a comparison of the medical assisting, nurse assisting, and surgical tech programs to overall retention. The next step is to develop improvement plan/strategies for achieving the 2012 retention rates for each of the three programs that will assist in reaching its projected 2012 rates. Surgical Technology is doing well so its goal may be to either increase by 2.5% as stated previously or maintain the 85% retention rate. The Medical Assisting program is just at 70%, but did show a 1% increase between 2009-2010 and 2010-2011; therefore, it would be realistic to challenge it with perhaps another 1-2% increase for 2012. For the Nursing Assistant program to reach 70% would require an increase of 3% in one year. By establishing program goals, the institution is assured that all programs are working toward overall retention improvement.

Other areas that might be considered to address retention include setting an average daily attendance goal for example at 90%; a maximum withdrawal rate per quarter for example 10%; a quarterly retention goal

Program Effective Plan Manual Updated January 2012 11

Page 12: A B H E S · Web viewIndependent &

quarterly grade distribution goals in the percentage of As and Bs for selected courses/classes for example for anatomy and physiology

Quarter As % Bs %Dec 09 45 22Mar 10 62 35Jun 10 62 35Sept 10 61 32Dec 10 50 26Mar 11 65 14Jun 11 49 34Sept 11 27 8Dec 11 54 31Mar 12 72 25Jun 12 83 11Sept 12 51 32Total 681 305Mean 57% 25%Total Average As & Bs 82%

Based on this distribution, the institution might elect to develop strategies to maintain the 82% rate or raise the goal to 85%. Each quarter an intervention plan might be developed for those struggling students not making As and Bs. Such an intervention plan might enhance retention.

Similarly quarterly grade distribution goals could be set in overall enrollment performance.

Average Quarterly Grade Distribution for March 2010-June 2011

In the analysis trends of grade

distribution are noted. Goals then could be set to raise the percentage of As and Bs while reducing the percentage of Ds, Fs and Ws accompanied by departmental strategies.

STANDARDd.  job placement rate in the fieldAn institution has a system in place to assist with the successful initial employment of its graduates and is required to verify employment post-initial employment date. At a minimum, an institution maintains the names of graduates, place of employment, job title, employer telephone numbers, employment date and verification dates. For any graduates identified as self-employed, an institution maintains evidence of employment. Documentation in the form of employer or graduate verification forms or other evidence of employment is retained.

Program Effective Plan Manual Updated January 2012

Quarter Ending

Total EOQ

Students

FTEs As% Bs% Cs% Ds% Fs% Ws%

Mar 10 571 1606 35 33 16 11 5 0June 10 354 1118 32 29 17 9 3 10Sept 10 391 1180 32 28 16 13 2 9Dec 10 417 1295 36 34 15 12 2 1Etc.

Total 15134 376 378 209 103 38 95Mean 1261 31% 32% 17% 9% 3% 8%

12

Page 13: A B H E S · Web viewIndependent &

The method of calculation, using the reporting period July 1 through June 30, is as follows:                                    (F + R)/(G-U)=P%    

*Related field refers to a position wherein the graduate’s job functions are related to the skills and knowledge acquired through successful completion of the training program.

**Unavailable is defined only as documented: health-related issues, military obligations, incarceration, continuing education status, or death.

Important Note: graduates pending required credentialing/licensure in a regulated profession required to work in the field and, thus, not employed or not working in a related field as defined above, should be reported  through back-up information required in the Annual Report. This fact will then be taken into consideration if the program placement rate falls below expectations and an Action Plan is required by ABHES.

EXAMPLES:Placement results for the same annual reporting years identified above for retention are used as your baseline data, if available, along with goals for the upcoming year. In addition, if an institution has developed long-term goals for placement, this information should also be included with status updates.

Placement rates for the past three years, beginning with 2008-2009, taken from the institution’s ABHES Annual Reports are:

2008-2009 2009–2010 2010–2011 Medical Assistant 88% 90% 94%Nursing Assistant 85% 88% 91%Surgical Technology 86.5% 88% 91.3%

These rates indicate a steady annual increase and all rates exceed the 70 percent ABHES benchmark. The chart shows a comparison of the three programs to the overall placement.

Since these are good placement rates, the institution may elect to hold at these rates for 2012 and develop strategies to maintain the rates or the institution may

Program Effective Plan Manual Updated January 2012

FR*GU**P%

=====

Graduates placed in their field of trainingGraduates placed in a related field of trainingTotal graduatesGraduates unavailable for placementPlacement percentage

13

8082848688909294

2008-09 2009-10 2010-11

MA

NA

ST

Overall

Page 14: A B H E S · Web viewIndependent &

elect to increase by a given percentage for 2012 such as one percent or use an average of the increases. STANDARDe. credentialing examination participation rateParticipation of program graduates in credentialing or licensure examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment,

The method of calculation, using ABHES’ reporting period July 1 through June 30 is as follows: 

           Examination participation rate = G / T           T =      Total graduates eligible to sit for examination           G =      Total graduates taking examination

EXAMPLE:While credentialing may not be required for employment, concerted efforts should be made by the institution to promote and encourage participation in licensure exams by setting participation and pass rate goals and establishing strategies for achieving those goals. Include results of periodic reviews conducted throughout the reporting year of certification exam results by program along with goals for the upcoming year. If results are not easily accessible without student consent, the institution should consider incentive procedures or devise alternate methods to obtain results, which can be documented to assess program effectiveness. Again include the three most recent years of data collection. Data may be analyzed by class or just by program. Data collected and analyzed by class provides more detail.Example by program

EXAMPLE: Looking at the nursing assistant graduate percentage of those who took the test versus those that have taken the test for the last three years would be lower than the 2011 pass rate. Therefore, it would be more advantageous to calculate the percentage increase between ‘09/’10 and ‘10/’11 to get an average to establish the percent increase for ‘11/’12 [2% (percent increase between 2009 & 2010) + 11(percent increase between 2010 & 2011) ÷ 2 = 6.5]. So the goal for the number taking the nursing assistant exam in 2012 would be 94.5%.

If students are admitted and graduate on a quarterly basis, the institution might find data collected quarterly to be more beneficial such as this example:

Program Effective Plan Manual Updated January 2012

PROGRAMGRADS’09 ’10

‘11

NUMBER TOOK EXAM

% GRADS TOOK EXAM

Nursing Assistant 20

26

24

15

20

21

75

77

88

Medical Assistant 24

26

30

20

21

25

83

81

83

Surgical Technology

23

19

17

15

11

13

65

58

76

14

Page 15: A B H E S · Web viewIndependent &

Example by program by class (useful tracking if employ adjunct faculty that change each term)

PROGRAMGRADS

’09 ’10 ‘11NUMBER TOOK

EXAMPERCENT GRADS

TOOK EXAMNursing Assistant 20 26 24 15 20 21 75 77 88

Winter 2 4 7 2 3 5Spring 7 6 5 5 6 5Summer 6 7 6 3 3 3Fall 5 9 6 4 7 6

Medical Assistant 24 26 30Winter 7 4 9Spring 5 6 10Summer 6 7 5Fall 6 9 6

Other data to demonstrate student-learning outcomes may include entrance assessments, pre- and post-tests, course grades, GPA, CGPA, standardized tests, and portfolios.

STANDARDf. credentialing examination pass rate An ongoing review of graduate success on credentialing and/or licensing examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment is performed to identify curricular areas in need of improvement. A program maintains documentation of such review and any pertinent curricular changes made as a result.

The method of calculation, using ABHES’ reporting period July 1 through June 30 th, is as follows:                                              F / G = L%

FGL%

===

Graduates passing examination (any attempt)Total graduates taking examinationPercentage of students passing examination

At a minimum, the names of all graduates by program, actual graduation date, and the credentialing or licensure exam for which they are required to sit for employment are maintained.

Example by program

From tFrom this data, establish a goals for the percentage of graduates passing the exam using the same methods described above for graduates taking the exam. Since passing rates have not steadily climbed, setting a reasonably achievable passing goal could be established by merely averaging the three

Program Effective Plan Manual Updated January 2012

PROGRAMGRADS

’09 ’10 ‘11NUMBER (G) TOOK EXAM

NUMBER(F) PASSED

PERCENT(L) PASSED

Nursing Assistant 20 26 24 15 20 21 12 15 16 80 75 76

Medical Assistant 24 26 30 20 21 25 15 17 19 75 81 76

Surgical Technology 23 19 17 15 11 13 14 9 10 93 82 77

15

Page 16: A B H E S · Web viewIndependent &

most recent passing rates (80+ 75+ 76÷ 3 = 77), which would give a goal of 77% passing for the nursing assistant program.

Again, if students are admitted and graduate on a quarterly basis, the institution might find data collected quarterly to be more beneficial such as this example:

Example by program by class (good tracking if employ adjunct faculty that change each term)

PROGRAMGRADS

’09 ’10 ‘11NUMBER TOOK

EXAMNUMBER PASSED

PERCENT PASSED

Nursing Assistant 20 26 24 15 20 21 12

15

16

80

75

76

Winter 2 4 7 2 3 5Spring 7 6 5 5 6 5Summer 6 7 6 3 3 3Fall 5 9 6 4 7 6

Medical Assistant 24 26 30Winter 7 4 9Spring 5 6 10Summer 6 7 5Fall 6 9 6

STANDARDg.  program assessment The program assesses students prior to graduation as an indicator of the program’s quality. The assessment tool is designed to assess curricular quality and to measure overall achievement in the program, as a class, not as a measurement of an individual student’s achievement or progress toward accomplishing the program’s objectives and competencies (e.g., exit tool for graduation). Results of the assessment are not required to be reported to ABHES, but are considered in curriculum revision by such parties as the program supervisor, faculty, and the advisory board and are included in the Program Effectiveness Plan.

EXAMPLES FOR USE AS PROGRAM ASSESSMENT: Comprehensive Final Exam Scenarios National Practice Exams Practical demonstrations using a comprehensive checklist before students go

on externship Comprehensive Final Exam An important measure of program effectiveness is how well it prepares students with the entry level competencies for field of work.   A comprehensive examination designed to measure the individual student’s preparation in the required competencies identified by the program objectives administered to every student prior to completion and the collective results used to assess the program’s performance in preparing students, as a group, for employment in the field. The comprehensive examination may include either written questions, practical demonstrations, or a combination of methods so long as necessary clinical competencies are validly and reliably assessed. 

Program Effective Plan Manual Updated January 2012 16

Page 17: A B H E S · Web viewIndependent &

 Such an exam should be designed to incorporate all major elements of the curriculum for assessment of quality. A well-designed exam will point directly to that segment of the curriculum that needs remedy. For example, if scores are consistently low in the anatomy and physiology segment of the exam as indicated by a three year trend, then an action plan may include new textbooks, an instructor change , instructor professional development or in-service or the course taught as a prerequisite instead of a core element. These cohort exam scores are then closely monitored for upward trends that indicate that the plan is working. A program may find it beneficial to score the exam with ranges rather than pass/fail. This communicates to the student that it is being used as an overall quality improvement tool, rather than a personal test. Each student is provided instructions regarding the scenario. The students then role play as they complete the scenarios under the direction of the faculty. MA SCENARIO EXAMPLE (Thanks to Ross Medical Education Center for sharing these

examples)

Program Assessment Evaluation Method: Medical Assistant Day In The Office Scenario

MA Program

Total Number of seniors who completed

day in the office scenarios

Total number of seniors scoring “Proficient” or

“Acceptable”

Overall Proficiency %for all combined day in

the office scenarios.Years 08-09 09-10 10-11 08-09 09-10 10-11 08-09 09-10 10-11

N/A N/A 100% N/A N/A 100% N/A N/A 100%

Day in the Office Scenario (Tasks) 2010‐11 Class Proficiency PercentagePrepare and maintain electronic medical records 100

Manual Filing with Alphabetic System 100Take and Record Height and Weight 100Take a complete set of Vital Signs 100Perform Visual Acuity with Snellen Chart 100Perform a physical and chemical U/A 100Administer an ID injection 100Administer a SQ injection 100Administer an IM injection 100Administer a Z-track injection 100Perform a Standard 12-Lead EKG 100Perform a Spirometry Test 100Measure Infant Height/Weight, Head and Chest Circumference 100Perform a Urine Pregnancy Test 100Perform Multi-draw Venipuncture 100Perform Venipuncture with Butterfly 100Perform a Capillary Puncture and MicroHematocrit 100Perform a Capillary Puncture and Hemoglobin 100Perform a Capillary Puncture and a CLIA Waived Mono Test 100Measure Blood Glucose using a Handheld Monitor 100Wrapping Instruments/Operate an Autoclave 100Apply Sterile Gloves/Set Up Sterile Tray/Remove Sutures 100Code Assignment/Posting Patient Charges Electronically 100

Program Effective Plan Manual Updated January 2012 17

Page 18: A B H E S · Web viewIndependent &

Rationale for Data Collection Procedures

The program assessment evaluation tool is used to

assess curricular quality and measure overall

achievement in the program by class.

The purpose of the assessment tool is to determine if

classes are performing clinical skills at a high level and

progressing toward accomplishing the program’s

objectives through demonstrated proficiencies.

Program assessment rubrics are used to assess senior students’ clinical

skills through end-of-term “Day in the Office” scenarios. The Program

Assessments are compiled twice a year in July for January 1 to June 30

and in January for July 1 to December 31. To ensure data is significant,

the data is compiled by class and by program and then

reviewed/analyzed/assessed based on the results from the prior 6

month period.

Program assessment of students’ clinical skills is based on 4

categories: Proficient, Acceptable, Limited, and no Opportunity. If

the score for any competency falls below the 90% proficiency level

(by totaling proficient + acceptable categories), the concern is

brought to the Director of Education in order to perform a

comparative analysis for the proficiency among all Ross campuses.

Goals for 2011 1‐ 2 Responsible Party Review Dates

80% participation rate of all classes during the reporting period.

75% of proficiencies listed on each rubric are completed for each evaluation period.

Director, Instructors January 2012 and July 2012

STANDARDh. surveys of students (classroom and clinical experience), clinical extern

affiliate, graduate and employer satisfaction with the program surveysA program must survey each of the constituents identified above. The purpose of the surveys is to collect data regarding student, clinical extern affiliate, graduate and employer perceptions of a program’s strengths and weaknesses.

At a minimum, an annual review of results of the surveys is conducted, and results are shared with administration, faculty and advisory boards. Decisions and action plans are based upon review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda).

The institution establishes: (i) a goal for the percent of surveys returned and (ii) benchmarks for the level of satisfaction desired. Accordingly, a program must document that at a minimum the survey data included in its effectiveness assessment include the following:

A representative sample must provide feedback to determine program effectiveness; therefore, two goals should be established for all surveys

(1) a goal for the percent of surveys returned as well as

Survey participation rate: SP / NS = TP SP = Survey Participation (those who actually filled out the survey)NS = Number Surveyed (total number of surveys sent out)TP = Total Participation by program, by group; meaning the number of

students/clinical extern affiliates/graduates/employers by program who were sent and completed the survey during the ABHES reporting period (July 1–June 30).

Program Effective Plan Manual Updated January 2012 18

Page 19: A B H E S · Web viewIndependent &

(2) satisfaction benchmarks

Programs must assess satisfaction by surveys for the currently enrolled student, the clinical extern affiliate, the recent graduate, and the graduate’s employer.

Student: Student evaluations are used as a composite of student views relating to course importance and satisfaction and overall class attitudes about the classroom and clinical environments. EXAMPLE:Student Satisfaction:The surveys conducted periodically throughout the reporting year should assess the students’ satisfaction with the administration, faculty, and program training, including externship. The institution would also want to establish a percentage return goal such as 90% of the students complete the survey.

Rationale for DataCollection

Procedures Goals Summary/AnalysisImprovement

StrategiesSecure feedback from students relating on importance and satisfaction on customer service and overall attitudes related to the institution’s administration. Data used to reflect on what worked or didn’t work.

Student Satisfaction Surveys collected semiannually

90% student participationUsing Student Satisfaction Surveys (orientation through graduation) the baselines are:Tutoring 80%Academic Advising 80%Support From Admissions 75%Financial Aid 75%Career Services 75%Library 80%Spirited/Fun Environment 50%Orientation Sessions 75%Recognition 65%Mission Statement 50%Admin Accessibility 80%Facility 70%Social Activities 50%

Feedback obtained from completed surveys should be tallied for each category.

The data is collected and benchmarks are set and analyzed for improvement strategies when measures fall below established baselines.

End of term student evaluations used as composite of student views relating to course importance and satisfaction and overall class attitudes about the classroom environment. Faculty use data to determine effective/ineffective activities and compare this information with other classes.

Student Course Surveys collected each term

Using the Student Course Surveys the following baselines have been established: Presentation/Delivery Methods 80%Course Pace 80%Course Objectives 75%Information Clarity 85%Textbook 75%Supplementary Materials 75%Classroom Assignments 80%Support From Faculty 75%Encouragement/Motivation 80%

Feedback obtained from completed surveys should be tallied for each category.

Failure to achieve a baseline goal will be addressed at faculty and in-service meetings

STANDARDClinical extern affiliate: Externship site evaluations include a critique of students’ knowledge and skills upon completion of their in-school training and reflect how well the students are trained to perform their required tasks. They include an assessment of the strengths and weaknesses, and proposed changes, in the instructional activities for currently enrolled students. The sites also evaluate the responsiveness and support provided by the designated school representative, who visited the site and remained in contact with the site throughout the duration of the students’ externship.

Program Effective Plan Manual Updated January 2012 19

Page 20: A B H E S · Web viewIndependent &

EXAMPLE:Externship sites are off-campus labs enabling students to apply acquired knowledge and skills. Students on externship should be given an opportunity evaluate this experience just as they did in the classroom.Summarized results of the externship site evaluations of the students’ knowledge and skills upon completion of their in-school training should reflect how well the students are trained to perform their required tasks, and include an assessment of the strengths and weaknesses, and proposed changes, if any, in the instructional activities for currently enrolled students. The sites should also evaluate the responsiveness and support provided by the designated school representative, who visited the site and remained in contact with the site throughout the duration of the students’ externship.

Rationale for DataCollection

Procedures Goals Summary/AnalysisImprovement

StrategiesTo maintain interaction with off-site labs to identify student skill level and provide follow-up instruction when deficiencies identified.

Externship site survey collected bi-weekly by externship coordinator on Friday during student externship assignment. Student clinical experience evaluation

100% Student participationAttendance 100%Initiative/Appearance 80%Communication 80%Critical Thinking 80%Information Use 90%Quality of Work 90%Multi-Tasking 80%Technical Procedural Proficiency 90%Professional Attitude 80%Interaction & responsiveness of institution 95%

For all students to rate their overall clinical experience at or above the “cut score” of 3 on a 5-point Likert scale.

Rating of School RepresentativeSchool representative’s responsiveness 90%Quality of the designated school

representative’s visit to the site 90%Rank the representative’s contact with

the site throughout the duration of the student’s externship. 90%

For all clinical affiliates to rate their contact with the school experience at or above the “cut score” of 4 on a 5-point Likert scale.

Feedback obtained from completed surveys should be tallied for each category.

Failure to achieve a baseline goal will be addressed at faculty and curriculum meetings to design improvement.

STANDARDGraduate: A program has a systematic plan for regularly surveying graduates. Graduate surveys are provided no sooner than 10 days following graduation. At a minimum, an annual review of the results is conducted and shared with administration, faculty and advisory boards. Decisions and action plans are based upon the review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda).

EXAMPLE:Graduate Satisfaction:Include information reflecting the level of satisfaction of graduates with the academic program, how well the educational and clinical experiences prepared the student for

Program Effective Plan Manual Updated January 2012 20

Page 21: A B H E S · Web viewIndependent &

employment, and how the education relates to their current position. Such informationcould include a measurement of the quality of instruction and the relevance and currency of curricula. The major distinction between graduate and student satisfaction is that graduate feedback should be sought once graduates’ have had an opportunity to seek and secure employment in their fields and have been employed long enough to be able to evaluate their training in relation to the tasks performed on the job.

Because graduate and employer satisfaction surveys provide valuable information for timely program revision and development, they must be conducted on an on-going basis and summarized at least annually as the information from these surveys are vital to the PEP in evaluating educational outcomes and setting short- and long-term goals. The institution should also establish a percentage return goal such as 60% of the graduates return the survey.

Rationale for DataCollection

Procedures Goals Summary/AnalysisImprovement

StrategiesTo ensure the ability of graduates to secure employment related to their interests and training both upon graduation and throughout their lifetime.

Data collected using the alumni survey a minimum of 30 days following student employment.

60% of employed graduates during the reporting year will return the completed survey.

Baseline is to have an 80% or higher evaluation in skill categories and 70% or higher in the leadership area of surveys rated at 3 or above on a 5 point Likert scale.

Prepared for career in ability to: Communicate with employer/co-

workers Think critically Use information Multi-task Apply knowledge Use learned skills Perform to employer satisfaction Secure desired position

Satisfied with instruction qualityProvided effective job search skills

Feedback obtained from completed surveys should be tallied for each category.

The data is collected and benchmarks set and analyzed for improvement strategies when measures fall below established baselines.

Employer: A program has a systematic plan for regularly surveying employers. Employer surveys are provided to employer no fewer than 30 days following employment.  At a minimum, an annual review of the results is conducted and shared with administration, faculty and advisory boards. Decisions and action plans are based upon the review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda).

EXAMPLE:Employer Satisfaction: Information about the degree of employer satisfaction regarding the competencies of graduates who have completed a program of study is a major part of determining program effectiveness. This information reflects how well employees (graduates) are trained (skill level) to perform their required tasks, and include an assessment of the strengths and weaknesses, and proposed changes, if any, in the instructional activities for currently enrolled students. This example below is for the institution, however, to ensure that all program graduates are meeting this goal, data could be tallied by program. The institution should also establish a percentage return goal such as 50% of the employers return the survey.

Program Effective Plan Manual Updated January 2012 21

Page 22: A B H E S · Web viewIndependent &

Rationale for DataCollection

Procedures Goals Summary/AnalysisImprovement

StrategiesTo maintain interaction with employing community, identify current workplace needs, and anticipate future job requirements that will shape the careers and graduate opportunities.

Employer Survey

Collected quarterly and tallied annually in November

Using the Employer Survey, rate graduate job performance and use data to update curriculum, program objectives and program offerings.

Employer Response Rate 50%

Data collected in the following:Technical Knowledge

Proficiency 85%Information Use 80%Quality of Work 80%Multi-Tasking 70%Communication 85%Critical Thinking 75%Professional Attitude 85%

Other goal examples:For all employers to rate graduate overall knowledge base, clinical experience, and interpersonal communication skills at or above the “cut score” of 3 on a 5-point Likert scale.

Feedback obtained from completed surveys should be tallied for each category.

The data is collected and benchmarks are set and analyzed for improvement strategies when measures fall below established baselines.

STANDARDi. faculty professional growth and in-service activities.A program maintains data that evidences faculty participation in professional growth activities and in-service sessions that promote continuous evaluations of the programs of study, instructional procedures and training.

Include the schedule, attendance roster, and topics discussed at in-service training sessions conducted during the reporting year. The data should evidence that the sessions promote continuous evaluation of the program of study, training in instructional procedures, and review of other aspects of the educational programs. Outline procedures for monitoring all full-and part-time faculty participation in professional growth activities in an effort to remain current in their fields. Include the past two years’ in-service training and professional activities outside the institution for each faculty member.EXAMPLE:

Rationale for Data

Collection Procedures Goals Summary/

Analysis

Improvement Strategies

Invest in faculty development to ensure current expertise and ability.

Professional development plan tied to other assessments (student evaluations, faculty evaluations, etc.) to directly address identified areas of need.

Professional development plans prepared annually based on instructor evaluation and reviewed quarterly.

Full-time and adjunct faculty professional development participation, as documented in professional development plans and professional development programs, will increase.

Participation minimum:On-campus quarterly in-service 3Instructor initiated off-campus professional development directly related to teaching assignment 2

Current licensure where required in the field

Feedback obtained from completed surveys tallied for each category.

In-service evaluationStandard feedback form for PD activities

Data collected and benchmarks set and analyzed for improvement strategies when personnel fail to fulfill plan.

Future topics based on survey feedback and in-service evaluations

STANDARD

Program Effective Plan Manual Updated January 2012 22

Page 23: A B H E S · Web viewIndependent &

Subsection 2- Outcomes AssessmentV.I.2.   A program has a process for assessing effectiveness.The Program Effectiveness Plan specifies a process and a timetable for the annual assessment of program effectiveness in achieving the outcomes it has identified with its objectives and criteria. The plan must:

i. Document historical outcomes and show evidence of how these historical data are used to identify expected outcomes and to achieve expected goals (e.g., evaluations, advisory boards, credentialing). Outcomes are the result of students’ successful completion of a program. Outcomes, though not limited to, are generally defined in terms of retention, placement, student competencies, student, clinical, graduate, and employer satisfaction. Use at least three years’ historical outcomes for each element. The last three PEPs and Annual Reports provide the necessary historical data. Data from other prior years may be used if it will better define the picture of progress or set more realistic goals. Describe the measurable standards used to judge the effectiveness of your institution.

ii. Identify and describe types of data that are used for assessment, how data were collected, rationale for use of each type of data, timetable for data collection, and parties responsible for data collection.

Institutions are expected to collect data that clearly evidences the level of educational outcomes of retention and placement and satisfaction experienced by current students, graduates, clinical sites and employers of graduates. In addition, institutions are to include information that is relevant to improving overall effectiveness, such as in-service training programs and professional growth opportunities for faculty.

The institution is encouraged to collect a variety of statistical data that will assist it in improving the educational outcomes. A few examples of possible surveys and studies include:

New or entering student surveys

Faculty evaluation studies Student demographic studies

Program evaluations Alumni surveys Labor market surveys

Studies of student performance might include: Admission assessments Grades by course Standardized tests Quarterly grade distribution

Pre-test and post-test results Portfolios Graduate certification examination

results Average daily attendance

Consider other studies such as a review of surveys of professional and trade associations, Chamber of Commerce, U.S. Department of Labor, or economic development board studies.EXAMPLE:

Data Collection Rationale for UseEmployer Survey collected quarterly and Using the Employer Survey, rate the job performance of graduates and use

Program Effective Plan Manual Updated January 2012 23

Page 24: A B H E S · Web viewIndependent &

tallied annually in November by career services department.

data to update curriculum, program objectives and program offerings. Rating goals:

Employer Response Rate 75%

Goals for data collected in the following areas:Communication 84%Critical Thinking 75%Information Use 80%Quality of Work 80%Multi-Tasking 70%Technical Knowledge Proficiency 85%Professional Attitude 85%

iii. Review initial baseline rates and measurements of results after planned activities have occurred.

Data related to the PEP must be evaluated at least once per year and should take place at a predetermined time. Many institutions evaluate data related to their PEP on a monthly or quarterly basis then complete an annual comprehensive evaluation. As previously noted, it is suggested that an institution establish a schedule or range of evaluation dates for each year to ensure that timely monitoring is taking place.To maximize the integrity of the process and opportunities for improving the educational programs, the individuals involved in the evaluation of the data must have the responsibility and authority for the development of the educational programs and educational processes.

An institution should develop an evaluation plan designed to meet its needs; no one model is prescribed as each institution is unique. An example of how an institution may evaluate the PEP could be by completing the following activities:a. Measuring the degree to which institutional or educational goals have been

achieved.b. Conducting a comprehensive evaluation of the elements outlined in the

Standards Manual.c. Summarizing the institutional changes that have been developed and/or

implemented based upon information gained from the evaluation process.d. Documenting changes in institutional, academic, or administrative

processes such as revised goals, planning documents, or program goals and activities.

At the end of the year, a review of the data collected will demonstrate how well the predetermined goals were met in all categories and will identify changes needed. An example of how data may be reported could be to set it up in a table format for easy analysis.

Program Effective Plan Manual Updated January 2012 24

Page 25: A B H E S · Web viewIndependent &

Goals Summary/Analysis Improvement StrategiesRetention 75%

Retention program focused on motivating students to stay in school only 70% stayed.

Daily instructors contact absent students & document discussion.

Refer students to individuals or services to help overcome attendance obstacles.

Department chairs distribute at-risk list to every employee each Tuesday.

Placement 70% placed within 60 days of graduation

Placement staff will be on campus two days/week during the hours between day and evening classes to meet with students in their last term.

Placement personnel will join and participate in local business and civic organizations.

Increase employer presence on campus with mock interviews, speaking, and twice a year career fair.

Establish on-line career bankImproved to 80% or above graduate job performance

Employers rated performance of graduates below 80% on the following:Critical Thinking 75%Multitasking 70%

Add at least three critical thinking scenarios to the last three modules

Add four multitasking practica the last two modules.

iv. Provide a summary and analysis of data collected and state how continuous improvement is made to enhance expected outcomes.

Provide an overview of the data collected. Summarize the findings for all elements reviewed that indicate the institution’s strong and weak areas with plans for improvements, where applicable, and use results to develop the basis for the next annual review, presenting new ideas for changes to help the institution further improve its effectiveness.

One of the most important indicators of the effectiveness of the PEP is the improvement of the educational programs offered at the institution. Establish specific goals as benchmarks to measure improvement of the institution as a whole as well as each program. Goals can be set as an annual incremental increase or set as a static goal, such as 50 percent for employer survey returns.

Summary/Analysis Use of Data to ImproveEmployer Response Rate 45%

Overall Job Performance Rating:Communication 82%Critical Thinking 75%Information Use 88%Quality of Work 85%Multi-Tasking 67%Technical Knowledge Proficiency 94%Professional Attitude 85%

Employer response rate improved but continued effort is needed to generate a better response. All baseline goals established were met or exceeded with the exception of communication and multi-tasking.

Career Services maintain bi-weekly contact with employers and graduates. Phone calls made to unresponsive employers encouraging them to complete the survey.

The recommendation is that all programs add more emphasis in communications, critical thinking, and multi-tasking. More case studies, role-playing, and practical applications in these areas will be included in all third and fourth quarter courses.

v. Identify how data were used to improve the educational process. At least annually monitor activities conducted that include systematically collecting data/information on each of the elements; analyzing the data/information and comparing it with previous findings; and based on the findings, identifying changes to be made in educational activities.

An institution may offer an exemplary educational program in terms of curriculum, but for one reason or another, the educational processes are not

Program Effective Plan Manual Updated January 2012 25

Page 26: A B H E S · Web viewIndependent &

allowing the contents to be delivered effectively to the students. However, by analyzing the data in the PEP, such as employer, graduate, and student surveys, and faculty professional development, an institution is able to change the process to enhance the program or change the program entirely.

Summary/Analysis Use of Data to ImproveProfessional development participation as documented in professional development plans increased and included all full-time and adjunct faculty. Professional development was tied to annual evaluations, licensing requirements, and student evaluations to directly address identified areas of improvement. Five of the nine faculty attended two field related workshops, three renewed licenses, all attended at least two of the four in-services.

Faculty participation in professional development activities throughout the system has been widespread and generally successful. Full-time faculty members have usually followed through to complete their planned activities, thereby benefiting the Collegeand the students. Part-time faculty members have attended a variety of training sessions organized by the College. For instance, quarterly Professional Development sessions on campuses have been beneficial to full-time and part-time instructors, as indicated by faculty evaluation results from the sessions. Individual faculty plans will include at least two field related seminars with some financial support and campus-wide training will continue to be implemented based on instructional needs, as determined by faculty members, the Dean’s evaluation, and mid-term and end of quarter student evaluations, as well as other assessment data.

vi. Adjust goals as a result of the evaluation of a PEP, based on an assessment of community and employer demand for graduates, which justifies the continued need for a program.At this juncture, it would be advantageous to identify those responsible and establish periodic times for review to ensure that progress toward the new goals are on track or if not, determine why, new strategies, and/or adjust the goal.

Goals Who Responsible Review Dates Summary/Analysis Strategy AdjustmentImprove employer survey return rate by hand delivering the surveys to respondents

Clinical coordinator April 30, 2012, & November 30, 2012

As of 4-30-12, employer survey rates have been improved by 30%

Continue through 11-30-11, modify if a decline in percent returns occurs

vii. Identify the activities that will be undertaken to meet the goals set for the next year.

Problems/Deficiencies Specific ActivitiesInconsistent externship recordkeeping

Externship coordinator will contact externship supervisor two days prior to report due date.Three days following due date, if externship report not received, externship coordinator will

contact externship supervisor.ATB counseling not frequent enough and not meeting ABHES standards

Institute required mentoring and tutoring for 1st & 2nd term students.Develop standardized reporting form and procedures for ATB counseling by 7/31/--Distribute to all faculty.DOE to monitor to ensure that counseling is provided weekly and filed.

Low Employer Survey return Within 10 days of surveys failing to be returned, the career services department will make a follow-up phone call to those delinquent.

Inadequate Graduate Survey return Within 10 days of surveys not returned, the career services department will make a follow-up phone call to those delinquent.

Faculty Files Incomplete Dean of education will review faculty files quarterly; contact faculty who have not submitted professional development documentation.

Quarterly memo to all personnel reminding that all credentials received, CEUs, etc. are to be submitted to the DOE promptly after receipt.

Professional Development Increase to quarterly internal monitoring of professional development for all instructors.Request credentialing information every 90 days. Require instructors to obtain two CEUs per year.

FORMAT EXAMPLES*

Program Effective Plan Manual Updated January 2012 26

Page 27: A B H E S · Web viewIndependent &

All PEPs must contain the following:

Title Page to include all of the following:

ABHES I.D. CODEName of InstitutionAddressCity State and Zip CodeName of ProgramName of Program DirectorCredential AwardedPortion of the program, if any offered via distance learningLength of program (clock hours, semester/quarter credits, weeks, etc.)

IntroductionSchedule of ReviewInstitutional Mission and Objectives Program Description and Objectives Student Population

Overall Student Population Program Student Population

FORMAT EXAMPLE I Program RetentionRetention statistics are extracted from reports in CampusVue. The annual period used to measure retention for the purposes of accreditation is July 1 through June 30. The following program retention rates were determined using the ABHES formula as follows: (EE + G / (BE + NS + RE) = R%

EE = Ending enrollment (as of June 30 of the reporting period) G = GraduatesBE = Beginning enrollment (as of July 1 of the new reporting period) NS = New startsRE = Re-entriesR% = Retention percentage

Retention Percentage –Program FY 2008 FY 2009 FY2010

Medical Assistant, Diploma 57.46% 69.7% 75.0%Summary/Analysis:The Medical Assistant Diploma Program started December 2004. Retention rates have increased in Fiscal Year 2010 due to implementation of several factors; biweekly meetings with the program chair, weekly attrition meetings, more accountability for instructors, and campus wide efforts to retain students.Rationale: To have the most current picture of who is in the program in order to monitor participation and progress, and build retention plans based on student needs.Retention Goal: The retention goal for FY 2012 is 79%Responsible Parties: Academic Dean, Program Chair, Faculty and other staff as applicableReview Dates: Retention of the program will be monitored weekly; however more in

Program Effective Plan Manual Updated January 2012 27

Page 28: A B H E S · Web viewIndependent &

depth in monthly meetingsTypes of Data/Methods of Assessment: Daily attendance reports, Weekly retention meetings, Monthly retention reports, Faculty and student surveys, and Quarterly retention reports. Reports reviewed with team members: Registrar, DOE, Program Chairs.Continuous Improvement Strategies:

• Daily monitoring of student attendance by the Instructor, Program Director, Academic Dean and the Career Services Representative to proactively identify students at risk of withdrawing or being withdrawn. Regular meetings will be held to discuss at-risk students and action plans developed both individually (advising) and collectively to address challenges facing potential drop students.

• The Academic Dean will hold daily meetings with the Program Directors to determine each student absent four or more days.

• All absent students will be called daily by their Instructors, Program Directors and/or Academic Dean. Preferably before their regularly scheduled class is over.

• Faculty will be coached on holding students accountable for attendance and for developing engaging classrooms. If specific classes are identified as having lower than average attendance and/or retention, coaching and developmental activities will be implemented for the faculty member(s). Outstanding attendance and recognition awards will be given to instructors who have the highest retention and attendance each quarter.

Job Placement Repeat sections for each of the elements to be evaluatedPlacement Percentage –

Program FY 2008 FY 2009 FY 2010Summary/Analysis:Rationale: Goal: Responsible Parties: Review Dates: Types of Data/Methods of Assessment:Continuous Improvement Strategies:

Credentialing Examination Participation Rate –Credentialing Examination Pass Rate –Program Assessment Exam –Surveys –

Student SurveysClinical Affiliate Surveys Graduate SurveysEmployer Surveys

Faculty Professional Growth and In-Service Activities –

Program Effective Plan Manual Updated January 2012 28

Page 29: A B H E S · Web viewIndependent &

FORMAT EXAMPLE 2 MA program started in FY2011Where are the program objectives published?

How does the program determine that graduates have achieved the objectives (e.g. surveys, credentialing exam)

Who reviews the data? What process is utilized? (e.g. semiannually by advisory committee)

What changes have resulted from data review?

Date of most recent program review

Program: MA The program objective is stated on page 15 of the 2011 2012 ‐Michigan Campus Catalog.

Graduation requirements are stated on page 9 of the 2011‐2012 Michigan Campus Catalog. Student must have a grade average of 70% or higher with no less than 60% in any individual course, and attend no less than 85% of scheduled classroom days. In addition, student must successfully complete externship.

Data is reviewed by campus and corporate leadership on a weekly basis by retention and placement reports, and also reviewed by the campus advisory board twice annually.

Currently no changes are in place for the Canton campus as this is the first year of operation.

The most recent Advisory Board meeting was held January 2011.

PROGRAM RETENTION RATE :

MA 2008‐2009 2009‐2010 2010‐2011* na% na% 82%

Goals for 2011‐12 Responsible Party Review DatesMaintain a retention rate of 82% Director, Faculty and All Staff Weekly

Summary/Analysis Improvement Strategies

The Canton campus achieved a successful 2010‐2011 year with the first class opened October 4, 2010. All 11 students completed the program. All 2010 1‐ 1 classes achieved a satisfactory 82% retention rate. Although some withdrawals were unavoidable due to serious medical conditions, it is to be noted that some students simply left the program due a change in their vocational desires. Ensuring prospective students understand what the program fully entails, including academic and attendance requirements; will result in a more committed student population, resulting in a higher retention rate.

Admissions representatives are trained to uncover issues that may result in students having to withdraw from the program, and assess preventive strategies with the students so they are prepared to complete the course. Faculty members are encouraged to communicate concerns directly to the Director when a student is struggling with the program. When a student’s attendance falls below 85%, the attendance card is brought to the front desk so the Director is aware of student’s attendance. The student is also counseled by the Director on attendance requirements.

Job Placement Repeat format for each of the elements to be evaluated

*Thanks to Everest College, McLean, Virginia, and Ross Medical Education Center, Canton, Michigan, for agreeing to share their PEP formats for this Manual.

Program Effective Plan Manual Updated January 2012 29

Page 30: A B H E S · Web viewIndependent &

OTHER EXAMPLES:Examples of changes to a process that might enhance a program: If a course requires outside lab or practice time and an analysis of the

students’ actual lab or practice time demonstrates that the students are not completing the required hours, formally scheduling those hours or adding additional laboratory times may dramatically increase the effectiveness of that course.

If an analysis of the data demonstrates that a large number of students are failing a specific course or are withdrawing in excessive numbers from a particular program, the institution may change the prerequisites for that course or offer extra lab hours or tutoring to see if the failure or withdrawal rate are positively affected.

Examples of changes to a program that might enhance a program: If the analysis of the data indicates that large numbers of students are

dropping or failing a course when taught by a particular instructor, the instructor may need additional training or a different instructor may need to be assigned to teach that course.

If surveys from employers and graduates indicate that a particular software program should be taught to provide the students with up-to-date training according to industry standards, the institution could add instruction in the use of the particular software program. But only after the institution is assured that the instructor has been properly trained in its use.

CONCLUSIONThe PEP is a working document used as a resource to constantly identify and access a program’s goals that have been established to meet its educational and occupational objectives. An effective PEP is regularly reviewed by key personnel and used in evaluating the effectiveness of each program and the overall operations of the institution. It is important for each institution to establish a program which exhibits the institution’s progress toward providing the highest quality education for its students to present to ABHES evaluation teams, government regulatory groups, and the general public.“Of all our human resources, the most precious is the desire to improve.” –Unknown

Program Effective Plan Manual Updated January 2012 30