24
1B.1: The institution demonstrates a sustained, substantive and collegial dialog about student outcomes, student equity, academic quality, institutional effectiveness, and continuous improvement of student learning and achievement. Evidence of meeting the standard: The college engages in a continuous, substantive dialog about student outcomes, and structures its dialog through department meetings, division meetings, regularly scheduled meetings of the Student Learning Outcomes Assessment Committee and the Program Review Committee, and Professional Development Day activities. Dialog about student outcomes also takes place between faculty and students through green sheets, which must include the learning outcomes for each course. Student Learning Outcomes (SLOs) are assessed on a regular schedule. SLO assessment is done by full-time faculty lead instructors in TracDat, a software application that provides academic and non- academic departments with a mechanism for managing their assessment needs. One SLO per course is assessed each semester, one Program Learning Outcomes is assessed per program each semester, and one Institutional Outcome is assessed at the end of each by the Student Learning Outcomes Assessment Committee. The results are evaluated by the appropriate faculty members in order to improve instruction to further student success. (IPRA, pg. 15) The assessment of SLOs should be coordinated with updates to the Course Outline, done every 4 years, and with the completion of the Comprehensive Program Review, done every 4 years. All Program level SLOs should be assessed at least every 4-6 years. (SJCC: Timelines and Reporting Forms) The Comprehensive Program Review Form contains the following questions regarding learning outcomes: How do the Program Student Learning Outcomes (PSLOs) align with San Jose City College’s Institutional Student Learning Outcomes (ISLOs)? In what capacity have your program and course SLOs been mapped? Please provide three examples of how the course SLOs map to the Program SLOs. Indicate how program and course-level Student Learning Outcomes (SLOs) are assessed on a regular basis. How have Student Learning Outcome (SLO) assessments and program data been utilized to improve instruction in the program? Please share one or two success stories about the impacts of SLO assessment on student learning. An example of results from this dialog is illustrated in an SLO success story from Gail McElroy, who teaches journalism: “When I starting teaching at SJCC in 2013, SLOs were new to me. The first couple of semesters I used a survey and/or quiz to assess whether the SLO had been met. I quickly learned that these assessment tools did not really assist me in knowing if the students really learned the material or were just good at memorizing. It certainly did not change or improve my instruction methods. Consequently, with subsequent semesters, I began to really examine the SLOs before the semester started in order to build in assessments along the way that would best provide feedback to me on whether or not students were truly learning key concepts for the course, as well as how I could modify the course the next time around to better achieve learning outcomes for the students. In Spring 2015 I received the desired results Comment [BJ1]: about what subjects has the college engaged in dialog? Comment [JB2]: Question 1B1: How has the college structured its dialog? Comment [BJ3]: How has the college engaged in dialog Comment [BJ4]: When has the college engaged in dialog Comment [BJ5]: How has the college engaged in dialog Comment [JB6]: Question 1B1 What impact has the dialog had on student learning?

1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

1B.1: The institution demonstrates a sustained, substantive and collegial dialog about student outcomes, student equity, academic quality, institutional effectiveness, and continuous improvement of student learning and achievement. Evidence of meeting the standard:

The college engages in a continuous, substantive dialog about student outcomes, and structures its dialog through department meetings, division meetings, regularly scheduled meetings of the Student Learning Outcomes Assessment Committee and the Program Review Committee, and Professional Development Day activities. Dialog about student outcomes also takes place between faculty and students through green sheets, which must include the learning outcomes for each course. Student Learning Outcomes (SLOs) are assessed on a regular schedule. SLO assessment is done by full-time faculty lead instructors in TracDat, a software application that provides academic and non-academic departments with a mechanism for managing their assessment needs. One SLO per course is assessed each semester, one Program Learning Outcomes is assessed per program each semester, and one Institutional Outcome is assessed at the end of each by the Student Learning Outcomes Assessment Committee. The results are evaluated by the appropriate faculty members in order to improve instruction to further student success. (IPRA, pg. 15) The assessment of SLOs should be coordinated with updates to the Course Outline, done every 4 years, and with the completion of the Comprehensive Program Review, done every 4 years. All Program level SLOs should be assessed at least every 4-6 years. (SJCC: Timelines and Reporting Forms)

The Comprehensive Program Review Form contains the following questions regarding learning outcomes:

• How do the Program Student Learning Outcomes (PSLOs) align with San Jose City College’s Institutional Student Learning Outcomes (ISLOs)?

• In what capacity have your program and course SLOs been mapped? Please provide three examples of how the course SLOs map to the Program SLOs.

• Indicate how program and course-level Student Learning Outcomes (SLOs) are assessed on a regular basis.

• How have Student Learning Outcome (SLO) assessments and program data been utilized to improve instruction in the program? Please share one or two success stories about the impacts of SLO assessment on student learning.

An example of results from this dialog is illustrated in an SLO success story from Gail McElroy, who teaches journalism: “When I starting teaching at SJCC in 2013, SLOs were new to me. The first couple of semesters I used a survey and/or quiz to assess whether the SLO had been met. I quickly learned that these assessment tools did not really assist me in knowing if the students really learned the material or were just good at memorizing. It certainly did not change or improve my instruction methods. Consequently, with subsequent semesters, I began to really examine the SLOs before the semester started in order to build in assessments along the way that would best provide feedback to me on whether or not students were truly learning key concepts for the course, as well as how I could modify the course the next time around to better achieve learning outcomes for the students. In Spring 2015 I received the desired results

Comment [BJ1]: about what subjects has the college engaged in dialog?

Comment [JB2]: Question 1B1: How has the college structured its dialog?

Comment [BJ3]: How has the college engaged in dialog

Comment [BJ4]: When has the college engaged in dialog

Comment [BJ5]: How has the college engaged in dialog

Comment [JB6]: Question 1B1 What impact has the dialog had on student learning?

Page 2: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

by truly being able to monitor my students’ progress and have an accurate reflection of how well students were able to apply the learned material into demonstrated skills. The course was JOURN 22 (newswriting and reporting). The assessment involved reviewing three different news stories written during the beginning, middle and end of the semester, with increasing levels of complexities. Although this took longer than usual to review, analyze and assess, it gave me a true insight into whether the students truly grasped and could demonstrate success for the specified SLO. After the assessment was complete, I was able to modify the Fall 2015 course content to enhance areas that might improve student learning.”

Further dialog about the use of data to determine the success of student outcomes takes place during the learning outcomes assessment process, when faculty evaluate the results of the assessments. Recommendations made by instructors regarding the success or failure of the learning outcomes can then be incorporated into planning for improvements to the course or program via the Program Review process. For the past three years, the Office of Institutional Effectiveness and Student Success (formerly the Office of Research and Institutional Effectiveness) has contributed to campus professional development days by providing a presentation how to find and use data. The purpose of these meetings is to ensure a collective understanding of the common definitions used in collecting data on student success. More advanced workshops are provided throughout the year by the campus based researchers on an individual and small group basis. At the annual Deans Academy, the leadership at both campuses get together to discuss, among other things, cycles of assessment to improve student learning. In 2015, the Dean’s Academy included a discussion about program review and how to utilize the new tools in CROA (Colleague Reporting and Operating Analytics). In addition, the deans participated in a discussion about student educational planning. Each year, the IESS office participates in the Deans Academy with the goal of continuous dialogue about the meaning of data and research as it pertains to the evaluation of student learning. This dialog student about learning outcomes assessment leads to a collective understanding of the meaning of evidence and data used in evaluation of student learning through evaluation of results in Course and Program SLOs. (from Tamela Hawley, DO IESS)

The dialog about student equity is based in the Student Equity and Success Committee, which reports directly to the College Advisory Council. The charge of the committee is as follows:

• The San Jose City College Student Success and Equity Committee will serve as an instrument for college-wide collaboration with regard to student access, achievement and engagement.

• The committee will provide leadership for the planning, implementation and evaluation of a comprehensive student success plan that includes, but is not limited, to components of the Student Success and Support Program plan, Student Equity plan and Basic Skills Initiative.

• In addition, the committee will monitor and make recommendations to the College Advisory Council and Academic Senate.

In January 2015, the Student Equity and Success Committee published the San Jose City College Student Equity Plan. This plan identifies the target groups (subpopulations)

Comment [JB7]: Question 1B1 Does the dialog lead to a collective understanding of the meaning of evidence, data, and research used in evaluation of student learning?

Comment [JB8]: Question 1B1 Does the dialog lead to a collective understanding of the meaning of evidence, data, and research used in evaluation of student learning?

Comment [BJ9]: about what subjects has the college engaged in dialog?

Page 3: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

of at-risk students and discusses goals and plans to assist these groups. Using campus-based research data on access, course completion, ESL and Basic Skills completion, and degree/certificate completion, the committee put together a Goals and Activities rubric with 4 main goals that correspond to the data gathered, activities related to achieving those goals, the expected timeline for achieving the outcome, and the responsible party or parties. (SJCC Student Equity Plan, pp 20 – 34) The dialog about student equity leads to a collective understanding of the meaning of evidence and data through the disaggregation of relevant data to determine which student subpopulations are disproportionally at risk, and what activity could help mitigate that risk. An example of such an activity is the Summer Bridge Program, which is designed to help students to successfully complete one basic skills course prior to their first semester at SJCC. Starting in Summer 2014, students who place into Basic Skills Math, English or Reading will have the opportunity to complete one of the necessary Basic Skills courses. As a Summer Bridge participant, students will also take a Guidance class and receive supplemental instruction and tutoring for the class(es) they are taking.

The dialog about academic quality involves reviewing and assessing the mission statement, which is done by though assessing the Key Performance Indicators. The Key Performance Indicators (KPIs) identify College wide measures of effectiveness that help us to assess progress in meeting the Strategic Goals and in fulfilling the Mission of the College. In May of 2014, the Strategic Planning Committee held an expanded retreat to modify the KPIs to meet State collected data. In the Fall 2014 semester, the campus community selected six KPIs as the most urgent to address and continued by proposing various strategies for that work. The mission is also measured through the accomplishment of the Institution-Set Standards, which are currently being reviewed and updated by a taskforce put together by the Academic Senate and Vice President of Instruction. Data and evidence from Program Reviews and Student Learning Outcome reports also provide indicators for the fulfillment of the college’s mission. The Comprehensive Program Review forms for all programs contain a question about how the program Student Learning Outcomes or Service Area Outcomes align with and support San Jose City College’s mission and/or Institutional Student Learning Outcomes (ISLOs). As the body representing the faculty in the shared governance structure, the Academic Senate is very involved in the dialog about academic quality. The dialog about the quality of the academic curriculum is based in the Instructional Policies and Curriculum Committee (IPCC), which reports to the Academic Senate. The IPCC oversees and approves the college’s instructional policies and curriculum. The Academic Senate may question the process but not the curriculum decisions of the IPCC. Dialog takes place between faculty who are developing or revising curriculum and programs, their area deans, members of the IPCC, the Academic Senate and the Board of Trustees.

The Distance Education Committee is another Academic Senate committee; its charge is to develop policies and promote practices that contribute to the quality and growth of distance education at San Jose City College. The parties involved in this committee are faculty and administrators. The committee supports student success in Distance Education by making recommendations to the Academic Senate, College Advisory Council, and College President regarding:

Comment [BJ10]: Question 1B1 Does the dialog lead to a collective understanding of the meaning of evidence, data, and research used in evaluation of student learning?

Comment [BJ11]: What impact has the dialog had on student learning?

Comment [BJ12]: about what subjects has the college engaged in dialog?

Comment [BJ13]: When has the college engaged in dialog

Comment [BJ14]: about what subjects has the college engaged in dialog?

Comment [JB15]: Question 1B1 What parties are involved in the institution’s dialog about the continuous improvement of student learning through DE/CE mode and how it compares with student learning in traditional programs?

Page 4: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

• Curriculum and instruction, evaluation and assessment, course design, accessibility, technology, infrastructure, and academic support services that affect all modes of distance education course delivery

• Accreditation compliance • Ongoing faculty development and training in the areas of pedagogy and

technology • Online student support and training • Make recommendations to the IPCC:

o Regarding the D.E. supplement Form o Upon request, about Distance Education as an appropriate mode of

learning • The consensus model shall be used for making recommendations. The DE committee is actively engaged in writing the Distance Education plan. It has

also recommended that the Academic Senate approve the adoption of The Online Education Initiative’s Standards for Online Education. The committee also recently developed a faculty evaluation form for online courses, which is slightly different from the form for traditional classes. The committee has discussed making changes to the student evaluation to add language specific to distance education, such as frequency of faculty-student contact. Currently, there are not many differences in how the DE program compares to traditional mode. A few examples of differences are in areas such as curriculum development, where DE courses must fill in a separate form on CurricUNet which includes fields for Distance Education Method(s) of Instruction, Lecture/Lab or Lab only Courses (Manner in which lab content is presented), Distance Education Contact Method(s), Explain how methods of evaluation are accomplished, Describe a sample assignment for this course, Describe accommodations for students with disabilities.

Further dialog regarding the academic quality of Basic Skills courses is the purview of the Basic Skills Initiative (BSI), which also reports directly to the Academic Senate. The charge of the BSI is to assist Basic Skills and ESL students at SJCC to achieve levels of success by providing ongoing information and support to faculty, staff and administration based on best practices and research driven data. To meet this goal, the basic Skills Initiative Group shall meet on a regular basis to make recommendation to the Academic Senate related to the planning, implementation, assessment and funding of the projects and goals related to the Basic Skills Initiative on our campus. An example of an intervention designed by the BSI is the Math In-Class Tutors. In Spring 2013, ten tutors were assigned to twenty sections of three different developmental math courses. Six of these tutors were also Supplemental Instruction leaders who also held weekly out-of-classroom workshops designated for students enrolled in their assigned sections. A pre and post survey was administered in the courses. 315 students participated in the survey. The survey included two questions that directly assessed students' perceptions of the in-class tutoring:

• Question 12: Did working with the tutor help you succeed in the course? • Question 13: Would you recommend that this in-class tutoring program continue

next semester?

Comment [JB16]: Question 1B1 What impact has the dialog had on student learning?

Page 5: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

In response to Question 12, 74 percent of the participants responded 'Yes' and 26 percent responded ‘No'. In Question 13, 89 percent of the participants responded 'Yes' and 11 percent responded 'No'. Based on these results, the project was determined to be successful for the majority of students who were enrolled in these courses. It is interesting to note the discrepancy in results between the two questions, in that more students would recommend this program be offered in future, but a lesser percentage of students found the service appropriate for themselves.

The dialog about institutional effectiveness is truly a campus-wide dialog. In terms planning for ongoing institutional effectiveness, the dialog takes place in the Strategic Planning Committee (SPC) and the College Advisory Council (CAC). The purpose of the SPC is to ensure that the college's strategic planning process is sound, collaborative, evidence-based, and sustainable; and that the Strategic Plan guides decision-making and activities that support improvement of institutional effectiveness and student learning and success. This purpose is consistent with the Sustainable Continuous Quality Improvement level in the Accrediting Commission's planning rubric which includes the following criteria:

• The institution uses ongoing and systematic evaluation of Key Performance Indicators (KPIs) and planning to refine its key processes and improve student learning.

• Dialogue about institutional effectiveness that is ongoing, robust, and pervasive. • Data and analyses are widely communicated and used throughout the

institution. • Document ongoing review and adaptation of evaluation and planning processes. • Consistent and continuous commitment to improving student learning and

success. • Educational effectiveness is a demonstrable priority in all planning structures

and processes. The charge of the College Advisory Council is to counsel and to make

recommendations to the President in matters that involve and affect the college as a whole, to include recommendations from standing college committees and recommendations from other campus groups or councils formed to study specific topics or issues affecting the campus at large. As the primary college council, the CAC serves as the umbrella group for the various shared governance committees on campus and will bring forward to the President its recommendations, as appropriate, for consideration and implementation. The objective of the CAC is to disseminate critically important information, to ensure open communication, to promote genuine involvement before and while decisions are made, and to provide for inclusive participation of all campus constituencies. Standard 1B.2: The institution defines and assesses student learning outcomes for all instructional programs and student and learning support services. (ER 11) Evidence of meeting the standard:

Page 6: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

SJCC has several established policies and processes in place to ensure that all instructional programs and student learning support services define and evaluate student learning outcomes. Board Policy 4020 outlines the structure of curriculum development and the role of the Academic Senate. As a subcommittee of the Academic Senate, The IPCC has the responsibility to ensure that a comprehensive, coherent curriculum is offered by the College appropriate to its mission. It is charged with reviewing and approving curriculum within the parameters of Title 5 standards for course development including Course Outline of Record: title, description, content, SLOs, methods of instruction and evaluation, grading, assignments, textbooks, and articulation if appropriate.

The established institutional processes for course development are outlined in the document Program and Course Approval Handbook from the California Community College Chancellor’s Office.

As part of the curriculum development process, all courses must have defined student learning outcomes that are linked to program learning outcomes. The curriculum form in CurricuNet has a section for entering student learning outcomes and the methods by which they will be assessed. In TracDat, all course student learning outcomes are defined for every course and all program learning outcomes are defined for every program. The process for evaluating courses and programs through assessment of student learning outcomes is defined in the document SJCC: Timelines and Reporting Form, put together by the Student Learning Outcomes Assessment Committee. The steps include identifying SLOs for the program and course, developing a grid to demonstrate alignment between Course, Program, Degree and Institutional SLOs, identifying at least one direct or indirect measure to be used for assessment of one outcome, and performing assessment through gathering data for one outcome. Once the assessments are complete, the next steps are to review the data and summarize the strengths & weaknesses of the program or courses based on these assessments, create assessment plans & timelines for program or course improvement including date of recommended next assessment of the same SLO, and then implement assessment plan.

The institution uses disaggregated data for analysis of student learning. As part of the program review process, all programs are supplied with disaggregated data from the college researcher or the Institutional Effectiveness and Student Success services at the District Office. Data is disaggregated by gender, race, ethnicity, and age. Further information is available through the Datamart feature of the California Community College Chancellor’s Office, though use of this data is not required.

Student learning outcomes and assessments are established for all courses and programs (Evidence #7 - Active Courses with Dates of Last Revised/Board Approval, email from S. Hager 6-24-15, (for listing of all courses and approval status)). In order to create or update a course, the course developer must enter SLOs and their methods of assessment. Appropriate methods of evaluation are used as assessment of student achievement of SLOs. (Evidence #8) The course revision cycle is every 5 – 6 years for all but Career and Technical Education courses, which are updated every 2 years.

Comment [JB17]: Question 1B2 What established policies and institutional processes guide the development and evaluation of courses, programs, certificates, and degrees?

Comment [JB18]: Question 1B2 Does the institution use disaggregated data for analysis of student learning?

Comment [JB19]: This is required by Title 5 section 55002 (c)(3), a "course is described in a course outline of record that shall be maintained in the official college files and made available to each instructor." The course outline of record shall specify, (among other items), "objectives".

Comment [JB20]: http://sjcc.curricunet.com (course outlines, including SLOs and corresponding assessments and course methods of evaluation)).

Comment [JB21]: according to State regulations regarding prerequisites, co-requisite, and advisory - Section 55003, "...These processes shall provide that at least once every six years all prerequisites and co-requisites established by the district shall be reviewed...These processes shall also provide for the periodic review of advisories."

Page 7: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

Non-credit courses are being developed to serve as a bridge into college level coursework for remedial students; these courses will follow the same development processes as credit courses.

The courses taught each semester are evaluated through assessment of Course Student Learning Outcomes, under the Course Assessment Plan section of TracDat. The results of the evaluation are reported in TracDat under the Results tab, which provides a place to report whether the criteria for success was met and an analysis of the findings based on the results of the SLO assessment. Recommendations are optional, but encouraged. Programs SLOs are also evaluated through TracDat. The Program Assessment Plan section functions similarly to the Course Assessment Plan section, in that the Program Student Learning Outcomes are assessed periodically and the courses are mapped to the PSLOs depending on the level of competency expected in each course. Programs are also evaluated through Program Review, which uses data from SLO assessments to plan improvements. Programs are evaluated in a four year cycle, the first three years in Annual Program Review and the 4th year in Comprehensive Program Review. Certificates and degrees are evaluated when there is a change made.

The process for evaluating programs is defined by the Program Review Committee in their documents 2015-2016 Annual Program Review Form and Instruction ACADEMIC AFFAIRS and 2015-2016 Comprehensive Program Review Form and Instructions -ACADEMIC AFFAIRS. These documents provide the purpose and function of Program Review, as well as the structures for completing the reviews. Program review is either annual or comprehensive depending upon where the department, program, or service falls on the program review cycle. Program Review allows for analyzing the College’s instructional, instructional support, student services, and administrative services areas to identify the following: strengths and weaknesses, solutions to weaknesses, how each has achieved or is aligned with college strategic goals, and the equipment, staff, and facilities needs for budget requests. In addition, the program review includes the status on the development and assessment of student and program learning outcomes. (IPRA pg 15). Certificates and degrees are evaluated when they are developed or revised. There currently is no established cycle of evaluation for certificates and degrees.

• [Need evidence of courses, programs, certificates and degrees improvements] As an example of improvements made to a program as a result of program

evaluation, the physics department determined the need to formally incorporate transferable skills as outcomes into their programs and has begun the process to accomplish this. Six outcomes have been identified that focus on science communication, teamwork, information research, problem solving, data analysis and instrumentation. A working group has been formed that is developing an implementation guideline for each of the 5 physics courses. Additionally, physics faculty are working with faculty in communications and the library to train physics faculty on skill incorporation into the curriculum (science communication/teamwork and information research, respectively).

The Board of Trustees for the SJECCD has adopted policy 5050.2 on the topic of student success and directed general fund budget resources to support efforts to improve success rates. SJCC has launched several academic and student support services initiatives to promote greater student success. These initiatives are in addition

Comment [JB22]: Question 1B2 What improvements to courses, programs, certificates, and degrees have occurred as a result of evaluation?

Comment [JB23]: Question 1B2 How does the institution provide for systematic and regular review of its student and learning support services? How are the results used?

Page 8: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

to the support services listed in Chapter V.B., Scan of Conditions Internal to San Jose City College. Some of these efforts are the result of planning activities described in Chapter II, Context for the Educational Master Plan.

Starting in summer 2015 the College is initiating the 2021 Scholars Program as a pilot effort. Funded through a Hispanic Serving Institutions (HIS) federal grant the core of the program is the Caminos Project that seeks to intake 120 high-risk students and graduate 95% of them by 2021. The Caminos Project builds on previous successful strategies used at the College in the METAS program and well-researched best practices described in the Basic Skills As A Foundation for Success in the California Community Colleges (aka the Poppy Copy review of research literature on effective practices in basic skills programs). These students will experience a summer school basic skills bridge program of instruction that will lead into a cohort-based first year experience during academic year 2015-16. Each participant will develop a student educational plan (SEP) and be afforded an opportunity to register simultaneously for classes in three future terms. To the extent possible, the College intends to schedule classes on days and at times that these students would prefer to attend. Successful interventions such as supplemental instruction, peer-led-team-learning (PLTL), and tutoring will be provided to the program participants. A number of student support services will be used to assist the students such as counseling and career guidance, vouchers for textbooks, Board of Governors fee waivers, assistance in securing financial aid, etc. (EMP)

The College has been providing several interventions coordinated through its Basic Skills Initiative (BSI). BSI resources have funded some of the activities while others have also used Student Equity (SE) Plan dollars and/or Student Support Services and Program (SSSP) Plan funding. The more recent interventions include inside and out-of-class tutoring from the Reading and Writing Center and a math textbook loan service (the Avanzamos program). Tutoring has now been extended to the ESL laboratory. Training for all tutors has been upgraded through instruction and certification provided by the International Tutor Training Program offered through the College Reading and Learning Association. The new student orientation has been revamped and ESL advisement has been revised to be co-conducted with counselors and ESL faculty. A workshop series of three required sessions was developed for academically disqualified students. Students on academic probation are strongly encouraged to attend the workshops. The English and Reading faculty are conducting a trial experience with an accelerated composition curriculum redesign effort. Supplemental instruction has been initiated and expanded in basic skills math courses. A direct-push college success strategy is now conducted in basic skills English composition courses. That strategy involves a one-hour, in-class workshop on getting organized, planning ahead, and test taking techniques. During the summer of 2014 a writing sample pilot was conducted in which composition course placements were made based on both the scores earned on the American College Testing (ACT) COMPASS placement exam and an evaluation of a writing sample. Most placements were higher than would have been the case using only the COMPASS exam results. English faculty members have reviewed the basic skills migration behavior of students and are now pursuing the research question- why do successful students not persist in the curriculum sequence? Math faculty members have also explored how placement cut scores are assigned and used. Some who teach Elementary Algebra have

Page 9: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

engaged in an experiment to trace the impact of -traditional instruction vs. selected best practices pedagogies. The math department is engaged in developing a five-year strategic plan based on their research to locate evidence-driven ideas that have produced greater student success in the basic skills curriculum.

The Student Affairs professionals have joined their academic faculty counterparts in the efforts to improve student success as noted above. The Student Affairs faculty and staff have also undertaken several efforts on their own. Currently, they are aggressively engaging students to complete a Student Education Plan. They are piloting the use of the Datatel student educational plan and the degree audit software. In spring 2015 they launched a pilot of the early alert software program. For new students there is an effort to promote engagement in communities of career interests so that groups of students can bond with others around common career fields of study. Counseling faculty completed a three-day On-Course workshop organized around student success principles and a learner-centered approach to teaching. Subsequently, they revised all of the counseling course materials. Counselors are also beginning to plan for linked courses with academic faculty members. The linked courses would include the GUIDE 130, College Success, course a basic skills math or English course, and one course in the student’s career field. Enrollment Services personnel and counseling faculty continue to offer the “super Saturday” strategy to efficiently and effectively outreach to high school seniors who are prospective students for the College. They also jointly host the Male Summit to encourage young men of color to consider attending the College. (Interventions to Improve Student Success pg 122 EMP)

The established policies and institutional processes that guide the development and evaluation of courses and programs offered in DE mode are largely the same as for traditional courses in terms of course development through the IPCC and evaluation through SLO assessment and Program Review. State regulations regarding quality standards of the traditional classroom being applied to distance education (Section 55372 Course quality standards) states, "...The same standards of course quality shall be applied to distance education as are applied to traditional courses in regard to the course quality judgments made pursuant to the requirement of Section 55002 of this Part, and in regard to any local course quality determination or review process." There are some differences in that faculty must fill out a Distance Education Supplement form in CurricuNet for all DE courses, including hybrid courses. The form includes questions about DE methods of instruction, manner in which lab content is presented, DE contact methods, methods of evaluation, and accommodations for students with disabilities.

At the present time, the Distance Education Committee is concentrating on developing College policy regarding appropriate professional development for faculty who are interested in teaching online, defining key concepts such as regular effective contact between an instructor and students, course review standards for potential online delivery, and instructor evaluation procedures when teaching an online class. The Committee is promoting a College review of the Canvas course management software and the larger California Online Education Initiative (OEI). The Senate and College Advisory Council review the DE Committee proposals and will eventually review the DE Plan to promote integration with other plans at the college. This Plan is integrated with other plans of the institution through the Plan review processes and through its

Comment [JB24]: Question 1B2 What established policies and institutional processes guide the development and evaluation of courses and programs offered in DE/CE mode? Are they different from the policies and institutional processes that guide the development and evaluation of courses offered in traditional mode?

Page 10: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

activities. SJCC has had neither a long nor strong history of offering distance education instruction through the Internet. Internet-based instruction was not offered prior to fall 2011. From fall 2011 to fall 2014 online instruction has only represented 6.5% of the FTES generated while non-distance education instructional methods accounted for 93.5% of the FTES. (EMP)

Discipline expertise or teaching knowledge in the field of DE is used for establishing quality in DE courses through the initiative of the individual instructors In May 2015, the Academic Senate voted to approve a measure supporting the DE committee’s recommendations regarding Instructor Requirements for Distance Education. The recommendations were:

• Starting Spring 2016: • Participate in one professional development activity per regular academic

semester (Fall and Spring) related to Distance Education (on-campus workshop on a specific feature of Moodle, an off-campus DE conference, a PDD session on online learning/features, Webinar session, and so on) and demonstrate prior successful experience in teaching online course(s) at SJCC or another similar institution for a minimum of 4 semesters/sessions in previous three years; OR

• At least two courses in online teaching from @One Teaching Certification Program or equivalent online teaching programs from regionally accredited institutions.

• Starting Spring 2018: • Complete at least four courses in online teaching from @One Teaching

Certification Program or equivalent online teaching programs from regionally accredited institutions AND

• Participate in at least one professional development activity per regular academic semester related to Distance Education (on-campus workshop on a specific feature of Moodle, an off-campus DE conference, a PDD session on online learning/features, Webinar session, and so on). (approved Minutes of the Academic Senate, May 19, 2015)

• The chair of the DE committee is requesting that the Academic Senate adopt the Online Education Initiative’s Standards, for Quality Teaching Online, set of 10 comprehensive standards. (@One Standards for Quality Teaching Online)

• Not all DE courses have been evaluated. There needs to be training of administrative assistants and deans in how to do the evaluations correctly. There are also rules in faculty contracts that need to be followed in order to do the evaluations correctly. There is a need to evaluate the DE program as a whole.

Analysis and evaluation: Standard 1B.3: The institution establishes institution-set standards for student achievement, appropriate to its mission, assesses how well it is achieving them in pursuit of continuous improvement, and publishes this information. (ER 11)

Comment [JB25]: Question 1B2 What is the role of faculty and how is discipline expertise or teaching knowledge and expertise in the field of DE/CE used for establishing quality for these courses?

Comment [JB26]: Question 1B2 What improvements to DE/CE courses and programs have occurred as a result of evaluation?

Page 11: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

Evidence of meeting the standard: In response to U.S. Department of Education requirements and ACCJC

expectations, the College has set a series of minimum student achievement performance standards for the institution as a whole and published them in the Educational Master Plan. For 2015 those performance standards are reflected in the table below. Table: SJCC Institutional Set Standards, 2015, taken from the Annual Accreditation Report Student Achievement Institutional-Set Standard Topic Standard Successful Course Completion Standard 73% Successful Course Completion, Fall 2014 71% Completion of Degrees and Certificates Per Year Standard 1041 Completion of Degrees Per Year Standard 582 Completion of Certificates Per Year Standard 455 # of Students Who Received a Degree or Certificate in 2013-14 1021 # of Students Who Received a Degree in 2013-14 571 # of Students Who Received a Certificate in 2013-14 446 # of Students Who Transfer to a 4-year Institution Per Year Standard 142 # of Students Who Transferred to a 4-Year School in 2013-14 139 Does The College Have Certificate Programs That Are Not CTE yes If yes, Please Identify them

CSU GE Breadth; IGETC

# of Career-Technical Education Certificates and Degrees 74 # of Career-Technical Education Certificates and Degrees That Meet Employment Standards

3

# of Career-Technical Education Certificates and Degrees For Which The College Has A Standard for Licensure Passage Rates

3

# of Career-Technical Education Certificates and Degrees For Which The College Has A Standard for Graduate Employment

0

Other Standards Established By The College

% of Students That Re-enroll Fall to Fall 67% % of Student Who Complete a Course With A Grade 88% % of Students Who Complete a Course With A Grade of "C" or better 67%

The U.S. Department of Education and ACCJC have communicated their expectations that colleges will also monitor the licensure examination pass rates and job placement rates of program graduates. As of 2015 those rates are captured in the tables below. Program

CIP Code

Examination Standard Pass Rate

Comment [JB27]: Question 1B3 Is there broad-based understanding of the priorities and the processes to implement strategies to achieve the desired outcomes?

Comment [JB28]: Question 1B3 How does the college use accreditation annual report data to assess performance against the institution-set standards?

Page 12: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

Cosmetology & Esthetics 12.04 state 80 % 82 % Dental Assisting 51.06 state 85 % 86 % Emergency Medical Services 51.09 national 80% 82 % Program CIP Code Standard Job Placement Rate Cosmetology & Esthetics 12.04 80 % 0 % Dental Assisting 51.06 80 % 85 % Emergency Medical Services 51.09 80 % 85 %

The College has evaluated data about its performance with respect to goals it has established and the accountability framework used by the community college system. (EMP pg 106) The college is currently using an ad-hoc committee from the Academic Senate to determine its priorities and set minimum expectations for student achievement in relation to the college mission. This will inform the ACCJC annual report and will be included in Program Review. That data will be reviewed by the division deans with the departments to create action plans. The committee will review and revise the institution-set standards to be sure they include expectations for course completion, licensing examination passage rates, and job placement rates. The committee will also review and revise standards of student performance for other indicators pertinent to the institution’s mission, e.g., student persistence from term to term, degree and certificate completion, and transfer rates. While there are currently no specific goals (institution-set standards) and objectives for the effectiveness of DE activities, the committee will include them in their deliberations. We are in process of collecting data in order to establish the standards, and raise awareness through Program Review and Dean/department strategies.

SJCC has joined with other community college state wide in the CTE Employment Outcomes Survey (CTEOS), which will give us data about job placements for our students in CTE programs. We also use Labor Market Information data from Centers of Excellence and EDD. Licensure is done through each individual department’s program review and external accreditation.

The tables above show that we are not meeting our institution set standards in all the areas reported. While our examination pass rates and job placement rates for CTE programs is above standard, the Successful Course Completion, Completion of Degrees and Certificates Per Year Completion of Degrees Per Year and the number of Students Who Transferred to a 4-Year School are all below the institution set standards. The college has adopted a goals framework in response to recently enacted legislation (Education Code section 84754.6) which has required the Board of Governors for the California community college system to adopt a goals framework that will encourage improvement in institutional effectiveness. The statute also required that, as a condition of receiving Student Success and Support Program funds, each college must develop, adopt and post a goals framework that addresses the following four areas: (1) student performance and outcomes, (2) accreditation status, (3) fiscal viability, and (4) programmatic compliance with state and federal guidelines. In accordance with this

Comment [JB29]: Question 1B3 What criteria and processes does the college use to determine its priorities and set minimum expectations (institution-set standards) for student achievement, including required expectations of performance for course completion, job placement rates, and licensure examination passage rates? (Federal Regulation)

Comment [JB30]: Question 1B3 To what extent does the college achieve its standards? (Federal Regulation)

Comment [JB31]: Question 1B3 If an institution does not meet its own standards, what plans are developed and implemented to enable it to reach these standards? (Federal Regulation)

Page 13: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

mandate San Jose City College has adopted the following required and optional goals: see pg. 122 EMP Required Goal Indicators Lo

ng-t

erm

Go

al

Shor

t-te

rm

Goal

2013

-14

2012

-13

2011

-12

2010

-11

2009

-10

Student Performance and Outcomes

Successful Course Completion

73.3% 71.8% 71.3% 70.1% 67.7% 67.9% 68.9%

Fiscal Viability & Programmatic Compliance with State and Federal Guidelines

District Fund Balance

7.0% 7.0% 16.4% 14.6% 11.9% 11.1% 6.3%

District Audit Findings

modified

modified

modified

Full-time Equivalent Students

6,602 6,907 7,401 8,258 8,292

Accreditation Status

Feb 2014 July 2013 Feb 2013

July 2012 Feb 2012

Next Visit: Oct. 15, 2015

full accred.

full accred.

full accred.

full accred.

reaffirmed

probation

probation

Optional Goal Indicators

Completion Rate

Page 14: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

College Prepared

74.5% 73.0% 72.5% 70.5% 65.4% 70.4% 61.5%

Unprepared for College

32.6% 31.1% 30.6% 34.3% 33.3% 32.3% 31.6%

Overall 45.1% 43.6% 43.1% 44.5% 43.5% 45.0% 41.4% Remedial Rate

Math 27.0% 26.5% 25.0% 27.3% 25.6% 25.2% 22.2% English 42.5% 40.5% 40.5% 43.1% 35.4% 34.4% 33.6% ESL 22.8% 21.3% 20.8% 18.9% 16.2% 14.2% 18.6% Career-Technical Education Rate

45.6% 44.1% 43.6% 47.5% 49.5% 45.4% 43.2%

Completion of Degrees

582 574 571 564 397 384 376

Completion of Certificates

455 574 446 310 220 190 251

District Fiscal Viability & Programmatic Compliance with State and Federal Guidelines

Salary & Benefits

85.0% 85.0% 87.1% 87.1% 88.5% 88.3% 87.6%

Annual Operating Deficiency

$0 $0 $2,579,902

$2,153,657

$384,006

$3,535,825

-$1,427,083

Cash Balance

$30,000,000

$25,000,000

$27,051,663

$21,784,574

$14,874,245

$11,201,780

$5,702,447

[I am still waiting for answers to my questions about DE data] Analysis and evaluation: Standard 1B.4: The institution uses assessment data and organizes its institutional processes to support student learning and student achievement. Evidence of meeting the standard:

Comment [JB32]: Question 1B3 Has the college defined specific goals (institution-set standards) and objectives for the effectiveness of its DE/CE activities? How are these goals and objectives defined and communicated? What data and/or evidence are used to communicate and analyze institution-set standards relevant to DE/CE?

Page 15: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

In order to improve student learning and achievement, the assessment data is incorporated into college planning through Program Review and Strategic Planning. Through the use of data disaggregated by age, race, ethnicity and gender, the campus-based researchers provide analyses of student outcomes on a regular basis. In partnership with the deans, the campus based researchers perform analyses of student placement decisions based upon the Compass assessment and the correlation of student placement with subsequent success in relevant coursework. The District Strategic Plan includes indicators that track student progress on academic inputs (such as Compass assessments) and academic outcomes (DO Strategic Plan Goal I: Objective II http://www.sjeccd.edu/RIE/Documents/Goal%20I%20Workplan.pdf).

In addition, the Strategic Plan specifically calls out Student Success as the first and most important strategic goal. Within this goal are three objectives, one of which is to Improve Student Academic Outcomes. Along with the Vice President for Academic Affairs and the deans, the district research officers are named as persons responsible for continually assessing basic skills as well as General Education outcomes of students. Currently the campuses are supported through ad hoc analyses by the campus based researchers. The timeline for implementation of a systematic plan for cyclical review of both basic skills and general education is spring 2016.

The College has evaluated data about its performance with respect to goals it has established and the accountability framework used by the community college system. The State first introduced an accountability system for the community colleges in the late1990s. At that time the Partnership for Excellence (PFE) established system-wide goals for performance in exchange for enhanced funding. By 2004 legislative action replaced the PEF initiative with the Accountability Reporting for Community Colleges (ARCC),which created college-specific reporting in addition to system-wide reporting. The framework approached the outcomes measures based on cohort analysis of students whose behavior defined their intentions. Although colleges were encouraged to develop their own goals for improvement on the outcome measures there were no financial incentives or penalties attached to performance. (Pg 106 EMP) Analysis and evaluation: Standard 1B.5: The institution assesses accomplishment of its mission through program review and evaluation of goals and objectives, student learning outcomes, and student achievement. Quantitative and qualitative data are disaggregated for analysis by program type and mode of delivery. Evidence of meeting the standard:

The college has a cyclical program review process in place. Program review is either annual or comprehensive depending upon where the department, program, or service falls on the program review cycle. This cycle constitutes four years and is marked by Annual Program Reviews and a larger Comprehensive Program Review in the fourth year. As a key component to SJCC’s Integrated Planning Process, all instructional, student services, and administrative areas are required to go through the Comprehensive Program Review cycle. Program Review allows for analyzing the

Comment [JB33]: Question 1B4 How is assessment data incorporated into college planning to improve student learning and achievement? (Federal Regulation)

Comment [JB34]: Question 1B4 Are the data used for assessment and analysis disaggregated to reflect factors of difference among students?

Comment [JB35]: Question 1B5 Does the college have a program review process in place? Is it cyclical, i.e., does it incorporate systematic, ongoing evaluation of programs and services using data on student learning and achievement, improvement planning, implementation, and re-evaluation?

Page 16: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

College’s instructional, instructional support, student services, and administrative services areas to identify the following: strengths and weaknesses, solutions to weaknesses, how each has achieved or is aligned with college strategic goals, and the equipment, staff, and facilities needs for budget requests. In addition, the program review includes the status on the development and assessment of student and program learning outcomes. (IPRA pg 15) As a key component of the integrated planning and resource allocation model, Program Reviews provide systematic, data-driven information that allows the College to examine the overall effectiveness of the institution. The Program Review process is designed to provide academic, student, and administrative areas the opportunity for review and assessment in relation to the College’s mission, vision, values and performance indicators.

Moreover, the purpose of Program Review is to ensure appropriate resources are being allocated to facilitate ongoing improvement in meeting the evolving learning needs of our students and community. (Program Review Committee charge, webpage) All departments, programs and services (cost centers) are expected to submit a program review each year. The program review will identify and justify the budgetary needs of the respective cost center. Program reviews must be validated and rated as proficient for budget requests to be approved. All program reviews and validation reports are available for the Finance Committee to review. (IPRA, pg 16)

The instructional program review prompts promote integrated planning by requiring a discussion of how well the unit functioned and addressed the alignment with the College mission, how it has assessed student learning outcomes, and how it has conducted planning and program improvement. The final portion of the review provides an opportunity to discuss future needs and plans. The more recent instructional comprehensive program reviews were consulted in preparing the EMP. Particular attention in instructional unit reviews was given to the responses about curriculum, facilities, and future needs.

Institutional data and evidence are supplied to programs from the Office of Institutional Effectiveness and Student Success at the District Office and the campus-based researcher to the area deans and the programs. The data provided allows programs to answer questions in the program review form, such as:

• What were the enrollment trends for the previous four years (FTES, WSCH, # of Sections, Headcount, Seat count, Day/Evening, etc.)? Discuss how these trends impact your program.

• What were the student demographics of your program in the previous five years (student population served/demographics-age, gender, ethnicity, income, previous education, etc.)? Discuss how these demographics impact your program.

• What were the student retention, persistence and success rates for your program? Were there any significant differences by gender, age or ethnicity? Discuss these trends as they apply to your program. If applicable, offer a plan for improvement of success, retention and persistence.

The processes for planning, approval, evaluation and review of courses offered in DE mode are similar to those for courses offered in the traditional face-to-face mode. These

Comment [JB36]: Question 1B5 How does college budgeting of resources follow planning?

Comment [JB37]: Question 1B5 How is planning integrated?

Comment [JB38]: Question 1B5 To what extent are institutional data and evidence available and used for program review?

Comment [JB39]: Question 1B5 Does the college have separate processes for the planning, approval, evaluation, and review of courses offered in DE/CE mode, or are the processes similar to those for courses offered in traditional face-to-face mode? How are these processes integrated into the college’s overall planning process?

Page 17: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

processes are integrated into the college’s overall planning process through the same mechanisms that are used for traditional courses. Standard 1B.6: The institution disaggregates and analyzes learning outcomes and achievement for subpopulations of students. When the institution identifies performance gaps, it implements strategies, which may include allocation or reallocation of human, fiscal and other resources, to mitigate those gaps and evaluates the efficacy of those strategies. Evidence of meeting the standard:

The institution identifies significant trends among subpopulations of students through the analysis of disaggregated data. Student Equity Planning is administered through the Student Success and Support (SSSP) unit at the Chancellor's Office. SSSP staff members are responsible for the implementation of the Board of Governor's Student Equity Policy and related regulations, including assessing district plans and reporting recommendations to the Board of Governors, providing districts with technical assistance in the development and improvement of plans, and assessing district progress towards the implementation of their plans over time. College student equity plans focus on increasing access, course completion, ESL and basic skills completion, degrees, certificates and transfer for all students as measured by success indicators linked to the CCC Student Success Scorecard, and other measures developed in consultation with local colleges. “Success indicators” are used to identify and measure areas for which disadvantaged populations may be impacted by issues of equal opportunity. Title 5 regulations specify that colleges must review and address the following populations when looking at disproportionate impact: American Indians or Alaskan natives, Asians or Pacific Islanders, Blacks, Hispanics, Whites, men, women, and persons with disabilities. The institution does not currently have separate set performance expectations for the subpopulations identified. However, this is one of the areas that will be included in the work of the subcommittee on Institution set standards.

The Student Success and Equity Committee published the Student Equity Plan in early 2015, in which they discuss areas of disproportionate impact on certain subpopulations of students. For example, one such area is that of men of color; data from the 2011 Community College Survey of Student Engagement (CCSSE) during the Kresge Men of Color Institute demonstrates that this is an at-risk population. There are several items in place in the action plan specifically designed for this group, including the Male Summit.

Another area of disproportionate impact that they discovered concerns students over 24 years of age. SJCC has an older population and many of these students are juggling jobs and family and academic work. This equity plan addresses the needs of these students through the development of a survey of students in night classes, where many of the older students attend, in order to ascertain their specific needs and then acting upon the data collected. The plan also includes points about extending support services in the evenings at peak times of the semester in order to serve these students.

Some students who are at risk are those who have not yet been tracked and so the college does not have hard data collected for them but we know they are in need of

Comment [JB40]: Question 1B6 Does the institution identify significant trends among subpopulations of students and interpret their meaning?

Comment [JB41]: Question 1B6 Has the institution set performance expectations (key performance indicators) for the subpopulations?

Page 18: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

support. One such group is veterans entering the community college. Historically there is a larger influx of veterans in the community college system than in previous years. SJCC has recently opened a Veteran's Resource Center (VRC) and is continuing to work on ways to support this population. Another at-risk population is that of LGBTQ students; many of these students face discrimination, overt or subtle, and there is a need for training and education made available for staff and faculty as well as students. There are several action items in this plan to address this reality, and the committee will be working on ways to collect data on this group in order to assist this population more effectively.

The SSE committee has identified the following goals, each with activities and timelines associated with them:

• Student Success Indicator for Access • Student Success Indicator for Course Completion • Student Success Indicator for ESL and Basic Skills Course Completion • Student Success Indicator for Degree and Certificate Completion • Student Success Indicator for Transfer An example of an activity for achieving goal A is the African American and Latino

Male Summit. This event helps develop a college-going culture for these young men that includes academic information as well as motivation and perspectives from other men of color who are either current college students or who have graduated and are now in a position to be role models, as well as male faculty and staff. 150 male students of color from both SJCC and local high schools will participate in AALMS and reaffirm a college-going culture within the group. These young men will also receive priority for the Summer Bridge program.

Each semester, in collaboration with the Office of Institutional Effectiveness and Student Success and the campus-based researcher, the Student Success and Equity Committee will monitor progress and gather data on activity outcomes. The campus-based researcher will analyze the data to determine whether disproportionate impact has increased, remained the same, and whether or not new groups have been impacted. The Student Success and Equity Committee will discuss the findings and share them with the Academic Senate, the College Advisory Council and the College President as well as welcome input and feedback from other campus members to mitigate disparities and plan proactive and corrective activities to ensure equity for all San Jose City College students.

• I don’t have answers to these questions yet. Standard 1B.7: The institution regularly evaluates its policies and practices across all areas of the institution, including instructional programs, student and learning support services, resource management, and governance processes to assure their effectiveness in supporting academic quality and accomplishment of mission. Evidence of meeting the standard:

The Program Review Committee is responsible for the policies and practices associated with assessing instructional programs. As part of the review cycle, completed

Comment [JB42]: Question 1B6: How does it judge its achievement of the target outcomes?

Comment [JB43]: Is the institution performance satisfactory? What changes have been made or are planned as a result of the analysis of the data?

Page 19: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

program reviews are read and validated by the PRvC using a rubric designating the submitted review as either In Progress or Proficient. Each year, when the evaluation process is complete, the PRvC evaluates its process and revises the Program Review Handbook. (PRvC minutes)

The Instructional Policies and Curriculum Committee has several policies and procedures in place to assure that College resources are used in support of programs that are capable of functioning effectively in terms of serving sufficient numbers of students and maintaining high instructional standards. One is the SJCC Program Viability Review Policy, put together collaboratively by the Academic Senate and the IPCC. The Program Viability Review (PVR) Procedure will be utilized when there is ample qualitative and quantitative evidence that a program may no longer be viable. The procedure provides a framework for the collection and analysis of appropriate data, the application of established criteria, and the assessment of impact on students, employees, and other programs. Ultimately, it will provide a recommendation to the Chancellor and the Board of Trustees as to whether the program should be continued, discontinued, revitalized, or suspended. (SJCC Program Viability Review Policy). This policy is coupled with the SJCC Program Viability Review Procedure. The ultimate purpose of the PVR Procedure is to provide a thorough and equitable process to assess weak or nonproductive programs and to determine an appropriate course of action. . (SJCC Program Viability Review Procedure).

The Strategic Planning Committee evaluates other committees to ensure that the

college’s strategic planning process is sound, collaborative, evidence-based, and sustainable. A standard that must be adhered to when evaluating institutional effectiveness is Sustainable, Continuous, Quality Improvement. This standard was established by the Accrediting Commission and requires the following:

• The institution uses ongoing and systematic evaluation and planning to refine its key processes and improve student learning.

There is dialogue about institutional effectiveness that is ongoing, robust, and pervasive. Data and analyses are widely distributed and used throughout the institution. There is ongoing review and adaptation of evaluation and planning processes. There is consistent and continuous commitment to improving student learning, and educational effectiveness is a demonstrated priority in all planning structures and processes.

As the first step, the Strategic Planning Committee (SPC) reviews and evaluates the college strategic goals. The goals are proposed and discussed by the campus community at Professional Development Days. Upon receipt of the college goals, the SPC attempts to identify and determine Key Performance Indicators (KPIs) to measure the advancement of a strategic goal. Once all data were identified, the SPC evaluated the KPIs. Annual reports, bond reports, HR documentation, and state reports were used to measure KPIs. As part of the comprehensive assessment of planning processes, the College has established a number of annual evaluations including that of:

• Standing Committees associated with Planning and Resource Allocation • Strategic Planning Committee • Resource Allocation Processes

Comment [BJ44]: What processes does the institution use to assess the effectiveness of its cycle of evaluation, integrated planning, resource allocation, and re-evaluation?

Comment [JB45]: What processes does the institution use to assess the effectiveness of its cycle of evaluation, integrated planning, resource allocation, and re-evaluation?

Page 20: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

• Strategic Planning Processes The SPC developed a survey instrument that was administered to all standing

committees identified in SJCC’s Resource Allocation Model. This instrument allowed each committee to assess its achievement of established goals and its alignment to the college strategic plan. In addition to the survey items listed above, the Committee evaluation included an assessment of goal completion and recommendations for improvements based on survey findings.

The evaluation measures the Committee’s processes, interactions, and outcomes during the academic year. The evaluation is conducted during the annual retreat in late Spring. Results are used to improve planning processes and committee performance. Under the leadership of the Finance Committee, the resource allocation process is reviewed and revised based on feedback from the campus community. Based upon the SPC feedback and the Finance Committee’s self-evaluation, the resource allocation process is updated. (IPRA pp 12 – 13) I don’t have answers to the following questions: How effective are the college planning processes for fostering improvement? What mechanisms does the institution use to gather evidence about the effectiveness of DE/CE learning programs and related student and learning support services? I don’t have an answer to this question yet. Analysis and evaluation: Standard 1B.8: The institution broadly communicates the results of all of its assessment and evaluation activities so that the institution has a shared understanding of its strengths and weaknesses and sets appropriate priorities. Evidence of meeting the standard:

SJCC has several modes of communicating the results of all its assessment and evaluation activities. The college web page serves as a primary conduit for communication with college constituency groups, the community and the world. There is a separate link for Accreditation, which contains important documents and timelines related to the accreditation process. Copies of Master Planning documents, Strategic Planning documents, Student Equity Plan, and other documents that contain assessment and evaluation activities are posted on the Accreditation page. Additionally, all standing committees have a web presence, with the charge of the committee, agendas and minutes, and relevant documents.

Another mode of communication is email. Information relevant to assessment and evaluation is often contained in the Roar, a weekly email newsletter sent from the President’s office. Results of surveys and other data are shared via email as well.

SJCC’s Professional Development Days provide an opportunity for broad

Page 21: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

communication to administrators, faculty and staff about how the college is doing. Reports from the President, Vice Presidents, Faculty Association President, CSEA President, Associated Student Government President, Academic Senate President, and others detail specific information about enrollment, budgetary issues and other topics germane to the shared understanding of the institution’s strengths and weaknesses. Division meetings and breakout sessions drill down further into these topics. Professional Development Days have also been used to collect information about priorities, such as surveying faculty and staff regarding key performance indicators.

The Strategic Planning Committee plays a key role in communicating the results of assessments and evaluations through its reports. It represents a participatory group of individuals from the faculty, classified, student and administrative ranks who have dedicated themselves to evaluate and recommend improvements to our integrated planning and resource allocation process. During the 2013-2014 and 2014-15 Academic Years, they have endeavored to refine the excellent work done by the previous committees by aligning the Key Performance Indicators to match data requirements examined by the State. As part of that alignment process, they engaged the full campus community to select Key Performance Indicators requiring the most attention; those focused upon student success and campus safety were selected. The campus community was further engaged to create, review, and implement various strategies for achieving improvements in the Key Performance Indicators.

The goal of the Strategic Planning Committee is to ensure that the College’s strategic planning process is sound, collaborative, evidence-based, and sustainable, and that the Strategic Plan guides decision-making and activities that support improvement of institutional effectiveness and student learning in the long term.

The College Advisory Council is another important part of the communication piece. Its objective is to disseminate critically important information, to ensure open communication, to promote genuine involvement before and while decisions are made, and to provide for inclusive participation of all campus constituencies. As the primary college council, the CAC serves as the umbrella group for the various shared governance committees on campus and brings forward to the President its recommendations, as appropriate, for consideration and implementation. The responsibilities of the CAC are:

• Oversee all aspects of the college planning and accreditation processes; • Review and update progress annually on the implementation of the college’s

Strategic Plan; • Review and approve the college’s Accreditation Self-Study; • Oversee the development and integration of the college’s Educational Master

Plan, Facilities Master Plan, and Strategic Plan; • Provide counsel to the President in matters of institutional operations, including

but not limited to budgeting, planning, facilities, technology, diversity and external priorities.

Analysis and evaluation:

Comment [JB46]: What mechanisms exist for participation in and communication about college planning and evaluation?

Page 22: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

Standard 1B.9: The institution engages in continuous, broad based, systematic evaluation and planning. The institution integrates program review, planning, and resource allocation into a comprehensive process that leads to accomplishment of its mission and improvement of institutional effectiveness and academic quality. Institutional planning addresses short- and long-range needs for educational programs and services and for human, physical, technology, and financial resources. (ER 19) Evidence of meeting the standard:

Beginning with an alignment to the College’s mission and vision, the Strategic Planning Model identifies all the components associated with the integrated planning activities and the relationship of each function. The Model represents a continuous improvement system with all activities keeping the College Mission, Vision, and Values at the center. The following provides an introduction of each part of the planning model:

Mission, Vision and Values: The College’s mission statement articulates who we are; the College’s vision focuses on who we want to become, and the values communicate what we believe in. The Mission, Vision, and Values location at the center of the Strategic Planning Model is significant in that it is a visual representation of how all activities at San Jose City College keep these concepts at their core.

Educational & Facilities Master Plans: The Educational and Facilities Master Plans chart the College’s long-term course. The Educational Master Plan is the foundation document for the Facilities Master Plan. Both focus on institutional change, analysis, and improvement of existing conditions; both anticipate changes in the community, growth of the College as a whole, and changes in programs and services, as well as include institutional strategic goals and opportunities for input from all College constituencies.

Program Review and Focused Instructional Analysis: Program review is either annual or comprehensive depending upon where the department, program, or service falls on the program review cycle. Program Review allows for analyzing the College’s instructional, instructional support, student services, and administrative services areas to identify the following: strengths and weaknesses, solutions to weaknesses, how each has achieved or is aligned with college strategic goals, and the equipment, staff, and facilities needs for budget requests. In addition, the program review includes the status on the development and assessment of student and program learning outcomes.

The Program Review Committee is responsible for providing guidance to the College in the use of program review materials and the process of program review, evaluating and provide feedback on the quality of program review documents submitted by the units undergoing a self-study, validating completed program review documents and forwarding the documents to appropriate offices and committees. The Program Review Committee breaks up into subgroups to work closely with the constituency units undergoing program studies.

Campus Budget Development and Resource Allocation: The Strategic Planning Committee has modified the previous Resource Allocation Model to highlight the steps associated with aligning campus resources to planning. These modifications are a result of analysis of effectiveness of current activities, re-evaluation of the current activities,

Page 23: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

and proposed improvements to the process. Analysis and Process Evaluation: Analysis: To ensure continuous improvement, each year, the Strategic

Planning Committee will evaluate the effectiveness of the College’s planning process and will review performance indicator data to determine the extent to which the College has accomplished its goals.

Environmental Scan: The environmental scan will consist of a periodic review of external and internal data needed to assess the needs of the communities. This evaluation process allows the College to consider the factors that influence the direction of the College. The scan is conducted every three years.

Evaluation: Each fall semester, the groups overseeing various planning documents will perform a review of the goals and objectives of each plan and make recommendations for improvements. So too will the plans be reviewed to ensure alignment with Strategic Goals and Key Performance Indicators.

Integrated planning is promoted at the College through the shared governance process. The content of current plans have a common theme of addressing the major elements in the College Strategic Plan. The processes used at the College promote integration through the College Advisory Council as the primary coordinating and review body supported by governance councils and standing committees. In addition to the College Advisory Council, SJCC has an Academic Senate. San Jose City College has organized these standing committees to support the work of shared governance. As discussed above, the planning work of the institution, except for unit and department comprehensive program reviews, is largely accomplished through these standing committees: Distance Education, San Jose City Educational Master Plan 10/11/15-16 Diversity Advisory, Facilities and Safety, Finance, Technology, Instructional Policies and Curriculum, Professional Development, Program Review, Strategic Planning, and Student Success. (not sure where this is from). I don’t have answers for the following questions How effectively do evaluation processes and results contribute to improvement in programs and services? Are the assessment data collected for DE/CE different from data collected for traditional face-to-face education? What is the rationale? What types of assessment data does the college collect on learning programs and support services offered in DE/CE format? Analysis and evaluation: Evidence that well-defined, decision-making processes and authority facilitate planning and institutional effectiveness Evidence of regular and systematic assessment of the effectiveness of all institutional services and processes (Program Review) Evidence that the results of evaluations are disseminated to and understood by the college community (Ask Jessica)

Comment [JB47]: What mechanisms does the institution use to gather evidence about the effectiveness of programs and services?

Page 24: 1B.1: The institution demonstrates a sustained, substantive and ... 1I.B-DRAF… · 15/04/2012  · An example of results from this dialog is illustrated in an SLO success story from

Evidence that results of regular and systematic assessments are used for institutional improvement (Program Review) Evidence of current, systematic program reviews and use of results (Program Review) Evidence that program review processes are systematically evaluated (Program Review) Evidence of institutional dialog about the continuous improvement of student learning in DE/CE mode Evidence that clearly stated and measurable goals and objectives guide the college community in making decisions regarding its priorities related to DE/CE Evidence of evaluation of progress on the achievement of goals and objectives related to DE/CE List of all DE/CE courses/programs Evidence of quantitative and qualitative data that support the analysis of achievement of goals and objectives for DE/CE Evidence of mechanisms for allocation of resources to plans for DE/CE Evidence of periodic and systematic assessment of the effectiveness of DE/CE Evidence that the assessment data is effectively communicated to the appropriate constituencies Evidence of current reviews of programs and support services including library services related to DE/CE and examples of improvements