Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Universidad Ana G. Méndez Gurabo Campus
School of Education
Annual Report 2017-2018
April 30, 2019
2019 EPP Annual Report Template
CAEP ID: 21508 AACTE SID:
Institution: Universidad Ana G. Méndez- Recinto de Gurabo
Unit: Escuela De Educación
Section 1. AIMS
Profile After reviewing and/or updating the Educator Preparation Provider's (EPP's) profile in AIMS, check the box to indicate that the information available is accurate.
1.1 In AIMS, the following information is current and accurate...
Agree Disagree
1.1.1 Contact information
1.1.2 EPP information
1.1.3 Program options
Section 2. Program Completers
2.1 How many candidates completed programs that prepared them to work in preschool through grade 12 settings during Academic Year 2017-2018?
Enter a numeric value for each textbox.
2.1.1 Number of completers in programs leading to initial teacher certification or licensure1
2.1.2 Number of completers in advanced level programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.)2
Total number of program completers 0
1 For a description of the scope for Initial-Licensure Programs, see Policy 3.01 in the Accreditation Policy Manual 2 For a description of the scope for Advanced-Level Programs, see Policy 3.02 in the Accreditation Policy Manual
Section 3. Substantive Changes
Have any of the following substantive changes occurred at your educator preparation provider or institution/organization during the 2017-2018 academic year?
Explanations have a 600 character limit, including spaces.
3.1 Changes in the established mission or objectives of the institution/organization or the EPP
Change No Change / Not Applicable 3.2 Any change in the legal status, form of control, or ownership of the EPP.
Change No Change / Not Applicable 3.3 The addition of programs of study at a degree or credential level different from those that were offered when most recently accredited
Change No Change / Not Applicable 3.4 The addition of courses or programs that represent a significant departure, in terms of either content or delivery, from those that were offered when most recently accredited
Change No Change / Not Applicable 3.5 A contract with other providers for direct instructional services, including any teach-out agreements
Change No Change / Not Applicable Any change that means the EPP no longer satisfies accreditation standards or requirements: 3.6 Change in regional accreditation status
Change No Change / Not Applicable 3.7 Change in state program approval
Change No Change / Not Applicable
Section 4. Display of Annual Reporting Measures
Annual Reporting Measures (CAEP Component 5.4 | A.5.4)
Impact Measures (CAEP Standard 4) Outcome Measures
1. Impact on P-12 learning and development (Component 4.1)
5. Graduation Rates (initial & advanced levels)1
2. Indicators of teaching effectiveness (Component 4.2)
6. Ability of completers to meet licensing (certification) and any additional state requirements; Title II (initial & advanced levels)
3. Satisfaction of employers and employment milestones (Component 4.3 | A.4.1)
7. Ability of completers to be hired in education positions for which they have prepared (initial & advanced levels)
4. Satisfaction of completers (Component 4.4 | A.4.2)
8. Student loan default rates and other consumer information (initial & advanced levels)2
4.1 Provide a link or links that demonstrate data relevant to each of the Annual Reporting Measures are public-friendly and and prominently
displayed on the educator preparation provider's website.
See section 4 in AIMS.
4.2 Summarize data and trends from the data linked above, reflecting on the prompts below.
What has the provider learned from reviewing its Annual Reporting Measures over the past three years?
An annual report is a comprehensive report on the activities throughout the preceding years. They give to shareholders and other interested people information about the candidate’s trends activities over time. Also gives information on how the School of Education (SoED) has been performing and how it expects to grow in the future. After reviewing, the Annual Reports from 2015-2018 we learn that there is a need to review all the programs to make them more attractive to the future prospects, maintain, and strengthen the undergraduate courses that lead candidates to pass the PRTCE test and expand the academic offering through the different modalities: online, hybrid, videoconference and telepresence. In addition, the SoED included in its work plan a provision to strengthen the Preschool Program and the marketing for Undergraduate and Graduate Programs using the Facebook platform. Another strategy that the School of Education will implement is the design of courses in the online platform (Educational Administration, English as second language, Bilingual Education and Autism Certification, Educational Leadership). In addition, SoED has concluded that is important to establish an Electronic Assessment System (EAS) in all the undergraduate and graduate programs to maintain the effectiveness and quality of the school programs.
Discuss any emerging, long-term, expected, or unexpected trends?
After reviewing, the Annual Reports from 2015-2018 we have seen a decrease for candidates that enrolls in the School of Education. The student decrease is a result of many external factors. The recuperation after Hurricane Maria has been very difficult for some families in Puerto Rico. Some of them are still struggling with the primary services of electricity, water and housing. Knowing all this circumstances, the institution has offered some supporting services to the student population. Because of that, the School of Education started in June 2018 an aggressive marketing to promotes its entire programs and attract new prospects to enroll in its programs. In addition, to the review of all programs and courses as part of the efforts to attract students to the school. On the other hand, the Puerto Rico Department of Education (PRDE) is facing a crisis in the recruitment of specialized teachers. They are planning to send its specialty faculty to study and complete courses in education. In addition, we have seen an increase in the development of teachers in the area of Special Education. Because of this, the School of Education (SoED) has started a plan to promote the sign language program to certify teacher in this area. These efforts are align with the PRDE. Recently it communicates to the University members that participate the needs to have faculty in the all the Subject Areas. There are few teachers in this area to replace the ones that are about to retire during this year.
Discuss any programmatic/provider-wide changes being planned as a result of these data?
After a careful review of all the data needed to submit in the SSR and AR, we found that there is a need for an Electronic Assessment System (EAS) to structure the data collection and assessment results. The SoED decided to develop an electronic data collection system as part of the School of Education Quality Assurance System (SoEDQAS). The data system will accumulate the information of three or more cycles of administration and collection of all assessments as required in Standards 1, 3, and 4. In addition, the data will be publish in a dashboard on the SoED webpage (that is under revision to update it), and will be available for all SoED faculty and staff to review on a continuous basis. In addition, the Blackboard Platform will be used to maintain data and relevant information available to the professors and other administrative personnel to help gather information about students through specific reports in the system. The University Student Data System (Banner) accumulates a lot of information, not public, but available to SoED Dean and professors in case they need it. The information that Banner System includes is registration, grades, alumni information, and student personal information. The revision of programs, manuals and syllabus is another change that is taking place now. Another change that is transformed as a plan is the consideration of accrediting its entire programs under SPA’s recognition. This will maintain the programs quality and updated.
Are benchmarks available for comparison?
The Puerto Rico Teacher Certification test (PRTCE/PCMAS) and Teacher Preparation Test (PPM/SIAAM) are test used as benchmarks to determine program effectiveness. These results are widely spread and are used to make decisions about the effectiveness of the programs. In Evidence M4.1 page 7, table 1 indicates the passing rate for the general test is 89 and for the specialty indicates different rates. For Social Studies/History 85, for Spanish 85, For English 80, for Math 80, for Science 80. Table 3 on page 11 of the Institutional Report of the Teacher Certification Tests (PRTCE/PCMAS), presents the averages, medians, standard deviation (SD), minimum score, maximum score, the percentage of candidates who passed the exam, the total of candidates examined (N), as well as the internal consistency reliability index and the measurement error of the total sample of candidates examined. For General Test the Cronbach alpha was .88 wit +/- 5.5 (Evidence M4.1). In Evidence M4.2 on page 5 of The Report of the Comprehensive Test (PPM/SIAAM) present the passing score for the Fundamental Competencies and the Professional Competencies, that is 89. In addition, it presents the Approval Rate for each UAGM Institution. This number serve as a point of comparison to determine how well our institution is doing. Our Institution has an approval rate of 77% for Fundamentals Competencies and 72% for Professional Competencies. Page 6 presents the averages, medians, standard
deviation (SD), minimum score, maximum score, the percentage of candidates who passed the exam, the total of candidates examined (N), as well as the internal consistency reliability index and the measurement error of the total sample of candidates examined. The test reliability for Fundamental Knowledge, Kuder-Richardson Formula 20 is .63 and Cronbach alpha coefficient is .63 and for Professional Competencies, Kuder-Richardson Formula 20 is .59 and Cronbach alpha coefficient is .59 for November 2018 test (Evidence M4.2). According to the metric scale a 0.69 to 0.60 questionable reliability. This could be because of the low amount of participants.
Are measures widely shared? How? With whom?
The SoED at this moment is updating the information in its webpage. The information that will be available in this page are Title II report, PPM results, PRTCE results, SoED SSR and Annual Report. This information will be shared in the page for the PRDE, Title II, Stakeholders, Faculty and Students to review. However, the SoED keep the faculty updated with these results through the faculty meetings held regularly. There is more information about the Annual Reporting Measures (CAEP Component 5.4 | A.5.4) in the report attached.
Section 5. Areas for Improvement, Weaknesses, and/or Stipulations
Summarize EPP activities and the outcomes of those activities as they relate to correcting the areas cited in the last Accreditation Action/Decision Report.
TEAC: Weakness [Teacher Education] 0.1 Evidence of candidates' subject matter knowledge
Evidence of candidate achievement in subject matter knowledge is uneven across content areas. (**Important Note: Some of the evidence cited are in Spanish but they have comments in English to help with the discussion of the data.) The decrease in the number of candidates enrolled in the Teacher Preparation Programs may be the reason to cause that the Evidence of candidate achievement in subject matter knowledge is uneven across content areas. This is the result of several external factors unrelated to the efforts that made by the SoED and the institution to attract new candidates to the Teacher Preparation Programs. One External factor that contribute to the tendency to decrease on the amount of candidates in the all the Subject Specialties and in the teacher profession in general is the fact that we continue dragging the aftermath of what left Hurricane Maria in its wake. Hurricane Maria worsened the economic situation that Puerto Rico has been experiencing for several years. As a result, there are families who have migrated to the United States and continue to migrate seeking better employment opportunities due to the closure of many businesses. This has affected the enrollment of all the institutions in Puerto Rico that prepare teachers. Another external factor that could affect the enrollment of candidates in Teacher Preparation Programs is the candidate study preference. In an article found in Universia (2010) mention that the 10 occupations with the highest demand and the best salary in Puerto Rico and been a teachers is not one of them (http://noticias.universia.pr/educacion/noticia/2016/09/27/1143609/10-ocupaciones-mayor-demanda-mejor-salario-puerto-rico.html). A 2017 report from the Department of Labor and Human Resources of Puerto Rico indicates that there are currently 15,260 teachers in the areas of Preschool, Kindergarten and Elementary, 11,520 teachers for middle school, high school and technical education and more than 5,090 Special Education teachers for all the schools in Puerto Rico (Evidence S5.1, S5.2, S5.3). The trend seen in these numbers is a decrease in the number of teachers as they enter the specialties. Another factor is the change in the PRDE working conditions. The traditional format based on a Primary Level (K-3rd grade), Elementary Level (4th-6th grade), and Secondary Level (Intermediate 7th-9th and High School Level 10th-12th grade) changed. The new format based on two levels: Elementary Level (K-8th grade) and Secondary Level (9th-12th). This change the availability of teacher positions in the PRDE is decreasing because they are closing schools due to the decrease in student population and the restructuration of PRDE and the demographic changes due to economic recession in Puerto Rico. All this factors has had an impact not only on all education programs, on all campus programs. Currently in the country, there is a need for teachers in all specialized subjects as expressed by the Secretary of Education of Puerto Rico. All specialty areas have become areas of difficult recruitment. For this reason, the Secretary of Education
is taking measures through alternate routes. Because of this, SoED is continually making efforts to increase the candidate’s population by reviewing the programs, courses and rubrics of each program and with the accreditation of its programs through the SPA’s, for both levels. In addition, has started an aggressive campaign through Facebook to attract candidates to the programs offered in the SoED. Despite the decrease in the number of candidates enrolled in the Teacher Preparation programs, SoED maintains the quality of its programs as evidenced by the results of the PRTCE (Evidence M4.1, page 11). In addition, all the programs have maintained candidates. In conclusion, the action plan for the SoED is a) continue to offering teacher preparation programs to serve the country's education, b) improve offers based on new trends and changes in the country, and c) make a plan to start offering online courses.
TEAC: Weakness [Educational Leadership] 2.1 Rationale for assessments
The faculty’s evidence in support of its rationale for the validity of its assessments is not fully developed. After reviewing the data available to build the SSR the SoED decided to start in August 2018, a plan to collect evidence to support the rationale for the validity of its assessments. In the SSR we include the Phase in Plan (Appendix A) submitted in March 4, 2019, in its explain the timeline to collect all the data necessary to demonstrate validity and reliability of the its instruments used to assess candidates performance. In Appendix C, page 41, is a timetable for other activities needed in order to complete the assessment validity and reliability. During 2018-2019, the first semester there was a pilot project using past semester rubrics to determine validity and reliability of the instrument as you can see on Appendix B. We will continue to collect the data of the subsequent terms to determine the validity and reliability of the rubrics use to determine candidate performance. In section six (6), you can find more information about the validity and reliability pilot project. The Assessments Revision and Statistical Analysis is important to measure the candidate’s performance. Examine that the rubrics used in SOED have validity and reliability and comply with the sufficient level or above on the CAEP Evaluation Framework for EPP-Created Assessments is a priority. We set a goal to have At least 75% of EPP-created assessments used in the QAS are scored at the sufficient level or above on the CAEP Evaluation Framework for EPP-Created Assessments, with particular attention to content validity. The pilot project started in March 2019. In addition, a Phase –in Plan (Appendix A) specify the revision for the assessments including the validity and reliability.
Section 6. Continuous Improvement
CAEP Standard 5 The provider maintains a quality assurance system comprised of valid data from multiple measures, including evidence of candidates' and completers' positive impact on P-12 student learning and development. The provider supports continuous improvement that is sustained and evidence-based, and that evaluates the effectiveness of its completers. The provider uses the results of inquiry and data collection to establish priorities, enhance program elements and capacity, and test innovations to improve completers' impact on P-12 student learning and development. CAEP Standard 5, Component 5.3 The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes.
6.1 Summarize any data-driven EPP-wide or programmatic modifications, innovations, or changes planned, worked on, or
completed in the last academic year. This is an opportunity to share targeted continuous improvement efforts your EPP is
proud of. Focus on one to three major efforts the EPP made and the relationship among data examined, changes, and
studying the results of those changes.
Describe how the EPP regularly and systematically assessed its performance against its goals or the CAEP standards. (CAEP 5.3)
The SoED Quality Assurance System (SoEDQAS) describes the School of Education (SoED) capacity to reach the mission and goals using an analysis of the evidence, and that same capacity provides access to evidence that informs all other standards and have valid data from multiple measures. In that way, it guarantees the quality of the undergraduate and graduate programs in our school. It also document measures used in initial-licensure programs, programs at the advanced level and other measures used to demonstrate the knowledge, skills, and dispositions of candidates and program candidates on P-12 student learning. The SoEDQAS allows the SoED leaders to engage in continuous improvement, that is sustained and evidence-based, that help to identify strength and weaknesses in order to set priorities that lead to enhance programs and pursue innovations in order to improve the candidates effectiveness on P-12 student development. SoEDQAS uses a variety of assessments to monitor candidate progress, the candidate’s achievements, and provide operational effectiveness. It uses programs like SPSS and Excel to help collect, store, and analyze data. These technologies used as database help to record all the data but they are not in a structured system. SoED is planning to incorporate a data collection system to facilitate the data collection and analysis. All retrieve data come from the institution's student information system (SAP), and generated reports on these data. The SoED uses various measures to determine its performance against its goals and CAEP
standards. The measures used are the PPM results, PRTCE results, Feedback Survey Level 1: Initial-Beginner, Feedback Survey Level 2: Pre-Professional, Feedback Survey Level 3: Professional, and Employer Satisfaction Surveys (Directors, Cooperative Teacher, Cooperative Director and Superintendent). The information collected from these instruments help the SoED to determine the degree of performance and satisfaction of its programs.
For the PPM during April 2018, 81% of the candidates approve the fundamentals knowledge part and 72% of the candidates approve the professional part of the test (Evidence 6.1.1, page 5). In November 2018, 77% of the candidates approve the fundamentals knowledge part and 72% of the candidates approve the professional part of the test (Evidence 6.1.2, page 6). This indicates how the content has been assimilated so far. This information is use to determine if the candidate is suitable to continue to the Practicum and it is also use to determine if there is a need to reinforce any of the skills necessary for the Practicum.
The Puerto Rico Teacher Certification Exam (PRTCE) results provides evidence that demonstrate the program’s effectiveness (Evidence M4.1, page 11). The 2014-2015 Annual Report (AR) shows on page 7 that 77 candidates out of 84 pass the PRTCE test (Evidence 6.1.2). The 2015-2016 AR shows on page 18 shows that 59 out of 66 candidates pass the for the PR 21 – Elementary Level and 11 candidates out of 11 pass the PR 25 – Secondary Level (Evidence 6.1.3). The 2016-2017 AR shows on page 7that 40 candidates out of 45 pass the PRTCE test (Evidence 6.1.4). The College Board prepares this exam and it was administer to all students authorized by SoED to take the test. The PR10 exam is the component of the Puerto Rico Teacher Certification Exam (PRTCE) that measures concepts related to the general education component of the program. Since March 2015, PR10 is part of the General PRTCE that includes both, the general education and the professional pedagogical component. The cut-off score of the test is 89 points in a scale from 40 to 160 points (theoretical mean of 100 points). There was achieved the expected 75% of students approving the test. There were No statistically significant differences were identify between groups after an Independent Samples t-Test analysis. Evidence 6.1.2, page 5 present that in 2014-15, 91.67% of students pass the test. Evidence 6.1.3 page 18 present that in 2015-2016, 91.18% pass the test, and in Evidence 6.1.4 page 15 shows that in 2016-2017, 88.89% pass the test. Evidence 6.1.3 page 11 present that in 2017-2018, 88.80% pass the test.
The Feedback Survey Level 1: Initial-Beginner (Evidence 6.1.5 page 1), Level 2: Pre-Professional (Evidence 6.1.5, page 3) and Level 3: Professional (Evidence 6.1.5, page 5), are designed to gather information about candidates knowledge, skills, and dispositions/values and if the programs gives the candidate opportunities to demonstrate their knowledge. These three instruments also gather information about the program effectiveness. The instruments have one question that ask the candidates; in what areas should we strengthen ourselves? Because this is an open question, the candidates mention a series of areas but surprisingly one of the topics that was brought out more was the lesson planning. This is taken in consideration by the supervisors.
The Employer Satisfaction Survey (Evidence 6.1.6) and the Cooperative Teacher Satisfaction Survey (Evidence 6.1.7, page 25) for Initial Programs provide the information and insights needed to keep completers and employers pleased with the procedures and measures taken in the Teacher Preparation Program and attract new students. This is the first time that this kind of instrument is administer by the SoED. The completers/graduated and employers are a great source of experience information and often know what needs to improve the SoED programs and services. A focus on what the employers and candidates need is imperative in achieving total satisfaction, loyalty and ultimately overall programs improvement. While most institutions are aware of the need to act and continually improve and become more competitive, they often miss important insights from employers and students, which used to make decisions, which will make a difference for the experience of the both. That is where the employer satisfaction surveys come in, uncovering the hidden insights and data needed to improve SoED programs to compete more efficiently. This instrument is made of eight criteria that covers aspects of the teacher professional performance in the classroom and in the school as a whole. During the first semester of 2018-2019, we start the collection of this survey. From 12 participants that answer the survey the 100% answer that SoED comply for the criteria “The teacher preparation programs of the School of Education of the Universidad del Turabo (UAGM) develop the necessary professional skills to be an effective teacher in the classroom”. 100% answer that SoED comply with the criteria “The School of Education of the Universidad del Turabo (UAGM) is the leader in developing high quality teachers”. 100% answer that the SoED comply for the criteria “The School of Education of the Universidad del Turabo (UAGM) is recognized for its excellence in the preparation of future educators who, through their knowledge, skills, and dispositions, direct them towards the transformation of education”. This reflect that the employers are aware that our programs prepare candidates with the knowledge, skills, and dispositions necessary to be successful in their work area and hired in their Schools. For the criteria “Plan instruction based on knowledge of the subject, incorporating a variety of teaching strategies that promote the development of critical thinking, problem solving and reflection in students aligned with the Expectations of the Department of Education”, the 41.6% evaluate the performance of the completers as Outstanding, 58.3% as Very Effective. For the criteria, “Recognize individual needs and provide instructional experiences that address diversity (learning styles and multiple intelligences)”, the 33.3% evaluate the performance of completers as Outstanding, 58.3% as Very Effective and 8.3% as Effective. For the criteria “Demonstrates mastery of classroom management and use of instructional time”, the 41.6% evaluate the performance of the completers as Outstanding, 58.3% as Very Effective. For the criteria, “Uses technology to facilitate and improve the teaching-learning process and improve student performance”, 50% evaluate the performance of completers as Outstanding, 41.6% as Very Effective and 8.3 as Effective. For the criteria, “Properly uses formative and summative assessment strategies to determine student progress and direct the teaching-learning process”, 41.6% evaluate the performance of the completers as Outstanding, 58.3% as Very Effective. For the criteria, “Model standards of appropriate behavior and use effective strategies to establish and maintain codes of conduct by responding in an appropriate and respectful manner to students”, 58.3% evaluate the performance of the completers as Outstanding, 41.6 as Very Effective. For the criteria, “Reflect and evaluate the teaching-learning process, its performance and professional practices”, 41.6% evaluate the performance of completers
as Outstanding, 50% as Very Effective and 8.3% as Effective. For the criteria, “Knows, understands, collaborates and is committed to the school, the family and its community environment as a system of mutual influences for teaching and learning”, 41.6% evaluate the performance of completers as Outstanding, and 58.3% as Very Effective.
This survey was administer also to the Cooperative Teacher (Evidence 6.1.7). Only 12 Cooperative Teachers answer the survey. For the criteria “Plan instruction based on knowledge of the subject, incorporating a variety of teaching strategies that promote the development of critical thinking, problem solving and reflection in students aligned with the Expectations of the Department of Education”, the 33% evaluate the performance of the completers as Outstanding, 42% as Very Effective and 25% as Effective. For the criteria, “Recognize individual needs and provide instructional experiences that address diversity (learning styles and multiple intelligences)”, the 42% evaluate the performance of completers as Outstanding, 42% as Very Effective, 8% as Effective and 8% Little Effective. For the criteria “Demonstrates mastery of classroom management and use of instructional time”, the 33% evaluate the performance of the completers as Outstanding, 50% as Very Effective and 17% as Effective. For the criteria, “Uses technology to facilitate and improve the teaching-learning process and improve student performance”, 50% evaluate the performance of completers as Outstanding, 50% as Very Effective. For the criteria, “Properly uses formative and summative assessment strategies to determine student progress and direct the teaching-learning process”, 42% evaluate the performance of the completers as Outstanding, 42% as Very Effective and 17% as Effective. For the criteria, “Model standards of appropriate behavior and use effective strategies to establish and maintain codes of conduct by responding in an appropriate and respectful manner to students”, 58% evaluate the performance of the completers as Outstanding, 25% as Very Effective and 17% as Effective. For the criteria, “Reflect and evaluate the teaching-learning process, its performance and professional practices”, 67% evaluate the performance of completers as Outstanding, 17% as Very Effective and 17% as Effective. For the criteria, “Knows, understands, collaborates and is committed to the school, the family and its community environment as a system of mutual influences for teaching and learning”, 67% evaluate the performance of completers as Outstanding, 25% as Very Effective and 8% as Effective. This reflect that the employers are aware that our program is preparing candidates with the knowledge, skills, and dispositions necessary to be successful in their work area and hired in their Schools. 100% answer that SoED comply for the criteria “The teacher preparation programs of the School of Education of the Universidad del Turabo (UAGM) develop the necessary professional skills to be an effective teacher in the classroom”. 100% answer that SoED comply with the criteria “The School of Education of the Universidad del Turabo (UAGM) is the leader in developing high quality teachers”. 100% answer that the SoED comply for the criteria “The School of Education of the Universidad del Turabo (UAGM) is recognized for its excellence in the preparation of future educators who, through their knowledge, skills, and dispositions, direct them towards the transformation of education”. We will continue the collection of this information using this instrument. More data will be available on the accreditation visit.
What innovations or changes did the EPP implement as a result of that review? (CAEP A.5.3)
That we recognized that is a lot of information that have to be manage, the SoED has a phase-in plan to develop an electronic data collection system as part of the SoEDQAS. The data system will accumulate the information of three or more cycles of administration and collection of all assessments (see assessment data in Standards 1, 3, and 4). In addition, the data will be publish in a dashboard on the SoED webpage, and will be available for all SoED faculty and staff to review on a continuous basis. The Blackboard Platform will be used to maintain data and relevant information available to the professors and other administrative personnel to help gather information about students through specific reports in the system. In addition, the University Student Data System (Banner) accumulates a lot of information, not public, but available to SoED Dean and professors in case they needed. The information that Banner system includes is registration, grades, alumni information, and student personal information. Another innovation added to the SoEDQAS is the implementation of the satisfaction surveys and the feedback surveys at all levels. The administration of these surveys were implemented during last semester (August to December). The satisfaction survey was implemented from November to December and the feedback survey was implemented in December. Because the instruments were new the response for the satisfaction survey was low, the response for the feedback was complete because this instrument is administer in the three levels to each student that assist to EDUC 106, EDUC 401 and the practicum. We are planning to administer both surveys each semester.
How are progress and results tracked? How will the EPP know the degree to which changes are improvements? (CAEP A.3.1)
The collected information is actionable if it supplies who, what, when, where, and why that allows one to determine how to change current practice(s) to achieve the intended goal. The data collected using SoEDQAS will be accessible on the SoED website. This page is under revision to update it (http://ut.suagm.edu/es/educacion). Because of the revisions made during the Continuous Improvement Cycle (Evidence 5.1.1, page 1), decisions about the strengths and areas for SoED will implement growth within each program. Program faculty could review the data and make suggestions of changes that is incorporated to program. QEMEC and other committee review data across programs to identify and suggest any change necessary. Using the instruments provided in standard 1and 2, the faculty provide feedback to candidates (actionable) that is directly related to the preparation program and can be used for program improvement. One of the measures that categorized the changes, as improvements are the PRTCE Test in the Initial Program, and in the Advance Programs the project in the Master Degree Program and the Doctoral Dissertation in the Doctoral Program. The results of these measures are indicators of how effective the SoED is regarding the teacher preparation program and the Director preparation program.
The Retention Office (RO) (Evidence 6.1.8) has developed a plan to follow up the student performance and help with students at risk. The office establish an institutional work plan to improve retention, supervise and evaluate the development of the work plan, make recommendations on the work plan and the strategic retention plan and review policies and regulations that affect student retention processes and provide recommendations. In addition, students advised in their academic planning effectively to short and long term, ensure that the student complies with the pre-requirements of the programs and its courses. Also, create and maintain the student's updated record day through the existing physical and web mechanisms, document and intervene and record through student banner and other mechanism for the continuous monitoring of processes. The processes are such as attendance at courses, referrals of faculty and monitoring of populations in academic risks among others, give support and integrate to the processes designated by Retention Links, and evaluate students academically to confirm that they complete and comply with satisfactory academic progress and graduation requirements.
Aligned with the RO and the roles and responsibilities to comply with the teaching-learning experience, the faculty will participate in the assessment processes of SoED. Also, will be responsible for the assessment processes of their courses. Will help students to learn experiences in compliance with the course syllabus approved by the School or academic unit. Will be responsible for giving the students the outline or course guide in digital or printed format at the beginning of each academic period. Will recognize tolerance, respect for discrepancy, differences in criteria and acceptance of criticism as essential elements in the teaching and learning process; and will use the most innovative approaches, methods, strategies and techniques in the teaching and learning process. These responsibilities help keep the student interested in his courses and take him to complete his degree because they see the effort and dedication of the teacher in his courses (Evidence 6.1.9).
What quality assurance system data did the provider review? (Standard 5, A.5.1, A.5.2, A.3.)
The SoED uses various measures to determine its performance against its goals and CAEP standards. The measures used are the PPM results, PRTCE results, Feedback Survey Level 1: Initial-Beginner, Feedback Survey Level 2: Pre-Professional, Feedback Survey Level 3: Professional, and Employer Satisfaction Surveys (Directors, Cooperative Teacher, Cooperative Director and Superintendent), among others. The information collected from these instruments help the SoED to determine the degree of performance and satisfaction of its programs.
During this revision, we pay attention to the validity and reliability of the instruments used to evaluate candidate’s performance. Appendix A explain the Phase-In plan design to collect more data to validate all the instruments across time. In this phase in plan, we have set the process to collect this data. One of the instruments that need to be validated is the SOEDAS Assessment of Competencies for Initial, Pre-Professional and Professional Level. During the first semester 2018-2019, the SoED run a pilot project using a sample of
this instrument from the first semester of 2016-2017 and another sample from the first semester of 2018-2019. To assess the quality of the intra-class correlation the following metric scale was used.
Less than .70 not acceptable reliability
.70 to .79 acceptable reliability
.80 to .89 good reliability
.90 and more excellent reliability
As seen in Appendix B seven (7) of the 8 rubrics presented have values that represent an excellent Intra-Class Correlation, one (1) rubric had excellent reliability, one (1) had good reliability and two (1) had non-acceptable reliability. The rubric in question is the Student Evaluation Rubric that is use in the Course EDUC 515: Practicum in School Administration and Supervision during the candidate Clinical Experience. One of the reasons for the low level of intra-class correlation scores can be attributed to the very small sample of six (6) instruments where a student's score can carry the weight of the evaluation. Another reason may be the level of understanding of the evaluator, it is recommended to guide the evaluator before administering the instrument. Item by item analysis of the rubrics with non-acceptable Intra-Class Correlation are presented in Appendix B. The rubric has 20 items from which the analysis was done with only six (6) items. The other variables were excluded because they has zero variance and where removed from the scale. As you can see there is no variability between the items selected and the standard deviation is low (.408). This means that the results are very close to the Mean that is 4.83.
To analyze how consistently the rubrics, measure the expected concepts among the evaluations completed, an internal consistence reliability test was performed on all 8 rubrics by calculating the Cronbach’s alpha. The results of the analysis are presented in Appendix B. The following metric scale was used to assess the quality of the internal consistence for every rubric:
Cronbach alpha • 0.90 or greater -excellent reliability • 0.89 to 0.80 Good reliability • 0.79 to 0.70 acceptable reliability • 0.69 to 0.60 questionable reliability • 0.59 to 0.50 poor reliability • Less than 0.50 unacceptable reliability
For the eight (8) rubrics studied six (6) have excellent reliability, one (1) has good reliability, and one (1) has poor reliability. The rubrics with poor reliability is the Portfolio Rubric. This rubric is use to evaluate student’s evidence of performance in the Clinical Experience (Practicum). We plan to review the rubric and continue to gathering information to evaluate and correct the rubric.
What patterns across preparation programs (both strengths and weaknesses) did the provider identify? (A.3.2)
To have a data driven culture we need to establish a systematic procedure that collects, stores, analyzes, and reviews data relevant to Standard A.3 on applicants, enrollees, and exiting candidates, including data that address CAEP’s cross-cutting themes of diversity and applications of technology. To have Evidence-based practice it must include the implementation of the findings of the data collected. Validity is defined as the extent to which a concept is accurately measured in a quantitative study (Heale & Twycross, 2015). To demonstrate validity an instrument must have 3 characteristics: 1 Homogeneity (the instrument measures one construct) 2 Convergence (the instrument measures concepts similar to that of other instruments) 3 Theory evidence (is when the instrument measured what need to be measured). In addition, instrument reliability is very important. The instrument must be consistent and homogenous. The SoED has phase-in plans to improve the validity and reliability of its assessments and to collect the required data. One measure taken by the Dean is to review all the rubrics to comply with the minimum data needed to decision making. In order to do that Evidence 10 page 1, details a phase in plan to work with the improvement of rubrics and other instruments to increase the validity and reliability of each instrument. During 2018-2019, the first semester there was a pilot project using past semester rubrics to determine validity and reliability of the instrument as you can see on Appendix B. We need more data to demonstrate that the instruments sustain the rationale for validity. To comply with this objective, the Dean approve the use of rubrics that were developed and were out of use with time. To demonstrate that these rubrics are suitable to gather data about the student performance it is necessary to continue the administration of the instrument. To ensure content validity and validate the interpretations made of the data, the faculty will help with the administration and process. To determine its validity we will evaluate the internal consistency of the instruments. According to Kumar (2017) Internal Consistency Reliability is a measure of reliability used to evaluate the degree to which different test items that probe the same construct produce similar results. It examines whether or not the items within a scale or measure are homogeneous. Faculty will review data after each semester and individual results that raise concern will be addressed with a meeting between all the faculties. The faculty will either provide support depending on the results for each rubric. Another measure that has been taken immediately is the Practicum Manual at all levels. This will guarantee that the data that needs to be collected will be analyzed and shared every semester with the stakeholders and faculty. This will help in the decision making process over the programs. Another step that has been taken by the SoED Dean is the creation and administration of the necessary surveys and instruments to gather the data that will support each standard and the validity and reliability of the data collected.
The SoED Dean decided as part of the plan to develop various workshops for the faculty to help in the process of gathering the data needed. One important workshop to be developed is the Rubrics workshop, in it will be included the requirements of minimum information that needs the rubric in order to enhance the evaluation of the student and to comply with the standards More data and information regarding rubrics and surveys validity and reliability will be available for the accreditation visit.
How did the provider use data/evidence for continuous improvement? (A.5.3)
SoEDQAS (Evidence 5.1.1, page 1 & 2) as described before (sections 5.1 and 5.2), ensures that data are systematically collected,
analyzed, monitored, and reported throughout the academic year. Program faculty and the different committee members review data
during their meetings; they review the data and QEMEC committee program data annually. The committees uses SoEDQAS Internal
Audit Rubric (Evidence 5.1.1, page 13) to establish the criteria that must be examine when evaluating programs, is a flexible document
that could be adapted to the program to be evaluated as necessary. After the committee use this document then write the SoEDQAS
Internal Audit Report (Evidence 5.1.1, page 10). This document gathers the changes suggested according to the area to be review with
a justification of why they suggest the change. All data collected within the continuous improvement process are tracked over time.
Assessment data included in Standards 1 to 4 are shared annually with faculty, and are posted on the SoED web site to ensure
monitoring and review of data and to all those interested. Evidence 5.1.2 is an example of the checklist use by all the institution schools
for the new program creation, this checklist help to include all the information needed in the creation of a new program.
Additionally to the efforts of each committee, the institution has in place an AP to guarantee program effectiveness and student
learning. The purpose of the Institutional Effectiveness and Student Learning AP (IESLAP, Evidence 5.3.1), is to delineate the process
of assessment for overall institutional effectiveness. Additionally, it aims to promote the integration of planning and assessment at
the institutional level and each activity or academic area. The plan outlines efforts at the institutional level and provides guidelines for
operational units of the institution to develop AP’s as part of their respective work plans. IESLAP also drives a continuous improvement
process that focuses on the critical areas of the university performance. This is a comprehensive process focusing on seven activity
areas: UAGM Gurabo Campus Schools/ Programs, Student Services, Additional Locations/Branches Campuses, Research and External
Sponsors, Information Resources, Internationalization, and Support Services. These areas are tied to all the standards of accreditation
identified by the Middle States Commission on Higher Education. This Plan is a working tool for the UAGM Gurabo Campus schools
and administrative offices. The document indicates how to conduct assessment in a practical, cost efficient and effective way. For
practical reasons, this document is divide in two parts: Assessment of Student Learning and the Assessment of Institutional
Effectiveness. UAGM Gurabo Campus also recognizes that assessment, planning, and fiscal matters are interrelated. Thus, assessment
results yield recommendations and the implementation of improvement efforts and is a starting point for institutional, school and unit
planning and budgeting.
The purpose of Student Learning Assessment is the review of the learning experiences of our students at UAGM Gurabo Campus, and
its focus is to guide academic programs in the development of student learning outcomes and evaluate what students should learn.
Therefore, assessment of student learning shall be primarily course- embedded and school/program-based. Academic assessment
ensures that schools reviews contribute in a fundamentally important way to the achievement of the Institution's Mission. The
assessment of student learning must demonstrate that the institution's students have knowledge, skills, and competencies consistent
with institutional goals and those candidates at graduation have achieved appropriate higher education goals [MSCHE]. The
assessment of student learning at UAGM Gurabo Campus is a decentralized process by which faculty in each academic department or
program, at both the undergraduate and graduate levels, identify key learning outcomes, determine how outcomes will be measured,
carry out assessment activities, analyze results, and use those results in program planning to improve student learning.
A report is sent to Associate Vice-Chancellor of Evaluation and Evaluation of Teachers, for each selected course section will be produce
following the format established (AR-1 - Course Level Assessment Report, Evidence 5.3.2 page, 24). This report must be complete and
submit by the professor to the leader of the academic program of the school. This course level report (AR-1) will serve to prepare the
assessment report of academic program (AR-2 - Program Level Assessment Report. Evidence 5.3.1, page 26). The Program Level
Assessment Report (AR-2) will be generate by the program leader and submitted to the dean of the school annually. Its purpose is to
continue with the process of closing the loop (Evidence 5.3.1, page 5). Then a report (AR-3) for each unit will be produce following the
format established for each activity area mentioned in the Continuous Improvement Process. The AR-3 (Evidence 5.3.1 page 28),
Assessment Report of school and additional locations, has to be reported by the dean of each academic school and by the director of
the additional locations. Evidence 5.3.3 show a sample of an AR-1.
SoED review AP’s during and, as part of, the cyclic reviews of assessment results. Therefore, the evaluation of AP shall be incorporate
into the assessment process itself and conducted on a regular basis. This review need not be complicated. It should lead to the
refinement or improvement of the plans and eliminate ineffective assessment practices that are likely to promote frustration and a
negative response to the assessment process. The AR1 and AR2 collected as evidence about student's development and learning
outcomes and used to make decisions about resource allocation in planning for the program effectiveness and the overall institutional
effectiveness. Furthermore, AR3 is use to improve academic programs, enhance the environment provided for teaching and learning
and measuring overall student success. To track students’ performance over time the SoED uses the Key Courses in each level to
ensure that the student is acquiring the necessary skills for the profession. For Undergraduate Program, the key courses are EDUC
106, 401, 435,436, and Practicum. For Master Degree Program, the Key course are EDUC 503, 504, 506, 510, 519, 520, 702, 705, and
Practicum. For Doctoral Degree Program the key courses are EDUC 801, 802, 804, 805, 806, and 807. Tracking the student grades,
rubrics, surveys and portfolio gives the idea of how the student is developing those necessary skills. These procedures are stated in
standard 1 and 2 (For Syllabus see Evidence 3.4.1 and 3.4.4).
How did the provider test innovations? (A.5.3)
The SoED Quality Assurance System (SoEDQAS) describes the School of Education (SoED) capacity to reach the mission and goals using an analysis of the evidence, and that same capacity provides access to evidence that informs all other standards and have valid data from multiple measures. In that way, it guarantees the quality of the undergraduate and graduate programs in our school. It also document measures used in initial-licensure programs, programs at the advanced level and other measures used to demonstrate the knowledge, skills, and dispositions of candidates and program candidates on P-12 student learning. The SoEDQAS allows the SoED leaders to engage in continuous improvement, that is sustained and evidence-based, that help to identify strength and weaknesses and test for innovations in order to set priorities that lead to enhance programs and pursue innovations in order to improve the candidates effectiveness on P-12 student development. SoEDQAS uses a variety of assessments to monitor candidate progress, the candidate’s achievements, and provide operational effectiveness. It uses programs like SPSS and Excel to help collect, store, and analyze data. These technologies used as database help to record all the data but they are not in a structured system. All retrieve data come from the institution's student information system (SAP), and generated reports on these data.
What specific examples show that changes and program modifications can be linked back to evidence/data?
The PRTCE results is an example that changes and program modifications can be linked back to evidence/data. Years before, the
results of the PRTCE were not as satisfactory as those obtained recently. After analyzing the data collected in relation to this test,
modifications were made in several courses that led to improve the results of this test. Part of the changes that were made at that
time was to add as part of a course the reviews for the PRTCE tests in addition to adding in a course the clinical experiences as a pre-
practice. These changes helped the candidate improve his performance in the skills necessary to be successful in the classroom.
How did the provider document explicit investigation of selection criteria used for Standard 3 in relation to candidate progress and
completion?
Candidates applying for admission to the Initial Programs at Universidad Ana G. Méndez, Gurabo Campus must meet the following requirements: graduate from secondary school licensed by the Puerto Rico Council on Education or its equivalent. Have the University Admissions and Assessment Tests (College Board) of College Entrance Examination Board (CEEB) taken or the placement test in the areas of Spanish, English and Math provided by the RO, some Schools in the institution may have other specific program requirements. The Undergraduate Programs Catalog for 2017-2018 (http://ut.suagm.edu/es/academia/catalogo, page 12-13) explain the General Admission Requirements. It says that High school students in their senior year can submit the admission application and provide evidence of their cumulative grade point average (GPA) (computed at the end of the first semester of the senior year) and their University Admissions and Assessment Tests (PEAU) of College Entrance Examination Board (CEEB), SAT or ACT test results. Students in their junior year of high school can start an early process by filling in the admission application and providing evidence of their cumulative GPA computed at the end of the second semester of their junior year. If there is, evidence of complaints with these additional requirements they must submitted to SoED. Admission requirements vary between specific colleges and programs. Evidence 3.1.4, page 17 explains other academic regulations for the Undergraduate Programs.
Candidates admitted to the Advance Programs at Universidad Ana G. Méndez, Gurabo Campus at the Master’s level programs must meet the following requirements. (1) Hold a Bachelor’s degree or an equivalent degree from an accredited institution of higher education. (2) Submit an official credit transcript with the application for admission. (3)Complete an interview process with the director/coordinator of the graduate program or his/her representative. (4) If required, submit three letters of recommendation, according to the program. (5) Submit an essay on a topic selected by the Committee, if required. (6) Whether is required by the School or a particular program, take one of the tests of admission to graduate studies offered by the Educational Testing Service, such as the Graduate Studies Admission Test (EXADEP), the Graduate Record Examination (GRE), or the Graduate Management Admission Test (GMAT). The test results are valid for five years, candidates for program with additional admission requirements, such as additional tests, interviews and licenses, certifications, must comply with these requirements. (7) Submit a $25.00 nonrefundable application fee. The candidate need to submit two letters of recommendation as part of the admission requirements.
On the other hand,, the retention of students has reached a priority and active role in the last years in our Institution, due to the decrease in the enrollment of students in the institution. Diversity of projects and initiatives have focused primarily on the first year experience and the continuous improvement of services for all students. As a strategic, global and integrated measure the institution strengthened, student services with a focus on academic performance. Additionally, the Retention Office (RO) has been restructure
functionally and organizationally. The model to follow contemplates its strategy in three dimensions: academic, student and administrative. As part of these dimensions are the review and creation of retention committees in each school and university centers, the review of the referral process and monitoring of the student at all academic levels, among other activities. We trust that the support received by our Associate Vice-Rector for Retention, with the implementation of these strategies, will provide the projected results. In the Student Follow-Up System Manual (Evidence 5.3.2). Aligned with the RO, the focus of SoED is the development of high quality teacher candidates. SoED is committed to developing reflective, collaborative and highly effective educational leaders that will help to transform education. The three cycles of data analyzed for sufficiency determination of Standard 3 components are for 2014-2015, 2015-2016 and 2016-2017. Nevertheless, many teacher preparation programs in Puerto Rico, including the SoED has experienced diminish in the number of enrolled students in the past years.
How did the provider document that data-driven changes are ongoing and based on systematic assessment of performance, and/or
that innovations result in overall positive trends of improvement for EPPs, their candidates, and P-12 students?
The SoED Quality Assurance System (SoEDQAS) describes the School of Education (SoED) capacity to reach the mission and goals
using an analysis of the evidence, and that same capacity provides access to evidence that informs all other standards and have valid
data from multiple measures. In that way, it guarantees the quality of the undergraduate and graduate programs in our school. It
also document measures used in initial-licensure programs, programs at the advanced level and other measures used to
demonstrate the knowledge, skills, and dispositions of candidates and program candidates on P-12 student learning. The SoEDQAS
allows the SoED leaders to engage in continuous improvement, that is sustained and evidence-based, that help to identify strength
and weaknesses in order to set priorities that lead to enhance programs and pursue innovations in order to improve the candidates
effectiveness on P-12 student development. The SoEDQAS is composed of the Quality and Effectiveness Management Executive
Committee (QEMEC). QEMEC analyze and evaluate the SoED work plan; evaluate, monitor and audit the SoEDQAS. Audit the
accreditation processes and specialized accreditations of the SoED programs, evaluate and analyze academic procedures of the
undergraduate and graduate levels. It evaluates and analyze matters of importance and academic impact and establish and promote
a culture of evidence, accountability and standardized processes. Help maintain a standard of a culture of effective communication,
ethics, professionalism, confidentiality, consensus and teamwork. Establish and promote a culture of prevention, integrality,
relevance, sustainability, communication and effectiveness. The advantages of this committee is that allow and safeguard the
strategic direction of the school, maintains a global vision of risk and triggers plans for its correct management, maintains a standard
of internal control of quality and effectiveness of the processes.
How was stakeholders' feedback and input sought and incorporated into the evaluation, research, and decision-making activities?
SoEDQAS uses a variety of assessments to monitor candidate progress, the candidate’s achievements, and provide operational
effectiveness. It uses programs like SPSS and Excel to help collect, store, and analyze data. These technologies used as database help
to record all the data but they are not in a structured system. SoED is planning to incorporate a data collection system to facilitate
the data collection and analysis. All retrieve data come from the institution's student information system (SAP), and generated
reports on these data. In addition, the data that is analyze to make the annual report come from the different offices that generate
the information. All the raw data collected is upload into the SPSS program database, in this way we generate our statistics for the
reports needed. In addition, the institution have Banner System that collect many data about candidates. SoEDQAS begins the
collection of data from the appropriate faculty, Cooperative Teacher, University Faculty Supervisor and professors during the
academic year using the assessments instruments listed in Evidence 5.1.1 page 3. At the end of each cycle, the data will be download
and disaggregated for each certification area. The SoED AD and the CAEP liaison organizes the data on spreadsheets for ease of
review by the SoED Dean, QEMEC, program faculty, and other SoED faculty. These reports and the reports from PPM/SIAAM
(Evidence 1.3.2, page 13) and PRTCE start the continuous improvement and reporting cycle. When the new academic year begins,
we share the data with program faculty, who review and analyze the data, if it is necessary QEMEC start to make program decisions.
Then, QEMEC share Feedback to all the SoED committees and faculty, and becomes another source of data for consideration. If
there is a need to make decisions, the different committees meet (Evidence 5.1.1, page 7), throughout the academic year, to finalize
decisions on curriculum and other program changes. The final step in the SoEDQAS is the QEMEC meeting with SoED Dean, AD and
Coordinators to discuss data across all programs, and identifies any challenges and solutions to those challenges that need to be
address. This could include revising or developing assessments and making changes in field and clinical experiences (Evidence 5.1.1,
page 2). The evidence-based decisions assures a well knowledge of information across the programs and that the interpretations of
data are valid and reliable. The multiple measures data collection used to report, modify, and evaluate the programs operational
effectiveness demonstrate how SoED satisfies all CAEP standards. SoEDQAS was establish to provide a system for the collection,
analysis, and share data for CAEP Standards 1, 3, and 4.
6.2 Would the provider be willing to share highlights, new initiatives, assessments, research, scholarship, or service
activities during a CAEP Conference or in other CAEP Communications?
Yes No
6.3 Optional Comment
Character limit: 1,000 per response, left: 1,000
Section 7: Transition
In the transition from legacy standards and principles to the CAEP standards, CAEP wishes to support a successful transition to
CAEP Accreditation. The EPP Annual Report offers an opportunity for rigorous and thoughtful reflection regarding progress in
demonstrating evidence toward CAEP Accreditation. To this end, CAEP asks for the following information so that CAEP can
identify areas of priority in providing guidance to EPPs.
7.1 Assess and identify gaps (if any) in the EPP�s evidence relating to the CAEP standards and the progress made on addressing
those gaps. This is an opportunity to share the EPP�s assessment of its evidence. It may help to use the Readiness for
Accreditation Self-Assessment Checklist, the CAEP Accreditation Handbook (for initial level programs), or the CAEP Handbook:
Guidance on Self-Study Reports for Accreditation at the Advanced Level.
If there are no identified gaps, click the box next to "No identified gaps" and proceed to question 7.2.
No identified gaps If there are identified gaps, please summarize the gaps and any steps planned or taken toward the gap(s) to be fully prepared by your CAEP site visit in the text box below and tag the standard or component to which the text applies. Character limit: 10,000 per response, left: Tag the standard(s) or component(s) to which the text applies. Not finished yet
7.2 I certify to the best of my knowledge that the EPP continues to meet legacy NCATE Standards or TEAC Quality Principles, as
applicable.
Yes No
7.3 If no, please describe any changes that mean that the EPP does not continue to meet legacy NCATE Standards or TEAC
Quality Principles, as applicable.
Section 8: Preparer's Authorization
Preparer's authorization. By checking the box below, I indicate that I am authorized by the EPP to complete the 2019 EPP Annual Report.
I am authorized to complete this report.
Report Preparer's Information
Name: Position:
Phone: E-mail:
I understand that all the information that is provided to CAEP from EPPs seeking initial accreditation, continuing accreditation or having completed the accreditation process is considered the property of CAEP and may be used for training, research and data review. CAEP reserves the right to compile and issue data derived from accreditation documents.
CAEP Accreditation Policy
Policy 6.01 Annual Report
An EPP must submit an Annual Report to maintain accreditation or accreditation-eligibility. The report is opened for data entry each year in January. EPPs are given 90 days from the date of system availability to complete the report.
CAEP is required to collect and apply the data from the Annual Report to:
1. Monitor whether the EPP continues to meet the CAEP Standards between site visits.
2. Review and analyze stipulations and any AFIs submitted with evidence that they were addressed. 3. Monitor reports of substantive changes. 4. Collect headcount completer data, including for distance learning programs. 5. Monitor how the EPP publicly reports candidate performance data and other consumer information on its website.
CAEP accreditation staff conduct annual analysis of AFIs and/or stipulations and the decisions of the Accreditation Council to assess consistency.
Failure to submit an Annual Report will result in referral to the Accreditation Council for review. Adverse action may result.
Policy 8.05 Misleading or Incorrect Statements
The EPP is responsible for the adequacy and accuracy of all information submitted by the EPP for accreditation purposes, including program reviews, self-study reports, formative feedback reports and addendums and site visit report responses, and information made available to prospective candidates and the public. In particular, information displayed by the EPP pertaining to its accreditation and Title II decision, term, consumer information, or candidate performance (e.g., standardized test results, job placement rates, and licensing examination rates) must be accurate and current.
When CAEP becomes aware that an accredited EPP has misrepresented any action taken by CAEP with respect to the EPP and/or its accreditation, or uses accreditation reports or materials in a false or misleading manner, the EPP will be contacted and directed to issue a corrective communication. Failure to correct misleading or inaccurate statements can lead to adverse action.
Acknowledge << Back Save Save & Quit Submit
Annual Report 2018
THE FOUR MEASURES OF PROGRAM IMPACT
1. Impact on P-12 learning and development
IDEAL—Pre-service candidate impact on P-12 student learning is evaluated through recurring formative assessments and in some
standardized culminating assessment that includes explicit demonstration of P-12 student learning. EPP practices that integrate pre-
and post-instruction P-12 student learning into edTPA or the ETS pre-service portfolio are among examples.
In-service performance is assessed through state and/or local district teacher evaluations: • that include information on P-12 student impact appropriately attributed to each teacher; • the evaluation models are generally understood and accepted by technical, administrative and policy representatives; • there are appropriate adjustments for prior P-12 student learning; • there are appropriate adjustments for characteristics of the schools and/or students in which the teacher is employed; and • there are additional measures such as classroom observation, teacher surveys and student surveys. EPPs routinely make use of these data when they are provided by the state, or seek them out when they are available from school districts and can be reported and analyzed in similar ways. EPPs routinely supplement these data with special studies of teachers in grades or subjects not covered by accessible state or district teacher evaluations.
To demonstrate the program’s impact on school scenario, SoED will collect information about the student performance when the
teacher candidate’s clinical experiences finish. This evidence can be obtain from the roll book; it gathers the grades of the students in
the group that was assign to the candidates. To collect information about P-12 student learning data we establish a phase in plan
(Appendix D) to gather information about student performance while a Teacher Candidate is in the Clinical Experience process. To
gather information of the candidate’s performance and impact in student learning, Appendix E shows the form that gathers the grades
that the students obtained when the Teacher Candidate evaluates students using different assessment techniques during their clinical
experience. The instrument use the same grades that the Teacher Candidate have in their roll books. In addition, it has a column in
which the Teacher Candidate write the student grade before he starts it clinical experience. With this information, we can compare
the grades of the students before and after. Then, we can make decisions about the clinical experience process, the Teacher Candidate
progress, and determine the strengths and weaknesses of Teacher Candidates. This form is new to the Teacher Candidates.
To demonstrate that the program completers perceive their preparation as relevant to the responsibilities they confront on the job
and that the preparation was effective we use the Study of Upcoming Students to Graduate. The Summary of SoED for Study of
Upcoming Students to Graduate is a survey that collects information about student satisfaction at graduation. The survey shows for
year 2014-2015 an average of 99.4%, 2015-2016 the survey shows an average of 97%, for years 2016-2017 shows an average of 98%,
and for years 2017-2018 shows an average of 100% of candidates are Very Satisfied or Satisfied with the preparation they received in
the programs in SoED. If we compare their satisfaction with the rest of the candidates from other programs in the institution, we can
see that the percent’s are similar with an average for all this years of 98.9%.
2. Indicators of teaching effectiveness
IDEAL—One or two classroom observation measures are commonly used by most EPPs. These are validated for teacher evaluation
purposes (e.g., through the MET study). Reviewers are trained and external to the program.
CAEP state partnership protocol arrangements examine the potential for such measures across the state.
SOEDAS Assessment of Competencies Professional Level comprises 17 competencies divided into: 1-5 (Knowledge), 6-13 (Skills), and
14-17 (Values/Dispositions). This assessment tool is use when the student reaches his/her senior year. A Likert scale measures each
item: 1 (Non Acceptable), 2 (Beginner), 3 (Satisfactory), 4 (Competent), and 5 (Excellent). Behavior that depicts knowledge, skills or
values/dispositions is operationally define in the assessment tool to determine the level of competence in any given item. Cronbach’s
alpha reliability coefficient was 0.838 that suggest that the items have relatively high internal consistency. In 2016 Annual report on
page 9 indicates, 100% (46/46 students) of the completers performed over the beginner level was achieved in competence 3
(Knowledge of the organization and preparation of the subject matter that will teach). In competence 5 (Knowledge of principles and
structure of subject matter). In competence 6 (Ability to plan and implement instruction based on knowledge of subject matter,
student’s needs and curricular goals). In 2017 Annual report on page 9 indicates, 100% (71/71 students) of the students performed
over the beginner level was achieved in competence 3 (Knowledge of the organization and preparation of the subject matter that will
teach). In competence 5 (Knowledge of principles and structure of subject matter). In competence 6 (Ability to plan and implement
instruction based on knowledge of subject matter, student’s needs and curricular goals). In 2018 Annual report on page 9indicates,
100% (54/54 students) of the students performed over the beginner level was achieved in competence 3 (Knowledge of the
organization and preparation of the subject matter that will teach). In competence 5 (Knowledge of principles and structure of subject
matter). In competence 6 (Ability to plan and implement instruction based on knowledge of subject matter, student’s needs and
curricular goals).
3. Results of employer surveys, including retention and employment milestones
IDEAL—CAEP collaborates with states with the objective of creating common employer satisfaction surveys that explicitly link preparation satisfaction with various elements of preparation that are important in accreditation. As a result: • One or two surveys are commonly administered for employers of new teachers in their first, second and third year of teaching and results are returned to the EPP; • Questions address employer satisfaction with completers preparation along particular dimensions of preparation (similar, perhaps, to those recently created for Ohio and Missouri); • State Education Agencies (SEAs), State Higher Education Executive Officers (SHEEOs) and state employment agencies enter into agreements to provide employment and retention data on all first, second, or third year teachers—alternatively, employers provide these data to EPPs; and • EPPs make comparisons with self-selected or state-selected peers and CAEP develops benchmark performances for national reporting.
During the semester of August to December 2018, we developed and administered the Employer Satisfaction Survey and the
Completers Satisfaction Survey. Because this survey is a new technic in Appendix F shows a phase in plan to gather more information
about employer’s satisfaction. The Employer Satisfaction Survey and the Cooperative Teacher Satisfaction Survey for Initial Programs
provide the information and insights needed to maintain the quality of the procedures and measures taken in the Teacher Preparation
Program and attract new students. This is the first time that this kind of instrument is administer by the SoED. The
completers/graduated and employers are a great source of experience information and often know what needs to improve the SoED
programs and services. A focus on what the employers and candidates need is imperative in achieving total satisfaction, loyalty and
ultimately overall programs improvement. While most institutions are aware of the need to act and continually improve and become
more competitive, they often miss important insights from employers and students, which used to make decisions, which will make a
difference for the experience of the both. That is where the employer satisfaction surveys come in, uncovering the hidden insights
and data needed to improve SoED programs to compete more efficiently. This instrument is made of eight criteria that covers aspects
of the teacher professional performance in the classroom and in the school as a whole. From 12 participants that answer the survey
the 100% answer that SoED comply for the criteria “The teacher preparation programs of the School of Education of the Universidad
del Turabo (UAGM) develop the necessary professional skills to be an effective teacher in the classroom”. 100% answer that SoED
comply with the criteria “The School of Education of the Universidad del Turabo (UAGM) is the leader in developing high quality
teachers”. 100% answer that the SoED comply for the criteria “The School of Education of the Universidad del Turabo (UAGM) is
recognized for its excellence in the preparation of future educators who, through their knowledge, skills, and dispositions, direct them
towards the transformation of education”. This reflect that the employers are aware that our programs prepare candidates with the
knowledge, skills, and dispositions necessary to be successful in their work area and hired in their Schools. For the criteria “Plan
instruction based on knowledge of the subject, incorporating a variety of teaching strategies that promote the development of critical
thinking, problem solving and reflection in students aligned with the Expectations of the Department of Education”, the 41.6% evaluate
the performance of the completers as Outstanding, 58.3% as Very Effective. For the criteria, “Recognize individual needs and provide
instructional experiences that address diversity (learning styles and multiple intelligences)”, the 33.3% evaluate the performance of
completers as Outstanding, 58.3% as Very Effective and 8.3% as Effective. For the criteria “Demonstrates mastery of classroom
management and use of instructional time”, the 41.6% evaluate the performance of the completers as Outstanding, 58.3% as Very
Effective. For the criteria, “Uses technology to facilitate and improve the teaching-learning process and improve student
performance”, 50% evaluate the performance of completers as Outstanding, 41.6% as Very Effective and 8.3 as Effective. For the
criteria, “Properly uses formative and summative assessment strategies to determine student progress and direct the teaching-
learning process”, 41.6% evaluate the performance of the completers as Outstanding, 58.3% as Very Effective. For the criteria, “Model
standards of appropriate behavior and use effective strategies to establish and maintain codes of conduct by responding in an
appropriate and respectful manner to students”, 58.3% evaluate the performance of the completers as Outstanding, 41.6 as Very
Effective. For the criteria, “Reflect and evaluate the teaching-learning process, its performance and professional practices”, 41.6%
evaluate the performance of completers as Outstanding, 50% as Very Effective and 8.3% as Effective. For the criteria, “Knows,
understands, collaborates and is committed to the school, the family and its community environment as a system of mutual influences
for teaching and learning”, 41.6% evaluate the performance of completers as Outstanding, and 58.3% as Very Effective.
This survey was administer also to the Cooperative Teacher. Twelve (12) Cooperative Teachers answer the survey. For the criteria
“Plan instruction based on knowledge of the subject, incorporating a variety of teaching strategies that promote the development of
critical thinking, problem solving and reflection in students aligned with the Expectations of the Department of Education”, the 33%
evaluate the performance of the completers as Outstanding, 42% as Very Effective and 25% as Effective. For the criteria, “Recognize
individual needs and provide instructional experiences that address diversity (learning styles and multiple intelligences)”, the 42%
evaluate the performance of completers as Outstanding, 42% as Very Effective, 8% as Effective and 8% Little Effective. For the criteria
“Demonstrates mastery of classroom management and use of instructional time”, the 33% evaluate the performance of the
completers as Outstanding, 50% as Very Effective and 17% as Effective. For the criteria, “Uses technology to facilitate and improve the
teaching-learning process and improve student performance”, 50% evaluate the performance of completers as Outstanding, 50% as
Very Effective. For the criteria, “Properly uses formative and summative assessment strategies to determine student progress and
direct the teaching-learning process”, 42% evaluate the performance of the completers as Outstanding, 42% as Very Effective and 17%
as Effective. For the criteria, “Model standards of appropriate behavior and use effective strategies to establish and maintain codes of
conduct by responding in an appropriate and respectful manner to students”, 58% evaluate the performance of the completers as
Outstanding, 25% as Very Effective and 17% as Effective. For the criteria, “Reflect and evaluate the teaching-learning process, its
performance and professional practices”, 67% evaluate the performance of completers as Outstanding, 17% as Very Effective and 17%
as Effective. For the criteria, “Knows, understands, collaborates and is committed to the school, the family and its community
environment as a system of mutual influences for teaching and learning”, 67% evaluate the performance of completers as Outstanding,
25% as Very Effective and 8% as Effective. This reflect that the employers are aware that our program is preparing candidates with
the knowledge, skills, and dispositions necessary to be successful in their work area and hired in their Schools. 100% answer that SoED
comply for the criteria “The teacher preparation programs of the School of Education of the Universidad del Turabo (UAGM) develop
the necessary professional skills to be an effective teacher in the classroom”. 100% answer that SoED comply with the criteria “The
School of Education of the Universidad del Turabo (UAGM) is the leader in developing high quality teachers”. 100% answer that the
SoED comply for the criteria “The School of Education of the Universidad del Turabo (UAGM) is recognized for its excellence in the
preparation of future educators who, through their knowledge, skills, and dispositions, direct them towards the transformation of
education”. We will continue the collection of this information using this instrument. More data will be available on the accreditation
visit.
4. Results of completer surveys
IDEAL—One or two surveys are commonly administered for new teachers in their first, second, and third year of teaching and results
are returned to the EPP. Questions address satisfaction of completers with particular aspects of preparation (similar, perhaps, to
those recently created for Ohio or Missouri). These data are tracked over time to indicate trends. EPPs make comparisons with self-
selected or state-selected peers and CAEP develops benchmark performances for national reporting.
To demonstrate that the program completers perceive their preparation as relevant to the responsibilities they confront on the job
and that the preparation was effective we use the Study of Upcoming Students to Graduate (Apendix G). The Study of Upcoming
Students to Graduate is a survey that collects information about student satisfaction at graduation. The summary of the survey shows
for year 2014-2015 an average of 99.4%, 2015-2016 the survey shows an average of 97%, for years 2016-2017 shows an average of
98% and for years 2017-2018 shows an average of 100% of candidates that are Very Satisfied or Satisfied with the preparation they
received in the programs in SoED. If we compare their satisfaction with the rest of the candidates from other programs in the
institution, we can see that the percent’s are similar with an average for all this years of 98.9%.
Another survey used to determine satisfaction of completers with particular aspects of preparation is the Feedback Survey
administered at different levels. The Feedback Survey Level 1: Initial-Beginner, Level 2: Pre-Professional and Level 3: Professional
(Evidence M4.3, page 1, 3, & 5), give us information about the reactions of students to the program and could be used as a basis for
improvement. This survey gather data about student knowledge, skills, and dispositions. In the Feedback Survey Professional Level 1:
Initial Beginner (Evidence M4.3, page 1), answer by the student, the instrument have a specific question (see the commentaries area,
question 2) related to which course (s) prepared the candidate for his teaching practice experience. The candidate make a list of the
courses in the program that where of some significance for him. A review of the data indicates that 100% of candidates in this level
select EDUC 106 as the most significant course that helps them in the clinical experiences. EDUC 106 is the first professional course in
the curriculum of the teacher preparation program (Evidence M4.3, page 25). It introduces concepts related to education while
students explore their individual commitment to teaching as a career, and their strengths and weaknesses. Special emphasis will be
place on observation and analysis of school scenarios, especially the teaching learning process. The different roles a teacher must take,
as part of his/her, school functions will be discuss. The student will complete 15 hours of clinical experiences. This indicates that the
students understand what are the responsibilities they confront on the job, and that the preparation was effective. In the Feedback
Survey Level 2: Pre-Professional (Evidence M4.3, see the commentaries area, question 2, page 3), the students choose from a list of
courses that they think helps them prepare for their clinical experience. A review of the data indicates that 89% of candidates, in this
level, select EDUC 401 as the most significant course that helps them understand the educative process. EDUC 401 is the second clinical
experience requirement in the School of Education’s Teacher Preparation Programs. It includes fifteen hours of campus-based seminar
and 30 clinical experiences hour of direct observation and active participation in at least 2 different school scenarios, as well as 15
lecture hours. In the Feedback Survey Level 3: Professional, a review of the data indicates that 81% of completers indicates that they
totally agree that the courses prepared them for the professional role and 12% indicates that they agree that the courses prepared
them for the professional role. Ninety percent of completers indicates that they totally agree with the criteria that say the degree to
which the Professional Level allowed to reflect the dispositions of Leadership, Reflection and Collaboration during your studies in the
School of Education. Eighty-four percent of the completers totally agree with the criteria that express to Indicates the degree to which
the Professional Level (teaching practice) allowed to reflect the Knowledge, Skills, and Dispositions for your profession.
THE FOUR MEASURES OF PROGRAM OUTCOME AND CONSUMER INFORMATION
5. Graduation rates
IDEAL—EPP statistical records have capacity to follow individual students longitudinally from admission to completion and at least three years thereafter. For each entering cohort, statistics are derived on those who dropped out or were counseled out, those who completed the full program and certification, and those employed. From these data the completion rate is calculated as the number of completers divided by the number of admitted candidates in a cohort. Dropouts and counseled out candidate rates will be calculated similarly. EPPs make comparisons with self-selected or state-selected peers and CAEP develops benchmark performances for national reporting.
In the following website you can find the information requested:
http://ut.suagm.edu/es/asuntos-estudiantiles/divulgacion/informacion-estudiantil
6. Ability of completers to meet licensing (certification) and any additional state requirements
IDEAL—State licensure tests are closely aligned with InTASC, Common Core, college and career ready, and CAEP Standards. They have many common features that make them at least partially aligned, and they are scored so that comparisons can be made. CAEP would require an 80% pass rate on either the first or second administration for completing candidates. The statistic is defined as number of licenses earned by completers in a cohort divided by number of admitted candidates in the cohort. Trends are reported for three to five years. EPP statistical records have capacity to follow individual candidates longitudinally from admission to completion and at least three years thereafter. These records include data on licensure test taking and results. EPPs compare their results with self-selected or state-selected peers and CAEP publishes national data with benchmark performances for groups of EPPs.
The Puerto Rico Teacher Certification Exam (PRTCE) results provides evidence that demonstrate the program’s effectiveness. The 2016
Annual Report (AR) shows on page 7 that 77 candidates out of 84 pass the PRTCE test. The 2017 AR shows on page 18 shows that 59
out of 66 candidates pass the for the PR 21 – Elementary Level and 11 candidates out of 11 pass the PR 25 – Secondary Level. The
2018 AR shows on page 7 establish that 40 candidates out of 45 pass the PRTCE test. The College Board prepares this exam and it was
administer to all students authorized by SoED to take the test. The PR10 exam is the component of the Puerto Rico Teacher
Certification Exam (PRTCE) that measures concepts related to the general education component of the program. Since March 2015,
PR10 is part of the General PRTCE that includes both, the general education and the professional pedagogical component. The cut-off
score of the test is 89 points in a scale from 40 to 160 points (theoretical mean of 100 points). There was achieved the expected 75%
of students approving the test. There were No statistically significant differences were identify between groups after an Independent
Samples t-Test analysis. In 2014-15, 91.67% of students pass the test. In 2015-2016, 91.18% pass the test, and in 2016-2017, 88.89%
pass the test. In 2017-2018, 80.8% pass the test. This is prove that candidates are sufficiently prepared to take the test after their
practicum experience. The following table summarize the amount of students passing the Certification test by specialty (Evidence
M4.1).
Percentage of Candidates that pass the Puerto Rico Teacher Certification Exam 2017-2018
Exam N Approve
General 94 85.1%
Social Studies/History 5 60.0%
Spanish 0 0%
English 17 82.3%
Mathematics 4 50%
Science 5 40%
7. Ability of completers to be hired in education positions for which they were prepared
IDEAL—EPPs report the completer employment status as of September 1 after preparation program termination, disaggregated by: a. employed in position for which trained/ admitted cohort; b. employed in any other education position/ admitted cohort; c. enrolled in continuing education/ admitted cohort; d. other employment/ admitted cohort; and e. other or not employed/ admitted cohort. The statistic would be defined as the number in each of the a-through-e categories divided by the number of completers used in item 5, graduation rates. CAEP would use these data to develop national benchmarks on groups of similar EPPs.
From 88 completers in 2018, we have contacted 55, this represent a 62.5% contacted. From this 32% is working in public schools,
16% is working on private schools, 3.6% are working as Directors, 27.2% are working in other non-related jobs, 12.7% are
unemployed, 7.2% is working in the United States in education related jobs.
8. Student loan default rates and other consumer information
IDEAL–Student loan default rates are calculated from U.S. Department of Education data by extracting the EPP information from the institution-level report. This is one of several indicators of “consumer information” that include additional measures created by EPPs such as: • Cost of attendance using some common methodology, • Beginning salary of completers based on official employer records and trends over time, and • Placement location patterns for completer cohorts, with trends over time. Evidence M8.1 shows the official notification of your school's fiscal year (FY) 2015 official cohort default rate (CDR) data. According to the Higher Education Act of 1965 (HEA), as amended, the Higher Education Reconciliation Act of 2005 (HERA), Pub.L.109-71 and the Department's regulations, your school is not subject to any sanctions based on your school's FY 2015 CDR. Evidence M8.2 shows a summary of cost of attendance to the School of Education at UAGM Gurabo Campus. You can access this information at http://ut.suagm.edu/es/asuntos-estudiantiles/divulgacion/servicios-estudiantiles. Salaries for Middle School Teachers, Except Special Education and Professional / Technical Education in Puerto Rico are as follows:
Graph 1: Comparison of Salaries in Puerto Rico and USA for Middle School Teachers
Salaries for Kindergarten Teachers, Except Special Education in Puerto Rico are as follows:
Graph 2: Comparison of Salaries in Puerto Rico and USA for Kindergarten/Elementary Teachers
Salaries for Special Education Teachers of Secondary School in Puerto Rico are as follows:
Graph 3: Comparison of Salaries in Puerto Rico and USA for Special Education Teachers of Secondary School
These graphs show the difference in teacher departures between Puerto Rico and the United States. More information can be found
in the following website: https://www.miproximopaso.org/profile/ext/salary/25-2021.00?s=PR&g=Contin%C3%BAe.
Appendix
APPENDIX A
Phase In Plan
Rubrics Inter Rater Reliability and Validity
EPP’s evidence to support the Reliability and Validity of created rubrics and other instruments
Relationship to standard or component
The phase in plan aligned with CAEP Standard 1 Component 1.1, 1.2, 1.4, Standard 2 Component 2.3, Standard 3
Component 3.3, 3.4, Standard 4 Component 4.4, and 4.2
Description: The Feedback Survey Level 1: Initial-Beginner (Evidence 1.1.1, page 1), Level 2: Pre-Professional (Evidence 1.1.1,
page 3) and Level 3: Professional (Evidence 1.1.1, page 5), are design to gather information about candidates knowledge, skills
and dispositions/values. The disposition is when the teacher candidate has the leadership and collaboration provisions
expected of an education professional. This aspect is measure at the three levels at the end of the course corresponding the
level. In each level (EDUC 106, EDUC 401 and Practicum), future teachers (candidates) are dedicated to the study of professional
courses and concentration courses that will help them acquire the knowledge, skills and dispositions required in the teaching
profession. The feedback data help to improve teaching and learning experiences for candidates and faculty, help faculty to
engage in a scholarly review of their teaching by reflecting on class design, delivery, candidate engagement, and assessment.
It will provide data to benchmark teaching and learning quality within SoED Programs courses of study provide evidence for
teaching staff to use as indicators of current teaching performance and course difficulties and provide evidence for academic
faculty promotion.
Description: The Satisfaction Surveys used in Undergraduate and Graduate Programs provide the information and insights
needed to keep completers and employers pleased with the procedures and measures taken in the Teacher Preparation
Program and attract new students. This is the first time that this kind of instrument is administer by the SoED. The
completers/graduated and employers are a great source of experience information and often know what needs to improve
the SoED programs and services. A focus on what the employers and candidates need is imperative in achieving total
satisfaction, loyalty and ultimately overall programs improvement. While most institutions are aware of the need to act and
continually improve and become more competitive, they often miss important insights from employers and candidates, which
used to make decisions, which will make a difference for the experience of the both.
The Assessment of Competencies, Teaching Practicum Evaluation Instrument and Practicum Student Evaluation Rubric of
Initial Level, Pre-Professional level and Practicum Level, is use to evaluate candidates' knowledge, skills and dispositions at each
level, is a measure of candidate progress collected once per semester in the assign course. The rubrics in question evaluate the
candidate performance during each level. The rubrics evaluates areas like the development and implementation of a lesson
plan. Each rubric is divide in Knowledge skills and dispositions. Under these criteria, the candidate is evaluate according to the
level in which he is. It is evaluate from its communication skills to its execution in the classroom.
Objective of the data/evidence collection
o The collected data evaluates candidate’s performance to demonstrate that they acquire the necessary knowledge to
carry out planning and curricular development activities. In addition, if they understand the development of a lesson
plan, measurement and evaluation of learning. In addition, the collected data evaluates how the candidate use the
acquire knowledge for the design lesson according to the content, and if the candidate use a variety of teaching
strategies and methods.
o For the Satisfaction Surveys, the collected data evaluates candidate’s performance to demonstrate that they acquire
the necessary knowledge to analyze different scenarios according to the information gather through the eyes of the
employers and other personnel.
Timeline & Resources
Detailing of strategies, steps and a schedule for collection through full implementation, and indication of what is to be
available by the time the site visit;
Baseline (2018-2019) Site Visit (December 2019)
Objective 1: Evaluate the 3 level Feedback
Surveys
December 2018 pilot data 2019/2020 pilot data available
Objective 2: Evaluate the 3 level Assessment of Competencies Rubric, Teaching Practicum Evaluation Instrument and Practicum Student Evaluation Rubric
December 2018 pilot data 2019/2020 pilot data available
Objective 3: Evaluate the Employer Satisfaction Surveys in Undergraduate Program and Graduate Program
December 2018 pilot data 2019/2020 pilot data available
Objective 4: Implement the Feedback Surveys August 2019 data Fall 2019 data available
Objective 5 : Implement Competencies Rubric, Teaching Practicum Evaluation Instrument and Practicum Student Evaluation Rubric
August 2019 data Fall 2019 data available
Objective 6: Implement Employer Satisfaction Surveys
August 2019 data Fall 2019 data available
Description of the personnel, technology and other resources available; institutional review board approvals, if appropriate; and EPP access to data compilation and analysis capability.
o The CAEP liaison will oversee planning, implementation, and assessment of Rubrics and Surveys. The Associate Dean
and the Practicum coordinator will assist faculty in collecting the instruments. Faculty will implement the Feedback
Surveys and rubrics in select undergraduate and graduate classes each semester (fall/spring).
Data Quality
A copy of the collection instrument if it is available
o A copy of the instrument are available as part of the SSR evidence.
Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric
o The faculty will administer the instruments in the courses each semester (fall/spring).
o The faculty hand in the instruments to the CAEP Liaison to start the statistic analisis.
o The CAEP Liaison discuss the results with the Associate Dean and Dean.
o The results are publish in the SoED Webpage (under construction).
Steps that will be taken to attain a representative response, including: the actions to select and follow up a representative
sample (or, a purposeful sample if that is appropriate for the data collection) and actions to ensure a high response rate
o Candidate’s data from all courses are gather throughout the phase-in plan. The instrument will be use during EDUC
401, EDUC 106 and Practicum at Undergraduate and Graduate courses as a required component of each course.
Steps to ensure content validity and to validate the interpretations made of the data
o The rubrics and surveys are tools created by faculty. To determine its validity we will evaluate the internal consistency
of the instruments. According with Kumar (2017) Internal Consistency Reliability is a measure of reliability used to
evaluate the degree to which different test items that probe the same construct produce similar results. It examines
whether or not the items within a scale or measure are homogeneous. Faculty will review data after each semester and
individual results that raise concern will be address with a meeting between all the faculties. The faculty will either
provide support depending on the results for each rubric.
o To ensure that the instrument gather valid and reliable data the form will be sign by the University Faculty Supervisor.
After all the forms are gathered then the data will be analyze using SPSS. According with Kumar (2017) Internal
Consistency Reliability is a measure of reliability used to evaluate the degree to which different test items that probe
the same construct produce similar results. It examines whether or not the items within a scale or measure are
homogeneous. Faculty will review data after each semester and individual results that raise concern will be address
with a meeting between all the faculties.
References:
Aycock-Cushman, C. A. & Kemp, A. (2012). The Effects of Clinical Experiences on the Understanding of Classroom Management Techniques. Journal of Inquiry & Action in Education, 4(3)
Mohajan, H. (2017). Two Criteria for Good Measurements in Research: Validity and Reliability. Retrieved
from https://mpra.ub.uni-muenchen.de/83458/1/MPRA_paper_83458.pdf. Nesari, A. J. & Heidari, M. (2014). The Important Role of Lesson Plan on Educational Achievement of
Iranian EFL Teachers' Attitudes. International Journal of Foreign Language Teaching & Research, Vol. 3(5).
APPENDIX B
Summary of Rubric Intra-Class Correlation
Course Rubric Intra-class
Correlation
95% Confidence Interval
Lower
Bound
Upper
Bound
EDUC
44_
Clinical Experience Evaluation Instrument
2015
.985 .973 .993
EDUC
44_
Clinical Experience Evaluation Instrument
2017
.951 .911 .978
EDUC
515
Student Evaluation Rubric 1.000 1.000 1.000
EDUC
44_
Portfolio Rubric .550 .019 .857
EDUC
106
Competencies Initial Level Rubric .958 .922 .981
EDUC
401
Competencies Pre-Professional Level Rubric .916 .851 .960
EDUC
44_
Clinical Experience Rubric .913 .808 .974
Satisfaction Survey Undergraduate
Completers
.807 .567 .939
Summary of Rubric Cronbach Coefficient
Course Rubric Cronbach’s Alpha Cronbach’s Alpha based on
Standardized items
EDUC 44_ Clinical Experience Evaluation Instrument
2015
.984 .987
EDUC 44_ Clinical Experience Evaluation Instrument
2017
.951 .950
EDUC 515 Student Evaluation Rubric 1.000 1.000
EDUC 44_ Portfolio Rubric .550 .552
EDUC 106 Competencies Initial Level EDUC 106 Rubric .958 .961
EDUC 401 Competencies Pre-Professional Level EDUC 401 .916 .903
EDUC 44_ Clinical Experience Rubric .913 .917
EDUC 44_ Satisfaction Survey Undergraduate Completers .807 .842
Summary of Item Analysis
Item Statistics
Mean Std. Deviation N
Conocimientos principios 4.83 .408 6
Destrezas Negociar 4.83 .408 6
Conocimientos Teorias Modelos 4.83 .408 6
Dominio tecnologías 4.83 .408 6
Conocimientos recursos 4.83 .408 6
Aprecio sensibilidad 4.83 .408 6
APPENDIX C
AREAS OF IMPROVEMENT
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
New LMS
platform as
part of the
SOEDQAS
Use of a new
platform to
gather insights
data that are
accessible,
relevant, and
actionable.
Leverage data
from all touch
points and deliver
insights at every
level.
Provide the right
information to
the right person,
at the right time
and identify at-
risk candidates
and candidates
success.
5.3 Continuous
Improvement;
Testing Innovations
as Part of Standard 5
Continuous
Improvement
Starting
January 2019
Rubrics- The
courses will
be place in
Blackboard
with rubrics.
They are
going to be
available in
the platform
for the
professor to
use in each
course. This
maintain the
data available
all the time.
Course
grades-
Blackboard
used the IMS
Caliper
Analytics
standard to
combine
student’s
1.Review of
rubrics and
surveys to assure
they comply with
the CAEP
assessment level
2. Design courses
with rubrics in
Blackboard.
3. The professor
that teach the key
courses in each
program will
have available the
Blackboard
platform for his
course.
3. Each professor
will hand in a
grades report a
rubric analysis
using the AR1.
Examples of
rubrics with
tasks
Grades
distribution
AR1
Data collected
in March 2019
and May
2019.
Feedback
Survey
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
activity
stream data
from
Blackboard
Learn and
Vital Source
to see if it
would
improve
predictions
about student
achievement.
Survey Monkey Use of an online
survey (Survey
Monkey)
development
cloud-based
software to
collect and
analyze
information from
a variety of
sources like
student,
cooperative
teacher, student
supervisor,
directors and
others.
4.4 Required
component – The
provider
demonstrates, using
measures that result
in valid and reliable
data, that program
completers perceive
their preparation as
relevant to the
responsibilities they
confront on the job,
and that the
preparation was
effective.
Starting
January 2019
Feedback
surveys-At the
end of each
key course
there will be
the feedback
survey
1.Review of
surveys to assure
they comply with
the CAEP
assessment level
2. Starting in
October 2018, the
professor that
teach the key
courses in each
program will
have the Survey
using the Survey
Monkey platform
for his course.
Data collected
in December
2018, March
2019 and May
2019.
Feedback
Survey
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
Data Sharing Publish the data
collected from
SOEDQAS,
display it on the
institution web
page to be share
with, and
reviewed by all
system
stakeholders,
including
program faculty,
QEMEC, and all
the SOED
committees.
5.4 Required
component–Measures
of completers impact,
including available
outcome data on P-12
student growth, are
summarized,
externally
benchmarked,
analyzed, shared
widely, and acted
upon in decision
making related to
programs, resource
allocation, and future
direction.
Starting
January 2019
PCMAS
results
SIAAM
results
Feedback
surveys
results
1.Review of
surveys and data
that is ready to be
share to assure
they comply with
the CAEP
standards.
2. Starting in
January 2019, the
SOED web page
will be update
every month with
information
regarding
accreditation and
student outcomes.
All
information
regarding
student’s
outcomes,
accreditation,
enrollment,
retention and
graduation
data.
Assessments
Revision and
Statistical
Analysis
Examine that the
rubrics used in
SOED have
validity and
reliability and
comply with the
sufficient level or
above on the
CAEP Evaluation
Framework for
EPP-Created
Assessments.
At least 75% of EPP-
created assessments
used in
the QAS are scored at
the sufficient level or
above on the CAEP
Evaluation
Framework for
EPP-Created
Assessments, with
particular attention to
content validity.
Starting
March 2019,
pilot project
Test-retest
Reliability
Inter-rater
Reliability
Face validity
1. Meeting with
the Curriculum
Committee to
establish the work
plan
2.Meeting with
the statistician to
explain what is
the test we need
to use
See Evidence
5.2.1,
summary of
the assessment
to be tested.
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
3.Meeting with
the statistician to
discuss the results
of the test
administer to the
selected rubrics
4. Sharing the
results and
establish new
plan with
committee in case
is needed to solve
any problem
within the
instruments.
Innovation
Teaching and
Clinical
Experiences
Center
(ITCEC)
Make decisions
related to the
student's
experiences in
the public or
private school
5.5 The provider
assures that
appropriate
stakeholders,
including alumni,
employers,
practitioners, school
and community
partners, and others
defined by the
provider are involved
in program
evaluation,
improvement, and
August 2018 Meeting
decisions
1.Meeting with
the Clinical
Supervisors
2.Selection of the
members
committee
Meeting
agenda
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
identification of
models of excellence-
Candidate
Feedback for
Advance
Program
Gather
information about
the candidates
and completers
satisfaction with
the program and
their Knowledge,
Skills and
Dispositions.
5.4 Measures of
candidates and
completer’s impact,
including available
outcome data on P-12
student growth, are
summarized,
externally
benchmarked,
analyzed, shared
widely, and acted
upon in decision
making related to
programs, resource
allocation, and future
direction.
A.4.4 The provider
demonstrates, using
measures that result
in valid and reliable
data, that program
completers perceive
their preparation as
relevant to the
responsibilities they
confront on the job,
and that the
Starting
March
2019
Student
Feedback
Survey
1. Review the
instrument to
gather the
necessary data.
3.Start the pilot
study
4. Gather the
data.
5. Analyze the
data and publish
the results.
Candidate
Feedback
Survey for
Advance
Program
Meeting
Agenda
Pilot Results
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
preparation was
effective.
EDUC 515
Practicum
Manual
Revision
Adjust the
manual to the
new changes that
have arisen in the
PRDE.
A.2.2 The provider
works with
partners to design
varied and
developmental
clinical settings
that allow
opportunities for
candidates to
practice
applications of
content knowledge
and skills that the
courses and other
experiences of the
advanced
preparation
emphasize. The
opportunities lead
to appropriate
culminating
experiences in
which candidates
demonstrate their
proficiencies
through problem-
based tasks or
research (e.g.,
Starting
March 2019
EDUC 515
Practicum
Manual
1. Meeting with
the Curriculum
Committee to
establish the work
plan
2. Meeting with
the professors to
establish the work
plan.
3. Meeting with
the professors to
discuss the
manual.
4. Put Manual to
use.
Practicum
Manual
PLAN OBJECTIVE STANDARD TIMELINE DATA
QUALITY
STEPS AVAILABLE
EVIDENCE
qualitative,
quantitative, mixed
methods, action)
that are
characteristic of
their professional
specialization as
detailed in
component A.1.1.
After the review of all the programs and the requirements in the Initial and advance accreditation handbook, there is a need to create a
platform as part of the SOEDQAS process. To comply with all the data needed for all the reports the SoED Dean is planning to build
with the Central Office of Informatics and Technology (OCIT) office a new LMS platform to gather insights data that will be
accessible, relevant, and actionable. Leverage data from all the important points and deliver insights at every level. The platform will
provide the right information to the right person, at the right time and identify at-risk candidates and candidate’s success. This process
started in January with the first meeting to understand the data needed to comply with continuous improvement (CAEP 5.3) and
decision making of the SoED. Part of the plan is to upload the rubrics to the courses in Blackboard. They are going to be available in
the platform for the professor to use in each course. This maintain the data available all the time. Also, will use Blackboard IMS
Caliper Analytics in a standard way to combine student’s activity stream data from Blackboard Learn to see if it would improve
predictions about student achievement using Course Grades.
To improve data gathering is important to use as many tool as possible. The use of an online survey (Survey Monkey) development
cloud-based software to collect and analyze information from a variety of sources like student, cooperative teacher, student supervisor,
directors and others. This will be use for the Feedback Surveys (for initial and advance) at the end of each key course as a required
component (CAEP 4.4). This demonstrates that the SoED use measures that result in valid and reliable data, that program completers
perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. The data
collection will be in December, and May 2019.
Publishing the data collected from SOEDQAS will be display on the SoED web page to be share with, and reviewed by all system
stakeholders, including program faculty, QEMEC, and all the SOED committees. This is necessary to comply with CAEP 5.4 in which
the measurement of completer impact, including available outcome data on P-12 student growth, are summarized, externally
benchmarked, analyzed, shared widely, and acted upon in decision making related to programs, resource allocation, and future
direction. This process will start in March 2019.
The Assessments Revision and Statistical Analysis is important to measure the candidate’s performance. Examine that the rubrics used
in SOED have validity and reliability and comply with the sufficient level or above on the CAEP Evaluation Framework for EPP-
Created Assessments is a priority. We set a goal to have At least 75% of EPP-created assessments used in the QAS are scored at the
sufficient level or above on the CAEP Evaluation Framework for EPP-Created Assessments, with particular attention to content
validity. The pilot project will start in March 2019. In addition, a Phase –in Plan (Appendix A, page 1) specify the revision for the
assessments including the validity and reliability.
The creation of the Innovation Teaching and Clinical Experiences Center (ITCEC) will be in charge of all matters related to clinical
experiences. This center will make decisions related to the student's experiences in the public or private school. The creation of this
center is align with CAEP 5.5 and assures that appropriate stakeholders, including alumni, employers, practitioners, school and
community partners, and others defined by the provider are involved in program evaluation, improvement, and identification of
models of excellence. Several meetings needed to take place before the center is a reality.
In order to gather the necessary data to comply with CAEP 5.4 and A.4.4 there is a need to create the Candidate Feedback for
Advance Program to collect information about the candidates and completers satisfaction with the program and their Knowledge,
Skills and Dispositions. This Feedback Survey is different from the other satisfaction surveys used as evidence on these standards
because the purpose is to understand how the candidates perceived their own learning and if they are acquiring the necessary
knowledge skills and dispositions.
The clinical experience is the most important phase in the Director’s preparation. The EDUC 515: Practicum in School Administration
and Supervision Manual main objective is to give direction to the practice phase, and in particular, to assist the student-director
(candidate) in his executions as such. This document presents a broad conceptualization of what school administration is in order to
serve as a guide and stimulus in the development of good administrative practices. It also contains a summary of the monitoring
process and the different modalities that allow differentiate it to respond to the needs and interests of the supervised. The manual
needs to adjust to the new changes that have arisen in the PRDE.
APPENDIX D
Roll Book Data Collection
Relationship to standard or component
The phase in plan aligned with CAEP standard 4 component 4.1
o Description: Roll Book Data Collection: The roll book is a book in which a teacher keeps a record of the attendance or
classwork of his pupils. It is important to maintain the record of the student performance and is a requirement as part
of the Clinical Experience of the Teacher candidate (completers). Collecting the information from students’
performance in the Teacher candidates Clinical Experience can gives you an idea of how the completer performance
impact the student performance. According to Muleta & Gurmesa (2015), several factors influence academic
performance of students. One of those factors is the Educators (teachers) they have a great role in fostering positive
or negative attitude to achievements of students.
Objective of the data/evidence collection
o The collected data evaluates completer’s performance to demonstrate that they acquire the necessary knowledge to
carry out planning and curricular development activities. In addition, if they understand the development of a lesson
plan, measurement and evaluation of learning and measure the impact of its teaching in students’ performance.
Timeline & Resourses
Detailing of strategies, steps and a schedule for collection through full implementation, and indication of what is to be
available by the time the site visit;
Baseline (2018-2019) Site Visit (December 2019)
Objective 1: Gathering the data using the Impact
of Completer-Teacher Candidate Instrument
January-May pilot data 2019/2020 pilot data available
Objective 2: Evaluate the data using the Impact of Completer-Teacher Candidate Instrument
May 2019 pilot data 2019/2020 pilot data available
Objective 3: Share results with the stakeholders and faculty
August 2019 pilot data 2019/2020 pilot data available
Description of the personnel, technology and other resources available; institutional review board approvals, if
appropriate; and EPP access to data compilation and analysis capability.
o The CAEP liaison will oversee planning, implementation, and assessment of the instrument. The CAEP Liaison and
Associate Dean and the will assist faculty in data collection and analysis. After the review of data, the procedure will
be implement in the Clinical Experiences Protocol and Manual.
Data Quality
A copy of the collection instrument if it is available
o A copy of the instrument will be available.
Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric
o To ensure that the instrument is appropriate for the data that needs, the faculty will review the instrument and give
their suggestion to improve if it is necessary.
Steps that will be taken to attain a representative response, including: the actions to select and follow up a representative
sample (or, a purposeful sample if that is appropriate for the data collection) and actions to ensure a high response rate
o The Associate Dean guarantee the distribution of the instrument to the entire Clinical Experience faculty at the
beginning of the semester and gather at the end of the semester.
Steps to ensure content validity and to validate the interpretations made of the data
o To ensure that the instrument gather valid and reliable data the form will be sign by the University Faculty Supervisor.
After all the forms are gathered then the data will be analyze using SPSS. According with Kumar (2017) Internal
Consistency Reliability is a measure of reliability used to evaluate the degree to which different test items that probe
the same construct produce similar results. It examines whether or not the items within a scale or measure are
homogeneous. Faculty will review data after each semester and individual results that raise concern will be address
with a meeting between all the faculties.
References:
Aycock-Cushman, C. A. & Kemp, A. (2012). The Effects of Clinical Experiences on the Understanding of
Classroom Management Techniques. Journal of Inquiry & Action in Education, 4(3)
Mohajan, H. (2017). Two Criteria for Good Measurements in Research: Validity and Reliability. Retrieved
from https://mpra.ub.uni-muenchen.de/83458/1/MPRA_paper_83458.pdf.
Muleta-Akessa, G. & Gurmesa-Dhufera, A. (2015). Factors that Influences Students Academic
Performance: A Case of Rift Valley University, Jimma, Ethiopia. Journal of Education and Practice,
Vol.6, No.22
Nesari, A. J. & Heidari, M. (2014). The Important Role of Lesson Plan on Educational Achievement of
Iranian EFL Teachers' Attitudes. International Journal of Foreign Language Teaching & Research, Vol.
3(5).
APPENDIX E
Roll Book Form
Teacher candidate name Date
University Faculty Supervisor Name Clinical Experience Center
Subject Level
Instructions: Using your record book (Roll Book), transfer your student’s grades in each column without entering the name of the
student. In the first column, place the student number corresponding to your record. In the second column, write the grade that the
student has before beginning your assignments, please enter the points and the letter (98/A). From column 1 to 10, write the grades
for your students’ assignments/assessments. Use as many column as you need. In the total column, write the total points. In the last
column, enter the average grade of the student (98/A).
Students Grades
Student Grade
at
Start
1 2 3 4 5 6 7 8 9 10 Total Average
(%)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Please provide the following information, in the second column (Amount of Students) write the total amount of students that has A,
B, C… according with the letter and in the third column write the Percent of grades. In the last column, write the total amount of
students in the class.
Grade Amount of Students Percent (%) Total amount of
students in the class
A
B
C
D
F
Teacher Candidate Signature: _________________________________________________ Date:
_____________________________________
University Faculty Supervisor Signature: ___________________________________________ Date:
___________________________________
APPENDIX F
Employer Satisfaction Survey (Superintendents)
Relationship to standard or component
The phase in plan aligned with CAEP standard A.4 and A.5.
o Description: The Employer Satisfaction Survey (Evidence A.4.2.1), have eight criteria that summarize what the
program expect of the employed graduate in a position of Director/Leadership in a public or private school. The
survey ask if the administrator is satisfied with the preparation of the alumni directors of the Universidad Ana G.
Méndez Gurabo Campus (Former Universidad del Turabo) regarding the execution of their responsibilities as assigned
to work at the school. Another question is based on the director's performance, how well prepared are UAGM alumni
principals/directors to work in the schools? The instrument divide the criteria’s in 3 categories, knowledge, skills and
dispositions and align with CAEP standards, ELCC standards, and with the Directors Profile. The criteria’s for
Knowledge are 1. Facilitates and directs, together with its team, the preparation of action plans based on the
proposed goals and objectives, and considers the evaluation processes as a means to make decisions that contribute
to improving learning (CAEP: 1.1, 1.4; ELCC: 1.1; ICAAE/Directors Profile: A.I) and 2. Interpret, together with its faculty,
the results of the PPAA and other evaluation instruments, as well as the reports of notes, to identify strengths and
limitations of their students, with the purpose of incorporating into their plan of action, activities aimed at the
improvement of the teaching-learning process (CAEP: 1.4; ELCC: 1.2; ICAAE: A.I.2; B.I.3). The criteria’s for Skills are 3.
It fosters an adequate organizational climate, fostering the processes that support the improvement of educational
quality, such as effective communication and relationships, safe and orderly learning environments, excellent
academic services and good relations with the community (CAEP: 1.1; ELCC: 4.3), 4. Establishes work practices and
peaceful coexistence, which offer security and protection to all members of the school community, fostering a culture
of learning favorable to students (CAEP: 1.5; ELCC: 4.4; ICAAE: B.III.4; B.III.5; B.III.6), and 5. Promote effective
relationships with the community your school serves (CAEP: 1.2; ELCC: 4.4; ELCC: 4.4; ICAAE: C.III.2). The criteria’s for
Disposition’s are 6. Demonstrate knowledge of the public policy by which the Educational Systems of the Country are
governed (CAEP: 1.1; ELCC: 3.4; 3.5; ELCC: 3.4; 5.4; ICAEE: A.IV.1), 7. Demonstrate knowledge about effective time
management and processes to effectively organize their work, which facilitates administrative processes in their
workplace (CAEP: 1.1; ELCC: 3.4; 3.5; ICAEE: B.IV.1), and 8. Know and participate in the established processes, so that
your school has the human and physical resources that are required for the proper functioning of the school campus
(CAEP: 1.1; ELCC: 3.1; ICAAE: C.IV.4).
Objective of the data/evidence collection
o The purpose for collecting the survey data is to evaluate the SoED Program impact and how effective are the
completers in the Educative scenario. Completer’s performance will demonstrate that they acquire the necessary
knowledge to carry out their responsibilities’. In addition, if they understand the development of processes,
measurement and evaluation in the administrative area. In addition, the survey evaluates how the completer use
adequate materials and technological resources for the administrative decision-making according to the developed
skills, and if the completer use a variety of administrative strategies and methods.
Timeline & Resourses
Detailing of strategies, steps and a schedule for collection through full implementation, and indication of what is to be
available by the time the site visit;
PLAN OBJECTIVE STANDARD TIMELINE DATA QUALITY STEPS AVAILABLE
EVIDENCE
Employers Satisfaction
Survey for Initial and Advanced
Program
Establish a collaborative agreement with DEPR to help SOED gather information about the candidates and completers
5.4 Measures of completer’s impact, including available outcome data on P-12 student growth, are summarized, externally benchmarked, analyzed, shared
Starting October 2018
Satisfaction surveys of employers
1. Establish first contact to the Caguas School Region to select and incorporated schools in the agreement. 2. Develop the instrument to
Satisfaction Surveys Meeting Agenda Pilot Results
widely, and acted upon in decision making related to programs, resource allocation, and future direction.
gather the necessary data. 3.Start the pilot study 4. Gather the data. 5. Analyze the data and publish the results.
Description of the personnel, technology and other resources available; institutional review board approvals, if
appropriate; and EPP access to data compilation and analysis capability.
o The CAEP liaison will oversee planning, implementation, and assessment of the survey. The Associate Dean and the
Practicum coordinator will assist faculty in data collection and analysis. Clinical Experiences University Supervisors will
administer the survey each semester (fall/spring). The evaluated data will be upload to the SoED website.
Data Quality
A copy of the collection instrument if it is available
o A copy of the instrument will be available.
Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric
o The Clinical Experience University Supervisor will help to administer the survey assessment each semester
(fall/spring). They also will help in the disaggregation of data and review the results to inform how the instrument
results can be use in reaching conclusions.
Steps that will be taken to attain a representative response, including: the actions to select and follow up a representative
sample (or, a purposeful sample if that is appropriate for the data collection) and actions to ensure a high response rate
o Survey data from all selected participants will be gather throughout the phase-in plan. The instrument will be use
during the clinical experience phase. The University Supervisor will help with the collection of the surveys. To ensure
that all participants answer and hand in the surveys there will be establish a procedure with the Superintendents for
the collection process of the surveys.
Steps to ensure content validity and to validate the interpretations made of the data
o The instruments are tools created by the QEMEC Committee. To determine its validity we will evaluate the internal
consistency of the instruments. According with Kumar (2017) Internal Consistency Reliability is a measure of reliability
used to evaluate the degree to which different test items that probe the same construct produce similar results. It
examines whether or not the items within a scale or measure are homogeneous. Faculty will review data after each
semester and individual results that raise concern will be address with a meeting between all the faculties. The faculty
will either provide support depending on the results for each rubric.
References:
Gwet, K. L. (2014). Handbook of Inter-Rater Reliability, 4th Edition: The Definitive Guide to Measuring
The Extent of Agreement Among Raters. Advanced Analytics, LLC.
Mohajan, H. (2017). Two Criteria for Good Measurements in Research: Validity and Reliability. Retrieved
from https://mpra.ub.uni-muenchen.de/83458/1/MPRA_paper_83458.pdf
APPENDIX G
ANA G. MÉNDEZ UNIVERSITY SYSTEM
ASSISTANT VICE PRESIDENT OF ANALYSIS AND INTITUTIONAL STUDIES STUDY OF UPCOMING STUDENTS TO GRADUATE
SCHOOL OF EDUCATION
Comparison of Students Upcoming to Graduate between the Ana G. Méndez University System
Item 2014-2015 2015-2016 2016-2017 2017-2018
Professional
preparation
received
UT UMET UNE UT UMET UNE UT UMET UNE UT UMET UNE
Very
satisfied /
Satisfied
332
99.4%
142
99.4%
37
100%
152
97%
141
97%
40
100%
140
98%
71
100%
45
98%
38
100%
115
99%
41
100%
Insatisfied /
Very
insatisfied
2
0.6%
2
0.6%
0
0%
4
3%
4
3%
0
0%
3
2%
0
0%
1
2%
0
0%
1
1%
0
0%