Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Program Evaluation
1
Course Project: Program Evaluation
Hiba Diarbi
Walden University
Dr. Michael Burke, Instructor
Program Evaluation (EIDT - 6130 - 1)
Sunday 23, 2014
Program Evaluation
2
Global Horizon - Global Horizons FZC is a consulting practice specializing in the
deployment and integration of software solutions for the small and medium sized
enterprise plus offering a variety of management consulting services suitable for all types
of businesses. (Profile, Global Horizon)
The program is Global Campus. Global Campus is a school web-based information
system designed to meet the unique needs of medium and large urban K-12 schools. It is
composed of a bouquet of services that address every single activity in a school.
Out of the complete program I selected one module for the evaluation, the module is
Teachers’ Lounge. Teachers use the system to enter marks, contribute to a bank of question
that can be used to create online tests, add lesson plans and share them with colleagues,
and last but not least generate reports and statistics to follow up on students’ progress and
analyze results.
The whole program started seven years back. It was the result of hard work and
efforts of a group of programmers and expertise in school management and education.
Global Campus staff keep working on improving the services the system offers. The oral
feedback and requests from the customers are always taken into consideration and the
staff make sure they add and improve whatever benefit the customers and facilitate their
work.
In this program the stakeholders are the following: Global Horizon; the global
campus staff, who are concerned about customer satisfaction, quality assurance, product
enhancement and development, and service and support.
Program Evaluation
3
The school; admin, school principal, school owner, coordinators are also interested
to follow up and make sure of the following: service quality, customer satisfaction, parents’
involvement in kids’ education, kids’ education, and service enhancement.
Teachers who appreciate any system that can help them in doing their work fast and
get the result that would take long time to work on manually. Teachers also benefit from
the system as they are getting tools to analyze students’ data, find and compare students’
progress and as a result they are able to focus on the issues that help in students learning
and improvement.
Students: they have the chance to get a better learning environment as their
teachers have less work load and can focus their efforts on students’ learning and
improvement. The students benefit from the system as they can access school’s homework
and classwork from home, all materials they need are online for them to download any
time. They can monitor their progress and compare with previous results.
Parents: an important stakeholder that the system cares about. Parents use the
system to communicate with their kids’ teachers and school admin, stay aware of their
kids’ progress and behavior at school.
However, evaluating the use of global campus will be effected of the following
contextual factors:
1- Time: When to perform the evaluation and survey?
2- Availability of the stakeholders to contribute in the evaluation
3- Availability of the system
4- Political and psychological factor: participants might be very positive or very
negative and not objective in their contribution to the evaluation.
Program Evaluation
4
5- Literacy of the stakeholders and how versed they are in using the technology
6- Internet availability at home
Some of the above factors are directly related to each other, time and availability of
the system are related as we cannot do the evaluation if we do not have time to meet with
the teachers and the system is available. Also, the literacy of the stakeholders and how
versed they are in using the technology will affect how positive or negative they are.
The Global Campus program, as many other program implemented in schools, may
face different ethical challenges while conducting an evaluation. “The ethical problems
generally arise from pressures from stakeholders, typically the client, concerning the
product of evaluation”. (Fitzpatrick, Sanders, & Worthen, 2011)
Some types of ethical challenges Global Campus may face are that the text
mentioned: (Fitzpatrick, Sanders, & Worthen, 2011)
Challenges in the contracting phase
Ethical concerns regarding confidentiality
Challenges in presenting findings
Program Evaluation
1
Select an Evaluation Model
Evaluation Model Advantages Disadvantages
EXPERTISE AND CONSUMER-
ORIENTED APPROACHES
Consumer-Oriented approach has made
evaluations available on products and
programs to consumers who may have
not the time or resources to do the
evaluation process themselves.
It increases the consumers’ knowledge
about using criteria and standards to
objectively and effectively evaluate
educational and human services
products.
Consumers have become more aware of
market strategies.
Expertise-Oriented approach: those
well-versed make decisions, standards
are set, encourage improvement
through self-study (Alex et al,
evaluation program, 2010)
Consumer-Oriented approach increases
product costs onto the consumer
Products tests involves time and
money, typically passed onto the
consumer
Stringent criteria and standards may
curb creativity in product creation
Concern for rise of dependency of
outside products and consumer
services rather than local initiative
development
Expertise-Oriented approach: whose
standards? (Personal bias), expertise
credentials, can this approach be used
with issues of classroom life, texts, and
other evaluation objects or only with
the bigger institutional questions? (Alex
et al, evaluation program, 2010)
PROGRAM-ORIENTED EVALUATION
APPROACHES
Program-oriented approach is easily
understood, is easy to follow and
implement, and produces information
that program directors generally agree
is relevant to their mission. (Fitzpatrick,
Sanders, & Worthen, 2011)
The focus on objectives can cause
evaluators to ignore other important
outcomes of the program, both
beneficial and detrimental, and if the
evaluation draws final conclusions, the
Program Evaluation
2
judgment of the program may be
seriously incomplete.
It also neglects program description,
the need to gain an understanding of
the context in which the program
operates and the effects of that context
on program success or failure.
Evaluators using this approach ay
neglect their role in considering the
value of the objectives themselves.
(Fitzpatrick, Sanders, & Worthen, 2011)
DECISION-ORIENTED EVALUATION
APPROACHES
Provide information that helps people,
typically managers or policymakers,
make decisions. (Fitzpatrick, Sanders, &
Worthen, 2011)
Their focus on decisions. This approach
tends to neglect stakeholders with less
power
Social equity and equality are not
values directly addressed by decision-
oriented models. (Fitzpatrick, Sanders,
& Worthen, 2011)
PARTICIPANT-ORIENTED
EVALUATION APPROACHES
Participants oriented approach
emphasizes human element, gain new
insights and theories, flexibility,
attention to contextual variables,
encourages multiple data collection
methods, provides rich, persuasive
information, establishes dialogue with
and empowers quiet, powerless
stakeholders. (Alex et al, evaluation
program, 2010)
It is too complex for practitioners
(more for theorists), political element,
subjective, “loose” evaluations, labor
intensive which limits number of cases
studied, cost, potential for evaluators to
lose objectivity (Alex et al, evaluation
program, 2010)
The feasibility, or manageability, of
implementing a successful participative
study; and the credibility of the results
Program Evaluation
3
to those who do not participate in the
process. (Fitzpatrick, Sanders, &
Worthen, 2011)
Explain your choice of model for your program evaluation:
The evaluation approach that may work well in the GC evaluation program is a combination between:
DECISION-ORIENTED EVALUATION APPROACH
And
PARTICIPANT-ORIENTED EVALUATION APPROACH
Then identify and explain why one approach or combination of approaches is most feasible, practical, and effective for
your program evaluation, and describe how you intend to apply it. Why is this approach (es) superior to the others
given the context of your program evaluation?
One of the evaluation goals is decision making, the GC company, the school board will refer to the evaluation results to make
their decision on:
School:
Replace the current system or keep it
Invest more in the current system to solve the problems
GC company:
Improve on the user interface, modify workflows, activate extra features
Change training methodology, support delivery, communication channels, and others...
In addition, both school and GC company are concerned about learning more about other stakeholders benefits from the
system; teachers, parents, students. Their input is important and as the text said: “involving stakeholders in thinking about
the meaning of various constructs or
Program Evaluation
4
phenomena to be measured, conceptualizing them, and considering how to measure them or collect information on them
can result in more valid data.” (Fitzpatrick, Sanders, & Worthen, 2011)
“Evaluators’ dialogue with (teachers, counselors, parents, and students) them increases the evaluators’ understanding of the
concept and, ultimately, the selection or development of better, more valid measures for the evaluation.” (Fitzpatrick,
Sanders, & Worthen, 2011)
The evaluation program can take from both approaches in order to get reliable information. It is good that the evaluators sit
with a group of stakeholders: for example, HOD (head of departments) to learn from them about their main concerns of
teachers’ performance and works when related to the GC system. How would using the system affect teachers’ performance,
the way they deal with their tasks, with their students and parents.
It is important to identify the major levels of decision making at the school, and focus the evaluation towards what best
serves their need to make the decisions.
Program Evaluation
1
Main purpose(s) of a program evaluation
The main purpose of the evaluation is to determine whether the Global Campus
System is an appropriate system that answers all the needs of the stakeholders, whether
the stakeholders have followed the right training with the right methodology that enabled
them to become versed and at ease in manipulating the system, whether the system is used
to its maximum efficiency and in a way that helps them achieve their targets, and last but
not least to devise a future plan to sustain maximum usage and improve on the projected
outcomes.
Why should the evaluation be conducted?
The evaluation should be conducted to answer a lot of questions that were raised
based on symptoms that the stakeholders and the provider noted and were unable to
identify the causes in depth. Some of symptoms are the stress that the system experienced
in peak times that was caused by the fact that the teachers were leaving their work to the
last moment, the quality of the data analysis that was not reflecting the actual situation on
the ground, some complaints that were raised by some of the stakeholders, and complaints
by the provider whereby many problems were erroneously blamed onto the system while
the provider thought and sometimes proved it was not.
What do the various stakeholders want to achieve?
The stakeholders want to identify and understand the internal and external factors
that are resulting in the noted symptoms, evaluate whether the system is the right choice,
understand the strengths and the weaknesses of the system; all of this in the objective of
Program Evaluation
2
maximizing efficiency, improving on performance and helping all the stakeholders
achieving their targets and goals in the ultimate goal of providing a better quality of
information analysis and distribution and better quality of education.
Is the purpose to measure outcomes, evaluate processes, assess needs and
challenges, improve efficiency, or something else?
The purpose of the evaluation is indeed to measure outcomes, evaluate processes,
assess needs and challenges, improve efficiency but its ultimate purpose is to help in taking
strategic decisions about the choice of the system, the use of the system and maybe revising
the internal policies based on the evaluation.
Propose 5 evaluation questions that determine specifically what your program
evaluation is going to answer.
1. Is the system answering all the needs of the stakeholders? If yes, how is it doing
so?
2. What are the reasons behind the symptoms? Are the stakeholders' proficiency
level and/or the lack of time management the cause of the noted symptoms? if not, what
are the causes?
3. Could the internal policies be a factor? Should they be revised, and on what basis?
4. Should the system be replaced? If yes, what is the best time to phase it out? What
are the implications on such a decision? If no, what are the weaknesses of the current
system? How can it be improved to help reach the goals?
Program Evaluation
3
5. How do we solve the problems? What plan should we adopt? How do we measure
the progress and success of the selected plan? Where do we go from here?
Provide a rationale for this focus, and explain how these questions will impact the
program evaluation.
"Understanding a problem is half of the solution". No solution is possible if the
problem is not properly identified. These questions cover most of the aspects in the hosting
environment. They cover the technical aspect, management aspect, and quality aspect.
Clarify what is not being evaluated and why it should not be.
The aspects that will not be evaluated are the financial aspects. At this stage, these
aspects are not relevant to the evaluation process because the school has already invested
in the system and the financials are in any way a concern at this level. This aspect might be
evaluated based on the outcome and the decision of the evaluation.
Identify what standards are reflected in the choice of evaluation questions.
The two evaluation approaches; decision-oriented “Provide information that helps
people, typically managers or policymakers, make decisions” (Fitzpatrick, Sanders, &
Worthen, 2011), and participant-oriented are reflected in the choice of the questions.
“Involving stakeholders in thinking about the meaning of various constructs or phenomena
to be measured, conceptualizing them, and considering how to measure them or collect
information on them can result in more valid data.” (Fitzpatrick, Sanders, & Worthen,
2011)
Program Evaluation
4
Identify which stakeholders should be involved in determining evaluation questions
and explain why you think so. Explain what the role of the stakeholder should be in
determining the evaluative criteria.
Stakeholders should be involved in determining evaluation questions but because
there are large number of stakeholders, small group that represent everyone can be
involved. Example: HOD (head of departments), parents committee, students’ council, and
of course the school board (principal, school owner, etc..) and staff from the Global Campus
company.
Program Evaluation
1
Develop a Data Collection Design and Sampling Strategy
Reporting Strategy
Stakeholder Reporting Strategy Implications Stakeholder Involvement
The school; admin,
school principal, school
owner, coordinators and
HOD
The evaluation report should be communicated with this group in different methods: 1. Writing report because they have to save a record of all findings. 2. Another method is through a presentation that will show the result in different visual formats such as graphs. 3. Audio or video can be also used. This type presents qualitative evaluation findings, such as interviews (Stetson, 2008). For example an interview with a parent that highlights the important answers to the evaluation questions.
Main implication on the acceptance of the program (Global Campus)
There are many options in evaluation communication and reporting, and often several techniques or formats are used or sequenced to promote greater dissemination of results. For example, evaluators may draft a written report with preliminary findings, and then hold a working meeting with key evaluation stakeholders to validate findings, followed by a radio program to disseminate the final results. Sequencing a series of communication formats in a skillful way can be very influential in communicating a written report’s findings and recommendations (Torres et al. 2005). (Stetson, 2008)
Program Evaluation
2
Teachers
Working sessions which focus on the needs the evaluation highlight as a result Personal discussions can be used with specific teachers who need more work to be done with them and to focus on their needs as the evaluation result highlighted
Either accept using the program or refuse to use it.
Attending the working sessions Help generating the reports and highlighting the main issues.
Parents
Newsletter that can be posted on the website Presentation at school to inform the parents about the results of the evaluation in the area of their concern, and to highlight the issues that was found to be week or need more clarification
Increase the communication between the school and the parents.
Record the interviews Use a parents’ account to check the website and follow up on all what is happening on the website through observing the kids profile
Students
Video presentation, to show the students the results that are of their concerns.
The students benefit from the system as they can access school’s homework and classwork from home, all materials they need are online for them to download any time. They can monitor their progress and compare with previous results. Whatever result of the evaluation program is, the main concern is the students’ benefit. This benefit should increase.
Minor role, and we can name it as a main role. All the reports that are generated from the system an used by the evaluators to check on the teachers’ work will be through checking students’ marks, progress, homework, etc..
Program Evaluation
3
The global campus staff
Writing report which is important for the company to keep Personal discussions that help focus on specific needs Since this team is not available all the time at the school, one of the communicating tool can be emails, the other one can be teleconferences. Teleconferences can be arranged through communication service providers. A single number is given to participants to call; speaker phones are used to accommodate many people. Teleconferences are especially useful for discussing and getting feedback on evaluation documents that are distributed and reviewed by participants prior to the call. (Stetson, 2008) Debriefing meetings typically begin with a brief presentation, followed by discussion of key findings or other issues. Ongoing debriefing meetings may be held to communicate
Either satisfaction or the challenge to make changes to meet the needs and requirements as presented by the outcome of the evaluation report
Main role in preparing the writing reports, planning, preparing and doing the working sessions.
Program Evaluation
4
evaluation progress to program managers. A final debriefing meeting can be held with stakeholders to share and discuss key findings and recommendations from the final evaluation report. (Stetson, 2008)
Values, Standards, and Criteria: There are four standards as mentioned in the Program Evaluation Report (NDI, 2004) Standard 1: Utility Utility standards ensure that information needs of evaluation users are satisfied. Seven utility standards address such items as identifying those who will be impacted by the evaluation, the amount and type of information collected, the values used in interpreting evaluation findings, and the clarity and timeliness of evaluation reports. Standard 2: Feasibility Feasibility standards ensure that the evaluation is viable and pragmatic. The three feasibility standards emphasize that the evaluation should employ practical, non disruptive procedures; that the differing political interests of those involved should be anticipated and acknowledged; and that the use of resources in conducting the evaluation should be prudent and produce valuable findings. Standard 3: Propriety Propriety standards ensure that the evaluation is ethical (i.e., conducted with regard for the rights and interests of those involved and effected). Eight propriety standards address such items as developing protocols and other agreements for guiding the evaluation; protecting the welfare of human subjects; weighing and disclosing findings in a complete and balanced fashion; and addressing any conflicts of interest in an open and fair manner. Standard 4: Accuracy Accuracy standards ensure that the evaluation produces findings that are considered correct. Twelve accuracy standards include such items as describing the program and its context; articulating in detail the purpose and methods of the evaluation; employing systematic procedures to gather valid and reliable information; applying appropriate qualitative or quantitative methods during analysis and synthesis; and producing impartial reports containing conclusions that are justified.
Program Evaluation
5
Without attention to these standards, your work and your results will not be credible or useful and, ultimately, will not help you continually improve your program. The American Evaluation Association has created a set of Guiding Principles for evaluators: (Wikipedia) Systematic Inquiry: evaluators conduct systematic, data-based inquiries about whatever is being evaluated. This requires quality data collection, including a defensible choice of indicators, which lends credibility to findings. Findings are credible when they are demonstrably evidence-based, reliable and valid. This also pertains to the choice of methodology employed, such that it is consistent with the aims of the evaluation and provides dependable data. Furthermore, utility of findings is critical such that the information obtained by evaluation is comprehensive and timely, and thus serves to provide maximal benefit and use to stakeholders. Competence: evaluators provide competent performance to stakeholders. This requires that evaluation teams comprise an appropriate combination of competencies, such that varied and appropriate expertise is available for the evaluation process, and that evaluators work within their scope of capability. Integrity/Honesty: evaluators ensure the honesty and integrity of the entire evaluation process. A key element of this principle is freedom from bias in evaluation and this is underscored by three principles: impartiality, independence, and transparency. Respect for People: Evaluators respect the security, dignity and self-worth of the respondents, program participants, clients, and other stakeholders with whom they interact. This is particularly pertinent with regards to those who will be impacted upon by the evaluation findings. Protection of people includes ensuring informed consent from those involved in the evaluation, upholding confidentiality, and ensuring that the identity of those who may provide sensitive information towards the program evaluation is protected. Evaluators are ethically required to respect the customs and beliefs of those who are impacted upon by the evaluation or program activities. Where stakeholders wish to place objections to evaluation findings, such a process should be facilitated through the local office of the evaluation organization, and procedures for lodging complaints or queries should be accessible and clear. Responsibilities for General and Public Welfare: Evaluators articulate and take into account the diversity of interests and values that may be related to the general and public welfare. Access to evaluation documents by the wider public should be facilitated such that discussion and feedback is enabled. Potential ethical issues: Protection of people:
Program Evaluation
6
The ethical challenges related to the protection of people can be subdivided into six major themes: avoiding personal duress, guaranteeing confidentiality, considering safety, setting realistic expectations, protecting the organization’s credibility, and avoiding research subject fatigue. Quality data collection techniques: In addition to using best practice in data collection to ensure sound and credible inputs for analysis, evaluators also need to consider some ethical challenges that can affect data quality.
Program Evaluation
1
References:
Alex, Ann, Mark, Lisa, Nick, & Ethan, Program Evaluation presentation, retrieved
from http://www.slideshare.net/spanishpvs/program-evaluation
Fitzpatrick, J., Sanders, J. R., & Worthen, B. R. (2011). Program Evaluation:
Alternative Approaches and Practical Guidelines. Upper Saddle River: Pearson Education.
Global Horizon, retrieved from: www.global-horizon.net
Cheyanne Church and Mark M. Rogers, DESIGNING FOR RESULTS, 2006, retrieved
from: http://www.sfcg.org/Documents/dmechapter11.pdf
Valerie Stetson, Communicating and Reporting on an Evaluation, 2008 retrieved
from:
http://www.crsprogramquality.org/storage/pubs/me/MEshortcut_communicating.
Wikipedia, Evaluation, retrieved from:
http://en.wikipedia.org/wiki/Evaluation#Standards
http://www.slideshare.net/spanishpvs/program-evaluationhttp://www.global-horizon.net/http://www.sfcg.org/Documents/dmechapter11.pdfhttp://www.crsprogramquality.org/storage/pubs/me/MEshortcut_communicating.pdfhttp://www.crsprogramquality.org/storage/pubs/me/MEshortcut_communicating.pdfhttp://en.wikipedia.org/wiki/Evaluation#Standards