Click here to load reader
Upload
dinhdien
View
212
Download
0
Embed Size (px)
Citation preview
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 1
COURSE OBJECTIVES AND GOALS This course is designed to give you an introduction to study design in the context of study design in the context of evaluation research. Very few programs these days are funded without requiring an evaluation component, often conducted by an independent researcher such as a professor. For those of you interested in practitioner jobs, agency personnel also need to be familiar with evaluation procedures because they must participate in the process and be smart about what to ask of their evaluators. Policymakers who face limited resources want to know if programs work and if they should re-fund programs or fund additional ones in other communities. Not just “any Ph.D.” can conduct program evaluations—or at least conduct them well. Successful program evaluators have specific skills, some learned and some not. This class is designed to give you both the book knowledge and some practical experience in the details of study design and program evaluation. Consequently, it includes lecture, readings, and experience in the field with real programs. Specifically, you will work with one local agency to develop an evaluation plan for one of their programs.
BOOKS AND OTHER READINGS
There are two books and multiple readings assigned for this course. Additional readings and examples will be posted on the class Canvas site. The books are available for purchase at all local college bookstores (and on the web, of course). The textbook is:
Rossi, Peter H., Mark W. Lipsey, & Howard E. Freeman. (2004). EVALUATION: A SYSTEMATIC APPROACH. 7TH ED. Thousand Oaks: Sage.
Shadish, William R., Thomas D. Cook, & Donald T. Campbell. (2001). EXPERIMENTAL AND
QUASI-EXPERIMENTAL DESIGNS FOR GENERALIZED CAUSAL INFERENCE. Cengage. ISBN 0395615569
CCJ 6712—STUDY DESIGN AND
EVALUATION RESEARCH Thursdays, 10:40-1:40 pm in NPB 1200
Graduate Seminar Department of Sociology and Criminology & Law
Spring 2016
Dr. Jodi Lane 3332 Turlington Hall, Office Hours: Tuesday and Thursday 9-10 am, by appt. or email.
(352) 294-7179 Email: [email protected]
Course materials on Canvas
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 2
ASSIGNMENTS AND GRADING (1) PROGRAM EVALUATION PLAN FOR LOCAL AGENCY (40%)
Your biggest assignment in this course is to work with a local agency to develop a program evaluation plan that specifically addresses their program, mission, objectives, needs, etc. This project requires consistent contact with the agency (at least weekly) and consistent work over the semester. At the end, you will give me and your agency a copy of this program evaluation plan. If your plan is a good one and well-thought out, this document/project may even serve as a precursor to a solid research project for you with the agency. In some cases, graduate students who do these evaluation plans actually conduct the evaluation for their thesis or dissertation. I will give you guidance and examples throughout the course. **PLEASE TURN IN TWO COPIES OF THIS DOCUMENT** (ONE FOR ME TO KEEP AND ONE FOR ME TO RETURN WITH COMMENTS). I will request confirmation that you have given a copy to your agency also (from the agency). You may give them the original or revise it after my comments/grade. You will be working on this plan throughout the semester, so the end product is just the writing up of the information and compilation of the assignments you have done during the class. (2) WEEKLY ASSIGNMENTS (25%) Most weeks you will have a specific assignment to do for this class (see attached list). Each assignment helps you build rapport with your agency’s staff and get the information necessary to write your final project—the program evaluation plan for your agency. You will turn in the assignment to me the week it is due, and in most cases you will share the information with your agency’s staff to make sure you have correctly represented their program. We will discuss the details of completing each assignment in class and you will share your assignment and concerns with the remainder of the class each week. Part of the class is designed to allow discussion, so you and the other students can think through concerns that you have in developing your evaluation plans. (3) PRESENTATION OF YOUR EVALUATION PLAN—FOR CLASS AND INSTRUCTOR FEEDBACK (10%) This assignment gives you the opportunity to talk through your evaluation plan with the class and with me before you write it up. You should present the details of your evaluation plan (e.g., in a PowerPoint presentation) and bring a written list of questions that you would like the class to give you feedback on (issues you are struggling with). In addition, you should bring copies of the presentation and any instruments you have drafted for me and for the rest of the students in the class as well as the expected table of contents for the evaluation plan, so they can write up feedback for you. Presentation of the plan itself should take about fifteen minutes, but discussion will follow. (4) CRITIQUE OF AN ACADEMIC JOURNAL ARTICLE ABOUT AN OUTCOME EVALUATION (10%) For this assignment, you are to find an academic journal article reporting the results of an outcome evaluation in criminology or criminal justice (or your own field, if you are not a criminology student)
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 3
(not one that is assigned). You must attach a copy of the article itself to your paper. For this paper you will, IN YOUR OWN WORDS (not the author’s):
1. Identify the program being evaluated 2. Identify the funding agency and its evaluation requirements 3. Identify the major research questions 4. Identify the theoretical orientation of the article 5. Explicitly identify the main hypothesis or hypotheses 6. Identify the dependent variable(s) and indicate how it/they are operationalized 7. Identify the independent variables and indicate how they are operationalized 8. Indicate the method of data collection and sample size. 9. Indicate the primary findings 10. Indicate the policy implications derived by the authors 11. Indicate your reaction and critique the methods and results detailed above, based
on what you’ve learned in this class. Question 11 is the most critical part of this assignment. Once you’ve identified the information above, you should carefully evaluate the article based on course material. Answer questions such as:
a. Was the program evaluated appropriate based on what you learned in the course?
b. Did the evaluators meet the funding agency’s requirements? c. Were there general problems with the design? d. Were the research questions appropriate and valid? e. Did the theory match the program? f. Was the operationalization of the variables appropriate? Do you see issues with
how they measured aspects of the program? g. Did the method of data collection match their research questions? h. Was their analysis appropriate? i. Do their findings make sense? j. Do their conclusions and policy implications follow from their data and findings?
Make sure you elaborate on and explain your answers—why did you come to the conclusion you did? It will be helpful to refer to class material such as readings and lecture to answer these questions. You may answer most of the questions above by using the question as a heading, and writing your answer below the question. Some of these answers may be one sentence. But, you must answer the question completely. **You will also share your article with the class, describe it’s content, and discuss your reaction to this article.**
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 4
(5) ATTENDANCE AND VERBAL AND WRITTEN PARTICIPATION (15%) In graduate courses, attendance is critical, because the class interaction stimulates ideas. In addition, in a methods class, each class period builds on the ones before it. I expect you to attend every class meeting, including those we reschedule. You may miss one class without penalty, but only for legitimate reasons (e.g., conferences, medical issue). These policies are consistent with the university policy for attendance in graduate classes. Please see p. 19 of the graduate student handbook at http://graduateschool.ufl.edu/media/graduate-school/pdf-files/handbook.pdf
I also expect you to be prepared and participate in discussion. We will discuss the readings in class each week. In addition, your feedback to your fellow students will help them with their final projects. You will serve as a “reviewer” of the plan for your fellow students. During class presentations of their projects, you should take notes on the presentation and write up a short set of comments for your fellow student regarding specific questions and comments to improve their plan (based on class material). These comments should be given to the student by Monday of the following week and emailed to me as well. They will count as part of your participation grade.
GRADING SCALE FOR COURSE
Total Points Grade
93-100 A
90-92 A-
87-89 B+
83-86 B
80-82 B-
77-79 C+
70-76 C
65-69 C-
63-64 D+
61-62 D
60 ↓ E
Please see the UF grading policies here: https://catalog.ufl.edu/ugrad/current/regulations/info/grades.aspx Students with Disabilities Students requesting classroom accommodations for disabilities must first register with the Dean of Students Office. The Dean of Students will provide documentation to the student who must then provide the documentation to Dr. Lane when requesting accommodation. Course Evaluations Students are expected to provide feedback on the quality of instruction in this course based on 10 criteria. These evaluations are conducted online at https://evaluations.ufl.edu. Evaluations are typically open during the last two or three weeks of the semester, but students will be given
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 5
specific times when they are open. Summary results of these assessments are available to students at https://evaluations.ufl.edu.
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 6
WEEKLY SCHEDULE WEEK DATE TOPIC READINGS FOOD
1 1/7 Overview of class Overview of program evaluation Managing yourself in the field
1. Rossi: Chapter 1
2 1/14 Tailoring Studies and Evaluations
to Local Context
1. Rossi: Chapter 2 2. Shadish et al.: Chapter 1,
pp. 13-32 and Chapter 3 3. Trulson, Marquart, &
Mullings (2004) 4. Lane (Regina House Plan)
3 1/21 Identifying Issues and
Formulating Questions Determining Mission, Goals & Objectives
1. Rossi: Chapter 3 2. Garcia (2003) 3. Petersilia (1993)
4 1/28 Assessing Need for a Program
1. Rossi: Chapter 4
5 2/4 Assessing Program Theory 1. Rossi: Chapter 5
2. Karp, Lane & Turner (2002)
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 7
WEEK DATE TOPIC READINGS
FOOD
6 2/11 Assessing Program Process
1. Rossi: Chapter 6 2. Shadish et al: Chapter 10 3. Petersilia (1989) 4. Lane & Lanza-Kaduce
(2007) 5. Lane & Turner (1999) 6. Lane, Turner, & Flores
(2004) 7. Matthews et al. (2001)
7 2/18 Measuring program outcomes 1. Rossi: Chapter 7
2. FBCDTI forms 3. RAND evaluation forms 4. Farrington (2003) 5. Shepherd (2003)
8 2/25 Assessing program impact:
Randomized Designs **GET ARTICLE APPROVED**
1. Rossi: Chapter 8 2. Shadish et al: Chapter 8 3. Petersilia & Turner (1992) 4. Petersilia & Turner (1993) 5. Greenwood & Turner
(1993) 6. Lane et al. (2005) 7. Gottfredson et al. (2005) 8. Wexler et al. (1999)
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 8
WEEK DATE TOPIC READINGS FOOD
9 3/3 NO CLASS: SPRING BREAK NONE NONE
10 3/10 Assessing program impact:
Quasi-Experimental Designs Detecting, Interpreting, and Analyzing Effects
1. Rossi: Chapters 9 & 10 2. Shadish et al: Chapters
4-7 3. MacKenzie et al. (1995) 4. MacKenzie et al., (1999) 5. Braga et al. (2001) 6. Johnson (2004) 7. Andrews et al. (1990) 8. Lipsey (1992)
11 3/17 Measuring Efficiency
Social Context of Evaluation Disseminating Results to Policymakers and Academics **ARTICLE EVALUATION DUE** Discussion of individuals’ articles
1. Rossi: Chapter 11 & 12 2. Cohen 3. Sherman & Berk (1984)
12 3/24 PRESENTATIONS OF
PROGRAM PLANS
NONE
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 9
WEEK DATE TOPIC READINGS FOOD 13 3/31 NO CLASS—ACJS
CONFERENCE (Budgets/Timetables Still Due via email)
NONE NONE
14
4/7 CLASS DISCUSSION OF AND PROBLEM SOLVING RE: PROGRAM PLANS
NONE
15 4/14 INDIVIDUAL MEETINGS WITH
DR. LANE (IF DESIRED)
NONE NONE
16 4/19--
TUES **FINAL PROGRAM PLAN DUE BY 2 PM IN MY MAILBOX/OFFICE** (2 COPIES)
NONE NONE
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 10
EVALUATION RESEARCH Spring 2016
Dr. Lane
Week Due Date
Weekly Assignment To Turn In (2 copies of each!)
1 & 2 1/14 1. Select agency for program
evaluation plan assignment (Dr. Lane must approve)
2. Meet with primary contact at agency 3. Describe the course and your
assignment, including what you’ll need from them.
4. Set up weekly meeting time with this person (or other key person)
Signed sheet
3 1/21 1. Gather background information on agency (e.g., newspaper articles, pamphlets, yearly reports, invitations to events, meetings agendas and minutes)
Copies of materials
4 1/28 1. Determine agency mission 2. Determine goals & objectives of
program 3. Determine agency’s
operationalization of these goals/objectives
Mission, goals, operationalization table (see example)
TWO COPIES!
5 2/4 1. Independently gather and compile in a table background on social conditions related to and client population/targets for the program (e.g., county statistics such as arrest data, population numbers—maybe poverty, homelessness, etc.)
2. Determine agency’s perceptions of their target population and the needs their program addresses
Table of background info
Worksheet on target population/needs
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 11
Week Due Date
Weekly Assignment To Turn In
6 2/11 1. Determine program’s impact theory, service utilization plan, and process theory
2. If possible, begin observing agency activities
Impact theory, utilization plan, and process theory
TWO COPIES!
7 2/18 1. Observe agency activities and meetings
2. Take notes relevant to process and your evaluation plan
Copies of observation notes
8 2/25 1. Observe agency activities and meetings
2. Take notes relevant to process and your evaluation plan
Copies of new observation notes
Bring article for approval
9 3/3
NONE—SPRING BREAK NONE SPRING BREAK
10 3/10 1. Continue observing agency 2. Develop evaluation instruments in
conjunction with agency personnel (e.g., surveys, observation checklists/notes, interviews)
Copies of new observation notes
Draft instruments
11 3/17 1. Continue observing agency 2. Develop evaluation instruments in
conjunction with agency personnel
Copies of new observation notes
Draft instruments
Article evaluation paper
12 3/24 1. Continue observing agency
2. Develop evaluation instruments
All Students
Copies of new observation notes
Presentation slides
Draft of Table of contents for evaluation plan
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 12
Week Due Date
Weekly Assignment To Turn In
13 3/31 1. Develop in-kind and actual budgets for evaluation
2. Develop implementation timetable
Budgets
Timetable
14
4/7 A detailed draft of table of contents Your expected, detailed table of contents (Revised as working)
15 4/14 WORKING ON PROGRAM PLANS
NOTHING
16 4/19, Tues
PROGRAM PLANS DONE! 2 COPIES OF BOUND PROGRAM PLAN DUE TUESDAY IN MY OFFICE
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 13
CCJ 6712 EVALUATION RESEARCH ADDITIONAL COURSE READINGS
(In assigned order) Trulson, Chad R., James W. Marquart, & Janet L. Mullings. (2004). “Breaking In:
Gaining Entry to Prisons and Other Hard-to-Access Criminal Justice Organizations.” Journal of Criminal Justice Education, 15, 451-478.
Garcia, Crystal A. (2003). “Realistic Expectations: Constructing a Mission-Based
Evaluation Model for Community Corrections Programs.” Criminal Justice Policy Review, 14, 1-19.
Petersilia, J. (1993). “Measuring the Performance of Community Corrections.” In
Performance Measures for the Criminal Justice System (pp. 61-84). Washington D.C.: Bureau of Justice Statistics.
Karp, David. R., Jodi Lane, & Susan Turner. (2002). “Ventura County and the Theory of
Community Justice.” In David R. Karp & Todd R. Clear (Eds.), What is Community Justice? Case Studies of Restorative Justice and Community Supervision (pp. 3-33). Thousand Oaks: Sage.
Petersilia, Joan. (1989). “Implementing Randomized Experiments: Lessons from BJAs
Intensive Supervision Project.” Evaluation Review, 13, 435-458. Lane, Jodi & Lonn Lanza-Kaduce. (2007). “Before You Open the Doors: Ten Lessons
from Florida’s Faith and Community-Based Delinquency Treatment Initiative.” Evaluation Review, 31/2, 121-152.
Lane, Jodi & Susan Turner. (1999). “Interagency Collaboration in Juvenile Justice:
Learning from Experience.” Federal Probation, 63/2: 33-39. Lane, Jodi, Susan Turner, & Carmen Flores. (2004). “Researcher-Practitioner
Collaboration in Community Corrections: Overcoming Hurdles for Successful Partnerships.” Criminal Justice Review, 29, 97-114.
Matthews, Betsy, Dana Jones Hubbard, & Edward Latessa. (2001). “Making the Next
Step: Using Evaluability Assessment to Improve Correctional Programming.” The Prison Journal, 81:
Farrington, David P. (2003). “A Short History of Randomized Experiments in
Criminology.” Evaluation Review, 27, 218-227. Shepherd, Jonathan P. (2003). “Explaining Feast or Famine in Randomized Field Trials:
Medical Science and Criminology Compared.” Evaluation Review, 27, 290-315.
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 14
Petersilia, Joan & Susan Turner. (1992). “An Evaluation of Intensive Probation in California.” Journal of Criminal Law & Criminology, 82, 610-658.
Petersilia, Joan & Susan Turner. (1993). “Intensive Probation and Parole.” Pp. 281-335
in Michael Tonry (ed.). Crime and Justice: An Annual Review of Research. Chicago: University of Chicago Press.
Greenwood, Peter W. & Susan Turner. (1993). “Evaluation of the Paint Creek Youth
Center: A Residential Program for Serious Delinquents.” Criminology, 31, 263-279.
Lane, Jodi, Susan Turner, Terry Fain, & Amber Sehgal. (2005). “Evaluating an
Experimental Intensive Juvenile Probation Program: Supervision and Official Outcomes.” Crime & Delinquency, 51: 26-52. And accompanying instruments: risk assessment, contact forms, background instrument, 6 month and 18 month follow-up forms
Gottfredson, Denise C., Brook W. Kearley, Stacy S. Najaka, & Carlos M. Rocha. (2005).
“The Baltimore City Drug Treatment Court: 3- Year Self-Report Outcome Study.” Evaluation Review, 29, 42-64.
Wexler, Harry K., Gerald Melnick, Lois Lowe and Jean Peters. (1999). “Three-Year
Reincarceration Outcomes for Amity In-Prison Therapeutic Community and Aftercare in California.” The Prison Journal 79/3: 321-336.
MacKenzie, Doris Layton, Robert Brame, David McDowall, & Claire Souryal. (1995).
“Boot Camp Prisons and Recidivism in Eight States.” Criminology, 33, 327-357. MacKenzie, Doris Layton, Katharine Browning, Stacy B. Skroban, & Douglas A. Smith.
(1999). “The Impact of Probation on the Criminal Activities of Offenders.” Journal of Research in Crime and Delinquency, 36, 423-453.
Braga, Anthony A., David M. Kennedy, Elin J. Waring, & Anne Morrison Piehl. (2001).
“Problem-oriented Policing, Deterrence, and Youth Violence: An Evaluation of Boston’s Operation Ceasefire.” Journal of Research in Crime and Delinquency, 38, 195-225.
Johnson, Byron R. (2004). “Religious Programs and Recidivism Among Former
Inmates in Prison Fellowship Programs: A Long-Term Follow-Up Study.” Justice Quarterly, 21, 329-354.
Andrews, D.A., Ivan Zinger, Robert D. Hoge, James Bonta, Paul Gendreau, and Francis
T. Cullen. (1990). “Does Correctional Treatment Work? A Clinically Relevant and Psychologically Informed Meta-Analysis.” Criminology 28/3: 369-404.
Dr. Lane, CCJ 6712 Evaluation Research, Spring 2016 Syllabus—page 15
Lipsey, Mark W. (1992). “Juvenile Delinquency Treatment: A Meta-analytic Inquiry into the Variability of Effects.” In T. D. Cook, H. Cooper, D. S. Cordray, H. Hartmann, L. V. Hedges, R. J. Light, T. A. Louis, & F. Mosteller (Eds.), Meta-analysis for explanation: A casebook (pp. 83-127). New York: Russell Sage.
Cohen, Mark A. (2000). “Measuring the Costs and Benefits of Crime and Justice.” In
David Duffee (ed.), Measurement and Analysis of Criminal Justice, Crime and Justice 2000, Vol. 4. Washington D.C.: National Institute of Justice. Pp. 263-315.
Sherman, Lawrence W. & Richard A. Berk. (1984). “The Specific Deterrent Effects of
Arrest for Domestic Assault.” American Sociological Review, 49, 261-272.