20
Our experience in monitoring and evaluating drug abuse prevention

Our experience in monitoring and evaluating drug abuse prevention

Embed Size (px)

Citation preview

Our experience in monitoring and evaluating drug abuse prevention

Giovanna CampelloUNODC Prevention Treatment & Rehabilitation UnitCICAD VII Meeting of the Expert Group on Demand Reduction, 13-15 September 2005, Ottawa, Canada

UNODC has carried out two kinds of work with regard to monitoring and evaluation (“m & e”) of drug abuse prevention:

1 -- Assessing the progress of member states (“MS”) in meeting the commitments they took in the Political Declaration of 1998 (including drug abuse prevention, treatment and rehabilitation

2 -- Identifying and disseminating good practices in monitoring and evaluating drug abuse prevention activities and programmes implemented by youth- and community-based organisations.

Emily Holman
Write out in long form "wrt"

1 -- Assessing Member States’ Drug Abuse Prevention Programmes and Activities

UN Member States Report on Prevention Activities through Questionnaires (BRQs)

• With regard to prevention, the Questionnaire asks:• Whether MS have implemented drug

abuse prevention activities in different settings (yes/no)

• If yes, whether the coverage of the activities is low/ medium/ high

• Whether the activities are sensitive to gender (yes/no)

• Whether they have been evaluated (yes/no)

Limitations of Questionnaires

• Provide the perception of Member States

• Provide limited information• Only on implementation, not on

impact (the questionnaire only asks whether the activities have been evaluated or not, it does not ask about the results of the evaluation)

• Yes/no, low/medium/high kind of answers

Emily Holman
I think you can remove this last bullet. Perhaps you can condense the two about teh BRQs onto one slide.
Emily Holman
This point on perception will be important in terms of the rest of your presentation -- and why it is worth their while - but should try to condense to one slide

Still some useful indication, for example, about the evaluation of prevention

% of Member States reporting that their prevention activities have been evaluated

-

5.0

10.0

15.0

20.0

25.0

1998-2000 2000-2002 2002-2004

Reporting cycles

Americas Global

Emily Holman
This looks interesting. But we don't want them to think about how much prevention programming is carried out and get them off track. Could you graph instead the extent to which they report have carried out Evaluation in N and S America and the Caribbean (instead of the numerical score on overall intensitve of prevention)?

How does the UNODC Questionnaire relate to other existing regional instruments for measuring the extent of prevention activities?

• Questionnaire to be reviewed in October/ November 2005 in Vienna

• CICAD, EMCDDA represented• Also to see how the monitoring

work can continue after 2008

Emily Holman
Again, would try and keep this Part 1 brief, maybe combine

2 -- Monitoring and evaluation of drug abuse prevention by youth- and community-based organisations

How we identify good practice

• Review of the (academic) literature identifies principles and issues

• Principles/ issues are discussed and enriched in meetings including youth/ prevention workers and youth from all regions

• Results are also circulated and discussed with focal points in national and international agencies• Next publication: MONITORING & EVALUATION!• Next piece of work: Prevention of

Amphetamine-Types Stimulants

Our Publications

All available on our website!www.unodc.org/youthnet

Emily Holman
Yeah, I think you could keep this slide instead - it's a quickie - it will give you a chance to reference the network, and can show these pictures in the context of the soon-to-come-out publication on m & e

Monitoring & Evaluation Definitions

Note: These are the definitions we find useful, we are aware that there are grey areas and that terminology is being used differently.

• Monitoring is about implementation of activities. It takes place during and feeds into implementation.

• Evaluation is about the impact of activities. It takes place ‘after’ implementation and assesses changes in the situation of the target group, including, but not limited to what was done (implementation).

What (should be evaluated)?

• Preventing use? Assessing impact in terms of drug abuse prevention might be counterproductive• The activities of most organisations are too limited

in the no. of risk/protective factors they address, in coverage, in intensity, in duration.

• To be valid, the kind of statistical analysis required is complex and/or requires too large a sample

• Change in protective factors? Assessing impact in terms of whether the risk/ protective factor situation has changed (on the basis of evidence of link to drug abuse prevention)

Example of a small youth group with the (long term) goal of decreasing the number of youth starting to use substance in their community

• IDENTIFIED RISK FACTOR 1 -- Poor communication between parents and youth• (IMMEDIATE) OBJECTIVE 1 -- By the end of our

project, the communication between parents and youth of our community will have improved.

• INDICATORS OF ACHIEVEMENT OF OBJECTIVE 1 -- Number of meals taken together by families has increased -- Youth report better communication with their parents, including on drug abuse issues

• ACTIVITIES PLANNED IN ORDER TO ACHIEVE OBJECTIVE 1 -- Parenting skill session after school once a week for two months -- Free family meals once a week -- Family picnics once a month

Example (continued)• IDENTIFIED RISK FACTOR 2 -- Youth have too

much time in their hands with not much to do• (IMMEDIATE) OBJECTIVE 2 -- By the end of our

project, the youth of our community will be more involved in constructive activities in their free time

• INDICATORS OF ACHIEVEMENT OF OBJECTIVE 2 – No. of youth involved in a constructive activity at least twice a week in their free time increased – No. of youth spending their time chatting in the street diminished

• ACTIVITIES PLANNED IN ORDER TO ACHIEVE OBJECTIVE 2 -- Organise sports training including a health promotion component & participate in competitions -- Assist youth in organising or finding other activities including a health promotion component

How? A couple of basic principles

• (At least) collect baseline data or collect data as time goes by to show how the situation changes.

• Use a variety of methods to collect your information to validate it (triangulation)

• To evaluate you also need good monitoring. How can you say that what you did is effective, if you do not know what you did in the first place?

How? The methods

• Surveys through (self administered) questionnaires• Not easy! Especially to get the

sampling right and to create a simple but effective questionnaire

• Labour intensive! Testing the questionnaire, ensuring anonymity and confidentiality, analysing the replies.

• Provides numbers, which people (and donors) like so much!

Emily Holman
This slide is also exactly up the alley for this panel discussion. Feel free to expand here, add other examples, etc.

How? The methods

• Key informant interviews• Provide a series of very specific points of view

(‘biased’ information)• Can give very useful insight, if the information is

triangulated rigorously.• Group discussions (including Focus Group

Discussions; visual techniques, e.g. mapping; drama based techniques, e.g. role playing)• Provide quickly the point of view of a group of

similar people. Extrapolation is not easy, but still VERY useful insights

• Need experienced facilitation and a setting that engenders trust (e.g. not in a place where adults can listen what the youth are saying)

Emily Holman
This slide is also exactly up the alley for this panel discussion. Feel free to expand here, add other examples, etc.

Who (should be involved)?

• Staff, (young) volunteers and youth participants• To maximise the relevance of the evaluation to the

organisation, they can and should be involved in the planning, undertaking analysis, and reporting. However, they will need support and/or training.

• Important stakeholders (administrators in schools and in the community, health and social workers, religious leaders, donors, etc)• Not everyone needs to be involved in everything, but kept

informed at crucial points, so that they can facilitate the undertaking of the evaluation (permission to access information/ youth/ stakeholders; statistical advice; etc.)

• External evaluator• Evaluators lend credibility to results, but are expensive and

need follow up. Hiring an evaluator should be a conscious ‘investment’ decision on the part of an organisation that wants to undertake a more complex evaluation (more for advocacy than for learning?)

Emily Holman
Yet again, this slide is exactly what they need to hear and think about. You could expand here as well (maybe give an example of who are some of the important stakeholders). One other thing is you could add a slide following this one (just a thought, I'm not sure what you think) relating which PEOPLE could help in administering which INSTRUMENTS - so that they realize what degree of person is needed for which level of instruments, and when you need multiple-levels of people to support the same one, etc.

Your decision will depend on why you are evaluating!• Your donor told you?

• Many decisions will have been taken for you.• To improve your programme?

• An organisation wide reflection on which activities were implemented, the feedback of participants and some indication of impact in terms of risk and protective factors will be very useful.

• To advocate among donors and the community?• Results of a self evaluation (see above) including simple

data, a few interviews and focus group discussions can go a longer way than you think!

• To show that your programme has a drug abuse prevention effect?• Your programme might have run for long enough, with

enough coverage and intensity that you might think: yes, this is the time to invest time and money to show that we are preventing drug abuse! You will need a good external evaluator and possibly a control group.