44
Evaluating the Impact of CLD on National and Local Outcomes Evaluating the Impact of CLD on National and Local Outcomes Education Scotland Final Report October 2011

Evaluating the Impact of CLD on National and Local ... the Impact of CLD on National and Local Outcomes Education Scotland ... study involving the following three elements: ... into

Embed Size (px)

Citation preview

Evaluating the Impact of CLD on National and Local Outcomes

Evaluating the Impact of CLD on National

and Local Outcomes

Education Scotland

Final Report

October 2011

Evaluating the Impact of CLD on National and Local Outcomes

Contents

Section Page

1 Introduction ............................................................................................ 1

2 Analysis of Survey Returns ........................................................................ 5

3 In-depth studies .................................................................................... 22

4 Synthesis of Findings .............................................................................. 25

5 Reflections and Recommendations ........................................................... 32

6 Appendices ........................................................................................... 35

1

Evaluating the Impact of CLD on National and Local Outcomes

1 Introduction 1.1 In early 2011, Education Scotland (previously Learning and Teaching

Scotland:LTS) in conjunction with CLD Managers Scotland (CLDMS) commissioned Blake Stevenson Ltd to undertake research into the impact of CLD on national and local outcomes. The purpose of this research is to:

inform policy development and implementation with improved

evidence on the impact of CLD;

create a baseline for further improvement of the evidence base for

policy development and implementation in CLD; and

suggest models for future impact evaluation.

Context

1.2 The importance of being able to demonstrate the impact of CLD on

national and local outcomes has been recognised for some time. The influential Working and Learning Together (Scottish Executive, 2004) stated that it was important to be “able to assess more thoroughly the contribution of CLD to achieving outcomes”. This document also established the three national priorities for CLD subsequently confirmed by the Joint Scottish Government /COSLA statement in 2008 as being achieved through:

learning for adults;

learning for young people; and

building community capacity.

1.3 The Joint Statement on CLD from the Scottish Government/COSLA (November, 2008) identifies five outcomes within the National Performance Framework that CLD can be expected to make a particular impact on:

We are better educated, more skilled and more successful, renowned for our research and innovation (outcome 3)

Our young people are successful learners, confident individuals,

effective contributors and responsible citizens (outcome 4)

Our children have the best start in life and are ready to succeed

(outcome 5)

We have improved the life chances for children, young people and

families at risk (outcome 8)

2

Evaluating the Impact of CLD on National and Local Outcomes

We have strong, resilient and supportive communities where people

take responsibility for their own actions and how they affect others. (outcome 11)

1.4 In addition to these outcomes the Joint Statement recognises that CLD can contribute to other outcomes such as health, community safety and the environment.

1.5 In 2008 Learning Connections produced Delivering Change - understanding the outcomes of community learning and development. This document began to tease out the different outcomes that CLD can support including core skills of being able to:

communicate with other people;

solve problems and make decisions;

work with others;

use ICT; and

use skills with numbers.

1.6 Delivering Change‟s CLD outcomes about personal development link to those of the Curriculum for Excellence agenda. CLD supports people to become:

confident individuals;

effective contributors;

responsible citizens; and

successful learners

1.7 In addition Delivering Change sets outcomes for the building community capacity aspect of CLD’s work:

CLD supports people to be confident skilled and active members of the community;

CLD supports communities to be active and have more influence;

CLD supports community organisations to get access to resources and to deliver services more effectively;

CLD helps organisations plan, manage and assess their work

effectively;

3

Evaluating the Impact of CLD on National and Local Outcomes

CLD supports community organisations to include a wide range of

people in their work; and

CLD supports productive networks and relationships.

1.8 However it also recognised the need to provide evidence that outcomes are being achieved both for policy makers to see its impact and for individuals to reflect on the changes it has made in their own lives.

1.9 These outcomes link in with those set out in Quality Indicators 2.1 and 4.1 in How Good is our Community Learning and Development 2 (HGIOCLD2). The recent statement paper “Doing Things Differently” from CLDMS (November 2010) clearly echoes these sentiments. It argues the case for more, not less CLD to deliver change in individuals and communities.

1.10 In August 2010 CLDMS produced “Let‟s Prove It, a Guidance Framework on providing evidence on the local and national outcomes of CLD activities”. This document sets out the need not just to evidence intermediate outcomes but also CLD’s contribution to shared end outcomes (within Single Outcome Agreements and through them to the National Outcomes). It offers a framework that suggests indicators at four levels: engagement, self-perception of impact, measurable outcomes and sustained impact, and stresses the importance of gathering both quantitative and qualitative evidence.

1.11 In undertaking this research we were aware that some local authority/community planning areas were already developing ways to evidence their CLD contribution to local outcomes. The aim of the research was to collate what is happening and analyse where the gaps are and what might be recommended to improve this in the future.

Methods

1.12 We used both quantitative and qualitative methods to undertake this study involving the following three elements:

an electronic survey with all local authorities to establish what

evidence they gather and how it is used. This was sent to all 32 local authority areas and 26 areas submitted responses (25 for

Youthwork; 24 for Adult Learning; 26 for Community Capacity Building);

in-depth analysis involving interviews with nine areas to explore the links between their Single Outcome Agreements (SOAs), CLD

outcomes and evidence gathered in more detail. (Originally ten areas were proposed however despite many efforts it was not

possible to gain the involvement of one of the proposed areas); and

analysis and report writing.

4

Evaluating the Impact of CLD on National and Local Outcomes

1.13 We met with an advisory group for the study on three occasions to discuss the progress on the work and the draft report. We also kept in weekly contact with the contract manager within LTS.

1.14 The report is set out as follows:

Chapter 2 provides an analysis of the survey returns, divided into

the three CLD national priorities of youthwork, adult learning and community capacity building;

Chapter 3 describes the in-depth studies and sets out the main

frameworks each area currently uses;

Chapter 4 pulls together the synthesis of findings from the survey

and in-depth work;

Chapter 5 sets out reflections and recommendations from the

study.

5

Evaluating the Impact of CLD on National and Local Outcomes

2 Analysis of Survey Returns

The survey process 2.1 The survey was provided electronically rather than on-line as we

recognised that in many local authorities more than one person would have to complete it. The survey was divided into three main sections covering the three CLD priorities.

2.2 The survey was sent out in February 2011 and we sent reminder emails and undertook many follow-up telephone calls to encourage people to return them. The deadline for the work was extended to allow for this delay in responses being forthcoming.

2.3 Our sense from this process is that people working in CLD are very busy and we are grateful for the returns we did receive. Given that in several areas re-structuring is currently taking place we think that the final return of 26 is reasonable. Of the 26, one area only returned one aspect of the survey (community capacity building) and one area returned two of the sections but not the adult learning one.

2.4 Because of the level of pressure on people’s time we suspect that some of the answers in some of the survey returns are less full than they might have been and this may have a more negative effect in some of the responses than might otherwise have been the case. Where we think this is particularly the case we state this in the text.

What does CLD cover in your area?

2.5 The survey started with a preliminary question about what CLD covers in each area as we were aware that this can vary.

2.6 The majority of responses indicated that the three main CLD priorities are covered. The main deviations or additions to this appear to be around the community capacity building element and were as follows:

instead of “community capacity building” Dundee City Council cites “community regeneration and health”;

Highland Council lists what CLD covers as “youthwork, adult

learning and outdoor education”;

East Dunbartonshire, as well as the three main strands, includes

employability, children and young people’s planning, gender based violence planning, local area co-ordination (learning disability) and

health promotion;

6

Evaluating the Impact of CLD on National and Local Outcomes

two areas specifically mention the operation of community

centres/facilities in addition to mentioning community capacity building (Edinburgh and South Ayrshire Council); and

a few areas have spelt out what adult learning encompasses in more detail including family learning, parenting support, literacies,

ESOL, basic ICT and return to learning.

Youthwork

2.7 There were 25 respondents to this element of the survey. The first question asked respondents to describe their youthwork provision and who the main providers are.

Youthwork provision

2.8 The answers to this across all 25 are very similar with the local authority generally being seen as the main provider with key partners cited as voluntary sector, churches and uniformed organisations. The main difference to emerge (but this may be an instance where some respondents have simply supplied more information as referred to in para 2.4 above) is those local authorities that mention that their youthwork takes place in schools as well as in the community and those that do not. Our sense is that this may be an area where there is a key difference in provision across local authority areas.

Youthwork outcomes and indicators

2.9 In response to whether specific outcomes for youthwork have been set 24 said yes and one said no.

2.10 The next question asked whether there are indicators to measure achievement in relation to the outcomes for youthwork and 24 again said there were and one area (but not the same area as did not have outcomes) said they did not have any. The survey asked respondents to say what these indicators are and the responses to this were mostly detailed and a small number were vaguer which might suggest the indicators are not that clearly set out in these areas or might be a result of people being too busy to complete the survey form in detail. One area stated that they are in the process of developing their indicators. A few other areas simply said they use HGIOCLD2 or the Curriculum for Excellence but did not provide detail.

7

Evaluating the Impact of CLD on National and Local Outcomes

2.11 We have analysed the indicators that were reported and these mainly fall

into the following categories:

numbers of participants;

numbers of volunteers;

achievement of awards;

numbers moving on to positive destinations;

numbers engaged (for example in Youth Forums); and

number of young people in the MCMC category.

2.12 Some areas provide indicators that demonstrate they are trying to use indicators to examine the outcomes for participants. For example in Inverclyde the following are included:

number of young people reporting an increase in self-esteem;

number of young people increasing their score on the Warwick-

Edinburgh mental wellbeing scale; and

number of young people reporting positive health behaviour

change.

2.13 In South Lanarkshire the following indicators (among others about participation and achievement) are included (taken from the Data Definitions described in more detail in the in-depth study):

[participants] have made progress in developing social and

emotional competence;

[participants] demonstrate positive change in any or all of the

following in terms of behaviour, attitude, health related activity, and commitment;

[participants] have used information and guidance to make positive lifestyle choices; and

[participants] give freely of their time to work with others

(volunteering).

2.14 These provide some examples of the more outcome-focused indicators that have been reported. Our sense is that in many areas the SOA itself requires only the more straightforward quantitative assessments of participation, achievement and levels of engagement even though some more qualitative assessments are being undertaken. The in-depth studies

8

Evaluating the Impact of CLD on National and Local Outcomes

have explored this further. The next question asked precisely what quantitative and qualitative information is being gathered.

Quantitative and qualitative evidence gathered

2.15 We provide below a summary of the main types of quantitative and qualitative evidence listed overall. The level of information provided varied from a few authorities that for example just stated they gathered quantitative information on the number of participants to others which listed details of a comprehensive range of information collected.

Type of method Examples of information gathered

Quantitative Numbers of: participants, awards achieved, programmes delivered, hits on website/portal, workers attending training, volunteers, street-based contacts, young people involved in service planning and delivery, young people moving to positive destinations.

Decrease in substance misuse. Increase in skills, confidence, motivation, attendance at school.

Qualitative Feedback from young people, peers, families, partners’ organisations, stakeholders.

Trends in improving services measured against the HMIe quality indicators.

Project self-evaluations and monitoring reports and Learning Community reviews.

Case studies, story boards, portfolios, learner journeys and experiences, qualitative statements from ILPs, photos, film, podcasts, group evaluations.

2.16 In terms of who gathers this evidence all 25 responses indicated that CLD staff are involved although in some areas only managers are mentioned and in others only local staff are mentioned as gathering evidence. Ten of the responses mention partners as being involved in gathering evidence, with some making it clear that the CLD Team then collates all this information while others either do not make this clear or state that external organisations gather and collate their own.

2.17 The table below shows where evidence is stored. Some areas have cited more than one system. Five areas reported that their Management Information System (MIS) was in development or had only just been newly developed. The table shows that the most common form of storage is local authorities’ own database/MIS systems.

9

Evaluating the Impact of CLD on National and Local Outcomes

Table 2.1: Where evidence is stored

Where stored Numbers using this

method

PIES (Performance and Information Evaluation System) 3

AIMS (Active Impact Monitoring System) 1

Covalent 1

DEAMS/SEEMIS 1

Cognisoft Outreach 5

Customised/own database 8

Hanlone/QPR 1

Currently manual/no MIS (but in development) 2

Not stated what system but electronic plus paper 5

No system 1

2.18 The most effective MIS appear from the in-depth studies to be those

where an area has either adapted its existing MIS system to meet the needs of CLD or has developed a new customised system for the purpose.

2.19 In terms of how the evidence is used the following provides a summary of the ways that have been reported:

for various Council Committees and to elected Members;

for various partnerships and partners including CLD, Youth and

MCMC partnerships;

for Councils’ performance management, Corporate Plan/scrutiny;

for senior management and service planning;

for funders;

for HMIe inspections;

for CPP, including thematic groups, and SOA;

for reporting to the press; and

for feedback to parents

2.20 One area, East Lothian, undertakes what we consider an example of particularly good practice:

“information is fed back to young people through an annual “Living the Vision” event and at individual schools and projects”.

2.21 In response to the question “Is there an overall plan (at local

authority/community planning partnership/CLD Partnership level) about

10

Evaluating the Impact of CLD on National and Local Outcomes

how youthwork provision will contribute to the local Single Outcome Agreement?”, three areas stated no and two others stated that it was partial, linked to more general outcomes for young people/contained in a wider document. Twenty said there was such a plan in place.

2.22 There were mixed responses with regard to whether the evidence has been used directly in relation to the SOA. Six indicate that this is either not in place at all or very partial. Ten other areas make it clear there is a direct link and the remainder talk about some form of linking to the SOA either through local authority reporting mechanisms or through the community planning partnership’s thematic groups. Some areas have a limited set of indicators related to CLD within the SOA but a more detailed set that they use to monitor progress at CLD service level.

2.23 A good example of direct linkage as reported in the survey response comes from South Lanarkshire which states:

“A number of key targets have been set out within the SOA that are directly youthwork related. These reflect achievement of young people, literacy and numeracy, healthy

lifestyle choice, transitional support into the world of work, and decision making within communities”.

Barriers and Difficulties

2.24 Five areas reported no barriers or difficulties. The following provides a summary of barriers and difficulties raised from the remaining 20 responses.

Barriers relating to the process: admin time to support the input

of information to the MIS; getting staff to complete returns;

databases not all compatible and a wide range of databases in use; lack of strategic support and direction over a number of years, although this is now being addressed.

Barriers relating to reporting/evidence gathering: tends to be

output driven; difficulty of attribution; hard to provide evidence of the long term impacts; getting information from all the partners is difficult.

Barriers linked to the SOA process: hard to influence the design

of the SOA in the first place; still dominated by the attainment model (education); youthwork is not mentioned in the SOA so showing contribution is difficult; all partners would benefit from

closer ties to SOA outcomes and streamlining reporting methods.

Reduced budgets and resources: including reduction of time,

making implementation of all of this particularly difficult.

11

Evaluating the Impact of CLD on National and Local Outcomes

2.25 In addition to the above there is some evidence of areas where barriers and problems have been or are being overcome:

“we have overcome significant resistance to embed the [outcome based] approach and we continue to work at maintaining that”.

“before the launch of Believing in Young People (BYP) there was a

lack of understanding about the possible role and contribution of youthwork to the lives of young people - part of the justification for the production of BYP was to address that... We are now

developing outcomes for youthwork to put into the Children and Families Service Plan and the Children and Young People‟s Strategic

Partnership”.

“CLDS have baselined youthwork for the first time in 2010/11”.

The influence of the evidence

2.26 The final question asked whether the evidence gathered had been influential in any way either within the local authority or more widely.

2.27 In terms of influence within the local authority area, several areas reported that the evidence had been influential in a range of ways: youthwork has a higher profile with senior managers and/or with elected Members; it has helped to avoid cuts to the youthwork budget; it has helped in service planning.

2.28 The following provides some examples of the evidence influencing more widely:

positive HMIe inspections. For example, HMIe described East

Lothian’s electronic project files as “sector leading”;

external awards such as COSLA and Securing the Future awards;

has helped to win external funding for initiatives;

other areas adapting initiatives developed, for example Dumfries

and Galloway’s Inspector8 model has been adapted by a number of other areas; and

wider dissemination: for example Highland report that BBC Alba filmed its intergenerational technology project recently.

2.29 A few final comments were made. These centred around the problems posed by budget cuts and restructuring but with several areas also stating that they are reviewing what they are doing in relation to outcomes for youthwork/recognise the need to do this. A couple of areas commented

12

Evaluating the Impact of CLD on National and Local Outcomes

on the need to involve the wider partners with one area, Scottish Borders, stating that this is underway.

2.30 Shetland provides a positive final comment in its response: “Youthwork is viewed very positively in Shetland, and is being regarded more as an equal partner in the Schools service”.

2.31 The final comment from Renfrewshire provides a useful endpoint summary for this section:

“The landscape and shape of Youth Services is changing on an almost weekly basis. We are expected to deliver more for

less and as resources decline the demand on our services increases. There needs to be a national push to get youthwork recognised and valued by all governmental and

political bodies”.

Adult Learning

2.32 There are 24 responses to this part of the survey.

Adult learning provision

2.33 The adult learning provision is described in similar manner across all respondents with most saying that provision is by the CLD service within the local authority, the FE College and the voluntary sector. The WEA is particularly mentioned as one of the key voluntary sector providers. In one area it is stated that there is less voluntary sector involvement. In a few areas the NHS is mentioned as a provider of health-improvement related groups. The provision appears similar and tends to include a mixture of literacies, ESOL, ICT, vocational classes, adult guidance, family learning, and arts and crafts classes.

Adult learning outcomes and indicators

2.34 All but one of the responses stated that outcomes for adult learning had been set. But as in the Youthwork section, one respondent (not the same as the one without outcomes set) also stated that indicators relating to the outcomes for adult learning had not been set.

2.35 The indicators reported fall under the following main headings:

numbers of participants involved in the range of activities

(literacies, ESOL, CBAL learning opportunities, guidance etc) offered;

achievement in terms of numbers of qualifications and also achieving learning goals set;

13

Evaluating the Impact of CLD on National and Local Outcomes

evidence of progression, including in relation to employability and

employment; and

evidence of increase in confidence/impact on family life/improved

parenting skills.

2.36 As well as these main indicators some areas pointed out that individual projects may also have their own outcomes and indicators.

Quantitative and qualitative evidence gathered

2.37 The reported quantitative evidence often links to the indicators described above mainly enrolment, attendance, achievement in terms of awards and qualifications, progression in relation to ILP.

2.38 On the qualitative side, in addition to the evidence of individual experience and progression (for example, personal perceptions of benefits of participation and progress made recorded in Individual Learning Plans and Group Learning Plans, participant impact statements, self-evaluation, case studies) there is some reporting of evidence being gathered across partnerships in relation to shared outcomes. For example, Aberdeenshire Literacies Partnership evaluates its progress towards shared outcomes and other areas describe Learning Community reviews with a range of partners involved.

2.39 One area reports that limited qualitative evidence is gathered at present (compared to what they gather for youthwork) and the level of response from a few other areas may suggest that other areas also do not gather much qualitative evidence at present (although it may of course also be a reflection of lack of time to fill the survey in fully).

2.40 In terms of who gathers the evidence there is, as with youthwork, a range of responses from “all staff” to “staff at local level” to senior managers/co-coordinators and a handful of areas where partners are also included as being responsible for gathering evidence.

2.41 Not surprisingly the responses as to where evidence is stored are very similar to those provided for youthwork with a mixture of databases and management information systems in use plus paper-based files in some areas. Where Partners are mentioned (which is in only a few cases) it is usually to say that they have their own systems for storage of evidence. The exception to this is Scottish Borders Council which suggests that evidence is submitted by all agreed partners and returns fed into a central database managed by CLDS.

2.42 Four of the 24 returns for this strand report that they do not have an overall plan (at local authority/community planning partnership/CLD partnership level) about how adult learning will contribute to the local Single Outcome Agreement.

14

Evaluating the Impact of CLD on National and Local Outcomes

2.43 In response to the question “has the evidence been used to indicate contribution to the area‟s SOA and if so in what way?” 16 respondents indicated that it had. Four other respondents indicated that the evidence was used in some way towards the SOA, possibly indirectly to other committees or thematic groups and four either stated that it had not or it was not clear from the answer given whether this was the case or not.

2.44 An example of good practice in respect of how evidence is used came from East Lothian’s response which indicates that even where practice is good there are still issues to address:

“East Lothian Learning Partnership Adults Forum has a logic

model Action Plan leading to SOA outcomes and reviews outcomes periodically. SOA targets are linked to overall

impacts for the projects in the CLDS Business Plan but not formally sent to anyone to count”.

Barriers and Difficulties

2.45 Thirteen of the responses in relation to barriers and difficulties are exactly the same as given for youthwork including three where there were no reported barriers/difficulties. Of the remaining ten that were the same the issues cover problems with the process (such as lack of admin time to input to the database, too many databases, getting people to submit returns), difficulties of attribution, keeping things proportionate, and, from several areas issues arising from the budgetary problems.

2.46 Those barriers and difficulties that differ from the youthwork ones tend to be around the issue of the outcomes and how these are being addressed. For example the following comments are made from different areas:

“some difficulties have been experienced in linking CLDS outcomes

with higher level SOA outcomes”;

“outcomes cover a variety of the SOA outcomes: not always sure

who is collecting evidence for each SOA outcome and how it is collated- quite possible that we are not sharing information with

everyone who should know”;

“being clear on what it is worth gathering and what it tells you”;

“there has been a structural change to CLD services and new adult

learning outcomes require to be developed to contribute more to SOA targets”;

In relation to an area where the SOA target is “a 5% year on year increase in the number of adult literacy learners” the perceived

difficulty reported was ”we need to refine this indicator so that it

15

Evaluating the Impact of CLD on National and Local Outcomes

remains meaningful, reflecting not just a cumulative increase but a more qualitative measure of outcomes achieved”;

“setting and evidencing outcomes that fully describe the

achievements and outcomes of participants. Tracking progression is particularly difficult in terms of outcomes”.

2.47 Some difficulties reported were specific to particular circumstances. For example in one area the respondent stated:

“CLD staff have felt that there has been a lack of focus on Adult Learning as a result of the integration of the Children

and Families Department and the development of more focus on vulnerable children and families. I think we have turned

that corner, but our adult learning is more focused on skills development in relation to gaining employment and within literacies work looking at more focus on learners achieving a

recognised capability. This is not without its difficulties”.

2.48 There were also some more positive comments in this section about how difficulties were being overcome:

“there have been less barriers here than in youthwork and

the key players (Council, Colleges, WEA) constitute a strong lobby”;

“a reporting and monitoring system is now embedded in practice”.

2.49 One area that had reported difficulties within youthwork reported none in

this section.

The influence of the evidence

2.50 As with youthwork, respondents report that the evidence gathered has been useful in influencing senior management and in some cases elected Members. It has been used in planning and shaping services and in helping to determine where resources will be spent.

2.51 Externally the evidence has been useful for HMIe inspections and in helping secure external resources. Some Councils have won awards at national level such as COSLA awards.

2.52 South Ayrshire reported some very practical results in relation to employability and learning progression:

Jobcentres now refer clients to CLD courses;

16

Evaluating the Impact of CLD on National and Local Outcomes

clear pathways have been developed from community based

learning to college;

SCQF accredited courses now being delivered; and

literacy now embedded in learning and employability programmes.

2.53 Only six areas chose to make further comments at the end of the survey and these mostly focused on issues relating to budget constraints and re-structuring. Two of the six comments concerned resources and questioned whether the local authority will be able to continue to deliver the same level of service or for example “to deliver on the 2010 National Adult Literacies Strategy”. One authority stated:

“sourcing external funds for Adult Learning is a major aspect of the work. Sustainability is in question beyond March 2011

regardless of how positive the impacts are”.

2.54 One authority commented on the perceived lack of strategy at national level:

“adult learning is suffering from a lack of strategy from National level, the Skills for Scotland report would have

benefitted from targeted funds being available at local level in much the same way the CLD Upskilling programme did”.

2.55 Two other comments reflected the recognition of the need to develop

outcomes to demonstrate impact, one because of restructuring and the other recognising the need to capture outcomes “across a broad partnership”.

Community Capacity Building (CCB)

2.56 We received 26 completed surveys for the CCB element.

CCB Provision

2.57 The range of providers and the provision offered is broader than in the

previous two sections of the survey. Providers can include various departments and staff within the local authority (regeneration, housing, libraries, greenspace/environmental, Ward managers) as well as a wide

variety of partners outwith the local authority including the third sector interface, development trusts, health organisations and the NHS, and a

range of voluntary sector organisations.

2.58 Some areas have commented on what it is they provide under this heading seeking to clarify what it involves. The range includes support

with involvement in community planning, community engagement,

17

Evaluating the Impact of CLD on National and Local Outcomes

support for social enterprises, support for new community groups, support for influencing groups/groups of community interest, tenants and

residents associations, development trusts and capacity building around children and families’ issues.

CCB Outcomes and Indicators

2.59 Twenty three areas responded that they have specific outcomes for CCB

with three areas saying that they did not.

2.60 Twenty two areas said they have indicators to measure their set outcomes

with one area (in addition to those which had stated they do not have outcomes at all) stating that it does not have indicators to measure its set outcomes.

2.61 The indicators that have been provided are analysed as follows:

numbers of volunteers, active community councils, credit union

members;

in relation to social enterprises: number of social enterprises

financially self-sustaining; turnover of the social economy;

in relation to community consultation: number of community consultations; number of groups which influence local and wider

decision making; completion of community consultations using VOICE;

in relation to support provided: number of management committees supported; numbers of groups given funding advice and guidance;

number of groups receiving support; number of community/third sector groups supported to secure funding or obtain assets.

2.62 In general it is clear that it is harder to arrive at qualitative outcome indicators for CCB. Several of the areas left this response blank even though they had said they do have indicators. Others said that they were in the process of developing them. A few indicators stood out as trying to get behind the numbers:

number of community groups developing and delivering services in

response to priority need;

increase the capacity of individuals to become effective activists in

their communities;

involvement in CLD activity supports participants’ health and

wellbeing;

18

Evaluating the Impact of CLD on National and Local Outcomes

local communities are better represented in decision making and

partnership groups and effectively influence the decision making process;

the extent to which organisations identify themselves as having the skills, confidence and knowledge to identify and respond to

community need;

groups systematically find out the needs of their communities and

have effective information channels;

community perspectives are reflected in local plans, priorities and

actions; and

community based voluntary organisations can better evidence positive impact.

CCB Quantitative and Qualitative Data

2.63 As might be expected from the indicators described above the quantitative data collected focuses on numbers of participants, volunteers, training courses, groups supported etc. Some areas indicate that they use aspects of returns from the Citizens Panel to gather evidence. Others spoke of using more generally available data such as health indicators and SIMD to see if there have been global improvements. One area said that it records the number of successful funding bids made by supported groups.

2.64 East Renfrewshire is beginning to tackle this whole area of evidencing CCB more systematically. At present it is piloting a baseline survey with community councillors to measure their confidence levels on key functions eg representing their community/influencing issues on behalf of their community. Another area spoke of the need to gather this kind of baseline information if any meaningful understanding of impact is to be forthcoming.

2.65 There is a strong sense that the evidencing of CCB’s impact is still in development. This is borne out by a comment made in response to the next question about who has responsibility for gathering the evidence where one area states:

“no-one has specific responsibility for gathering the data at present, so some of the indicators are more ‟this would be a good way to measure this... ‟rather than „this is how we are

actually measuring it‟: some of the baseline data on social economy turnover has not been collected yet”.

2.66 The remaining responses to this question again cover a range of CLD staff

(senior managers and local staff) with a few areas saying that all CLD partners have responsibility for gathering evidence.

19

Evaluating the Impact of CLD on National and Local Outcomes

2.67 In terms of where evidence is stored there is a similar range of management information systems quoted including PIES, Covalent, and Cognisoft. But it is clear that it is less well developed than is the case for youthwork and adult learning, with some areas that were using one of the above systems for example to store their evidence for Adult Learning stating that they still hold the CCB evidence in paper folders rather than electronically.

2.68 In terms of reporting the majority of respondents mention local authority committees and various partnership groups, including the community planning partnership. The evidence is used for performance measurement, community planning and for internal scrutiny and external inspections.

2.69 Three areas report on the development of measurement in this area. One area states: “at the moment data collection and use is being piloted. In future it will be used as part of the Council‟s Outcome Delivery Plan”. Another states that “there is no CCB partnership group and little or no joined up reporting at a [area-wide] level”... The third states that “this is developing as the authority develops a new approach to performance management”. It is interesting to note that even where areas are still struggling with this they are at least aware of the need for change which is encouraging.

2.70 Of the 26 responses seven said they do not have an overall plan (at local authority/community planning partnership/CLD partnership level) about how community capacity building provision will contribute to the local SOA and an eighth placed a question mark as its response. This means that 18 areas said they do have such a plan in place. This is a lower number than either of the other two strands indicated.

2.71 Sixteen areas gave answers to suggest that the evidence gathered has been used directly to indicate its contribution to the SOA. Two areas said this was in development and others either did not respond or gave an unclear response.

Barriers and Difficulties

2.72 Some responses were the same as in the previous two CLD strands and concerned process issues and lack of resources. The responses that differed were mainly around the problems of how to establish the evidence of impact:

“not having an accurate baseline figure and an agreed methodology for gathering and comparing data is a

significant challenge”;

“more work is being done on evidencing impact from

community capacity building both from groups and individuals involved”;

20

Evaluating the Impact of CLD on National and Local Outcomes

“establishing a baseline has been difficult as we are relying on responses from community members”.

2.73 One area spoke of a more fundamental problem:

“there is no current clear direction for CCB across the CPP. This is being raised as an issue within the current CLD review”.

2.74 One area however (Perth and Kinross) has managed to develop CCB

outcomes and commented:

“it has taken time to develop CCB outcomes but these are now in place and have guided the development of the new

[Cognisoft] IO system which will also help ensure consistency of reporting and contribute to a fuller picture of the wide

range of CLD activities in Perth and Kinross [but] this system is only used by local authority CLD staff”.

The influence of the evidence

2.75 Ten of the responding areas did not supply any information regarding the influence of the evidence gathered. Of the 16 areas that did respond several of them referred to the influence of the evidence on service planning and budget decisions. Others, as with the other two CLD strands, described external influence in the form of awards and the use of evidence for external inspections.

2.76 Some practical results from the evidence were also noted. For example one area commented that evidence had been used in relation to asset transfer while another stated that having particular indicators in the SOA [referring to credit union membership] had helped secure funding for third sector partners in relation to supporting an increase in credit unions in certain areas.

2.77 One area, Dundee City, has undertaken a local community impact assessment to assess the impact on local communities of local community planning and believes it is the first area to do this.

2.78 One area reported that the evidence was not really influential as it was mostly quantitative and very limited. Individual projects had good evidence but this was not being linked back to a clear strategic plan around CCB (now being addressed).

2.79 A few areas made further final comments. Some of these referred again to overall CLD budget constraints and re-structuring:

“CLD [overall] has just suffered a significant level of budget reduction and will require to complete a radical restructure.

21

Evaluating the Impact of CLD on National and Local Outcomes

Demonstrating evidence of impact will be crucial in the short and longer term”;

“CCB is probably the least resourced of the three WALT areas in terms of staff on the ground, most organisations in the CLDP are undergoing restructuring and staff resources are being depleted even further”; and

“Community and Customer Services of which CLD is a part is currently going through a major re-structuring-in doing so the lack of strategic support and direction will be addressed. A more co-ordinated approach will be developed along with robust quality assurance measures being put in place”.

2.80 Some areas wanted to point out positive aspects of their work in this area:

“Community Learning and Development Services has developed a local community capacity toolkit which has been rolled out to community planning partners; the framework aims to help build the capacity of voluntary and community organisations across East Ayrshire”;

“We have recognised the need to demonstrate impact and outcomes of CLD across a broad partnership. The CLD partnership steering group will be considering this”.

2.81 The next section of the report describes the methods used to conduct the in-depth studies and provides an overview of the existing frameworks that the nine areas involved have drawn on.

22

Evaluating the Impact of CLD on National and Local Outcomes

3 In-depth studies

Introduction 3.1 We conducted in-depth studies in nine areas. In each area we interviewed

one-three staff involved in completing the survey pro forma. The purpose of these interviews was to gather more detail about the link between local

outcomes, evidence gathered and national outcomes. We supplemented the interviews with further desk research as required.

3.2 The nine areas involved in this in-depth stage were:

Aberdeenshire;

Dundee;

East Lothian;

East Renfrewshire;

Inverclyde;

North Lanarkshire;

Perth and Kinross;

Scottish Borders; and

South Lanarkshire.

3.3 Each study included the following:

what CLD includes in that area;

links between local and national outcomes and fit with SOA;

planning processes in relation to outcome setting;

overview of local outcomes across each of the three CLD strands;

what evidence is currently gathered and how it is gathered;

how evidence is used to influence etc; and

ideas for the future.

3.4 We provide the synthesis of the findings from these in-depth studies in the next section of this report.

23

Evaluating the Impact of CLD on National and Local Outcomes

Use of existing frameworks

3.5 We asked each area to indicate which of the existing frameworks and

guidance documents it uses. The table below summarises the responses across the nine areas. This shows that Delivering Change and HGIOCLD2 are well used. LEAP is used for planning purposes. Let’s Prove It has a

more patchy usage pattern.

24

Table 3.1: Use of existing frameworks

Aberdeenshire Dundee East

Lothian East

Renfrewshire Inverclyde

North Lanarkshire

Perth and Kinross

Scottish Borders

South Lanarkshire

LEAP √ √ for planning

√ √ √ √ √ √ √ Individual practitioners use it

HGIOCLD2 √ √ √ √ √ √ √ √ √

Delivering Change √ √ √ Influenced early stages of our approach

√ √ √ use the outcomes in it a bit – as

an influence

√ √ for Youthwork

Lets Prove It √ X √ √ √ √ x X X Not that useful

CLD competences/standards

√ √ √ √ √ √ √ √ √

National standards for community engagement

√ √ √ √ √ √ √ √ √

Curriculum for Excellence’s four

outcomes (being

actively used for this)

√ √ √ √ √ √ √ √ √

SROI Yes, and other social audits.

A staff member is qualified to undertake social accounting.

√ Yes, being developed

X √ X X X X X

Other? VOICE

Local tools

VOICE Their own

strategies and

business plans are drivers locally

VOICE for

community engagement

Good practice framework for literacy

work

Hanlon

database

Early Years Framework National Youthwork

Strategy

VOICE HMI CPD

programme

25

Evaluating the Impact of CLD on National and Local Outcomes

4 Synthesis of Findings 4.1 This section of the report brings together an overview of the findings from

across the survey responses and in-depth studies. We group our analysis under the following headings:

provision;

structural issues;

planning processes;

management of information and evidence;

CLD outcomes and links to SOA and national outcomes;

partners;

interesting practice; and

ideas for future.

Provision

4.2 While there is much similarity in provision across the geographical areas for

the three CLD priority areas there are some interesting differences which then relate to what is measured.

4.3 Within youthwork there appears to be an interesting difference between

areas where youthwork is firmly linked with education and schools (such as North Lanarkshire and Aberdeenshire) and other areas where this is less

apparent. Some areas describe the role that CLD has to play in relation to the Curriculum for Excellence (for example this is stated clearly in Edinburgh’s CLD Service Plan, Supporting Communities, where the role that

CLD can provide in support of CfE is highlighted, and also in East Lothian’s Community Learning and Development Services Plan 2010/11 which outlines

how CLD will contribute to CfE).

4.4 Within adult learning although there is again much similarity there are differences of emphasis with some areas placing more focus on parents and

family learning than others.

26

Evaluating the Impact of CLD on National and Local Outcomes

4.5 The greatest differences however relate to the community capacity building (CCB) strand where a more diverse range of interventions are described and

a broad range of partners are involved in provision. The types of interventions include support to local communities of interest/local

campaigning groups; support to enable communities to influence community planning and service planning generally; support to the social economy; and general infrastructure support for established and new groups (managing

committees, team building, funding applications, HR etc).

4.6 Across the three CLD strands some areas make mention of health bodies

being involved (for example, in Inverclyde the Health Inequalities Action Plan is a pilot to test how taking a CLD approach to health improvement work in

deprived communities can add value to other agencies’ work in those areas) and in other areas there is no mention at all. Our sense is that the links with health are important and need to be strengthened where this is not already

the case.

Structural issues

4.7 CLD sits in different parts of local authority structures and where it sits can

be revealing of how it is regarded. The recent re-structuring in Dundee for example, where CLD was moved to sit in the Chief Executive’s Department

while other elements of the former service were placed in an arms length trust, illustrate the importance of CLD in that area. In contrast the CLD re-structuring ongoing in another area may see all staff dispersed in local area

teams with no central management or co-ordination.

4.8 In some areas CLD is one team with one budget (for example Aberdeenshire)

and the sense from that area is that this has made it a stronger force. In many other areas CLD is split between different departments with youthwork and adult learning often linked to education but CCB sitting in a

variety of places including in South Lanarkshire within Enterprise (for historic reasons). In Highland there is no CCB team within the Council but instead it

reports that there are Ward managers who work with the elected members and communities “to co-ordinate and develop local services”.

4.9 While it is not a structural issue, it is clear that where there is a strong senior

manager in charge of CLD or where there is an elected Member with specific responsibility for CLD/an aspect of CLD, this only serves to strengthen the

position of the service.

Planning processes

27

Evaluating the Impact of CLD on National and Local Outcomes

4.10 One of the striking aspects of the in-depth studies is just how many different

plans there are in each area: Team Plans, Service Plans, Department Plans, Corporate Plans and the SOA not to mention cross-cutting partnership plans.

One of the challenges of this work has been to try and sift through all of this to get at how CLD outcomes are being measured and used in relation to all of these. Our sense is that even those working within some of the in-depth

areas we have visited find this plethora of plans challenging – Inverclyde, for example, highlighted the urgent need for rationalisation across partnerships

and plans. This may be unavoidable but as much clarity and simplicity as possible would be helpful.

4.11 Despite the above comment some areas have managed to make clear the link between CLD and their area’s local outcomes and the link from these to national outcomes within one document (others have managed it across

several documents one for each of the three (or sometimes only two) strands. Some areas have no plan of this nature at all and clearly it is much

harder to evidence impact if there has been no planning for it in the first place. All areas (which supplied survey responses) however where there is no plan or strategy at present are aware of the need to do this. The

respondent in one area which has had no CLD strategy for the last four years felt it was clear that this has impacted on the profile of CLD in the area.

4.12 Our sense is that a clear, short CLD Strategy or Plan is essential to bring what is being attempted into one place and to help guide the impact evidencing process. It is also important in helping to raise the profile/visibility

of CLD.

Management of information and evidence

4.13 It is clear that many areas have been giving attention in recent years to the

development of management information systems, often custom built for the purpose. There are some excellent examples of MIS where both qualitative

and quantitative evidence is stored and able to trace through to the individual learner. Dundee, East Lothian, East Renfrewshire, South Lanarkshire are among good examples of this that we were shown and have

had positive feedback about these systems through HMIe inspections.

4.14 From the in-depth studies it is apparent that critical to the successful

implementation of the MIS is the intensive staff training and capacity building that needs to run alongside the development/introduction of the system itself. Several areas said that this was a multi-year process and not

28

Evaluating the Impact of CLD on National and Local Outcomes

something that could be rushed or ignored. One area likened it to managing a culture change which we think is apt for what is entailed.

4.15 In addition, invariably in those areas with very effective management information systems, these are not only used as databases for gathering

information, but are also a key instrument in building a culture of self-evaluation amongst staff, and in some cases are even used as a means of quality controlling staff approaches to evaluation.

CLD outcomes and links to SOA and national outcomes

4.16 Some areas are clearly now undertaking outcome-focused planning and this represents a real shift in thinking over the past few years. All the in-depth

areas are attempting outcome-focused planning while recognising that there are elements of this process that still need improvement.

4.17 Across all three CLD strands there are examples of “outcomes” in various

plans that are not “outcomes” at all but are more commonly interventions or outputs. This suggests to us that there is still further work to be done to

assist some staff to fully understand what outcomes are. It is important to be rigorous about this as otherwise in trying to measure impact there is a tendency to count aspects of interventions and outputs as if they were

outcomes. For example evidencing the provision of opportunities or the number of participants, while it is valuable evidence of activity does not in

itself indicate whether there have been positive outcomes for the participants involved.

4.18 That said, it is clear from the in-depth work that we did that areas are very

good at putting in place individual action plans at project/local area level and outcome-measurement at this level is good. It is when areas attempt to

aggregate data and demonstrate evidence of impact at a higher level, or between/across partner organisations that there is still some development work required.

4.19 The survey results demonstrate that it is harder to set outcomes for CCB than the other two strands and several of the areas involved in the in-depth

work, including some areas that are demonstrating excellent practice in evidence gathering such as East Lothian and East Renfrewshire, stated that this is an area still being developed.

4.20 The table below, based on the survey responses, illustrates that most areas have been able to demonstrate that outcomes are set for the three CLD

strands but less are able to say that there is an overall plan to link these to the SOA and even less are able to say that evidence gathered is used directly

29

Evaluating the Impact of CLD on National and Local Outcomes

to link back to the SOA. It is interesting to note that youthwork appears to have less evidencing directly back to the SOA than the other two strands,

although it has a high number of “some linking” which suggests that because it is part of many other services for children and young people, including

education, it may be a smaller part of a bigger evidence picture being presented.

Table 4.1: Summary of outcomes linked to SOA from Survey

responses Outcomes

set?

Overall Plan to

link to SOA?

Evidence used directly

re: SOA Yes Some No/

partial

Youthwork

(25 returns)

24 (1 not) 20 (5 not) 10 9 6

Adult learning

(24 returns)

23 (1 not) 20 (4 not) 16 4 4

CCB

(26 returns)

23 (3 not) 18 (8 not) 16 2 8

4.21 Most of the evidence that is being used to feed directly into the SOAs is

quantitative, because this is generally the kind of evidence that SOAs require. The qualitative evidence does exist in some areas at least and can be drawn on and is used in annual reports and service reviews. There is still

a question about how the outcomes for individuals in terms of the qualitative impact can best be highlighted and reported. One good example of this is

through the HMIe inspections of Learning Communities. For example the succinct report on the Learning Community Inspection around Dundee’s Harris Academy provides an excellent summary of the ways in which CLD has

impact on people’s lives.

Partners

4.22 While all areas note that many partners are involved in the delivery of CLD

provision, and while the SOA is the product of a community planning process involving the partners, the involvement of partners in the direct gathering

and collation of evidence to demonstrate the impact of CLD is patchy to say the least. Some areas involve partners with whom they have a funding

arrangement in this process and some, such as East Lothian, are considering taking this approach in future. Others state that partners gather their own evidence but whether this is used to inform overall impact is less clear and

our sense is that probably it is not. Areas that have a strong history of

30

Evaluating the Impact of CLD on National and Local Outcomes

partnership working, such as North Lanarkshire, seem to be making the most progress in involving partners in jointly evidencing impact.

4.23 Given that many areas now have a sophisticated MIS in place it should be possible to strengthen the work of CLD Partnerships by involving all key

providers in the gathering of evidence of impact. This will require capacity building and training with partner organisations’ staff, as it has with internal local authority staff, but is an area that would be worth the further

investment. It links back to the health of CLD Partnerships themselves and would perhaps enable them to demonstrate the added value of working in

partnership more effectively. An important part of this process will be to demonstrate to partners the benefits that they can gain from joint gathering

of evidence.

Interesting practice

4.24 We have found many examples of interesting and some innovative practice

during the course of this research including:

having one clear plan where the CLD outcomes are succinctly linked to

area/city outcomes and national outcomes: for example Edinburgh’s CLD Service Plan, Supporting Communities 2011-2014, and North

Lanarkshire’s Outcomes and Indicators 2010-12 document;

direct mentions of CLD within SOAs: for example, Dundee’s Single

Outcome Agreement emphasises community engagement, and Stirling’s SOA has a clear aim of “continuous improvement in the

quality of Community Learning and Development Services”;

having sophisticated MIS that staff are supported to use: several

examples including East Lothian, Dundee, East Renfrewshire;

having clear outcomes for CCB: Perth and Kinross, Aberdeenshire;

trying out new ways to gather evidence: for example Dundee’s local

impact assessment of community planning; Perth and Kinross’ Measuring the Difference Outcomes Form;

providing clear guidance and support to staff as to what evidence is required: South Lanarkshire’s Data Definitions; Dundee’s website with

guidance (“Focusing on Outcomes”) and forms easily available; East Lothian’s comprehensive staff training package;

31

Evaluating the Impact of CLD on National and Local Outcomes

several areas have raised the importance of building capacity with staff

across services within the Council to enable them to undertake community capacity building directly themselves: this is an important

development in times of diminishing resources; and

working with partners to evidence impact: Scottish Borders; North

Lanarkshire where they jointly record learners numbers on their MIS and in their Outcomes and Indicators 2010-12 Document clearly

highlighting which partners contribute to which of the joint outcomes (by measuring particular indicators).

Ideas for future

4.25 We asked each area in the in-depth studies to suggest ideas for the future, either for their own area or at national level. The following provide some of the ideas put forward:

Locally: the need to improve the CLD partnership (make it more dynamic/make it more outcomes-driven/gather evidence from

across partners);

the importance of putting in place baseline measurements, in

particular for CCB;

making better use of qualitative evidence to illustrate the outcomes of CLD;

ensuring that the evidence-gathering and reporting is proportionate and timely so that it has maximum impact.

Nationally: assistance to break down the CCB strand into one or more manageable components so that clearer outcomes can be set. This could help areas with this particular issue as it would allow

them to see more easily what it is they are trying to achieve and assist them in knowing what evidence to gather to demonstrate

the outcomes had been achieved;

clarifying/streamlining the various national bodies as there was a sense that there are too many different bodies at national

level;

update WALT/HGIOCLD to include some of the latest thinking

(and this current research could be drawn on for that purpose).

32

Evaluating the Impact of CLD on National and Local Outcomes

5 Reflections and Recommendations

5.1 The survey and in-depth studies have illustrated that there is a lot of activity in relation to evidencing the activity, outputs and outcomes of CLD in many areas across Scotland and the need to evidence outcomes is much more to

the fore than previously. Some of the existing frameworks, in particular Delivering Change and HGIOCLD2, have proved helpful to many areas in

thinking about outcomes and evidencing impact. The move to gather more evidence is happening against a backdrop of reducing budgets and increasing pressure on staff which adds significant challenges.

5.2 It is clear however that in some areas there is still some confusion over the difference between outcomes, outputs and activities with room for further

upskilling on this.

5.3 It is also clear that although there is some activity within most local

authorities around evidence gathering it does not often stretch to the wider partners involved in CLD in an area in a coherent way. This highlights the weakness of many of the formal CLD Partnerships in that they are not

working as effective partners in terms of demonstrating what they achieve collectively.

5.4 There is little consistency across Scotland in relation to what is evidenced and how. Within the three strands there is more evidencing of the outcomes and impact of youthwork and adult learning than of community capacity

building.

5.5 In some respects the evidencing of the impact of CLD, as a whole entity, is

made more difficult by the fact that it has three distinct strands and these strands are sometimes based in separate areas within local authorities with a range of partners who focus more on some elements than others. Having a

succinct CLD Plan to set out the outcomes across the three strands is helpful but in many areas this is not the case although there may be separate

youthwork, adult learning and sometimes CCB plans.

5.6 The way in which SOAs require evidence to be gathered does not tend to allow for qualitative evidence to be included and it may be that in terms of

assessing the overall impact of CLD the SOA route is not the best vehicle to use. It will be important to explore what other routes are available to help

demonstrate the impact of CLD more clearly.

5.7 There are strategic choices to be made about this:

33

Evaluating the Impact of CLD on National and Local Outcomes

do we want to see greater consistency across Scotland in evidencing

impact?

do we want to evidence the impact of CLD as a whole entity or is it

enough to evidence the three elements separately?

do we want impact to be evidenced across partners?

where and how can the qualitative impact of CLD best be captured: is

it through the SOA or is it through some other vehicle?

how do areas which are further behind with evidencing outcomes and

impact get supported?

5.8 These choices lead us to propose recommendations based on our own views on the answers to the above, but of course these will be subject to debate.

Recommendations

5.9 We propose the following recommendations:

a. In order to support greater consistency across Scotland (so that the impact at national level can be assessed) we suggest the following:

- the creation of a simple and clear CLD Action Plan template, showing sample outcomes for youthwork, adult learning and

CCB; - an event to share practice based on this report; and - further upskilling inputs in relation to outcomes, outputs and

interventions.

b. In order to maintain the impact of CLD as an entity we suggest that each area is encouraged to prepare a CLD Plan that shows clearly what the outcomes are, how they link to the area’s own outcomes and how

they will be evidenced. This should include how each key partner will contribute to this evidencing process.

c. We believe it is essential to evidence impact across partners if CLD Partnerships are to demonstrate real added value. In areas where the

local authority has established good MIS this should be extended to partners with appropriate capacity building to help them use it. (Should there be any further Upskilling funding for CLD this might be

34

Evaluating the Impact of CLD on National and Local Outcomes

an area worth funding.) “Partners” should include internal Council partners such as education and housing.

d. Reporting on the SOA (in its current format) is only ever going to be able to capture a limited amount of mainly quantitative information.

We think the use of HMIe Learning Community reports, where these rate practice as Very Good/Excellent should be further used to help senior managers, local politicians and national politicians understand

what it is that CLD does. This requires careful communication and PR work.

e. There are clearly some local authority areas which are further behind than others in this process of evidencing outcomes. (LTS/HMIe)

Education Scotland should consider how best these can be supported. One way is to encourage more sharing between neighbouring areas: this has already happened to some extent in the Upskilling programme

with regard to the alliances formed around training and capacity development. How can this sharing be supported through central

bodies?

f. Areas which do not have a good electronic system for gathering and storing evidence should be assisted to undertake the processes that

lead to this being put in place.

5.10 It is heartening that so much effort is being put in place to gather the

evidence to demonstrate the impact of CLD. We hope that this report can assist in taking this process further.

35

Evaluating the Impact of CLD on National and Local Outcomes

Evaluating the Impact of CLD on National and Local Outcomes

Education Scotland

Final Report Appendices

October 2011

Evaluating the Impact of CLD on National and Local Outcomes

Contents

Section Page

Survey schedule .................................................................................... 37

37

Evaluating the Impact of CLD on National and Local Outcomes

Appendix 1

Survey schedule Introduction

LTS in partnership with CLD Managers Scotland (CLDMS) has appointed Blake Stevenson Ltd, an independent research and consultancy company, to gather

evidence about the impact of CLD on local and national outcomes in Scotland.

In order to do this we wish to explore how local authorities and their partners currently collect evidence about the impact of CLD and any difficulties or barriers they face with this. We will explore the similarities and differences in approaches

to evidence-gathering across all local authorities and in ten areas examine more closely what positive impact the evidence of CLD on local and national outcomes

is having. We are sending this survey to you as the main contact for your local authority.

We know that CLD is covered differently in each local authority area and for this reason this survey is divided into three strands to cover youthwork, adult

learning and community capacity building. We recognise that in some authorities it may require more than one person to fill in the three main strands. Please could you help us by ensuring that all aspects of the survey are completed by

the most appropriate person.

Many thanks for your help. Please return the survey either electronically or by post to the address shown at the end. (If you use Word to complete the survey the boxes will automatically extend to fit your response.)

Personal details

Name:

Position:

Local Authority:

Preliminary question:

What does CLD cover in your area?

1. Youthwork a. Could you briefly describe the provision in your area for youthwork and

who the main providers are (both local authority and third sector)?

38

Evaluating the Impact of CLD on National and Local Outcomes

b. Have specific outcomes for youthwork been set? (please mark with X)

Yes No

c. Are there set indicators to measure achievement in relation to the

outcomes for youthwork? (please mark with X)

Yes No

If yes, what are these?

d. What evidence is gathered about the outcomes for youthwork?

Quantitative:

Qualitative:

e. Who is responsible for gathering the evidence?

f. How and where is evidence stored?

g. How is the evidence used for reporting purposes?

h. Is there an overall plan (at local authority/community planning partnership/CLD Partnership level) about how youthwork provision will

contribute to the local Single Outcome Agreement? (please mark with X)

Yes No

i. Has the evidence been used to indicate contribution to the area’s SOA and if so in what way?

j. What barriers/difficulties have you encountered in any of the above?

39

Evaluating the Impact of CLD on National and Local Outcomes

k. Has your evidence been influential in any way, either in your own local authority area or more widely? Please give details.

l. Any other comments?

2. Adult Learning a. Could you briefly describe the provision in your area for adult learning

and who the main providers are (both local authority and third sector)?

b. Have specific outcomes for adult learning been set? (please mark with X)

Yes No

c. Are there set indicators to measure achievement in relation to the outcomes for adult learning? (please mark with X)

Yes No

If yes, what are these?

d. What evidence is gathered about the outcomes for adult learning?

Quantitative:

Qualitative:

e. Who is responsible for gathering the evidence?

f. How and where is evidence stored?

40

Evaluating the Impact of CLD on National and Local Outcomes

g. How is the evidence used for reporting purposes

h. Is there an overall plan (at local authority/community planning partnership/CLD Partnership level) about how adult learning provision

will contribute to the local Single Outcome Agreement? (please mark with X)

Yes No

i. Has the evidence been used to indicate contribution to the area’s SOA

and if so in what way?

j. What barriers/difficulties have you encountered in any of the above?

k. Has your evidence been influential in any way, either in your own local authority area or more widely? Please provide details.

l. Any other comments?

3. Community Capacity Building

a. Could you briefly describe the provision in your area for community

capacity building and who the main providers are (both local authority and third sector)?

b. Have specific outcomes for community capacity building been set?

(please mark with X)

Yes No

c. Are there set indicators to measure achievement in relation to the

41

Evaluating the Impact of CLD on National and Local Outcomes

outcomes for community capacity building? (please mark with X)

Yes No

If yes, what are these?

d. What evidence is gathered about the outcomes for community capacity building?

Quantitative:

Qualitative:

e. Who is responsible for gathering the evidence?

f. How and where is evidence stored?

g. How is the evidence used for reporting purposes?

h. Is there an overall plan (at local authority/community planning partnership/CLD Partnership level) about how community capacity

building provision will contribute to the local Single Outcome Agreement? (please mark with X)

Yes No

i. Has the evidence been used to indicate contribution to the area’s SOA

and if so in what way?

j. What barriers/difficulties have you encountered in any of the above?

k. Has your evidence been influential in any way, either in your own local authority area or more widely? Please provide details.

42

Evaluating the Impact of CLD on National and Local Outcomes

l. Any other comments?

If you would like to attach any evidence (protocols, procedures, frameworks, reports) you have gathered this would be very helpful. Please send the

completed survey electronically or by post to the address shown below.

Glenys Watt Director Blake Stevenson Ltd

1 Melville Park Edinburgh EH28 8PJ

Tel: 0131-335-3700

Email: [email protected]