11
eLearning Papers 28 www.elearningpapers.eu eLearning Papers ISSN: 1887-1542 www.elearningpapers.eu n.º 28 April 2012 1 In-depth media education, online safety, school self- assessment Tags Authors Andy Phippen School of Management, Plymouth University, UK andy.phippen@plymouth. ac.uk David Wright South West Grid for Learning Trust, UK David.Wright@swgfl.org.uk Dr Tanya Ovenden-Hope School of Social Science and Social Word, Plymouth University, UK Tanya.ovenden-hope@ plymouth.ac.uk Are schools equipped to address online safety in the curriculum and beyond? This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and pracce. By comparing with data from the previous year, we consider the current state of pracce among UK schools and analyse progress over a 12-month period. What is clear from this analysis is that the aspects that either use technological inter- venon (i.e. filtering) and policy development are generally performing beer than those that require long-term resource investment (such as training) or whole school in- volvement (such as parental educaon or community understanding). Monitoring and reporng also perform badly. It is interesng to note that even with an almost double the number of parcipang establishments, the strongest and weakest performing as- pects remain almost constant across 2010 and 2011, with only slight improvement. The analycal tool used to gather this data is now being used in pilot projects in the US and Australia. Once it is in full use in these regions, detailed analysis of internaonal performance will be available, for the first me. This presents some excing opportuni- es to understand at an internaonal level, how schools engage with online safety and ensure protecon of their pupils, staff and wider community. 1. Introduction The issue of online safety never out of the media and constant concerns for schools, who have duty of care to both staff and pupils, as well as ensure policy is in place to show due dili- gence related to different aspects of online safety. However, while the focus of much media is on the sensaonal aspects of the issues (for example, predatory behaviour, cyberbullying), the reality of online safety in schools is far more broad, ranging from technical countermeas- ures such as effecve password strategy and content filtering, to developing policy is in place to deal with incidents if they arise. In the UK, a lack of naonal strategy on online safety has meant that many schools have adopted their own approaches which the instuons themselves have idenfied as, in many cases, incomplete and inconsistent. A review of online issues by Tanya Byron (hp://media. educaon.gov.uk/assets/files/pdf/s/safer%20children%20in%20a%20digital%20world%20 the%202008%20byron%20review.pdf) proposed a holisc review to online safety, compris- ing a broad range of issues from technical issues to wider parental and community educa- on. It called for a “whole school” approach where all staff were involved and engaged in all aspects of online as provided with regular training to ensure their knowledge and pracce is up to date with the every changing field. OFSTED’s Safe Use of New Technologies report (hp://www.ofsted.gov.uk/resources/safe- use-of-new-technologies), built on the recommendaons of the Byron review concluding

Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

Embed Size (px)

DESCRIPTION

Authors: Andy Phippen, David Wright, Tanya Ovenden-Hope. This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with data from the previous year, we consider the current state of practice among UK schools and analyse progress over a 12-month period.

Citation preview

Page 1: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning

Papers28www.elearningp

apers.eu

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

1

In-depth

media education, online safety, school self-assessment

Tags

Authors

Andy PhippenSchool of Management, Plymouth University, [email protected]

David WrightSouth West Grid for Learning Trust, [email protected]

Dr Tanya Ovenden-HopeSchool of Social Science and Social Word, Plymouth University, [email protected]

Are schools equipped to address online safety in the curriculum and beyond?

This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with data from the previous year, we consider the current state of practice among UK schools and analyse progress over a 12-month period.

What is clear from this analysis is that the aspects that either use technological inter-vention (i.e. filtering) and policy development are generally performing better than those that require long-term resource investment (such as training) or whole school in-volvement (such as parental education or community understanding). Monitoring and reporting also perform badly. It is interesting to note that even with an almost double the number of participating establishments, the strongest and weakest performing as-pects remain almost constant across 2010 and 2011, with only slight improvement.

The analytical tool used to gather this data is now being used in pilot projects in the US and Australia. Once it is in full use in these regions, detailed analysis of international performance will be available, for the first time. This presents some exciting opportuni-ties to understand at an international level, how schools engage with online safety and ensure protection of their pupils, staff and wider community.

1. IntroductionThe issue of online safety never out of the media and constant concerns for schools, who have duty of care to both staff and pupils, as well as ensure policy is in place to show due dili-gence related to different aspects of online safety. However, while the focus of much media is on the sensational aspects of the issues (for example, predatory behaviour, cyberbullying), the reality of online safety in schools is far more broad, ranging from technical countermeas-ures such as effective password strategy and content filtering, to developing policy is in place to deal with incidents if they arise.

In the UK, a lack of national strategy on online safety has meant that many schools have adopted their own approaches which the institutions themselves have identified as, in many cases, incomplete and inconsistent. A review of online issues by Tanya Byron (http://media.education.gov.uk/assets/files/pdf/s/safer%20children%20in%20a%20digital%20world%20the%202008%20byron%20review.pdf) proposed a holistic review to online safety, compris-ing a broad range of issues from technical issues to wider parental and community educa-tion. It called for a “whole school” approach where all staff were involved and engaged in all aspects of online as provided with regular training to ensure their knowledge and practice is up to date with the every changing field.

OFSTED’s Safe Use of New Technologies report (http://www.ofsted.gov.uk/resources/safe-use-of-new-technologies), built on the recommendations of the Byron review concluding

Page 2: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

2

In-depth

eLearning

Papers28www.elearningp

apers.eu

that outstanding online safety had to have a whole school ap-proach, including pupils, staff, governors, parents and the com-munity in policy and practice, and did not use technology in a locked down manner.

However, while these important policy documents were wel-comed, it also presented schools with a challenge in how to transform this strategic vision of what online safety should be into operational terms.

360 degree safe [www.360safe.org.uk] was launched by South West Grid for Learning Trust [www.swgfl.org.uk], in November 2009 as a means to allow schools to self-evaluate their own on-line safety provision; benchmark that provision against others; identify and prioritise areas for improvement and find advice and support to move forward. It provided a tool for schools to firstly understand the breadth of issues associated with online safety and then review their own performance and identify how to improve. It provided summary reports of progression, which helped all staff (not just those charged with the job of imple-menting an online safety policy) to understand the scope of on-line safety and what the school is doing about the issue.

In operationalising an online safety “vision”, the tool provided a prioritised action plan, suggesting not just what needs to be done, but also in what order it needs to be done. This is a vital bonus for teachers and managers who approach the issue of online safety for the first time, in a school which has no (or only a very rudimentary) policy.

Understanding Online Safety Policy and Practice with 360 Degree Safe Data

As well as providing a tool for schools to understand and de-velop their own online safety policy and practice, the tool also collects all submis-sions into a central database. In building a picture of practice across the UK, this resource is unique to hold data on every school who have engaged with the tool. In September 2010, the first analysis of the 360 degree safe database was pub-lished by the South West Grid for Learning (http://www.swgfl.org.uk/Staying-Safe/Content/News-Articles/Largest-ever-sur-vey-of-E-Safety-in-schools-reveals) based upon data returned from 547 establish-ments across England. The tool has been

adopted by many more organisations since this publication, and the data presented here is based upon returns from 1055 edu-cational establishments.

In this paper we present analysis “a year on” from this first re-port, comparing development over the 12 month period and for the first time allow a comparison of progress to understand how institutions, and online safety policy and practice, has de-veloped in the UK up to September 2011.

2. Methodology An overview of the 360 structure, detailing aspects covered, can be found at http://360safe.org.uk/Files/Documents/360-degree-safe-Structure-Map. In total 28 aspects are detailed by the tool, from technical measures such as filtering and technical security, through policy development to training and communi-ty education. For each aspect a school can give themselves a rat-ing from 5 to 1 (5 being worst, 1 being best). For each rating in each aspect, clear definitions are provided for each level to help the self review process. Establishments carry out the self review via a web interface and submitted data is stored in a relational database structure which holds the information in a collection on related “tables”, each table related to a specific data element within the system. The three data tables which provide the core for analysis relate to establishments, 360 degree safe aspects, and individual ratings, which detail an entry that an establish-ment has made against a specific aspect.

Each establishment’s “profile” comprises a number of entries in the rating table, each related to a specific aspect. It is possible for an establishment to have more than one entry in the rating table associated with a specific aspect which would reflect that they are using the tool for school improvement around online

 

Establishments   Aspects  

Rating  

Figure 1: 360 data structure

Page 3: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

3

In-depth

eLearning

Papers28www.elearningp

apers.eu

safety practice. An establishment’s profile will also reflect their current stage

Given the relational structure of the 360 degree safe data, the primary approach to analysis is through the querying of this data structure using SQL. In addition, summary data was loaded into Microsoft Excel for further statistical analysis and graphing.

Analysis of the data focuses on establishment’s self review of their online safety policy and practice, exploring their ratings against the 28 aspects of 360 degree safe. Aspect exploration al-lows the measurement of degrees of progression and improve-ment in the self review and those where, in general, policy and practice among UK educational establishment requires support to deliver further progress.

It should be noted that the international data (US and Austral-ia) has a slightly different, extended, structure, for the review aspects, and this will be discussed in more detail later in this report.

It is acknowledged that the data being explored is self reviewed – the establishments give themselves ratings against the as-pects and level definitions. However, self review is well estab-lished practice within the UK school system and level descrip-tors are very clearly defined. In addition, accreditation visits to date have demonstrated that in the instances of inspection that have occurred, self review ratings have been generally accurate. Indeed, many schools are generally conservative with their as-sessments. We also now have a sufficiently large database that “anomalous” returns are very apparent and can be followed up with the school or its local authority.

3. Details of the Establishments Analysed The vast majority of the data is drawn from English schools although there are a few from Wales. There are almost three times as many schools now registered to use the tool than there were last year. However, we should acknowledge that not all schools who have registered have embarked on their self re-view.

Based upon the local authority specified by each establishment, figure 2 details the proportion of establishments from differ-ent regions. As we can see, while there is a large proportion from the south west, over half are from other regions. The Mid-lands also has a strong representation, and there are also good spreads across other regions. SCE refers to Service Children’s

Education, an agency of the UK’s Ministry of Defence1, which provides education for MoD employee’s children overseas.

The “phase” of the establishment responses shows the break-down between primary, secondary and post-16 and nursery. As can be seen from figure 3, the majority of those registered are from primary schools. It is encouraging, given last year’s analy-sis showing that primary schools consistently underperform against secondary schools in online safety policy and practice, that the largest area of growth is registrations from that phase of school. While the number of secondary schools has more than doubled, the number of primaries has more than trebled in 12 months.

1 http://www.sceschools.com/home.php

Figure 2: Establishment geography

Figure 3: Establishment “phase”

Page 4: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

4

In-depth

eLearning

Papers28www.elearningp

apers.eu

4. Analysis of Aspect PerformanceTop level analysis of practice and policy performance explores responses to different aspects given by each establishment. The initial analysis explores the “best” rating any establishment has provided, given this reflects where establishments currently

tool. Therefore, different aspects have been rated by different numbers of establishments. In total, 559 establishments from our population have carried out the full self review, and 496 additional schools have reviewed at least one aspect. Of those establishments that have not completed a full review, figure 4

illustrates the variety of levels of completion to date. It details the num-ber of establishments that have achieved each given number of aspects to show the range of completion

This breakdown shows a spread of responses from those still in the early stages of self review to those nearing completion of the full set of aspects. It is interesting to note that, as with last year, there is a large concen-

stand in their self re-view. However, given that 360 degree safe is intended for use to im-prove as well as evalu-ate practice, a feature of the 360 degree safe database is that it re-cords any evaluation on a particular aspect made by an establish-ment at the time and date of entry. This data can be used to explore which areas are show-ing improvement in schools.

It should also be noted that it is not necessary for an establishment to have completed the full self review to have its data logged in the

Figure 4: The number of aspects completed by any establishment that has not completed the full review

Figure 5: Aspect frequency

Page 5: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

5

In-depth

eLearning

Papers28www.elearningp

apers.eu

tration of establishments who have completed 15 aspects. We would observe that, if the tool was being used in a linear man-ner, the 16th aspect is Password Security, arguably the first of the aspects being reviewed that might require specialist techni-cal input to make a judgement on the levels. We might hypoth-esis (but cannot test at the present time) this may be a reason why some reviews seem to stall at this stage.

In further exploring which aspects are more “popular” with es-tablishments, we can examine each aspect and the number of establishments who have completed a self review of that ele-ment. This is detailed in figure 5 and again supports our hypoth-esis that aspects requiring technical input (or those following aspects requiring technical input) are less “popular” than other aspects. We can see the two largest drops in aspect completions are around Password Security and Technical Security:

The aspects are ordered as they appear in the self review tool and the pattern presented shows that most establishments will undertake a linear approach to completing the self review. It should be noted that the tool can be used in a non-linear man-ner, but figure 5 suggests that this is not the case in the majority of establishments.

We will now move from the top level quantity based returns to look in more detail at each aspect presented in order to ex-

plore areas of strength and weakness across our establishments. We present this data as an approximate “state of the nation” report related to online safety policy and practice in the UK. However, we acknowledge that it is likely that the respondents who have embarked on an online safety self review are likely to be more engaged as “early adopters” than those who have not. Therefore, we might make the assumption that the data pre-sented may be better than average if it were possible to analyse performance in all educational establishments in the country. However, in comparing the results from this year’s analysis with those from the previous year, we will highlight a fairly consist-ent pattern, even with the addition of a significant number of new establishments. Therefore, this year we can say with higher confidence than last that this does represent a national picture.

Each aspect can be rated by the self reviewing establishments on a progressive maturity scale from 5 (lowest rating) and 1 (highest). In all cases analysis of the aspect ratings shows an across establishment maximum rating of 1 and minimum of 5. Therefore, in order to determine cross-establishment perfor-mance, average scores for each rating are used to measure are-as of strength and weakness in online safety policy and practice. Figure 6 illustrates overall averages across aspects:

Figure 6: Average ratings per aspect

Page 6: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

6

In-depth

eLearning

Papers28www.elearningp

apers.eu

The top 5 aspects across establishments are exactly the same as last year. In 2010 the strongest aspects were:

• Filtering (2.57)

• Acceptable Use Policies (2.78)

• Policy Scope (2.8)

• Digital and video images (2.93)

• Policy development (3.02)

In 2011 they are:

• Filtering (2.5)

• Policy Scope (2.65)

• Acceptable Use Policies (2.71)

• Digital and video images (2.83)

• Policy development (2.88)

There are two points to note in comparing the two sets of as-pects. Firstly, there has been a slight change in that Policy Scope is now ahead of Acceptable Usage Policies. More significantly, all of the aspects have improved on last year’s scores. While increases are not huge, all aspects have improved by some de-gree. And while that is encouraging as we remarked upon last year, the strongest aspects all have either a documentary (i.e. putting a policy in place, possibly derived from a local authority or regional broadband consortia) or technical in nature (which again is generally provided by an out-side agency or off the shelf solution).

We see a similar established trend with the five low-est rated aspects. As we identified last year these all focus on education and long term resource commitment and the 2011 weakest aspects are exact-ly the same as in 2010:

• Community understanding (4.03)

• Governor training (4.03)

• Monitoring the impact of policy and practice (3.96)

• E-Safety Committee (3.94)

• Staff training (3.84)

And this is the same in 2011:

• Community understanding (4)

• Governor training (3.93)

• Monitoring the impact of the e-safety policy and practice (3.9)

• E-Safety Committee (3.82)

• Staff training (3.76)

All of these aspects require long term development and commit-ment of resources (for example, regular and up to date training or monitoring). As with the strongest aspects, all have improved to some degree, which is encouraging to see. It is interesting to note that even with more than double the population size, the strongest and weakest aspects have remained very similar. This, again, gives us confidence in the representative nature of our population data and the consistency of the self review process.

Standard deviation is also used to explore the “spread” of rat-ings in the self review process. This is a useful measure to con-sider whether an aspect is consistently strong or weak across all schools, or whether there is variance in the evaluation. A high

Figure 7: Standard deviation of aspects

Page 7: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

7

In-depth

eLearning

Papers28www.elearningp

apers.eu

standard deviation would mean that different establishments were using a broad range of scores for self review. Figure 7 shows the standard deviations across the aspects.

As with last year, “Filtering” is a high average and low standard deviation, which shows that filtering is consistently highly rated across establishments. Also similarly to last year, other “strong” aspects have a broader standard deviation. For example, Digital images and video and policy scope show that these practices have a greater variance around schools.

In considering the weakest aspects, we can see that both Staff Training and Monitoring and Reporting Incidents have com-paratively narrow standard deviations, which would suggest that these aspects have consistently poor performance across schools. Another one of the weaker areas of practice – Informa-tion Literacy – also has a low deviation again reflecting consist-ently poor performance.

5. 2010/2011 comparison While we have used some comparison to last year’s data to show that there is a consistency and robustness to our data set, we now consider the comparison in more detail. As has been discussed above, the 2011 data set included considerably more

establishments than the previous year. Figure 8 shows the com-parison between the two sets of averages, and shows a very similar pattern but an improvement across all aspects.

We can compare this and last year’s scores and see there is vari-ation in the level of change. The “most improved” aspects are as follows:

• Governors (0.16)

• E-Safety Committee (0.14)

• Policy development (0.13)

• Policy Scope (0.13)

• The contribution of young people (0.12)

This is positive to see improvement in some areas that are out-side of the “policy or technical” areas. In particular the role of Governors in the online safety context is particularly encourag-ing, given the stewardship of the school strategy and the poten-tial for more aware governors to ask questions of senior man-agement around these issues.

However, of least improved areas:

• Information literacy (0.01)

• Parental education (0.01)

• Community understanding (0.03)

Figure 8: 2010/2011 average rating comparison

Page 8: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

8

In-depth

eLearning

Papers28www.elearningp

apers.eu

• M&R Incidents (0.04)

• Personal data (0.04)

The majority, again, are those that require long term invest-ment. Recent research around the abuse of professionals by students and other members of the school community (http://www.swgfl.org.uk/Staying-Safe/Content/News-Articles/The-Online-Abuse-of-Professionals) highlights how important strong community and parental engagement are in matters of online safety. However, our data would suggest these are still weak areas showing little sign of improvement. We would also ob-serve that personal data is a key area of concern from those working with schools, where establishments might be opening themselves up to potential data protection prosecution. Our data would show that this is an area of weakness that is not improving.

Figure 9 shows a comparison between 2010 and 2011 stand-ard deviations. Again we see a consistent share to the spread of data and this time greater variance in increases and decreases in scoring. A change in standard deviation does not mean some-thing has become “better” or “worse”, but is can show whether something has become more dispersed in terms of practice.

For example, we can see slight increase in spread for filtering, personal data and information literacy, while observing a reduc-tion for staff training and parental education, both areas of con-cern from the broad exploration of online safety. Community understanding, again highlighted as an area of concern, also has experienced a narrowing of standard deviation (therefore an-other area of consistently poor practice).

6. Primary Improvement, Secondary Stationary

As previously, the comparison of performance for primary and secondary establishments presents us with some very interest-ing comparisons. Figure 10 shows the difference between aver-age ratings in primary and secondary populations in 2011.

And we can see that, in general, primary establishments still report their performance as consistently weaker than their sec-ondary counterparts. This is not surprising given the difference in resource available in a lot of primary settings.

However, one of the most interesting things to draw from this comparison is that primary schools are “catching up” in terms of their policy and practice.

Figure 9: 2010/2011 standard deviation comparison

Page 9: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

9

In-depth

eLearning

Papers28www.elearningp

apers.eu

Figure 10: Primary/secondary comparison 2011

Figure 11: Primary/secondary comparison 2010

Page 10: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

10

In-depth

eLearning

Papers28www.elearningp

apers.eu

If we consider the 2010 data, in some aspects the average rating was more than a whole level difference:

• Whole School (1.5 difference)

• Community understanding (1.23)

• Mobile phones and personal hand held devices (0.96)

• Password security (0.93)

• Technical Security (0.81)

However, with the 2011 data these differences have greatly re-duced:

• Mobile phones and personal hand held devices (0.78)

• Password security (0.64)

• Email, chat, social networking, instant messaging (0.54)

• E-safety education (0.46)

• Technical Security (0.42)

If we break the 2010/2011 comparison down between primary and secondary schools, as detailed in figures 12 and 13, we can see clearly that there is a far more dramatic increase in perfor-mance in primary schools:

In some of the strongest areas of improvement, almost a quar-ter of a level has improved over the last year:

• Whole School (0.28)

• Technical Security (0.26)

• Professional standards (0.26)

• Governors (0.23)

• Password security (0.22)

In contrast, secondary schools, when the data is isolated, show little, in any improvement. In some cases, there has been a re-duction in performance:

• Technical Security (-0.13)

• Professional standards (-0.11)

• Governor training (-0.09)

• Password security (-0.07)

• Information literacy (-0.07)

• Community understanding (-0.07)

Figure 12: Comparison of primary school averages 2010 - 2011

Page 11: Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu

n.º 28 • April 2012

11

In-depth

eLearning

Papers28www.elearningp

apers.eu

Figure 13: Comparison of secondary school averages 2010-2011

7. ConclusionsThis paper has explored a number of aspects around the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with similar analysis from the previous year, we were able to both consider the cur-rent state of practice among UK schools but also analysing pro-gress over a 12 month period.

What is clear from this analysis is those aspects that either use technological intervention (i.e. filtering) and policy develop-ment are generally better performing than those aspects that require long term resource investment (such as training) or whole school involvement (such as parental education or com-munity understanding). Monitoring and reporting also perform badly. It is interesting to note that even with an almost doubling

in the number of partici-pating establishments, the strongest and weak-est performing aspects remain almost constant across 2010 and 2011 with only slight im-provement.

However, more in depth analysis of the data shows a more interest-ing picture, which pre-sents evidence that pri-mary schools are clearly improving in their per-formance, while sec-ondaries are remaining stationary or, in some cases having a slight de-

grading in performance.

This analysis does, however, only present the very high level overview of what is possible with this unique resource. Further analysis is possible at any level of comparison, from a national picture to regional analysis and even consideration of differ-ent institutions in the same area. Since this analysis has been performed, significantly more schools have now engaged with the tool, almost 1,000 having now carried out a full profile. In addition, the tool is now being piloted in the US and Australia, through the Generation Safe project [http://generationsafe.ikeepsafe.org]. Once this tool is in full use in these regions, de-tailed analysis of international performance will be possible, for the first time. This presents some exciting opportunities for understanding how schools internationally engage with online safety and ensure protection of their pupils, staff and wider community.

Copyrights The texts published in this journal, unless otherwise indicated, are subject to a Creative Commons Attribution-Noncommercial-NoDerivativeWorks 3.0 Unported licence. They may be copied, distributed and broadcast pro-vided that the author and the e-journal that publishes them, eLearning Papers, are cited. Commercial use and derivative works are not permitted. The full licence can be consulted on http://creativecommons.org/licens-es/by-nc-nd/3.0/

Edition and productionName of the publication: eLearning Papers ISSN: 1887-1542Publisher: elearningeuropa.infoEdited by: P.A.U. Education, S.L.Postal address: c/Muntaner 262, 3r, 08021 Barcelona (Spain)Phone: +34 933 670 400Email: [email protected]: www.elearningpapers.eu