27
8 th euroCRIS Strategic Seminar, September 13 th – 14 th , 2010, Brussels | Mikael K. Elbæk, Technical University of Denmark Evaluation in OpenAIRE

Evaluation in OpenAIRE

  • Upload
    zelia

  • View
    55

  • Download
    0

Embed Size (px)

DESCRIPTION

Evaluation in OpenAIRE. 8 th euroCRIS Strategic Seminar, September 13 th – 14 th , 2010, Brussels | Mikael K. Elbæk , Technical University of Denmark. OpenAIRE - factsheet. Open Access Infrastructure for Research in Europe Programme: FP7 – Research Infrastructures - PowerPoint PPT Presentation

Citation preview

Page 1: Evaluation in  OpenAIRE

8th euroCRIS Strategic Seminar, September 13th – 14th, 2010, Brussels | Mikael K. Elbæk, Technical University of Denmark

Evaluation in OpenAIRE

Page 2: Evaluation in  OpenAIRE

2

OpenAIRE - factsheetOpen Access Infrastructure for Research in EuropeProgramme: FP7 – Research InfrastructuresStarting date: December 1, 2009Duration: 36 monthsBudget: 4.1 Million38 partners covering all European member-states

– To be reached at www.openaire.eu

Page 3: Evaluation in  OpenAIRE

FP7 OA PilotThe European Commission launched the open access pilot in August 2008 in seven thematic research areas and it will run until the end of FP7: – Energy– Environment (including Climate Change)– Health– Information and Communication Technologies

(Cognitive Systems, Interaction, Robotics)– Research Infrastructures (e-infrastructures)– Science in society– Socio-economic sciences and the humanities

Special clause 39 :

“In addition to Article II.30.4, beneficiaries shall deposit an electronic copy of the published version or the final manuscript accepted for publication of a scientific publication relating to foreground published before or after the final report in an institutional or subject-based repository at the moment of publication.“

3

Page 4: Evaluation in  OpenAIRE

ERC OA guidelinesThe ERC Scientific Council's Statement on Open Access of December 2006 stressed the fundamental importance of peer-review in ensuring the certification and dissemination of high-quality scientific research, as well as the importance of wide access and efficient dissemination of research results. In December 2007, the ERC Scientific Council followed this up with Guidelines for Open Access. These Guidelines state that:

– The ERC requires that all peer-reviewed publications from ERC-funded research projects be deposited on publication into an appropriate research repository where available, such as PubMed Central, ArXiv or an institutional repository, and subsequently made Open Access within 6 months of publication.

– The ERC considers essential that primary data - which in the life sciences for example could comprise data such as nucleotide/protein sequences, macromolecular atomic coordinates and anonymized epidemiological data - are deposited to the relevant databases as soon as possible, preferably immediately after publication and in any case not later than 6 months after the date of publication.

4

Page 5: Evaluation in  OpenAIRE

European HelpdeskPromote FP7-pilot and ERC OA guidelinesNational Open Access Desks (27 countries)Provide OA “toolkits” for

– Researchers– Institutions– Repository managers

Setup 24/7 portal for deposit, search of OA publicationsLiaison with

– Other European OA initiatives– Publishers– CRIS systems

5

Page 6: Evaluation in  OpenAIRE

Reaching out to EC members

Click icon to add pictureRegion 1 North

(DTU)

Denmark (Technical

University of Denmark)

Finland (University of

Helsinki)

Sweden (National Library of

Sweden)

Region 2 South(UMINHO)

Cyprus (Universtity of Cyprus)

Greece (National

Documentation Center)

Italy (CASPAR)

Malta (Malta Council for

Science & Technology)

Portugal (University of Minho)

Spain (FECYT)

Region 3 East(eIFL)

Bulgaria (Bulgarian Academy of

Sciences)

Czech Republic (Technical University of

Ostrava)

Estonia (University of Tartu)Hungary (HUNOR)

Latvia (University of Latvia)

Lithuania (Kaunas Technical

University)

Poland (ICM – University of

Warsaw)

Romania (Kosson)

Slovakia (university Library of

Bratislava)

Slovenia (University of Ljubljana)

Region 4 West(UGENT)

France (Couperin)

Germany (University of Kostanz)

Ireland (Trinity College)

Netherlands (Utrecht University)

UK (SHERPA)

Austria (University of Wien)

Belgium (Universtiy of Gent)

Norway(University of

Tromsoe)

National Open

Access Desks

evaluating and

disseminating

policies & best

practices6

Page 7: Evaluation in  OpenAIRE

OpenAIRE infrastrucure basicsBuilds on European OA Repository infrastructureResearchers deposit once

– Harvest from OpenAIRE compliant repositories– OpenAIRE provides “Orphan” repository

Extend DRIVER guidelines for repository managersBased on DRIVER developed D-NET software toolkitAccess to scientific publications

– Search, browse– Visualization tools

Provide monitoring tools for– Document/depositing statistics– Usage statistics from repository infrastructure

Interoperation with other infrastructures

7

Page 8: Evaluation in  OpenAIRE

OpenAIRE in a nutshell

OpenAIRE overall

overview:

functionalities and

domains served

8

Page 9: Evaluation in  OpenAIRE

CERIF Data Model

9

Page 10: Evaluation in  OpenAIRE

OpenAIRE Data Model

10

Page 11: Evaluation in  OpenAIRE

OpenAIRE Controlled Vocs

11

NationalitiesAuthors

CountriesOrganizations

LicenseKindsArticles

Languages

DataSourceTypologiesDataSources

FP7SubjectsProjects_FP7subjectsProjects

nationality license_kind

language

country_of_origin typology

Page 12: Evaluation in  OpenAIRE

OpenAIRE GuidelinesDC DRIVER guidelines +:

projectID – dc:relationaccessRights – dc:rightsembargoEndDate – dc:date

12

Page 13: Evaluation in  OpenAIRE

Measuring success of OA in FP7

Document statistics for OA deposition– % rate of OA deposition

Usage statistics on FP7 OA publications– Impact of OA deposited publications

One WP dedicated in development of usage statistics service

13

Page 14: Evaluation in  OpenAIRE

Funders requirements (EC)

Open Access evaluation: the following stats will be broken down by research area, programme, ERC, contract type (NoE, IP, etc.) country, institution,

– total number of articles published in FP7 after August 2008 – total number of open access articles – average number of articles per project – total/average number of articles of SC39 projects– number of articles still in embargo period (those in 6 months embargo, those

in 12 months, those with no embargo, i.e., gold)Project evaluation based on OA publication, broken down by research area, programme, funding (e.g., big vs. small projects), maturity of project (how long has it been going for), etc.

– number of projects that have not published any articles – number of projects in the pilot that have not published any articles

Overall compliance with the mandate (how many are, how many are not) – this in relative numbers

14

Page 15: Evaluation in  OpenAIRE

Usage statistics

15 Davichi / David Oliva, http://creativecommons.org/licenses/by/2.0/

Page 16: Evaluation in  OpenAIRE

Strategy for recording usage

Based PRIUS model slightly adapted to OpenAIRE

Basically the normalization will be done centrally

– When– What– Who– Which

16

Page 17: Evaluation in  OpenAIRE

Usage statistics service for FP7 publications

17

Page 18: Evaluation in  OpenAIRE

Usage statistics service components

Harvester/aggregator from OpenAIRE compliant repositoriesService for processing usage data and produce meaningful statisticsProvide web and graphical tools for

– Managers – Funders– Researchers

Service can easily be adopted to non FP7 publications (e.g. in DRIVER, …)

18

Page 19: Evaluation in  OpenAIRE

OpenAIRE compliant repositories

Definition of OpenAIRE compliant repositories– OpenAIRE guidelines for usage statistics

Align with current initiatives Support major usage data formats

– Interoperability is importantSupport multiple transfer protocolsOpenAIRE orphan repository (Invenio/CERN) to comply with guidelines

19

Page 20: Evaluation in  OpenAIRE

OpenAIRE partners expertiseUNIBI, UGOE, SURF

– National initiatives – OA-stat– PEER project– Knowledge Exchange

Univ. of Minho– DSpace statistics plugin

University of Athens– Processing of web server & application logs– Usage statistics application in TELPlus– Personalized applications in DRIVER & TELPlus

20

Page 21: Evaluation in  OpenAIRE

Services to repository managers & funders

21

Page 22: Evaluation in  OpenAIRE

Web graphical tool Statistics provided by publication

22

Page 23: Evaluation in  OpenAIRE

Web graphical tool Statistics provided by publication

23

Page 24: Evaluation in  OpenAIRE

Other article level statsCitations

Web usage

Expert ratings

Social bookmarking

Community rating

Media/blog coverage

Commenting activity

More sophisticated types of usage metrics

Tracking ‘conversations’ outside of the publisher

Reputation metrics for users

Tagging

and more…

Mark Patterson – PLoS.org - www.ape2010.eu/ppt_wednesday/15_Patterson.pdf

24

Page 25: Evaluation in  OpenAIRE

Thank you

25 http://creativecommons.org/licenses/by/2.0/ by ky_olsen

Page 26: Evaluation in  OpenAIRE

Open IssuesUnique/persistent identifiers of publications

– Publications on different repositories– Capture different versions/formats of same

publicationExclusion of robotsRepository platform specific implementations

– Uniform approach? When?– What happens in thematic repositories?

Volume of usage stats – OpenAIRE sets?26

Page 27: Evaluation in  OpenAIRE

OpenAIRE workplan & timeline

May-July, 2010– OpenAIRE internal specifications for usage data and

transfer specsNovember, 2010

– Usage data exchange guidelines for OpenAIRE compliant repositories

– Harvester, aggregator serviceJuly, 2011

– Usage statistics service– Web usage statistics graphical tools

27