Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Juergen Wastl
Head of Research Information
Research Strategy Office, Cambridge University
Researchfish Interoperability Pilot: Institutional Perspective
Agenda
• Introduction
• Need for Research Information
• Need for interoperability from the HEI perspective
• Cambridge procedure and contribution (and the other HEIs procedure)
• Challenges for systems interoperability, pre and post pilot
• Phase 1 lessons learned
• Phase 2 and the next submission window
Introduction – the Research Information environment
Research Information
team
Research Office
Grant reporting (e.g. Researchfish)
HESA (return on statistics)
REF return
HR CRIS Grants DSpace
Academics
Administrators/ Co-ordinators
University Library
Senior Academic Committees
Research Strategy Office
Need for Research Information
• Managing research portfolio
• Managing compliance & reporting
• Benchmarking
• Analysing research portfolio
Mind the Map: Cambridge Research Information landscape
https://figshare.com/articles/Mind_the_Research_Information_Map/1507607
Researchfish in our Research Information landscape
Need for Research Information
Previous (and ongoing) challenges and opportunities
• Quality of underpinning data
• Curation
Emerging challenges
• External reporting demands and timelines (Researchfish, REF) e.g. OA requirements
• Communication and academic buy-in
Need for Interoperability from a HEI perspective
• Streamline and make reporting as efficient as possible • ‘enter data once, use multiple times’
HOWEVER
• Database models and structures are often incompatible (historic reasons, different focus)
‘Square of opposition’
HEI Researchfish
PI Academic
Award Grant
‘Square of opposition’
HEI Researchfish
PI Academic
Award Grant
jw312
RG12345 ES/N013891/1
M Parker: 93353 M Parker: 494 M Parker: S-436072
Need for Interoperability from a HEI perspective
How to link effectively? • Common standard sets and definitions
• Curated data
Aim
Problem
• reduce errors • reduce frustration • reduce duplication of effort
• incoming/outgoing staff • Award based vs user based • User (PI): Academic vs Student
‘Square of opposition’ - squared
HEI Researchfish
PI Academic
Award Grant
HEIs Funders
‘Square of opposition’
HEI Researchfish
PI Academic
Award Grant
HEIs Funders
Publications Outcomes
Phase 1 of the Interoperability Pilot
Criteria
Awards: RCUK, CRUK, NC3R, Academy of Medical Science (based on award reference number in Researchfish)
Publications: • Journal article of Conference proceedings • Doi/PubMed ID • Published 2014 onwards
Links: • between Awards and Publications • De-duplicated (csv/Excel format)
Cambridge procedure and contribution
Researchfish in Cambridge: 2014: 2400 awards 2015: 4800 awards 21 funders 3300 PIs… Our approach for phase 1 Focus on: 1) a select Group/Department; not the entire institution 2) Quality in this subset
Cambridge procedure and contribution
1) Curate Award references in Symplectic Elements
2) Backfill EXISTING links (publication-award) via existing sources (GtR, Dimensions, manual link in Symplectic Elements)
3) Create NEW links (publication-award) in Symplectic Elements
4) Verify and de-duplicate
5) Format for transfer to Researchfish
Lessons learned BEFORE upload
• Heavy manual workload by centre (60%) Dept admin (30%) and PIs (approx 15) 10%
• Curation (Awards) time consuming and challenging
• Verifying Awards was also challenging (free text fields, supplements, transfers, large awards (e.g. STFC) split in Researchfish)
• Verifying PI names challenging (duplicates, name variants)
• Challenging timeline
Lessons learned AFTER upload
• Still duplicates due to errors in publication meta data
• Difficult to disentangle shifting datasets
• Quality data and links nevertheless; proved the aim of the pilot
• Communication is critical; needs to be concise and clear.
Challenges & opportunities for systems interoperability
Data curation essential on all sides:
common sources of IDs for Award reference numbers, doi etc
Linking publications to grants via OA workflow:
deposit on acceptance has academics’ attention and the ‘enter once, (re)use multiple times’ message could be showcased here
ORCID
ORCiD as facilitator
HEI Researchfish
PI Academic
Award Grant
HEIs Funders
Publications Outcomes
Summary of phase 1 lessons learned
• Communication o Challenging in a fast moving environment (e.g. the submission
period with ORCID available in RF, change in response codes) • Data cuts (moving targets) • Researcher’s view and fixed mindset • Curation of data
Interoperability
• can reduce time spent on reporting • can increase the efficiency of the process • can improve overall reporting… • … and with the new JeS system could offer even more
Phase 2 and the next submission window
Challenge for Cambridge:
• Entire institution, not just a single unit
• Communication
• Timescale
Expectation is that there will be more efficient deduplication & re-use
Thank you
Let’s Talk – Interoperability between University CRIS/IR and Researchfish: a case study from the UK Clements A, Reddick G, Viney I, McCutcheon V, Toon J, Macandrew H, McArdle I, Collet S, Wastl J
https://www.repository.cam.ac.uk/handle/1810/257357
http://hdl.handle.net/11366/492
www.research-information.admin.cam.ac.uk