16
The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness? Lindsay Blake, MLIS, AHIP Darra Ballance, MLIS, AHIP Maryska Connolly-Brown, MLIS Kathy Davies, MLS Julie K. Gaines, MLIS, AHIP Kim Mears, MLIS, AHIP Peter Shipman, MLIS

The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Embed Size (px)

Citation preview

Page 1: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

The Many Faces of Embedded Librarianship: How do we

Evaluate Effectiveness?Lindsay Blake, MLIS, AHIP

Darra Ballance, MLIS, AHIPMaryska Connolly-Brown, MLIS

Kathy Davies, MLSJulie K. Gaines, MLIS, AHIP

Kim Mears, MLIS, AHIPPeter Shipman, MLIS

Page 2: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Objectives

• Project Overview• Survey of Embedded Librarians• Creating a tool for embedded program

evaluation• Where are going now?

Page 3: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Project Overview

• Phase One– Review the literature on embedded librarians in

universities, focusing on health sciences– Survey librarians for information on embedded

librarianship including any current evaluations• Phase Two– Create an assessment instrument to evaluate

embedded programs– Validate and pilot the evaluation instrument

Page 4: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

TimelineDate Task Responsible Party

August 2013 Completion of literature review Darra Ballance and Julie Gaines

September – November 2013 Creation of the survey on Embedded Librarians

Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

November – December 2013 Distribution to of the survey to listservs

Lindsay Blake

January 2014 Compilation and analysis of survey results

Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

February – May 2014 Creation of evaluation instrument for embedded programs

Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

June- August 2014 Validation of evaluation instrument

Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

August – September 2014 Piloting of instrument Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

October 2014 Reporting of results Lindsay, Kim, Kathy, Peter, Maryska, Darra and Julie

Page 5: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Survey of Embedded Librarians• Based on your professional responsibilities, do you consider yourself to be an

embedded librarian? Embedded Librarian: "Embedded Librarianship is a distinctive innovation that moves the librarians out of the library...forming a strong working relationship between the librarian and a group or team of people who need the librarian's information expertise." (Shumaker, 2012)

72%

28%

YesNo

Page 6: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Demographics

0-5 6-10 11-15 16-20 21+0

20

40

60

80

100

120Years as Librarian

Years Embedded

Years

Num

ber o

f lib

rari

ans

29%

11%

21%3%

27%

8%

Academic Medical Center

Healthcare System

Academic Medical Center and Healthcare System

Specialized Hospital or Clinic

University or Col-lege

Other

Page 7: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Where are you Embedded?

Dentistry Medicine Nursing Pharmacy Hospital or Clinic

Allied Health

Sciences

0

10

20

30

40

50

60

70

80

9

70

61

16

52

32

Page 8: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Patron groups

Facu

lty

Student

Medica

l Stu

dent 1st

or 2nd ye

ar

Medica

l Stu

dent 3rd

or 4th

year

Medica

l Resid

ent

Researc

hers

Clinici

ans

Other

020406080

100120140160 139

91

2740

59

93 104

43

Page 9: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Where do you spend most of your time?

FacultyStudentMedical Student 1st or 2nd yearMedical Student 3rd or 4th yearMedical ResidentResearcherClinicianOther

Page 10: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Embedded Activities

Teaching Basic Search Skills

Teaching Advanced Search Skills

Assisting Students

Performing Complex Searches

Partnering with Faculty on Grants/Articles

Co-Teaching with Faculty

Co-locating in a Dept

Attending Morning Report

Attending Morning Rounds

Curriculum Development

College Committees

Other

0 20 40 60 80 100 120 140 160 180

156

146

101

162

94

84

30

30

28

84

67

43

Page 11: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Evaluations of Embedded Programs

Student or Professional Role

Services or Information Products Used

Quality of Library Service

Changes in Information Seeking Behavior

Impact of Library Service on Decision Making

Impact of Library Service in Research Practice

Impact for Future Work

Desired Access to Future Services or Information Products

Other

0 5 10 15 20 25 30 35 40 45 50

Page 12: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Evaluation Methods

Saved Email CorrespondanceWritten Details of Experience

Informal Collection of ExperiencesHard Copy Tally

Software Tracking of ContactsStudent Performance

Student Exam QuestionsEvaluation within CMS

Pretest/PosttestQuestionnaire or Survey

Focus GroupsOther

0 5 10 15 20 25 30 35 40 45

Page 13: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Creating an Evaluation Tool• Bi-monthly meetings– Began with discussions of what we wanted out of

this instrument– Formed overarching Goal and objectives

• Areas of focus:– Survey instrument – ease of use, familiarity– Use information already collected in the library• Low hanging fruit

– Create a evaluations toolbox• Student, faculty, clinician survey & library data

Page 14: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Lessons Learned

• Discussion is good, but you need to start with a Goal and objectives.

• Look for low hanging fruit – what are you already doing that can be used?

• Validation has many facets – use as many as you can – internal consistency is one

• Have others look at your survey, evaluation instrument, test, etc.

Page 15: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Future Directions

• Finish creation of evaluation tool in survey format

• Create focus groups to review survey– Students, faculty, clinicians, across campus

• Send survey to experts for validation purposes– Evaluation, librarian, embedded librarian

• Pilot evaluation at Georgia Regents University• Expand to national evaluation

Page 16: The Many Faces of Embedded Librarianship: How do we Evaluate Effectiveness?

Questions?

• Picture of us