44
A Crowdsourced Library Help System Tao Zhang, PhD Digital User Experience Specialist Purdue University Libraries [email protected] Internet Librarian 2014 Funded by 2013 IMLS Sparks! Ignition Grant https:// github.com/crowdask/crowdask Ilana Stonebraker Business Information Specialist Purdue University Libraries [email protected]

CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Embed Size (px)

DESCRIPTION

Video of CrowdAsk here: https://www.youtube.com/watch?v=-kaNIPJ82yA

Citation preview

Page 1: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

A Crowdsourced Library Help System

Tao Zhang, PhD

Digital User Experience Specialist

Purdue University Libraries

[email protected]

Internet Librarian 2014

Funded by 2013 IMLS Sparks! Ignition Grant

https://github.com/crowdask/crowdask

Ilana Stonebraker

Business Information Specialist

Purdue University Libraries

[email protected]

Page 2: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Introduction

2

• Crowdsourced library help system (with

gamification)

• crowdask.lib.purdue.edu

• Year long IMLS funded project

• Github:

https://github.com/crowdask0/crowdask

Page 3: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Agenda

• Challenges/Solutions

• Project timeline

• Project results

– CrowdAsk System

– Classroom, Usability Testing and Archives beta-test

– Reference service rollout

• Discussion of Implications

3

Page 4: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

How we got to this

• Killing a FAQ

• How do we know which users want what in

help?

4

Page 5: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

ResultsResearch Project-

Characteristics/ Design

5

Library Familiarity

Perceived Competence

Work Avoidance

Task Orientation (Learning-oriented)

Task Orientation (Performance-oriented)

0.758Conceptual

0.358

Unobtrusive

0.341

Relevant to the user’s immediate situation

0.304 -0.351

-0.358

-0.414

-0.43

Easy to access-0.448

0.36

Help should be…

How we got to this

Page 6: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

ResultsResearch Project-

Characteristics/ Format

6

Library Familiarity

Perceived Competence

Work Avoidance

Task Orientation (Learning-oriented)

Task Orientation (Performance-oriented)

0.758

Index

Screenshots

Pictures/Images

Table of Contents

0.616

0.523

0.36

Video Tutorials

0.332

0.311

-0.301

-0.377-0.495

-0.664

0.36

How we got to this

Page 7: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

7

Page 8: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

8

Librarian

If patron is satisfied

• They learn

• They tell their friends

• They teach others

If patron is not satisfied

• Have to ask again

• They ask again later

If a patron is confused

• They ask again later

• They ask someone else

Page 9: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Challenges

9

Questions are all treated alike

• Majority of reference questions are lower level (Bishop &

Bartlett, 2013)

• Questions are context-based

• The process of reference decontextualizing then has to add

context in the reference interview process- inefficient

• Lack of utilization of other information sources such as

graduate students, instructors

Page 10: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Challenges

10

• Reference service model flawed

Librarians

Other

Students

FriendsProfessor

Page 11: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Challenges

11

Page 12: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Solutions

12

• “Crowd” sourced

– Content experts (such as graduate students) better

utilized

– Novices learning together

• Better reflect participatory culture, metaliteracy

Page 13: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Solutions

13

• Single channel

• Focus on librarians as community builders

versus information sources

Librarians

Students

Experts

Page 14: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

The Project Team

14

Page 15: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Project Timeline

Received Grant, Hired GA

• August 2013

Prototype

Completed

• December 2013

Beta-testing

• January-April 2014

Usability testing

• March- April 2014

Reporting, Sharing of Code

• April-October 2014

Page 16: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

How it works

• Demo

16

Page 17: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 18: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 19: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 20: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 21: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 22: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 23: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 24: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)
Page 25: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Three Types of Beta Tests

26

Classroom Beta TestSpring 2014

Special CollectionsSummer 2014

Website RolloutFall 2014

Page 26: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Metrics

27

Qualitative(Usability and

And Interviews)

Quantitative

Qs and AsGoogle

Analytics Return on

Investment

Page 27: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Beta Test- Students

• 370 undergraduate students at Purdue

University

• MGMT 175 (Information Strategies for MGMT

Students)

• English 106 (First-Year Composition)

• GS 175 (Information Strategies for Hospitality

and Tourism Management Students)

28

Page 28: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Quantitative Assessment- Students

• Systems statistics

– 211 registered users

– 99 users posted questions, 106 voted

– 122 questions, 232 answers

– Most views on a question: 92

– Most answers to a question: 8

– Most votes on a question: 35

29

Page 29: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Quantitative Assessment- Students

• Google Analytics (January 5, 2014 to April 2,

2014)

– 1,150 visits from 474 unique visitors

– 14,715 page views

• average 12.8 pages per visit.

– 6 minutes and 7 seconds average visit duration.

30

Page 30: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Types of Questions

• Course-related. – “Do you know how to retake the quiz on blackboard (for Management

175)?” and “How much will my writing improve throughout the semester?”.

• CrowdAsk-related. – “How do you earn points on CrowdAsk?”; “May I know the full list of badges

and how to achieve them?”; and “Are we only allowed to ask academic related questions here?”.

• Library services or resources.– “Is there a way to search the libraries catalog just for movies?”; “How to

reserve a study room at library”; and “How do you get the actual article to come up on Business Source Premier instead of just the abstract?”.

• How-to.– “What is a good website to use to do a voiceover on Prezi?” and “How to

analyze the financial tables of a company?”.

• Conceptual. – “What is the best citation management software?” and “Could someone tell

me what is the meaning of APA citation?”.

31

Page 31: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Qualitative Assessment- Students

• Usability Test of 4 students (2 novice and 2

expert)

• Average duration of task: 2 minutes

• Unclear items

– Value of assigning tags

– Badges- Meaning

– Awareness of Extra privileges

• Motivation for expert users: reciprocity, not

points

32

Page 32: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Second Beta Test- Special

Collections

• Usability Testing and Interviews- Usability test of

three Special Collection experts outside the

libraries

• Selection of Special Collection FAQ- integrated

into system.

33

Page 33: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Second Beta Test- Special

Collections

34

Page 34: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

35

Page 35: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

User Testing- Special Collections

• Echoed many of students’ usability testing

• Concern over expertise

• Badges, points not much of a motivator

36

Page 36: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Launch on Purdue Libraries Website

• August 2014

• Secondary Choice for users to get help

• PUL reference staff

– Enthusiastic meeting at beginning of semester

– Will ensure that CrowdAsk questions answered in one

day timeframe.

• PUL Web Team

– Work with CrowdAsk team to support system

37

Page 37: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

38

Page 38: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

39

Page 39: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Career Information- Purdue Career

Wiki

40

Page 40: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Launch on Purdue Libraries Website

Goal: develop sustainable

user engagement and

community involvement as

part of the Purdue University

Libraries website.

41

Page 41: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Next Steps

• We want you! Looking for partners and test

cases

• GNU General Public License Version 2 on

GitHub

43

Page 42: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

How could you use CrowdAsk?

44

Where are theOpportunities?

Where are the possibleThreats or Weakness?

Where are the possibleBenefits?

Page 43: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Thank You

CrowdAsk codehttps://github.com/crowdask0/crowdask

Short video on CrowdAskhttp://youtu.be/-kaNIPJ82yA

Ilana StonebrakerBusiness Information SpecialistPurdue University [email protected]

Tao Zhang, PhDDigital User Experience Specialist

Purdue University Libraries

[email protected]

45

Page 44: CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Works cited and Image Credit

• Bishop, B. W., & Bartlett, J. A. (2013). Where Do We Go from Here?

Informing Academic Library Staffing through Reference Transaction

Analysis. College & Research Libraries, 74(5), 489-500.

• http://static.guim.co.uk/sys-

images/Money/Consumer/financialservicesbrochures/2014/2/28/139

3599196845/Angry-man-about-to-throw--011.jpg

46