CrowdAsk- A Crowdsourcing Reference System (Internet Librarian 2014)

Preview:

DESCRIPTION

Video of CrowdAsk here: https://www.youtube.com/watch?v=-kaNIPJ82yA

Citation preview

A Crowdsourced Library Help System

Tao Zhang, PhD

Digital User Experience Specialist

Purdue University Libraries

zhan1022@purdue.edu

Internet Librarian 2014

Funded by 2013 IMLS Sparks! Ignition Grant

https://github.com/crowdask/crowdask

Ilana Stonebraker

Business Information Specialist

Purdue University Libraries

stonebraker@purdue.edu

Introduction

2

• Crowdsourced library help system (with

gamification)

• crowdask.lib.purdue.edu

• Year long IMLS funded project

• Github:

https://github.com/crowdask0/crowdask

Agenda

• Challenges/Solutions

• Project timeline

• Project results

– CrowdAsk System

– Classroom, Usability Testing and Archives beta-test

– Reference service rollout

• Discussion of Implications

3

How we got to this

• Killing a FAQ

• How do we know which users want what in

help?

4

ResultsResearch Project-

Characteristics/ Design

5

Library Familiarity

Perceived Competence

Work Avoidance

Task Orientation (Learning-oriented)

Task Orientation (Performance-oriented)

0.758Conceptual

0.358

Unobtrusive

0.341

Relevant to the user’s immediate situation

0.304 -0.351

-0.358

-0.414

-0.43

Easy to access-0.448

0.36

Help should be…

How we got to this

ResultsResearch Project-

Characteristics/ Format

6

Library Familiarity

Perceived Competence

Work Avoidance

Task Orientation (Learning-oriented)

Task Orientation (Performance-oriented)

0.758

Index

Screenshots

Pictures/Images

Table of Contents

0.616

0.523

0.36

Video Tutorials

0.332

0.311

-0.301

-0.377-0.495

-0.664

0.36

How we got to this

7

8

Librarian

If patron is satisfied

• They learn

• They tell their friends

• They teach others

If patron is not satisfied

• Have to ask again

• They ask again later

If a patron is confused

• They ask again later

• They ask someone else

Challenges

9

Questions are all treated alike

• Majority of reference questions are lower level (Bishop &

Bartlett, 2013)

• Questions are context-based

• The process of reference decontextualizing then has to add

context in the reference interview process- inefficient

• Lack of utilization of other information sources such as

graduate students, instructors

Challenges

10

• Reference service model flawed

Librarians

Other

Students

FriendsProfessor

Challenges

11

Solutions

12

• “Crowd” sourced

– Content experts (such as graduate students) better

utilized

– Novices learning together

• Better reflect participatory culture, metaliteracy

Solutions

13

• Single channel

• Focus on librarians as community builders

versus information sources

Librarians

Students

Experts

The Project Team

14

Project Timeline

Received Grant, Hired GA

• August 2013

Prototype

Completed

• December 2013

Beta-testing

• January-April 2014

Usability testing

• March- April 2014

Reporting, Sharing of Code

• April-October 2014

How it works

• Demo

16

Three Types of Beta Tests

26

Classroom Beta TestSpring 2014

Special CollectionsSummer 2014

Website RolloutFall 2014

Metrics

27

Qualitative(Usability and

And Interviews)

Quantitative

Qs and AsGoogle

Analytics Return on

Investment

Beta Test- Students

• 370 undergraduate students at Purdue

University

• MGMT 175 (Information Strategies for MGMT

Students)

• English 106 (First-Year Composition)

• GS 175 (Information Strategies for Hospitality

and Tourism Management Students)

28

Quantitative Assessment- Students

• Systems statistics

– 211 registered users

– 99 users posted questions, 106 voted

– 122 questions, 232 answers

– Most views on a question: 92

– Most answers to a question: 8

– Most votes on a question: 35

29

Quantitative Assessment- Students

• Google Analytics (January 5, 2014 to April 2,

2014)

– 1,150 visits from 474 unique visitors

– 14,715 page views

• average 12.8 pages per visit.

– 6 minutes and 7 seconds average visit duration.

30

Types of Questions

• Course-related. – “Do you know how to retake the quiz on blackboard (for Management

175)?” and “How much will my writing improve throughout the semester?”.

• CrowdAsk-related. – “How do you earn points on CrowdAsk?”; “May I know the full list of badges

and how to achieve them?”; and “Are we only allowed to ask academic related questions here?”.

• Library services or resources.– “Is there a way to search the libraries catalog just for movies?”; “How to

reserve a study room at library”; and “How do you get the actual article to come up on Business Source Premier instead of just the abstract?”.

• How-to.– “What is a good website to use to do a voiceover on Prezi?” and “How to

analyze the financial tables of a company?”.

• Conceptual. – “What is the best citation management software?” and “Could someone tell

me what is the meaning of APA citation?”.

31

Qualitative Assessment- Students

• Usability Test of 4 students (2 novice and 2

expert)

• Average duration of task: 2 minutes

• Unclear items

– Value of assigning tags

– Badges- Meaning

– Awareness of Extra privileges

• Motivation for expert users: reciprocity, not

points

32

Second Beta Test- Special

Collections

• Usability Testing and Interviews- Usability test of

three Special Collection experts outside the

libraries

• Selection of Special Collection FAQ- integrated

into system.

33

Second Beta Test- Special

Collections

34

35

User Testing- Special Collections

• Echoed many of students’ usability testing

• Concern over expertise

• Badges, points not much of a motivator

36

Launch on Purdue Libraries Website

• August 2014

• Secondary Choice for users to get help

• PUL reference staff

– Enthusiastic meeting at beginning of semester

– Will ensure that CrowdAsk questions answered in one

day timeframe.

• PUL Web Team

– Work with CrowdAsk team to support system

37

38

39

Career Information- Purdue Career

Wiki

40

Launch on Purdue Libraries Website

Goal: develop sustainable

user engagement and

community involvement as

part of the Purdue University

Libraries website.

41

Next Steps

• We want you! Looking for partners and test

cases

• GNU General Public License Version 2 on

GitHub

43

How could you use CrowdAsk?

44

Where are theOpportunities?

Where are the possibleThreats or Weakness?

Where are the possibleBenefits?

Thank You

CrowdAsk codehttps://github.com/crowdask0/crowdask

Short video on CrowdAskhttp://youtu.be/-kaNIPJ82yA

Ilana StonebrakerBusiness Information SpecialistPurdue University Librariesstonebraker@purdue.edu

Tao Zhang, PhDDigital User Experience Specialist

Purdue University Libraries

zhan1022@purdue.edu

45

Works cited and Image Credit

• Bishop, B. W., & Bartlett, J. A. (2013). Where Do We Go from Here?

Informing Academic Library Staffing through Reference Transaction

Analysis. College & Research Libraries, 74(5), 489-500.

• http://static.guim.co.uk/sys-

images/Money/Consumer/financialservicesbrochures/2014/2/28/139

3599196845/Angry-man-about-to-throw--011.jpg

46