14
This chapter describes the library use and information literacy behaviors included on standardized college stu- dent surveys of engagement and experiences, particularly information literacy-related experimental items developed by ACRL ’s Institute for Information Literacy’s College Stu- dent Surveys Project Group on the 2006 NSSE College Student Report. College Student Engagement Surveys: Implications for Information Literacy Bonnie Gratch-Lindauer Librarians have led the way for information literacy skills and abilities to be more integrated throughout the curriculum, and students are expected to demonstrate competency in finding, evaluating, and using information at many colleges and universities as part of the institution’s curricular require- ments and student learning outcomes. As part of higher education’s empha- sis on assessing student learning outcomes and increasing student success indicators (such as retention rate, graduation rate, and GPA), the use of data from some standardized surveys of college student engagement is increas- ingly becoming accepted as a proxy for student learning. This chapter sum- marizes the origins of these surveys and describes current surveys of student engagement, with particular focus on the NSSE College Student Report and development of information literacy–related items administered on the 2006 NSSE survey. The author presents the benefits and applications of using data from these engagement surveys and makes suggestions for future use of the information-literacy related items. Background and Origins of Student Engagement Surveys According to George Kuh, National Survey of Student Engagement’s (NSSE) founder and director from 1999 through mid-2007 and director of the Indi- ana University Center for Postsecondary Research, the conceptual roots of student engagement surveys go back more than two decades as part of the 101 8 NEW DIRECTIONS FOR TEACHING AND LEARNING, no. 114, Summer 2008 © Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com) • DOI: 10.1002/tl.320

College student engagement surveys: Implications for information literacy

Embed Size (px)

Citation preview

This chapter describes the library use and information literacy behaviors included on standardized college stu-dent surveys of engagement and experiences, particularlyinformation literacy-related experimental items developedby ACRL’s Institute for Information Literacy’s College Stu-dent Surveys Project Group on the 2006 NSSE CollegeStudent Report.

College Student Engagement Surveys:Implications for Information Literacy

Bonnie Gratch-Lindauer

Librarians have led the way for information literacy skills and abilities to bemore integrated throughout the curriculum, and students are expected todemonstrate competency in finding, evaluating, and using information atmany colleges and universities as part of the institution’s curricular require-ments and student learning outcomes. As part of higher education’s empha-sis on assessing student learning outcomes and increasing student successindicators (such as retention rate, graduation rate, and GPA), the use of datafrom some standardized surveys of college student engagement is increas-ingly becoming accepted as a proxy for student learning. This chapter sum-marizes the origins of these surveys and describes current surveys of studentengagement, with particular focus on the NSSE College Student Report anddevelopment of information literacy–related items administered on the 2006NSSE survey. The author presents the benefits and applications of using datafrom these engagement surveys and makes suggestions for future use of theinformation-literacy related items.

Background and Origins of Student EngagementSurveys

According to George Kuh, National Survey of Student Engagement’s (NSSE)founder and director from 1999 through mid-2007 and director of the Indi-ana University Center for Postsecondary Research, the conceptual roots ofstudent engagement surveys go back more than two decades as part of the

101

8

NEW DIRECTIONS FOR TEACHING AND LEARNING, no. 114, Summer 2008 © Wiley Periodicals, Inc.Published online in Wiley InterScience (www.interscience.wiley.com) • DOI: 10.1002/tl.320

102 INFORMATION LITERACY

initiatives to document conditions that promote student learning (Kuh,2001). He credits such researchers as Nevitt Sanford, Alexander Astin,Arthur Chickering, and Zelda Gamson, who with other leading scholars metin 1986 at a Wingspread retreat to distill the research findings on teachingand learning. One outcome of this group’s efforts was the “Seven Principlesof Good Practice in Undergraduate Education.” This widely cited documentwas first published in a 1987 AAHE Bulletin article (Chickering and Gam-son, 1987).

In response to concerns about college rankings, Russ Edgerton orga-nized a group of educational leaders and scholars at the Pew CharitableTrusts in 1998, where agreement was reached that alternative measures ofcollege quality were needed, particularly “an annual assessment of theextent to which institutions were using the kinds of good educational prac-tices identified in the literature” (Kuh, 2001, p. 12). With support from Pew,Peter Ewell convened a group of leading scholars on college student devel-opment to design a survey instrument focused on the extent to which stu-dents engage in good educational practices. The field testing and two pilotadministration cycles were completed in fall 1998 and spring 1999 and thefirst national administration of the National Survey of Student Engagementsurvey was launched in spring 2000.

NSSE is administered by the Center for Postsecondary Research, Indi-ana University. Its survey instrument, called the College Student Report,measures student behaviors that are highly correlated with many desirablelearning and personal development outcomes of college. NSSE also admin-isters the Beginning College Survey of Student Engagement (BCSSE) forentering students, administered before classes begin or in the first twoweeks of the fall term. Over the years, the College Student Report, hereafterreferred to as the NSSE instrument, has been refined and increased in usagefrom 75,000 students at 276 schools in spring 2000 to 260,000 students at523 colleges and universities who completed the survey in Spring 2006(NSSE, 2006).

In addition to the NSSE instrument and its offspring BCSSE, this authorhas identified seven other standardized surveys, many of which were devel-oped and administered before the NSSE:

1. Cooperative Institutional Research Program’s Student InformationForm (CIRP), currently called the CIRP Freshman Survey, administeredby the Higher Educational Research Institute at UCLA (HERI), includesdemographic items, opinions and expectations about college, degree,and career plans, as well as attitudes and values about a variety ofissues.

2. Your First College Year (YFCY), designed as a follow-up to CIRP Fresh-man Survey, has also been administered by HERI since 2000 andfocuses on student’s academic and personal development and adjust-ment during the freshman year.

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

103COLLEGE STUDENT ENGAGEMENT SURVEYS

3. College Senior Survey (CSS), formerly called the College Student Sur-vey, also administered by CIRP, is designed in part as a postsurvey foritems on the CIRP Freshman Survey and the YFCY, as well as focusingon academic achievement and engagement.

4. College Student Experience Questionnaire (CSEQ), administered bythe Center for Postsecondary Research (CPR) since 1979, Indiana Uni-versity, now in its fourth edition, measures quality of effort using cam-pus resources and opportunities for learning and development, studentopinions about the priorities and emphases of the campus environ-ment, and student self-reported progress toward a variety of educationaloutcomes. It has the largest number of library and information literacy-related items, but it has not been revised since 1998 and the number ofinstitutions using it is much fewer than those using the NSSE andCCSSE instruments.

5. College Student Expectations Questionnaire (CSXQ), also administeredby CPR since 1997, was adapted from the CSEQ and is now in its sec-ond edition; it measures new student goals and expectations for the col-lege experience, and when paired with the CSEQ it can measure thedegree to which those expectations were met.

6. Community College Survey of Student Engagement (CCSSE), a part-ner to NSSE, and administered since 2001 by the Community CollegeLeadership Program, University of Texas at Austin, contains substan-tial and intentional overlap of items on the NSSE, but it also containsmany unique items measuring how students spend their time; whatthey feel they have gained from their classes; how they assess their rela-tionships and interactions with faculty, counselors, and peers; whatkinds of work they are challenged to do; and how the college supportslearning. It is a benchmarking instrument, establishing national normson educational practice.

7. Community College Student Experience Questionnaire (CCSEQ),administered by the Center for the Study of Higher Education, Univer-sity of Memphis, since 1994, was developed from the concepts androots of the CSEQ; the CCSEQ focuses on student use of facilities and col-lege learning opportunities and their impressions at the communitycollege, as well as progress they report making toward goals.

All of these instruments allow inclusion of local items, and some of themhave parallel instruments to measure faculty perceptions and reports of theirteaching practices and expectations of their students. Exhibit 8.1 is basedon a review of the latest editions of instruments available at the institutionalWebsites as of late September 2007.

Benchmarks are groups of conceptually related survey items. For NSSE’sinstrument they are based on forty-two key questions from the survey thatcapture many of the most important aspects of the student experience: levelof academic challenge, active and collaborative learning, student-faculty

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

Exh

ibit

8.1

.C

omp

aris

on o

f N

atio

nal

Sta

nd

ard

ized

Su

rvey

s of

Stu

den

t E

nga

gem

ent

Inst

rum

ent

Stud

ents

Sur

veye

dIn

form

atio

n Li

tera

cy a

nd L

ibra

ry U

se I

tem

s In

clud

ed w

ith

Sele

cted

Exa

mpl

es

CIR

P F

resh

man

Su

rvey

Fre

shm

en a

t be

gin

nin

g “H

ow o

ften

du

rin

g th

e pa

st y

ear

did

you

:(h

ttp:

//w

ww

.gse

is.u

cla.

edu

/of

fir

st y

ear

––U

se t

he

Inte

rnet

to

rese

arch

for

hom

ewor

k, r

ead

new

s si

tes,

rea

d bl

ogs?

her

i/ci

rpov

ervi

ew.p

hp)

––E

valu

ate

the

qual

ity

or r

elia

bili

ty o

f in

form

atio

n y

ou r

ecei

ved?

––L

ook

up

scie

nti

fic

rese

arch

art

icle

s an

d re

sou

rces

?”

You

r F

irst

Yea

r in

Col

lege

F

resh

men

nea

r th

e en

d “R

ate

you

r sa

tisf

acti

on w

ith

libr

ary

faci

liti

es a

nd

serv

ices

,” a

nd

sam

e it

ems

as a

bove

(YF

YC

; htt

p://

ww

w.g

seis

.of

th

e fi

rst

year

for

the

CIR

P S

urv

eyu

cla.

edu

/her

i/yf

cy/)

Col

lege

Sen

ior

Surv

ey

Sen

iors

“Rat

e yo

ur

sati

sfac

tion

wit

h li

brar

y fa

cili

ties

. Sin

ce e

nte

rin

g co

lleg

e, h

ow o

ften

hav

e(C

SS; h

ttp:

//w

ww

.gse

is.u

cla.

you

:ed

u/h

eri/

css.

htm

l)––

Use

d th

e In

tern

et f

or r

esea

rch

or

hom

ewor

k?––

Use

d th

e li

brar

y fo

r re

sear

ch o

r h

omew

ork?

––

Com

pare

d w

ith

wh

en y

ou e

nte

red

this

col

lege

, how

wou

ld y

ou n

ow d

escr

ibe

you

r co

mpu

ter

skil

ls?

you

r an

alyt

ical

an

d pr

oble

m-s

olvi

ng

skil

ls?

you

r ab

ilit

y to

thin

k cr

itic

ally

?”

Col

lege

Stu

den

t E

xper

ien

ces

Un

derg

radu

ates

Con

tain

s th

e m

ost

item

s re

lati

ng

to li

brar

y an

d in

form

atio

n u

se: t

wo

sect

ion

s w

ith

Qu

esti

onn

aire

17

item

s ab

out

freq

uen

cy o

f u

se d

uri

ng

the

curr

ent

sch

ool y

ear

of t

he

libr

ary

and

(CSE

Q; h

ttp:

//w

ww

.indi

ana.

its

reso

urc

es a

nd

use

of

info

rmat

ion

tec

hn

olog

y, s

uch

as

“Ask

ed a

libr

aria

n o

r st

aff

edu

/~cs

eq/)

mem

ber

for

hel

p in

fin

din

g in

form

atio

n o

n s

ome

topi

c” o

r “S

earc

hed

th

e W

WW

or

Inte

rnet

for

info

rmat

ion

rel

ated

to

a co

urs

e”

Col

lege

Stu

den

t E

xpec

tati

ons

Pre

coll

ege

stu

den

ts a

nd

Con

tain

s a

com

bin

ed s

ecti

on L

ibra

ry a

nd

Info

rmat

ion

Tec

hn

olog

y w

ith

nin

e it

ems,

Qu

esti

onn

aire

fi

rst-

year

stu

den

ts a

t su

ch a

s “D

uri

ng

the

com

ing

sch

ool y

ear,

how

mu

ch d

o yo

u e

xpec

t to

do

the

(CSX

Q; h

ttp:

//w

ww

.indi

ana.

begi

nn

ing

of t

erm

foll

owin

g:ed

u/~

cseq

/csx

q_ge

ner

alin

fo.

––U

se t

he

libr

ary

as a

qu

iet

plac

e to

rea

d or

stu

dy?

htm

)––

Rea

d as

sign

ed m

ater

ials

oth

er t

han

tex

tboo

ks in

th

e li

brar

y?”

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

Col

lege

Stu

den

t R

epor

t F

irst

-yea

r an

d se

nio

r“D

uri

ng

the

curr

ent

sch

ool y

ear,

how

mu

ch h

as y

our

cou

rsew

ork

emph

asiz

ed t

he

(NSS

E; h

ttp:

//n

sse.

iub.

edu

/st

ude

nts

foll

owin

g m

enta

l act

ivit

ies:

htm

l/su

rvey

_in

stru

men

ts_

––Sy

nth

esiz

ing

and

orga

niz

ing

idea

s, in

form

atio

n, o

r ex

peri

ence

s in

to n

ew, m

ore

2007

.cfm

)co

mpl

ex in

terp

reta

tion

s an

d re

lati

onsh

ips?

––M

akin

g ju

dgm

ents

abo

ut

the

valu

e of

info

rmat

ion

, arg

um

ents

, or

met

hod

s?”

Beg

inn

ing

Col

lege

Su

rvey

of

En

teri

ng

stu

den

tsN

o li

brar

y u

se o

r in

form

atio

n-l

iter

acy

rela

ted

item

sSt

ude

nt

En

gage

men

t (B

CSS

E; h

ttp:

//bc

sse.

iub.

edu

/pd

f/B

CSS

E_2

007.

pdf)

Com

mu

nit

y C

olle

ge S

tude

nt

Com

mu

nit

y co

lleg

e “D

uri

ng

the

curr

ent

sch

ool y

ear,

how

mu

ch h

as y

our

cou

rsew

ork

at t

his

col

lege

Rep

ort

stu

den

tsem

phas

ized

th

e fo

llow

ing

men

tal a

ctiv

ity:

(CC

SSE

; htt

p://w

ww

.ccs

se.o

rg/)

––M

akin

g ju

dgm

ents

abo

ut

the

valu

e or

sou

ndn

ess

of in

form

atio

n, a

rgu

men

ts o

r m

eth

ods?

Com

mu

nit

y C

olle

ge S

tude

nt

Com

mu

nit

y co

lleg

eH

as L

ibra

ry A

ctiv

itie

s se

ctio

n w

ith

7 it

ems

that

ask

(5

of t

he

7 ar

e in

clu

ded)

:E

xper

ien

ce Q

ues

tion

nai

re

stu

den

ts“I

n y

our

expe

rien

ce a

t th

is c

olle

ge d

uri

ng

the

curr

ent

sch

ool y

ear,

abo

ut

how

(CC

SEQ

; htt

p://

coe.

mem

phis

.of

ten

hav

e yo

u d

one

each

of

the

foll

owin

g:ed

u/C

SHE

/CC

SEQ

.htm

; not

––

Use

d th

e li

brar

y as

a q

uie

t pl

ace

to r

ead

or s

tudy

mat

eria

l you

bro

ugh

t w

ith

on

web

site

; mu

st b

e or

dere

d)yo

u?

––R

ead

new

spap

ers,

mag

azin

es, o

r jo

urn

als

loca

ted

in t

he

libr

ary

or o

nli

ne?

––C

hec

ked

out

book

s an

d ot

her

mat

eria

ls t

o re

ad a

t h

ome?

––P

repa

red

a bi

blio

grap

hy

or s

et o

f re

fere

nce

s fo

r a

term

pap

er o

r re

port

?––

Ask

ed t

he

libr

aria

n f

or h

elp

in f

indi

ng

mat

eria

ls o

n s

ome

topi

c?––

Wor

ked

on a

pap

er o

r pr

ojec

t th

at r

equ

ired

inte

grat

ing

idea

s or

info

rmat

ion

from

var

iou

s so

urc

es?

––C

onsi

dere

d th

e ac

cura

cy a

nd

cred

ibil

ity

of in

form

atio

n f

rom

dif

fere

nt

sou

rces

?––

Use

d th

e W

eb o

r In

tern

et t

o ge

t in

form

atio

n f

or a

cla

ss p

roje

ct o

r pa

per?

In t

hin

kin

g ov

er y

our

expe

rien

ces

in t

his

col

lege

up

to n

ow, t

o w

hat

ext

ent

do

you

th

ink

you

hav

e ga

ined

or

mad

e pr

ogre

ss in

eac

h o

f th

e fo

llow

ing

area

s:––

Acq

uir

ing

skil

ls n

eede

d to

use

com

pute

rs t

o ac

cess

info

rmat

ion

fro

m t

he

libr

ary,

th

e In

tern

et, t

he

Web

. . .

106 INFORMATION LITERACY

interaction, enriching educational experiences, and supportive campus en-vironment. For the CCSSE’s instrument, thirty-six items compose thebenchmarks for academic challenge, active and collaborative learning, student-faculty interaction, student effort, and support for learning. Thesetwo instruments share many common items and benchmarks and thus offeran important database of longitudinal data that will allow studies of studentengagement at two-year and four-year campuses within a single state or uni-versity system and track the movement and performance of studentsbetween these two sectors.

Student Engagement Surveys and Library Use andInformation Literacy Behaviors

This section summarizes pertinent findings from three studies that particu-larly focused on library use and information literacy items on the NSSE andCSEQ instruments. The summary and discussion illustrate how theseinstruments have been used to connect student engagement in education-ally purposeful behaviors to student self-reported use of library resources,use of information technology, and their behaviors related to some aspectsof information literacy.

Mack and Boruff-Jones’s article “Information Literacy and StudentEngagement: What the National Survey of Student Engagement RevealsAbout Your Campus” (2003) is based on several original conceptions andassumptions. They reason that if the behaviors NSSE measures are indica-tors of student engagement in their learning, then information literacy mustbe a part of this because some NSSE items address use of information tech-nology, critical thinking, and ethical awareness—all aspects of informationliteracy behaviors. They suggest that documenting the level of institutionalinformation literacy be added to the list of productive ways that institutionsare using their NSSE reports.

Perhaps the most useful contribution of this article is the analysis ofitems relating to the NSSE’s active and collaborative learning benchmark, con-necting these items to indicators and outcomes from the Information LiteracyCompetency Standards for Higher Education, commonly referred to as the ACRLinformation literacy standards, and then comparing two research universities’scores on these items and offering suggestions for how institutions can usetheir NSSE findings to improve their instruction programs. Their point is thatbecause information literacy should be pervasive across the institution’s cur-riculum, one way to use the NSSE data is as a snapshot of how certain IL indi-cators and outcomes are revealed in the scores. Their findings are illustrativeof how some NSSE items can be mapped to some ACRL indicators and out-comes. However, even if one were to complete the mapping these two authorsbegan, the NSSE items would not reflect the totality of skills and behaviorsidentified in the Information Literacy Competency Standards for Higher Educa-tion. Moreover, the article authors admit that the correlation of NSSE survey

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

107COLLEGE STUDENT ENGAGEMENT SURVEYS

items to the ACRL standards does “not unequivocally demonstrate that therankings were necessarily enhanced by library instruction,” but they under-score their intent to “correlate NSSE benchmarks with the ACRL standards inorder to identify areas of strengths and weakness” (p. 490).

Another 2003 article reports findings from a longitudinal study of stu-dent responses to another instrument, the College Student ExperiencesQuestionnaire (CSEQ). Kuh and Gonyea included a very large sample: threehundred thousand students from about three hundred four-year collegesand universities who completed the second, third, and fourth editions of theCSEQ from 1984 to 2002, and a second sample of eighty thousand full-timestudents from 131 baccalaureate degree-granting institutions who com-pleted the fourth edition of the CSEQ between 1998 and 2002. Theirresearch questions addressed (1) the extent of change in student use of var-ious library resources between 1984 and 2002, (2) whether frequent use of the library is associated with greater gains in information literacy, and (3) how student use of library resources affects their engagement with effec-tive educational practices.

Kuh and Gonyea report that overall frequency of library use varies withthe type of student and institution; least frequent library users were whitestudents, math and science majors, those who have ready access to a com-puter, and those who were attending doctoral-extensive universities. How-ever, they note that a greater number of students reported using indexes anddatabases to find information, but that there is a noteworthy decrease in theproportion of students who use the library as a place to read or study. Theyalso reported that student contact with librarians increased somewhat dur-ing the eighteen-year period, suggesting “librarians may be becoming morevisible and accessible to larger numbers of students” (p. 266).

Observations from the findings that address the second research ques-tion do not support greater gains in information literacy, as they havedefined it based on frequency of use of library resources. The authors state“the most surprising (and mildly disappointing) finding is that library expe-riences do not seem to directly contribute to gains in information literacy,to what students gain overall from college, or to student satisfaction” (p. 267). In their discussion of findings, however, they admit that the infor-mation literacy scale created from selected CSEQ items may not be a validproxy. This author agrees and feels that the findings may be flawed becausethe six items used to create the information literacy scale only partiallyreflect information literacy behaviors.

A more recent study authored by Nelson Laird and Kuh, “Student Expe-riences with Information Technology and Their Relationship to OtherAspects of Student Engagement” (2005), analyzed eighteen information tech-nology (IT) items used only on a 2003 online version of the NSSE. Severalof the items asked students how frequently (1) they used e-mail and the Webfor class work and other reasons, (2) their professors used IT in the class-room, (3) they used the institution’s library Web site to obtain resources for

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

108 INFORMATION LITERACY

academic work, (4) they asked a librarian at their school for help in obtain-ing resources for their academic work, and (5) they made judgments aboutthe quality of information found on the Web for use in academic work.

Their analysis examines how the IT items correlate with items in sev-eral of the NSSE benchmarks; they comment that “the relative strength ofthe positive relationships between academic uses of IT and engagement, par-ticularly academic challenge, student-faculty interaction and active and col-laborative learning suggest that, at the very least, engagement in one areaoften goes hand-in-hand with engagement in other areas” (p. 230). Theysuggest that “the results of this study prompt us to consider how establishedindicators of student engagement may benefit from tying IT items to activ-ities related to collaborative learning, for example.” But they also cautionthat “measuring student engagement in IT may not add to our ability toexplain educational outcomes above and beyond what is already capturedby other measures of student engagement” (p. 232).

Information Literacy–Related Behaviors and the 2006 NSSE

The College Student Surveys Project Group began its work in June 2004 byinvestigating and comparing items on seven national college student sur-veys. Its members decided to concentrate on one survey, NSSE’s College Stu-dent Report, examining it in depth. Following the suggestion of NSSEdirector Kuh, the group members decided to focus item development workon student behaviors that contribute to information literacy.

This broader input was obtained from a six-month, adapted Delphiprocess to gather evidence from a polling of library and information scienceeducators and practitioners. Members reviewed the findings from the fieldalong with the items currently on the 2005 NSSE survey instrument todetermine the final items for submission. After NSSE staff review and revi-sions, the ten items in Exhibit 8.2 were used on the 2006 NSSE as experi-mental items and were administered during the spring 2006 term to 12,044students at thirty-three institutions.

Findings. NSSE’s associate director gave the project group membersnine tables of data and comments for each table. The frequency data in sixtables include frequencies by class rank and enrollment status for first-yearstudents and seniors, by major fields of study for seniors, by living arrange-ment for first-year students, by gender and class rank for first-year studentsand seniors, and by race and ethnic status for seniors. The second tableidentifies the component items included in the two scales that were createdfor the data analysis, active learning in information literacy scale and insti-tutional emphasis, and contributions in information literacy scale, alongwith including reliability statistics. The third table presents descriptive sta-tistics by class rank for each of the two scales, and another gives bivariate

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

109COLLEGE STUDENT ENGAGEMENT SURVEYS

correlations between the five NSSE benchmarks and the two informationliteracy scales. Table 8.1 combines the information contained on the origi-nal tables two and three.

Nine tables of data are available at the “ACRL Information Literacy Ini-tiatives” Web page in the “Assessment” section (http://www.ala.org/ala/acrl/acrlissues/acrlinfolit/professactivity/initiatives/acrlinfolitinitiatives.cfm).

Overall, the findings are very encouraging. They support modest tohigh significant positive relationships between the two information literacyscales and eight scales derived from NSSE items, particularly among seniorswith gains in the practical competence scale and general education scale.They also confirm that engagement in academic use of information technol-ogy goes hand in hand with engagement in other areas, such as academicchallenge.

Discussion. The frequency data corroborate several points that mightbe expected: seniors report doing the various activities a bit more often thando first-year students; social sciences and arts and humanities majors are

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

Exhibit 8.2. Information Literacy–Related Items Used on2006 NSSE as Experimental Items

1. In your experience at your institution during the current school year, about howoften have you done each of the following? [Response options included very often,often, sometimes, and never.]A. Asked a librarian for help (in person, e-mail, chat, etc.)?B. Went to a campus library to do academic research?C. Used your institution’s Web-based library resources in completing class assign-

ments?2. Which of the following have you done or do you plan to do before you graduate

from your institution? [Response options included done, plan to do, do not plan todo, and have not decided.]A. Participated in an instructional session led by a librarian or other library staff

member?B. Participated in an online library tutorial?

3. To what extent does your institution emphasize each of the following? [Responseoptions included very much, quite a bit, some, and very little.]A. Developing critical, analytical abilities?B. Developing the ability to obtain and effectively use information for problem

solving?C. Developing the ability to evaluate the quality of information available from vari-

ous media sources (TV, radio, newspapers, magazines, etc.)?4. To what extent has your experience at this institution contributed to your knowl-

edge, skills, and personal development in the following areas? [Response optionsincluded very much, quite a bit, some, and very little.]A. Evaluating the quality of information?B. Understanding how to ethically use information in academic work (proper cita-

tion use, not plagiarizing, etc.)?

110 INFORMATION LITERACY

more actively engaged with library information resources than are the othermajors listed; residential and full-time students use library informationresources more frequently; and African American, Asian and Pacific Islander,and Latino seniors indicated that they “asked a librarian for help” and “wentto a campus library to do research for a course assignment” more frequentlythan do Caucasian or white students.

Perhaps of more interest is the analysis of the statistics from the twoinformation literacy scales. Two items that were not included in the scalesbecause they did not add anything statistically are those under number twoin Exhibit 8.2. Indeed, should these items be included on the survey instru-ment again, the wording and placement of the items should be reviewed.

Table 8.1, which has been reformatted from the original tables two andthree, reveals that the mean score is quite a bit higher (3 = often; 2 = some-times) for the institutional emphasis and contributions in information lit-eracy scale than for the active learning in information literacy scale. Asscores increase in the information literacy scales, they also increase in thebenchmarks; especially notable is the .67 positive correlation for gains inthe general education scale, which is composed of the four items writingclearly and effectively, speaking clearly and effectively, acquiring a broadgeneral education, and thinking critically and analytically.

The future of using these experimental items on the NSSE with anational sample is undecided. The project group encourages local and con-sortium use of some or all of these items. As of early October 2007, the Lib-eral Arts Information Literacy Consortium formed and plans to use all tenitems on the 2008 NSSE. Librarians may also want to look at the eight itemson the latest edition of the CSEQ as a source for additional items. The proj-ect group hopes that many institutions will use these experimental items sothat within a few years there will be data from a larger number of institu-tions and a more sophisticated data analysis can be accomplished.

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

Table 8.1. Information Literacy Scales: Component Items, Reliability,and Descriptive Statistics by Class Rank

First-YearScale Students Seniors

Active Learning in Count 8832 8927Information Literacy Mean 2.3 2.5

Standard deviation 0.7 0.7Standard error of mean 0.01 0.01

Institutional Emphasis and Count 8832 8927Contributions in Information Mean 3 3.1Literacy Standard deviation 0.7 0.7

Standard error of mean 0.01 0.01

111COLLEGE STUDENT ENGAGEMENT SURVEYS

Value and Applications of Student Engagement Data

The theoretical underpinnings that NSSE and CSSE are based on have beenshown to be valid (Kuh, 2003). Student self-reports are widely used inresearch on college effects, and the validity and credibility of these data havebeen extensively studied and shown to be valid under certain conditionsthat are satisfied by both the NSSE and CSSE (Kuh, 2003).

Before looking more closely at ways that academic librarians might usethese survey findings, a few examples from research studies linking engage-ment data to outcomes data are summarized to illustrate the types of stu-dent outcomes data that have been linked and with what result.

Georgia Institute of Technology conducted a study linking multipleyears of NSSE responses from first-year and senior students to several stu-dent outcomes: freshman retention, GPA, pursuit of graduate education, andemployment outcome on commencement and degree conferral. They reportthat their findings show “minimal explanatory power in the NSSE bench-marks for these outcomes, but a statistically-derived model from the indi-vidual NSSE items shows greater promise” (Gordon, Ludlum, and Hoey,2006, p. 1).

A comprehensive July 2006 review of the literature on student successincludes several sections identifying and summarizing findings fromresearch that has studied the linkages between student engagement surveysand many student success outcomes, as well as looking at student charac-teristics and effective practices. The authors conclude that “the evidencefrom scores of studies over several decades strongly indicates that studentengagement is related to a host of positive outcomes including persistence,grades, and satisfaction.” In fact, GPA is positively related to all the effec-tive educational practices measured by NSSE and nearly all those repre-sented on the Community College Survey Student Engagement (Kuh andothers, 2006). Specifically for students at four-year colleges, GPA is associ-ated with time spent preparing for class, coming to class prepared, askingquestions in class, tutoring other students, receiving prompt feedback fromfaculty, maintaining high-quality relationships with faculty, and having afavorable evaluation of overall educational experiences in college.

The final research project of note is the 2006 CCSSE Validation study. Itis a large research project encompassing three separate studies that linkedthe Community College Survey of Student Engagement (CCSSE) respon-dents from 2002, 2003, or 2004 administrations of the instrument with suchexternal data sources as records of student demographics, placement tests,course taking, and completion points from the Florida Department of Edu-cation; data from the Achieving the Dream project, which is a multiyearnational initiative to help more community college students succeed, partic-ularly student groups that traditionally faced significant barriers to success;and student record databases maintained at community colleges that haveparticipated in the CCSSE survey and are either Hispanic-serving institutions

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

112 INFORMATION LITERACY

or members of the Hispanic Association of Colleges and Universities. TheExecutive Summary reports these overall conclusions:

• There is strong support for the validity of the use of the CCSR as a mea-sure of institutional processes and student behaviors that have an impacton student outcomes.

• The studies confirm a long tradition of research findings linking engage-ment to positive academic outcomes.

• There is strong consistency in the relationship between engagement fac-tors and outcome measures across the three studies; however, some out-comes have stronger relationships to engagement than others.

• The Support for Learners benchmark was consistently correlated withmeasures of persistence (p. 7).

What can librarians conclude from studies linking outcomes data to engage-ment data from national surveys? First, although positive, and often strong,relationships are found, the direction of the association is not usually known,so cause and effect cannot be determined. Second, data from surveys of stu-dent engagement do not directly measure what a student knows or can do,not even at the institutional level. However, these data are empirically vali-dated measures of what is known to be effective educational practices of stu-dent engagement even though the connection between engagement ineducationally effective practices and student learning is indirect. Lee Shulmanhas observed that “learning begins with student engagement, which in turnleads to knowledge and understanding . . . engagement is not solely a proxyfor learning, it can be an end in itself” (Shulman, 2002, p. 40). Thus the datafrom the college student engagement surveys are useful as part of a librarian’sassessment toolbox and for program planning and improvement.

Benefits and Implications for Academic Librarians

Assessment of information literacy programs and student learning requiresmultiple sources of data for evidence, one of which could be findings fromNSSE, CSSE, or CSEQ items that correspond to local IL program goals andspecific student learning outcomes. For example, if an IL program includesthe goal of increasing information literacy across the curriculum, one wayto use the NSSE data is “as a snapshot of information literacy at an institu-tional level,” as suggested by Mack and Boruff-Jones (p. 484). Anotherapproach could be to analyze the findings to specific IL-related items so thatstudent responses could be reviewed by broad discipline groupings, such associal sciences, physical sciences, and the like. To strengthen the evidencefrom engagement data, another strategy is to combine NSSE data findingswith selected student records data for the same group of students to deter-mine the relationship with student success indicators such as GPA or grad-uation rate.

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

113COLLEGE STUDENT ENGAGEMENT SURVEYS

Preparing for an accreditation agency’s self-study or a campus programreview also requires evidence. The NSSE Website includes toolkits that mapthe NSSE survey items to each of the regional accreditation standards.Because there are only a few information literacy–related behaviors currentlyincluded on the NSSE and CCSSE instruments, this application may have asomewhat more limited value, but still the data findings could be used inthe library’s self-study for selected items (“making judgments about thevalue of information” or “worked on a paper or project that required inte-grating ideas or information from various sources”).

Studies using student engagement data in conjunction with other out-come measures to identify high-performing institutions offer examples thatcan be considered for local implementation in IL or instruction programsand library service improvements. One useful source is the project briefseries (http://nsse.iub.edu/institute/?view=deep/briefs), based on the 2005publication about the Project DEEP institutions (Kuh, Kinzie, Schuh, andWhitt, 2005). The twenty institutions selected had higher-than-predictedgraduation rates and, through their NSSE scores, demonstrated that theyemploy effective practices for fostering student success.

There are many examples of collaborations related to information lit-eracy instruction, facilities planning, curriculum development, campus pro-gramming, and student services. However, these types of innovations areprobably not the norm at most colleges and universities; nor are they insti-tutionalized at some of the campuses that have been on the forefront ofexperimentation. The advantage of using student engagement survey find-ings is that this also documents the gaps and lacks in desired student behav-iors at an institutional level so that resources can be directed to programimprovements. Even without better or more items related to information lit-eracy and library use behaviors on NSSE, CCSSE, or other college engage-ment surveys, the current items and the 2006 experimental items can supplyevidence to help substantiate the need and resources for program and ser-vice improvements. As documented in this chapter, academic librarians havean important role to play in becoming involved with staff on their campuswho plan for and administer these engagement surveys, so that the role ofinformation literacy instruction and library use can be more fully repre-sented on these surveys and part of the campus learning environment con-tributing to student success.

References

Chickering, A. W., and Gamson, Z. F. “Seven Principles for Good Practice in Undergrad-uate Education.” AAHE Bulletin, 1987, 39,(7), 3–7.

Kuh, G. D. “Assessing What Really Matters to Student Learning: Inside the National Sur-vey of Student Engagement.” Change, May-June 2001, 33(3), 10–17.

Kuh, G. D. “Conceptual Framework and Overview of Psychometric Properties: NationalSurvey of Student Engagement.” 2003. National Survey of Student Engagement(http://nsse.iub.edu/pdf/conceptual_framework_2003.pdf).

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl

114 INFORMATION LITERACY

Kuh, G. D., and Gonyea, R. M. “The Role of the Academic Library in Promoting StudentEngagement in Learning.” College and Research Libraries, July 2003, 64(4), 256–282.

Kuh, G. D., Kinzie, J., Schuh, J. H., and Whitt, E. J. Student Success in College: CreatingConditions That Matter. San Francisco: Jossey-Bass, 2005.

Kuh, G. D., and others. What Matters to Student Success: A Review of the Literature. Com-missioned Report for the National Symposium on Postsecondary Student Success:Spearheading a Dialog on Student Success. National Postsecondary Education Coop-erative, July 2006 (http://nces.ed.gov/npec/pdf/Kuh_Team_Report.pdf).

Ludlum, J., and Hoey, J. J. “Validating the NSSE Against Student Outcomes: Are TheyRelated?” Paper presented at the annual forum of the Association for InstitutionalResearch, Chicago, May 14–18, 2006. ERIC document, ED493829.

Mack, A. E., and Boruff-Jones, P. D. “Information Literacy and Student Engagement:What the National Survey of Student Engagement Reveals About Your Campus.” Col-lege and Research Libraries, Nov. 2003, 64(5), 480–493.

National Survey of Student Engagement. 2006 Annual Report: Engaged Learning, Foster-ing Success for All Students. National Survey of Student Engagement (http://nsse.iub.edu/NSSE_2006_Annual_Report/docs/NSSE_2006_Annual_Report.pdf).

Nelson Laird, T. F., and Kuh, G. D. “Student Experiences with Information Technologyand Their Relationship to Other Aspects of Student Engagement.” Research in HigherEducation, Mar. 2005, 46(2), 211–233.

Shulman, L. S. “Making Differences: A Table of Learning.” Change, Nov.–Dec. 2002,34(6), 36–44.

BONNIE GRATCH-LINDAUER is coordinator of library instructional services at CityCollege of San Francisco and has served as instructor and head of instructionaland information services in all types of academic libraries. She has authoredmany articles and book chapters on assessment, instruction, and reference ser-vices and has been active in ACRL for thirty years.

NEW DIRECTIONS FOR TEACHING AND LEARNING • DOI: 10.1002/tl