27
Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the Study of Higher Education 38 th Annual Conference November 16 th , 2013

Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the

Embed Size (px)

Citation preview

Are those Rose-Colored Glasses You are Wearing?:

Student and Alumni Survey Responses

Amber D. Lambert, Ph.D.Angie L. Miller, Ph.D.

Association for the Study of Higher Education38th Annual ConferenceNovember 16th, 2013

Introduction & Literature ReviewSurveys are a common means of assessment

in higher education (Kuh & Ikenberry, 2009)

Student surveys are conducted on a variety of topics, from student engagement to use of campus resources to faculty evaluations (Kuh & Ewell, 2010)

Alumni surveys are used to gather information about satisfaction, acquired skills, and career attainment (Cabrera, Weerts, & Zulick, 2005)

Introduction & Literature ReviewInstitutions claim to prepare students with a

multitude of transferable skills in addition to pure content knowledge from major (Tait & Godfrey, 1999)Effective communicationAnalytical & creative thinking

AAC&U has recently addressed many of these types of skills as essential learning outcomes for higher education

Mastery of these skills should lead to success in the workplace (Stasz, 2001)

Research QuestionsQuestion 1: Are there differences in how

students and alumni perceive aspects of their institutional experiences and the skills and competencies that they acquire at their institutions?

Question 2: Do alumni evaluate their institutions with rose-colored glasses, or do they evaluate their education more harshly once they gain a more practical knowledge of the working world?

Question 3: Finally, if differences between students and alumni do exist, whose account should be given precedence?

MethodologyThis study uses data from the 2011 Strategic National Arts Alumni Project (SNAAP) and the 2012 National Survey of Student Engagement (NSSE)

What is SNAAP?

On-line annual survey of arts graduates

Investigates educational experiences and career paths

Provides findings to educators and policymakers to improve arts training, inform cultural policy, and support artists

Who does SNAAP Survey?Graduates of:

Arts schools, departments, or programs in colleges and universities

Independent arts collegesArts high schools

Both graduate and undergraduate degree recipients

All arts disciplines

SNAAP Questionnaire Topics

1. Formal education and degrees

2. Institutional experience and satisfaction

3. Postgraduate resources for artists

4. Career

5. Arts engagement

6. Income and debt

7. Demographics

SNAAP 2011 Administration InformationAdministered in Fall 2011

66 participating institutions 58 postsecondary and 8 high schools

Over 36,000 total respondents

What is NSSE?National Survey of Student Engagement

NSSE gives a snapshot of college student experiences in and outside of the classroom by surveying first-year and senior students

NSSE items represent good practices related to desirable college outcomes

Indirect, process measures of student learning and development

NSSE 2012 Administration Information Administered in Spring 2012

546 participating U.S. institutions

Over 285,000 total respondents

Each year, experimental item sets appended at end of core survey

Methodology: SampleFor this study, participants came from

6 institutions that participated in both SNAAP11 and NSSE12 administrations

Senior NSSE respondents from arts majors in corresponding SNAAP participating programs (n = 222)

Alumni of undergraduate SNAAP programs from graduating cohorts of 2001-2010 (n = 593)

Variables: SNAAP items

Variables: NSSE items

Data Analysis

Analysis of covariance (ANCOVA) was conducted to determine whether differences of reported satisfaction and skill development exist between graduating seniors and alumniControl variables included gender, race, U.S.

citizenship status, and first-generation status

Adjusted means and statistical significance

Cohen’s d as measure of effect size

Results

Adjusted means comparison for overall rating of institutional experience (4-point scale from “Poor” to “Excellent”) suggests that alumni give higher general appraisals

Student

Mean

Alumni Mean

Sig. Effect

size (d)

Overall experience

3.27 3.39 * .17

*p<.05; **p<.01; ***p<.001

Results (cont.)Adjusted means comparisons for satisfaction with aspects of time at institution (4-point scale after removing “Not Relevant” option) suggest that alumni give lower specific appraisals for certain aspects Stude

nt Mean

Alumni Mean

Sig.

Effect size (d)

Academic advising

2.99 2.84 * -.16

Career advising 2.80 2.44 *** -.35

Opp. for internships

2.68 2.41 ** -.27*p<.05; **p<.01; ***p<.001

Results (cont.)

Adjusted means comparisons for amount of institutional contribution to acquired skills and competencies (4-point scale from “Not at all” to “Very much”) show a similar pattern, with alumni giving lower specific appraisals for certain skills

*p<.05; **p<.01; ***p<.001

Student

Mean

Alumni

Mean

Sig.

Effect size (d)

Research skills 3.30 3.11 ** -.23

Clear writing 3.21 2.96 *** -.30

Persuasive speaking

2.96 2.78 * -.21

Project management

3.21 3.02 * -.21

Financial & business

2.24 1.92 *** -.38

Entrepreneurial 2.23 1.99 ** -.27

Leadership skills 3.05 2.88 * -.21

Networking 3.07 2.83 *** -.28

DiscussionAlumni may be viewing their

institutional experience as a whole through rose-colored glasses when they think about “the good old days”

However, when considering more nuanced aspects of their educational experiences, alumni perceptions may have a more lackluster pallor

DiscussionPost-graduation experiences in the

workplace may better enable alumni to reflect on certain aspects of their time

Alumni were less satisfied than graduating seniors in the areas of academic advising, career advising, and opportunities for internships or degree-related workMay be the case that as students,

respondents do not realize the need for better advising or internships until they enter the workforce

DiscussionAlumni may also learn that they needed to develop

some skills more once they have gained work experienceWriting, speaking, networking, and leadership are

important aspects of communication that may be experienced differently in an applied setting, such as the workplace, in comparison to a classroom situation

Some task-based procedural skills like research, project management, finance, and entrepreneurship may also be more completely understood and valued once an individual transitions from student to employee

Also possible that once alumni enter the workforce, they reference skill levels in comparison with colleagues who are quite advanced in these skills after years (or decades) of actual use

LimitationsEffect sizes small in magnitude

May not represent ALL students and alumni, data only available for those participating in both SNAAP and NSSE (and those receiving experimental NSSE items)

Relies on self-reported dataHowever, most studies looking at student self-

reports in higher education suggest that self-reports and actual abilities are positively related (Anaya, 1999; Hayek et al., 2002; Laing et al., 1988; Pace, 1985; Pike, 1995).

ConclusionsImportant institutional information can be

gained through surveying both students and alumni

Students may be better able to provide information about affective components of their experience, while alumni may be better judges of specific things needed in the workplaceBeing closer in time to the experience may

have advantage in terms of memory accuracy, but temporal distance may have advantage of reflective insight

ReferencesAnaya, G. (1999). College impact on student learning: Comparing the use of self-reported gains, standardized test scores, and college grades. Research in Higher Education, 40, 499-526.Cabrera, A.F., Weerts, D.J., & Zulick, B.J. (2005). Making an impact with alumni surveys. New Directions for Institutional Research, 2005: 5-17. doi: 10.1002/ir.144Hayek, J. C., Carini, R. M., O’Day, P. T., & Kuh, G. D. (2002). Triumph or tragedy: Comparing student engagement levels of members of Greek-letter organizations and other students. Journal of College Student Development, 43(5), 643-663.Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1-20.Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education, Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment.Laing, J., Swayer, R., & Noble, J. (1989). Accuracy of self-reported activities and accomplishments of college- bound seniors. Journal of College Student Development, 29(4), 362-368.Pace, C. R. (1985). The credibility of student self-reports. Los Angeles: The Center for the Study of Evaluation, Graduate School of Education, University of California at Los Angeles.Pike, G. R. (1995). The relationship between self-reports of college experiences and achievement test scores. Research in Higher Education, 36(1), 1-22.Stasz, C. (2001). Assessing skills for work: Two perspectives. Oxford Economic Papers, 3, 385-405.Tait, H., & Godfrey, H. (1999). Defining and assessing competence in generic skills. Quality in Higher Education, 5(3), 245-253.

Questions or Comments?

Contact Information:Amber D. Lambert

[email protected] L. Miller [email protected]

Strategic National Arts Alumni [email protected] snaap.indiana.edu

National Survey of Student [email protected] nsse.indiana.edu