41
Enhancing Understanding and Usability of the GSS through a Response Behavior Survey Presented to: Reg Baker, Market Strategies Presented by: Douvan Consulting Group (Julie de Jong, Erin Ferrell, Geon Lee, and Julie Sweetman) November 14, 2005

Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

  • Upload
    lavina

  • View
    22

  • Download
    0

Embed Size (px)

DESCRIPTION

Enhancing Understanding and Usability of the GSS through a Response Behavior Survey. Presented to: Reg Baker, Market Strategies Presented by: Douvan Consulting Group (Julie de Jong, Erin Ferrell, Geon Lee, and Julie Sweetman) November 14, 2005. Outline. - PowerPoint PPT Presentation

Citation preview

Page 1: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

Enhancing Understanding and Usability of the GSS

through a Response Behavior Survey

Presented to: Reg Baker, Market Strategies

Presented by: Douvan Consulting Group (Julie de Jong, Erin Ferrell, Geon Lee, and Julie Sweetman)

November 14, 2005

Page 2: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

2

Outline1. Introduction, procedures & concerns of the GSS2. Introduction, procedures & concerns of the RBS3. Recommendations

a. Improvements to the GSSb. Sampling procedures and frequency of administration

for the RBSc. Improvements to the RBS design and questionnaired. Other avenues to collect feedback on usability of GSS

4. Conclusions5. References

Page 3: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

3

Introduction to the GSS and RBS Studies

Page 4: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

4

Introduction to the GSS study Funded by National Science Foundation and National

Institutes of Health Census of all science, engineering, and health-related

master’s and doctorate-granting institutions in the US Purpose is to collect numbers, funding information,

and demographics of US graduate students and postdoctorates

2004 survey: 12,000 departments in over 600 institutions

Field period: 15 months beginning every November Few major changes since 1980

Page 5: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

5

GSS Respondents Understanding current respondents

Who are they? Institutional coordinators Department heads Support staff

How are they identified? Through the institutional coordinator for each

reporting unit

Page 6: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

6

Current Concerns in the GSS

Measurement Error in Numerous FormsNSF/NIH may not know who all the

respondents areSome respondents may not be the best

people for the jobRespondent records may not match the

information requested by the GSS Paper/electronic Centralized/decentralized Individual/aggregate

Page 7: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

7

Response Behavior Survey (RBS)

Follow-up to the postdoctorate portion of the 2004 GSS

n = 1,500 Evaluate how data is collected for the GSS

Delegation of responsibility for GSS Respondent knowledge Record-keeping practices Usability of web instrument

Goal is to use RBS data to reduce measurement error in the GSS

Page 8: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

8

Response Behavior Survey Concerns

Is the current RBS the most efficient way of evaluating the GSS?

Is the current RBS reaching the correct respondents?

Is the current RBS disseminated in the most efficient manner?

Page 9: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

9

Recommendations

Page 10: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

10

Circular Improvement Process(Deming, 1986 – 14 points)

Improve GSS

to improve to improve GSS RBS

ImproveRBS

Page 11: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

11

Recommendations

Improvements to the GSS Sampling procedures and frequency of

administration for the RBS Improvements to the RBS design and

questionnaire Other avenues to collect feedback on

usability of GSS

Page 12: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

12

Improvements to the GSS Respondent identification

Maria JohnsonInstitutional Coordinator

Jill EsauProgram Coordinator

Diane SmithChair of Engineering Dept.

Jim LepkowskiChair of Survey Methods Dept.

Steve HannaChair of Biology Dept.

Page 13: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

13

Respondent Identification, cont.

Collect name, e-mail address, and title from both the institutional coordinator and each individual respondent responsible for completing the department portion

Page 14: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

14

Respondent Identification at the Department Level

Sample GSS survey question to be completed before data can be submitted at the department level:

Please provide the name, email address, and title of everyone in this department who contributed to the completion of this survey.Name Email address Job title

Page 15: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

15

Respondent Identification at the Department Level

The department fills in the required information for each person involved in the completion of the survey:

Please provide the name, email address, and title of everyone in this department who contributed to the completion of this survey.Name Email address Job title

Jim Lepkowski [email protected] Dept. Chair

Jill Esau [email protected] Prog. Coordinator

Patsy Gregory [email protected] Admin. Asst.

Page 16: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

16

Respondent Identification at the Department Level

After the names are filled out, the respondent must answer the final survey question in order for the data to be submitted:

Please check the appropriate box next to the name, of the person who contributed most to the completion of this survey.

The person who contributed the most to the completion of the survey (check only one box)

Jim Lepkowski

Jill Esau

Patsy Gregory

Page 17: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

17

Recommended Sampling Frame for the RBS

Include only the respondent who completed most of the survey for their departmentAdvantages:

Allows selection of the people who had the most hands-on interaction with the GSS

Still permits flexibility in sampling other names if so desired

Disadvantages: Does not necessarily capture the perspectives of

people who had different types of involvement with GSS

Page 18: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

18

Possible Solution for Sampling Procedure for RBS

Compile list of all GSS respondents and information about them (i.e. title, department, etc.)

After GSS field period ends, select sample of respondents from those who indicated that they completed most of the survey

All selected respondents receive e-mail notification at the same time Follow-up with mailed letter to improve response

rates as necessary (Kapolowitz et al, 2004)

Page 19: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

19

Possible Solution for Sampling Procedure for RBS (cont.)

Advantages Allows for straightforward stratification of

respondents (i.e. by institution size, respondent title, department, etc.)

Guaranteed to get a GSS respondent Inexpensive programming of web instrument

Disadvantages Much time may pass between GSS and RBS

administration Respondents may forget about the GSS Respondents may change jobs and/or email

addresses

Page 20: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

20

Recommended Solution for Sampling Procedure for RBS

As each respondent submits his/her data, add the respondent to the RBS sampling frame

Systematic cluster sampling of the RBS frame, on a rolling basis, throughout the entire GSS field period For example, select every fifth respondent added to

the frame Selected respondents would receive an

e-mail invitation to participate in the RBS Follow-up with mailed letter to improve response rates as

necessary (Kapolowitz et al, 2004)

Page 21: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

21

Sampling for RBS: Stratification

Both Possible and Recommended Solutions: Possible solution

Stratify by the number of respondents within institutions or by the number of departments within institutions

Recommended solution Stratify on number of students in the university Greatest indicator of variability in how

respondents will complete survey

Page 22: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

22

Illustration of Recommended Stratified Sampling Procedure

10,000+ students

Every zth respondent

2,000 – 10,000 students

Every yth respondent

Less than 2,000 students

Every xth respondent

Page 23: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

23

Considerations while Implementing Recommended Stratified

Sampling Procedure Large and small universities will complete the

survey in different ways Aim is to stratify by homogenous groups May want to sample universities and then people Individuals will have differing context effects both

between and within universities

Page 24: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

24

Recommended Solution for Sampling Procedure for RBS (cont.) Advantages

GSS experiences will still be fresh in the respondents’ minds

Systematic and stratified sampling Ability to use weights in the analysis stage to

account for any over-sampling in a rolling list Guaranteed to get a GSS respondent

Disadvantages May need advanced programming techniques to get

a representative and stratified sample May lead to increased cost

Page 25: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

25

Sampling for RBS

Sample size considerationsRespondent burden (Phipps et al. 1995)

The RBS is not a short survey

Sample size dependent on stratification procedure

Page 26: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

26

Sampling for RBS Requirements for calculating the sample size

Compute an appropriate n for Simple Random Sampling (SRS) and adjust the nSRS by the design effect (deff) once a clustering and/or stratification design is chosen.

Obtain estimates for these values from previous RBS data Statistics needed to compute the sampling size:

Number of departments within each university Number of respondents within each department

Numbers may not be obtainable until after the modified RBS is administered

Number of students in each university See Kish (1965) for additional guidance

Page 27: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

27

Frequency of RBS Administration

Frequency of conducting RBSRecommended frequency

Once every 2 years unless significant GSS or technical changes occur

Could depend on amount of changes made to GSS, and changes in technology and record-keeping over time

Cost-efficient way to obtain data for improvements, particularly due to the overlapping nature of the GSS from year to year

Page 28: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

28

Improvements to the RBS

Additional questions to ask on RBSObtain more details regarding respondent

record-keeping practices Include question to determine the frequency with

which records are updated In the future, could use this information to tailor

the GSS to the format of records for each institution, or for different types of institutions (e.g., large and small)

Page 29: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

29

Adapt / Improve RBS Questions

Expert review of RBS instrument for both questionnaire design problems and interface usability

Laboratory experiments to guide questionnaire development and to evaluate and improve interface usability

Page 30: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

30

Expert Review A group of “experts” in different fields are

needed to review the questionnaire Review the survey itself, as well as

usability of the instrument Comments in open-ended form Presser & Blair (1994) concluded that

overall, expert review identified more problems than cognitive interviewing

Inexpensive solution for efficient questionnaire design

Page 31: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

31

Laboratory Evaluations for Questionnaire Development

“Think aloud” interview Combination of respondent’s thinking aloud and

interviewer’s nondirect probing Respondent Debriefing

Investigate whether respondents understand questions in the way that was intended by the survey designer

Behavior Coding Occurs as the respondent complete the questionnaire Identifies the location of problems in questionnaire Uses a “Coder” and coding sheets

De Maio et al. 1998, Willis et al. 1999

Page 32: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

32

Analyses with RBS Data to Improve GSS

Examine responses to certain RBS questions and match to actual GSS data to measure reliability“Did your institution have any post-docs?”Repeat other survey questions to measure

reliability If GSS ≠ RBS, use regression models to

understand predictors of reliability E.g., size of institution or the title of the respondent

who completed most of the survey

Page 33: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

33

Examine responses to questions about usefulness of paper survey and modify paper version of GSS as necessary Possibly provide links to PDF files for GSS for easy downloading

Analyze responses from open-ended questions to see if useful data is gained, use to develop coding categories to lessen respondent burden by creating closed-ended questions Example: 2004 RBS asked open-ended question “Where do you

get the CIP codes for your department”; responses could be used to develop pre-coded categories for the next RBS

Analyses with RBS data, (cont.)

Page 34: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

34

Other Avenues to Collect Feedback on Usability of GSS

Lab experiments and expert review with the GSS questionnaire, both in questionnaire design and usability of web interface

Paradata (Couper 1998, 2005) It is possible to learn a lot about respondent

behavior without even asking respondents! Current advances in paradata analysis are

furthering usability Compare respondent information to keystroke

records, use of help screens, time of GSS completion, etc.

E.g., use paradata to examine which help screens are used most and to further improve upon those

Page 35: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

35

Conclusions

Page 36: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

36

ConclusionsWe see this process as circular, with continuousimprovement as the goal

Improvements to the RBS will improve the GSS, which willimprove the RBS, etc.

W. Deming’s cycle of continuous improvementhttp://www.asq.org

Page 37: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

37

Conclusions, cont.Summary of Recommendations

1. Improve respondent identification in the GSS2. Use improved identification to sample for RBS

on a rolling, stratified basis, every 2 years3. Use lab experiments and expert reviews to

improve RBS questionnaire4. Perform analysis with RBS data to further

improve GSS5. Use other avenues to collect feedback on GSS

Page 38: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

38

Conclusions (cont.)

Implementing these recommendations will increase costs because of advanced programming techniques, questionnaire reviews, and analyses of RBS dataHowever, these increased costs are not

unreasonable given the current budgetWeb surveys are relatively inexpensive

compared to other modes, allowing for the reasonable implementation of our suggestions (Schaeffer, 2001)

Page 39: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

39

References

Page 40: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

40

References Couper, M.P. (2005). Technology trends in survey data collection. Social Science Computer

Review, 23(4), 486-501. Couper, M.P. (1998). Measuring survey quality in a CASIC environment. Proceedings of the

Survey Research Methods Section, American Statistical Association, pp. 41-49. DeMaio, T., Rothgeb, J., & Hess, J. (1998). Improving survey quality through pretesting.

Statistical Research Division Working Papers in Survey Methodology #98-03. Washington, DC: U.S. Bureau of the Census. (Download from http://www.census.gov/srd/papers/pdf/sm98-03.pdf).

Deming, W.E. (1986). Out of the Crisis. Cambridge, MA: MIT Center for Advanced Engineering Study.

Kaplowitz, M.D., Hadlock, T.D., & Levine, R. (2004) A comparison of web and mail survey response rates. Public Opinion Quarterly, 68, 94-101.

Kish, L. (1965). Survey Sampling. New York: Wiley. Phipps, P.A., Butani, S.J., and Chun, Y.I. (1995). Research on establishment survey

questionnaire design. Journal 0f Business and Economic Statistics, July, 337-346. Presser, J. & Blair, J. (1994). Survey pretesting: Do different methods produce different results?

In P.V. Marsden (Ed.), Sociological Methodology, 24, 73-104. Washington, DC: American Sociological Association.

Schaeffer, E. (2001). Web surveying: How to collect important assessment data without any paper. Office of Information & Institutional Research. Illinois Institute of Technology.

Willis, G., Schechter S., & Whitaker, K. (1999). A comparison of cognitive interviewing, expert review, and behavior coding: What do they tell us? Proceedings of the American Statistical Association (Survey Research Methods Section). Alexandria, VA: American Statistical Association, 28-37.

Page 41: Enhancing Understanding and Usability of the GSS through a Response Behavior Survey

41

Final Recommendations

1. Improve respondent identification in the GSS

2. Use improved identification to sample for RBS on a rolling, stratified basis, every 2 years

3. Use lab experiments and expert reviews to improve RBS questionnaire

4. Perform analysis with RBS data to further improve GSS

5. Use other avenues to collect feedback on GSS