1
ABSTRACT BACKGROUND METHODS RESULTS DISCUSSION This project is designed as an initial step to 1) establish criteria for conducting a culturally responsive evaluation of STEM programs and 2) to establish criteria for assessing cultural responsiveness in STEM program planning. We recruited staff from education programs conducting evaluations throughout New York as well as evaluators nationally that identified a concern with diversity or education issues in evaluation through membership with American Evaluation Association topical interest groups. We employed a standard concept mapping methodology where participants were asked to generate statements about behaviors and attitudes toward diversity in evaluation and program planning. Additionally, relevant statements were taken from an analysis of the extent literature on conducting culturally competent evaluations and added to the data set produced from participant responses. The compiled list of statements from the initial phase of data collection was used to enrich principles of culturally responsive evaluations with examples of behaviors that practitioners engage during planning, implementation, and evaluation. In future research, the statements will be analyzed to produce an annotated list of culturally competent behaviors which evaluators and staff can engage in to address the needs of a diverse population. For decades, Americans have witnessed an achievement gap between our students of color and of low socioeconomic status when compared to middle class and white students. This phenomenon is particularly disturbing when considering the importance of science and technology to the world economy. According to Kuenzi (2008): America now ranks 24 th in science literacy and 26 th in math literacy on a list of the top 40 most industrialized nations. The federal government has invested over $3 billion to increase minority participation in STEM. Thus, accountability and effectiveness of such programs should be of growing concern. In this research, we take the perspective that program initiatives aimed at minority students must operate in a culturally responsive manner in order to be effective. Thus, we emphasize the importance of assessing culturally responsive behaviors by programming staff as well as the importance of conducting culturally responsive evaluations of STEM programs. CULTURALLY RESPONSIVE EVALUATIONS The literature on culturally responsive evaluation (CRE) several principles that can help guide culturally responsive evaluations. However many of the principles are generic and vague. A review of the literature resulted in the following principles: Self-determination and stakeholder involvement Building trust and facilitating communication with multicultural staff Understanding the accuracies and possible inaccuracies of group cultural context Engage a multifaceted approach to data collection Choosing appropriate measures and measure format However, the literature surrounding these principles often result in questions about how to implement and apply each idea in a practical setting. For instance: How does a program/evaluator illicit stakeholder involvement? How does a program or evaluation team “diversify” their staff? How does a program or evaluator go about “understanding” the group they are working with? PARTICIPANTS N= 48 members from different organizations (only 41 responded to demographic questions) 50% self-identified as having involvement with STEM, 4-H, or other related programs 40% of respondents identified as working with a primarily low SES or racial/ethnic minority population 82.5%, were involved in planning or implementation of the evaluation of a program, and 52.5% were involved in both the evaluation and the program planning and implementation. MATERIALS 1) Concept Systems Global (internet software) 2) KWIC Concordance (Keyword in Context) PROCEDURE Preparing for concept mapping Initial preparation of the concept mapping exercise included generation of the focus statement. The focus statement is a statement intended to capture brainstorming ideas relevant to the focus of the concept mapping study. In this project, the intent of the concept mapping study was to capture a set of behaviors which staff and evaluators use in response to the needs of diverse populations in STEM initiatives. Focus statement: “One specific thing I do to be culturally responsive in planning, implementation, and/or evaluation of social and education programs is....Generating the ideas Participants invited to participate via email containing a link to the website Responded to 7 anonymous demographic questions Unlimited time to respond to the focus prompt by completing the statement Could list as many ideas as desired ANALYSIS The analysis consisted of five steps: 1) identifying compounded statements and separating them into distinct statements/ideas, 2) making statements grammatically correct, 3) reducing the statement set to from 124 to 100, 4) performing a keyword in context analysis to find key words, and 5) using keywords to extract themes from the set of statements. The original statement set consisted of 93 responses from 48 participants became 123 statements when compound ideas were disaggregated. We performed a keyword in context analysis, using KWIC Concordance Software (Kane & Trochim, 2007) to identify keywords that were used several times in the response set and the statements in which each keyword was used. In this way several keywords were identified (e.g. research, include, culture, etc.). Researchers then identified each statement linked to a keyword and derived the key themes based on the analysis of each keyword in context and the meaning of all statements linked to each keyword. A similar analysis of the extent literature on culturally responsive evaluations, in general, not currently limited to STEM programs, was also conducted in order to compare and contrast respondent statements with key ideas in the literature. Self-determination and stakeholder involvement Include members of the community and stakeholders in all areas of program/evaluation planning “…focus on making sure diverse individuals are involved as participants, planners, and evaluators.” “…to develop and use a stakeholder panel or advisory group to assist in the analysis of data…” Use cultural informants “…speak to a member of the group about potentially relevant topics.” “…routinely conduct focus groups or interviews with key informants around methods utilized.” Address concerns of community members “…validate that there is much wisdom and many perspectives within the group.” “… use information about barriers to participation to help me to keep changes for future program activities.” Building trust and facilitating communication with multicultural staff Hire diverse employees “…ensure that we recruit from that population whenever possible.” “…be sure that some of the people that conduct programs are authentic.” Hold sensitivity trainings Maintain reciprocal communication, dialogues “…hold discussions with groups to get their insights on what is not working and how it might work better.” “…share evaluation results.” Represent the lived and historical experiences of the group “…strive for balanced/mixed groups…” “…assemble an evaluation team whose collective lived experiences are appropriate to context of the evaluand.” Understand the normative and/or formal language “…use terms or concepts in the evaluation that relate to their life experiences.” “…ensure that language in instrument development is culturally sensitive to stakeholders.” Understanding the accuracies and possible inaccuracies of group cultural context Be understanding, considerate, and aware the group “…be aware of past injustice or harm that has resulted from evaluation or research with this population.” “…read about the population/area that the population resides in.” Apply relevant cultural theory when possible “…seek culturally responsive evaluation theory to guide my evaluation practice.” “…avoid framing cultural variables in terms of problems or deficits.” A review of the literature on CRE revealed several principles that can guide culturally responsive program planning and evaluations. However, though the principles are helpful, they are also generic statements of what an evaluator/ program planner should try to do without a specific mechanism for how to apply each principle practically. The preliminary data collection in this project serves to: enrich principles and theories in the literature with behavioral examples of how theory about cultural responsiveness translates into action. However it also serves to explore the extent to which language and methodology are used consistently across the field of evaluation and practitioners and to consolidate the academic knowledge base on cultural responsive evaluation with the practical experience of practitioners. It also serves as a preliminary step in creating a consolidated list of actions that can operationalize culturally competent behaviors for practical implementation (i.e. a checklist) in an accessible and distributable manner. There are clear themes in the literature on evaluation regarding approaches to culturally diverse populations. However, these themes are usually globalized, abstract principles rather than concrete, operationalized behaviors. There are many instances in the literature of highly specific examples on how evaluations are conducted in a particular context. However, this type of information is not centralized and does not always have relevance to different programs working with various populations. The data from this study is directly gathered from individuals working in STEM fields and provides more specific examples of behavior.. Future Directions A series of future studies have been designed as extensions of this work: Study 1 1) To what extent do practitioners perceive each of the principles and behaviors compiled in the pilot study as feasible and important to CRE? 2) To what extent is there consistent use of language and methods in CRE? Evaluators and STEM program staff Sorting task and rating task Result in a taxonomy of behaviors Study 2 1) Will a student perspective provide more concrete behaviors to supplement the generalized principles drawn from the literature and program staff/evaluators? students from a STEM pipeline program such as Tech-PREP statement generation, sorting task, and rating task acculturation scale, attitudes toward STEM careers Result in a taxonomy of behaviors as well as qualitative data on attitudes Study 3 1) Determine the extent to which students receiving the benefits of STEM programs agree with program staff on what constitutes culturally competent behavior 2) Attempt reconciliation of the two lists in ways that incorporate the most important and feasible culturally competent behaviors and principles from all perspectives gathered each of the two lists generated will be used by evaluators assessing STEM programs for cultural competence. REFERENCES Kuenzi, J. J. (2008). Science, Technology, Engineering, and Mathematics (STEM) education: Background, federal policy, and legislative action. Congressional Research Service. Retrieved September 26, 2009 from www.fas.org/sgp/crs/misc/RL33434.pdf . Kane, M. & Trochim, W. M. K. (2007) Concept mapping for planning and evaluation. Thousand Oaks: Sage Publications. Steps in preparing original statements EXAMPLE: “…ask members of the population we serve to participate in planning the program. This might include serving on an advisory committee, responding to a proposed activity, participating in a focus group related to the program.” Step Question Example STEP 1: Is the response represented in paragraph form? If, yes then list each statement separately. Yes. Statement 1: …ask members of the population we serve to participate in planning the program. Statement 2: This might include serving on an advisory committee, responding to a proposed activity, participating in a focus group related to the program STEP 2: Does either statement represent compound ideas that should be separated? Yes, statement 2 describes three activities that can be separated into three statements. Statement 2 becomes three statements: This might include serving on an advisory committee. …responding to a proposed activity. …participating in a focus group related to the program. STEP 3: Is each statement grammatically correct and does it complete the focus prompt? One specific thing I do to be culturally responsive in planning, implementation, and/or evaluation of social and education programs is....Statement 1 does complete the thought, but each distinct idea in statement 2 needs to be adjusted to finish the thought. The new list of statements from the one original response becomes: …ask members of the population we serve to participate in planning the program. …ask members of the population we serve to serve on an advisory committee. …ask members of the population we serve to respond to a proposed activity. …ask members of the population we serve to participate in a focus group related to the program.

Addressing Diversity in Planning, Implementation, and Evaluation of STEM … · 2015. 4. 15. · evaluation of STEM programs and 2) to establish criteria for assessingcultural responsiveness

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Addressing Diversity in Planning, Implementation, and Evaluation of STEM … · 2015. 4. 15. · evaluation of STEM programs and 2) to establish criteria for assessingcultural responsiveness

ABSTRACT BACKGROUND

METHODS

RESULTS

DISCUSSIONThis project is designed as an initial step to 1) establish criteria for conducting a culturally responsive

evaluation of STEM programs and 2) to establish criteria for assessing cultural responsiveness in STEM program planning. We recruited staff from education programs conducting evaluations throughout New York as well as evaluators nationally that identified a concern with diversity or education issues in evaluation through membership with American Evaluation Association topical interest groups. We employed a standard concept mapping methodology where participants were asked to generate statements about behaviors and attitudes toward diversity in evaluation and program planning. Additionally, relevant statements were taken from an analysis of the extent literature on conducting culturally competent evaluations and added to the data set produced from participant responses. The compiled list of statements from the initial phase of data collection was used to enrich principles of culturally responsive evaluations with examples of behaviors that practitioners engage during planning, implementation, and evaluation. In future research, the statements will be analyzed to produce an annotated list of culturally competent behaviors which evaluators and staff can engage in to address the needs of a diverse population.

For decades, Americans have witnessed an achievement gap between our students of color and of low socioeconomic status when compared to middle class and white students. This phenomenon is particularly disturbing when considering the importance of science and technology to the world economy. According to Kuenzi (2008):

• America now ranks 24th in science literacy and 26th in math literacy on a list of the top 40 most industrialized nations.• The federal government has invested over $3 billion to increase minority participation in STEM.

Thus, accountability and effectiveness of such programs should be of growing concern. In this research, we take the perspective that program initiatives aimed at minority students must operate in a culturally responsive manner in order to be effective. Thus, we emphasize the importance of assessing culturally responsive behaviors by programming staff as well as the importance of conducting culturally responsive evaluations of STEM programs.

CULTURALLY RESPONSIVE EVALUATIONSThe literature on culturally responsive evaluation (CRE) several principles that can help guide culturally responsive

evaluations. However many of the principles are generic and vague. A review of the literature resulted in the following principles:

•Self-determination and stakeholder involvement•Building trust and facilitating communication with multicultural staff•Understanding the accuracies and possible inaccuracies of group cultural context•Engage a multifaceted approach to data collection•Choosing appropriate measures and measure format

However, the literature surrounding these principles often result in questions about how to implement and apply each idea in a practical setting. For instance:

• How does a program/evaluator illicit stakeholder involvement?• How does a program or evaluation team “diversify” their staff?•How does a program or evaluator go about “understanding” the group they are working with?

PARTICIPANTS

N= 48 members from different organizations (only 41 responded to demographic questions)

• 50% self-identified as having involvement with STEM, 4-H, or other related programs• 40% of respondents identified as working with a primarily low SES or racial/ethnic minority population• 82.5%, were involved in planning or implementation of the evaluation of a program, and • 52.5% were involved in both the evaluation and the program planning and implementation.

MATERIALS1) Concept Systems Global (internet software)2) KWIC Concordance (Keyword in Context)

PROCEDUREPreparing for concept mappingInitial preparation of the concept mapping exercise included generation of the focus statement. The focus statement is a statement intended to capture brainstorming ideas relevant to the focus of the concept mapping study. In this project, the intent of the concept mapping study was to capture a set of behaviors which staff and evaluators use in response to the needs of diverse populations in STEM initiatives.

Focus statement: “One specific thing I do to be culturally responsive in planning, implementation, and/or evaluation of social and education programs is....”

Generating the ideas• Participants invited to participate via email containing a link to the website• Responded to 7 anonymous demographic questions• Unlimited time to respond to the focus prompt by completing the statement• Could list as many ideas as desired

ANALYSISThe analysis consisted of five steps:

1) identifying compounded statements and separating them into distinct statements/ideas, 2) making statements grammatically correct, 3) reducing the statement set to from 124 to 100, 4) performing a keyword in context analysis to find key words, and 5) using keywords to extract themes from the set of statements.

The original statement set consisted of 93 responses from 48 participants became 123 statements when compound ideas were disaggregated.

We performed a keyword in context analysis, using KWIC Concordance Software (Kane & Trochim, 2007) to identify keywords that were used several times in the response set and the statements in which each keyword was used. In this way several keywords were identified (e.g. research, include, culture, etc.). Researchers then identified each statement linked to a keyword and derived the key themes based on the analysis of each keyword in context and the meaning of all statements linked to each keyword. A similar analysis of the extent literature on culturally responsive evaluations, in general, not currently limited to STEM programs, was also conducted in order to compare and contrast respondent statements with key ideas in the literature.

Self-determination and stakeholder involvement

Include members of the community

and stakeholders in all areas of

program/evaluation planning

“…focus on making sure

diverse individuals

are involved as

participants, planners,

and evaluators.”

“…to develop and

use a stakeholder

panel or advisory group to

assist in the analysis of

data…”

Use cultural informants

“…speak to a member of

the group about

potentially relevant topics.”

“…routinely conduct

focus groups or

interviews with key

informants around

methods utilized.”

Address concerns of community

members

“…validate that there is

much wisdom and

many perspectives within the group.”

“… use information

about barriers to

participation to help me

to keep changes for

future program

activities.”

Building trust and facilitating communication with multicultural staff

Hire diverse employees

“…ensure that we recruit from that population whenever possible.”

“…be sure that some of the people that conduct

programs are authentic.”

Hold sensitivity trainingsMaintain reciprocal

communication, dialogues

“…hold discussions with groups to get their insights on what is not working and how it might work better.”

“…share evaluation results.”

Represent the lived and historical experiences of

the group

“…strive for balanced/mixed groups…”

“…assemble an evaluation team whose collective lived experiences are

appropriate to context of the evaluand.”

Understand the normative and/or formal language

“…use terms or concepts in the evaluation that relate to their life experiences.”

“…ensure that language in instrument development is

culturally sensitive to stakeholders.”

Understanding the accuracies and possible inaccuracies of group cultural context

Be understanding, considerate, and aware the group

“…be aware of past injustice or harm that

has resulted from evaluation or research with this population.”

“…read about the population/area that the population resides in.”

Apply relevant cultural theory when possible

“…seek culturally responsive evaluation

theory to guide my evaluation practice.”

“…avoid framing cultural variables in terms of problems or

deficits.”

A review of the literature on CRE revealed several principles that can guide culturally responsive program planning and evaluations. However, though the principles are helpful, they are also generic statements of what an evaluator/ program planner should try to do without a specific mechanism for how to apply each principle practically. The preliminary data collection in this project serves to:

enrich principles and theories in the literature with behavioral examples of how theory about cultural responsiveness translates into action. However it also serves

to explore the extent to which language and methodology are used consistently across the field of evaluation and practitioners and

to consolidate the academic knowledge base on cultural responsive evaluation with the practical experience of practitioners.

It also serves as a preliminary step in creating a consolidated list of actions that can operationalizeculturally competent behaviors for practical implementation (i.e. a checklist) in an accessible and distributable manner.

There are clear themes in the literature on evaluation regarding approaches to culturally diverse populations. However, these themes are usually globalized, abstract principles rather than concrete, operationalized behaviors. There are many instances in the literature of highly specific examples on how evaluations are conducted in a particular context. However, this type of information is not centralized and does not always have relevance to different programs working with various populations. The data from this study is directly gathered from individuals working in STEM fields and provides more specific examples of behavior..

Future Directions

A series of future studies have been designed as extensions of this work:

• Study 11) To what extent do practitioners perceive each of the principles and behaviors compiled in the pilot

study as feasible and important to CRE?2) To what extent is there consistent use of language and methods in CRE?

• Evaluators and STEM program staff •Sorting task and rating task•Result in a taxonomy of behaviors

• Study 2 1) Will a student perspective provide more concrete behaviors to supplement the generalized

principles drawn from the literature and program staff/evaluators?• students from a STEM pipeline program such as Tech-PREP• statement generation, sorting task, and rating task• acculturation scale, attitudes toward STEM careers• Result in a taxonomy of behaviors as well as qualitative data on attitudes

• Study 31) Determine the extent to which students receiving the benefits of STEM programs agree with

program staff on what constitutes culturally competent behavior2) Attempt reconciliation of the two lists in ways that incorporate the most important and feasible

culturally competent behaviors and principles from all perspectives gathered each of the two lists generated will be used by evaluators assessing STEM programs for cultural competence.

REFERENCES

Kuenzi, J. J. (2008). Science, Technology, Engineering, and Mathematics (STEM) education: Background, federal policy, and legislative action. Congressional Research Service. Retrieved September 26, 2009 from www.fas.org/sgp/crs/misc/RL33434.pdf .

Kane, M. & Trochim, W. M. K. (2007) Concept mapping for planning and evaluation. Thousand Oaks: Sage Publications.

Steps in preparing original statementsEXAMPLE: “…ask members of the population we serve to participate in planning the program. This might include serving on an

advisory committee, responding to a proposed activity, participating in a focus group related to the program.”Step Question Example

STEP 1:

Is the response represented in paragraph form? If, yes then list each statement separately.Yes.

Statement 1:…ask members of the population we serve to participate in planning the program.Statement 2:This might include serving on an advisory committee, responding to a proposed activity, participating in a focus group related to the program

STEP 2:

Does either statement represent compound ideas that should be separated?Yes, statement 2 describes three activities that can be separated into three statements.

Statement 2 becomes three statements:This might include serving on an advisory committee.…responding to a proposed activity.…participating in a focus group related to the program.

STEP 3:

Is each statement grammatically correct and does it complete the focus prompt? “One specific thing I do to be culturally responsive in planning, implementation, and/or evaluation of social and education programs is....”Statement 1 does complete the thought, but each distinct idea in statement 2 needs to be adjusted to finish the thought. The new list of statements from the one original response becomes:

…ask members of the population we serve to participate in planning the program.…ask members of the population we serve to serve on an advisory committee.…ask members of the population we serve to respond to a proposed activity.…ask members of the population we serve to participate in a focus group related to the program.