Matt EEN Presentation June 2008

Embed Size (px)

Citation preview

  • 8/14/2019 Matt EEN Presentation June 2008

    1/12

    Three Years into Building a Network of Three Years into Building a Network of Environmental Evaluators:Environmental Evaluators:

    Where do we stand? Where should we head?Where do we stand? Where should we head?

    Environmental Evaluators Networking ForumEnvironmental Evaluators Networking ForumThird Annual Meeting; Washington, DC Third Annual Meeting; Washington, DC

    June 2008 June 2008

    Matt BirnbaumEvaluation Science Officer

    National Fish & Wildlife Foundation

  • 8/14/2019 Matt EEN Presentation June 2008

    2/12

    OverviewOverview 1) Summary of Evaluators Assets and Concerns2) Focus on Specific Methodological Issues Needing to be Addressed3) Role of AEA, EEN and other Entities in Building Capacity

  • 8/14/2019 Matt EEN Presentation June 2008

    3/12

    MethodologyMethodology

    1. Online survey design administered as part of EEN Registration.1. 8 sets of closed and/or mixed-ended questions 2 open-ended questions

    2. 124 respondents in 2008 Forum; 117 respondents in 2007 Forum;

    88 respondents in 2006 Forum.3. Method of analysis of data: Closed-ended questions = descriptive statistics (principally frequencies) Open-ended questions = content analysis

    4. Nominal group methodology was applied and interpreted in data

    collection and content analysis from 96 participants involved withthe strategic planning sessions held during the 2006 forumsupplemented by a set of unstructured interviews in 2007 of emerging leaders in the Network.

  • 8/14/2019 Matt EEN Presentation June 2008

    4/12

    Key Attributes of EEN ParticipantsKey Attributes of EEN Participants

    Diversity among Participants (geography, education, org. affiliation) Variations in Participants Connection to Evaluation Commonalities of Views in Issues of Concern Initial Themes Emerging for Priorities

  • 8/14/2019 Matt EEN Presentation June 2008

    5/12

    Diversity of Respondents: Organizational TypeDiversity of Respondents: Organizational Type

    2006 % 2007% 2008%

    Academic 12 10 11

    Federal 44 46 55

    Non-Profit/Foundation 24 22 15

    Private Sector 20 18 19

    State/Regional/

    Local/Tribal1 3 1

    TotalRespondents 86 117 124

  • 8/14/2019 Matt EEN Presentation June 2008

    6/12

    Diversity of Respondents:Diversity of Respondents:Geographic AreaGeographic Area

    2006 Percent 2007 Percent 2008 Percent

    DC Metro Area 62 65.0 72

    Northeast US 12 9 9

    Southeast US 7 7 7

    Midwest US 6 8 4

    Mountain West US 1 3 2

    Pacific US 9 4 4

    International 4 4 4

    Total Respondents 86 117 124

  • 8/14/2019 Matt EEN Presentation June 2008

    7/12

    Diversity of Respondents:Diversity of Respondents:Education LevelEducation Level

    At Masters Level: People were morelikely to be in a professionalfield:

    57% in a professional field (74%in 2007)

    20% were in either socialsciences or life sciences

    At Doctoral Level: Most studied in atraditional science field:

    29% in a life science discipline(34% in 2007).

    22% in a traditional socialscience discipline (29% in2007).

    43% in a professional program(37% in 2007)

    2006 % 2007 % 2008%

    UndergradDegree

    13 11 8

    MastersDegree 50 52 51

    PhD32 33 37

    Other 5 4 4

    Total N 78 113 124

  • 8/14/2019 Matt EEN Presentation June 2008

    8/12

    Participants Connection to EvaluationParticipants Connection to Evaluation

    Respondents vary in time spent on evaluation at work

  • 8/14/2019 Matt EEN Presentation June 2008

    9/12

    Lack of Knowledge of ManyLack of Knowledge of ManyEvaluation ApproachesEvaluation Approaches

    2007 2008

    Auditing 7% 10%

    CBA/CEA 26 26

    Exp Design 28 28

    Multi-Level (e.g., Site) 28 22

    Participatory 30 27

    Impact 32 32

    Needs Assessment 34 32

    Process 38 34

    Performance Mgt NA 39

  • 8/14/2019 Matt EEN Presentation June 2008

    10/12

    and Leads to Predictableand Leads to PredictableIssues of Concern:Issues of Concern:

    Evaluation designs and methods reported by more than half of theresponses

    The illness of Metricitis Evaluative Capacity building :

    a. Getting top-management buy-in, including appropriate budgets for monitoring

    and evaluation.b. Improving the processes for utilizing knowledge generated by evaluations inpolicy making.

    3. Accounting for the human dimension.a. Estimating changes in human behaviors (frequently in addressing someb. Assessing changes associated with collaboration

  • 8/14/2019 Matt EEN Presentation June 2008

    11/12

    Initial Themes EmergingInitial Themes Emergingfor Short-Term Prioritiesfor Short-Term Priorities

    We asked: What are the 1-2 highest technical and/or institutional priorities that

    environmental evaluators need to address over the next couple of years?

    You responded:3. Improving evaluative capacity building across the network (52%)4. Improved designs and methods:

    a. Confounding variables and forces outside of the control of many programs(e.g., climatic patterns)b. Constructing credible indicators for assessing both socioeconomic andecological patterns.

    c. Assessments of human dimensions in causal models.

  • 8/14/2019 Matt EEN Presentation June 2008

    12/12

    Future DirectionsFuture Directions

    Assumption: Those doing evaluation in this arena come frommany disciplines with many having no affiliation with AEA.

    It means that networks need to be created in bridging theseboundaries.

    We need new ways of thinking about laboratories for experimentation. We need to balance top-down intellectual labor on these and

    other sticky issues with bottom-up places for testing andrefining new approaches.

    Methodological concerns are grave given continued risingdemands for outcomes-based information in other arenas.

    We need to be fighting and support those demanding open andcommon standards and rewarding learning over accountability.