97
Professional Development for Afterschool Practitioners The First Year of the Palm Beach County Afterschool Educator Certificate Program Stephen Baker Tracey Lockaby Kai Guterman Kathleen Daley Susan Klumpner 2011

Professional Development for Afterschool Practitioners The ......Educator Certificate Program Stephen Baker Tracey Lockaby Kai Guterman Kathleen Daley Susan Klumpner 2011 Professional

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

  • Professional Development for Afterschool Practitioners The First Year of the Palm Beach County Afterschool Educator Certificate Program

    Stephen Baker Tracey Lockaby Kai Guterman Kathleen Daley Susan Klumpner

    2011

  • Professional Development for Afterschool Practitioners

    Stephen Baker, Tracey Lockaby, Kai Guterman, Kathleen Daley, Susan Klumpner

    Recommended Citation

    Baker, S, Lockaby, T., Guterman, K., Daley, K., & Klumpner, S. (2011). Professional Development for Afterschool Practitioners. Chicago: Chapin Hall at the University of Chicago

    ISSN: 1097-3125

    © 2011 Chapin Hall at the University of Chicago

    Chapin Hall at the University of Chicago 1313 East 60th Street Chicago, IL 60637

    773-753-5900 (phone) 773-753-5940 (fax)

    www.chapinhall.org

  • Acknowledgments

    This report would not have been possible without the benefit of guidance and information from participants in Palm Beach County Afterschool Educator Certificate Program. We owe special thanks to staff at Prime Time who worked closely with us, responding to requests throughout the year and offering information and helpful insights. We are grateful to the program directors, front-line staff, and other participants in Palm Beach County Afterschool Educator Certificate training who participated in our surveys and made time in their busy days to talk with us about their experiences. We also wish to acknowledge the assistance of Julie Spielberger and Anne Clary, who assisted with the editing of this report.

    Finally, we thank Prime Time Palm Beach County, Inc. for supporting the evaluation.

  • Table of Contents

    Introduction, Overview, and Background......................................................................................................1

    Overview and Summary of This Report....................................................................................................2

    What are the characteristics of individuals who participated in PBC-AEC?........................................2

    What did participants think about PBC-AEC prior to training and as they were participating? ..........2

    What were the perceived immediate and short-term effects of training? .............................................3

    What were the perceived effects of training on longer-term outcomes?...............................................3

    Prime Time Palm Beach County, Inc.: History and Context ....................................................................4

    Quality Improvement ............................................................................................................................4

    Professional Development ....................................................................................................................5

    Community Engagement and Supports.................................................................................................5

    Summary of the Identified Need for PBC-AEC ...................................................................................6

    Palm Beach County Afterschool Educator Certificate Program ...................................................................7

    PBC-AEC Values and Goals .....................................................................................................................8

    PBC-AEC Structure and Course Curriculum............................................................................................9

    PBC-AEC Implementation and Context .................................................................................................11

    The Evolution of the PBC-AEC during the Pilot Year ...........................................................................13

    Focus and Research Questions for 2009–2010 Evaluation .........................................................................15

    Methods ...................................................................................................................................................16

    Participant Survey ...............................................................................................................................16

    Telephone Interviews ..........................................................................................................................17

    Secondary Data ...................................................................................................................................17

    Data Analysis and Presentation...........................................................................................................17

    Implementation Study Findings...................................................................................................................19

    What Are the Characteristics of Individuals Participating in PBC-AEC? ..............................................19

  • Demographic Characteristics ..............................................................................................................19

    Educational Characteristics .................................................................................................................20

    Professional Characteristics ................................................................................................................20

    What Did Participants Think about PBC-AEC Prior to Training and as They Were Participating? ......21

    Survey Findings: Engagement, Recruitment, and Concerns prior to Starting Training .....................21

    Interview Findings: Engagement, Recruitment, and Concerns prior to Starting Training .................22

    Survey Findings: Experiences with the Program ................................................................................24

    Interview Findings: Experiences with the Program ............................................................................26

    Survey Findings: Satisfaction with the Overall Program and Specific Courses .................................28

    Interview Findings: Satisfaction with the Overall Program and Specific Courses .............................30

    What Were the Perceived Immediate and Short-Term Effects of Training? ..........................................32

    Survey Findings: Immediate and Short-Term Effects of Training .....................................................32

    Interview Findings: Immediate and Short-Term Effects of Training .................................................34

    What Were the Perceived Effects of Training (and Other Factors) on Longer-Term Outcomes?..........36

    Survey Findings: Effects on Longer-Term Outcomes ........................................................................36

    Interview Findings: Effects on Longer-Term Outcomes ....................................................................37

    Summary and Questions Going Forward.....................................................................................................39

    Diagramming PBC-AEC with a Logic Model ........................................................................................40

    Questions Going Forward .......................................................................................................................41

    What parts of the program should be retained?...................................................................................41

    How can PBC-AEC successfully blend its enriching and assessing elements?..................................41

    How should we interpret concerns raised about the amount of time the course takes?......................42

    How can we learn more about the specific elements of the High/Scope curriculum that constitutes an important part of training?...................................................................................................................42

    How can PBC-AEC respond to participant interests in ongoing support and training? .....................43

    What is the appropriate role of financial incentives in the PBC-AEC in the short and long term?....43

  • How should targeting and recruiting balance various immediate program interests and the interests of possible research designs to identify program outcomes?..............................................................43

    What is the specific contribution of PBC-AEC in shaping long-term outcomes, and how might this differ among participants?...................................................................................................................44

    Conclusion...............................................................................................................................................46

    Bibliography ................................................................................................................................................47

    Appendices...................................................................................................................................................49

  • List of Figures

    Figure 1. High/Scope Youth Development Approach as Pyramid of Youth Needs (adapted from Akiva, 2007) ............................................................................................................................................................11

    Figure 2. PBC-AEC Cycle of Learning, Trying, Reflecting, and Refining.................................................27

  • List of Tables

    Table 1 Incentive Payment Schedule for Last Three Cohorts of Pilot Year...............................................12

    Table 2 Description of Training Cohorts and Changes ..............................................................................14

    Table 3 Perceptions of the AEC prior to Program Start (as Recalled at Program Completion) that Influenced the Decision to Participate and Contributed to Expectations of the Program ...........................22

    Table 4 Survey Items Using Similar Language: Changes in Strong Agreement prior to Program (as Recalled) and at End of Program.................................................................................................................25

    Table 5 Perceptions of What Participants Learned from PBC-AEC Training ...........................................29

  • Chapin Hall at the University of Chicago 1

    Introduction, Overview, and Background

    A critical effort of the afterschool field in recent years has been to move afterschool programs beyond a deficit-based focus on reducing the known risks of unsupervised time and toward more intentional and effective positive developmental activities (Yohalem et al., 2009; Miller, 2005; Pittman et al., 2008). Research has identified, however, specific challenges to this enterprise. One is that positive youth development outcomes appear to be primarily products of high-quality programs (e.g., focused, active, explicit), rather than of diffused, lower-quality programs (Durlak & Weissberg, 2007). A second is that many afterschool programs currently fall below critical quality thresholds for generating these positive outcomes (Granger, 2008). Given the needs for high levels of quality and its frequent absence in the field, accomplishing a transition to effective youth development as a dominant paradigm in the afterschool field will require successful quality improvement models and broad application.

    There are many potential levers for increasing afterschool program quality. These include changes at the systemic, public policy, and funding levels (e.g., expanded and targeted funding, opportunities for sharing expertise among similar agencies, requirements for staff certification), at the individual program or organizational level (e.g., staff recruiting, staff mentoring), and at the point of service (e.g., curricula, effective assessment processes). Staff training is also a potentially important contributor to quality, and the specific structure and other attributes of training help determine its effectiveness. For example, training appears to be more effective when it is ongoing rather than a one-time event (BEST, 2002; HFRP, 2005), is hands-on and relates to staff’s actual work with youth (NIOST, 2001) while connecting to the underlying theory or rationale for the new practice, and is provided in a setting perceived as safe for trying new approaches (Metz, 2009).

    Providing training with these qualities, attracting staff to it, and ensuring that participants are enthusiastic, however, can be difficult for a range of reasons. Training approaches too often default to didactic lecture-style formats that make training content more difficult to engage. The field is characterized by staffing patterns that typically include part-time and full-time staff with varied educational backgrounds and

  • Chapin Hall at the University of Chicago 2

    professional experience. This can complicate efforts to tailor the content of training to the background of those trained (Kelley, 1999). There are numerous disincentives for individuals to voluntarily obtain training, or for organizations to require it of their staff. For example, in previous studies of quality improvement in Palm Beach County, Chapin Hall researchers found that many afterschool practitioners do not plan to pursue careers in the afterschool field because of low pay and the prevalence of part-time or, in some cases, short-term positions. Such weak attachment to the field enters into the cost-benefit calculations for individuals and agencies alike when considering investments in training and other professional development. A further disincentive is that staff who increase their skills are rarely financially rewarded for doing so. At the organizational level, supervisors are often not in a position to pay staff to attend training. At the same time, family and other personal responsibilities—including college requirements for some staff—can make it difficult for directors and front-line staff to prioritize time for training outside of work.

    As one response to these and other challenges, Prime Time Palm Beach County, Inc (Prime Time) piloted a training program called the Palm Beach County Afterschool Educator Certificate (PBC-AEC) Program in the fall of 2009. That program, the focus of this report, provided training for front-line and supervisory staff with 12-week courses on youth development and afterschool practice, including financial supports and incentives. Chapin Hall at the University of Chicago was contracted to conduct a process evaluation on the pilot year of implementation.

    Overview and Summary of This Report Key findings of this report are identified very briefly below. These findings are followed by an overview of the full report.

    What are the characteristics of individuals who participated in PBC-AEC?

    Participants were predominantly female and racially diverse. More than three-quarters were front-line staff. Compared to front-line staff, supervisory staff reported higher levels of formal education, training in afterschool, and work experience.

    What did participants think about PBC-AEC prior to training and as they were participating?

    Participants were encouraged by their supervisors to attend, and the vast majority identified the financial incentive as an important consideration in their participation.

    Participants reported mixed expectations prior to the start of training, and their reported experiences with the training itself matched or exceeded their expectations. For example, far more participants said they found the course “fun” and said they learned a lot more than they expected.

  • Chapin Hall at the University of Chicago 3

    Participants reported high levels of satisfaction with most aspects of the training about which they were asked. These include being able to attend training with coworkers and their supervisors; the hands-on and interactive teaching style; and the way in which they were able to learn concepts, try them out at their workplace, and review them in their next training session.

    The most common barriers to participation in training were time and scheduling demands.

    What were the perceived immediate and short-term effects of training?

    A high percentage of participants credited PBC-AEC training with increasing confidence in their abilities, increasing satisfaction with their job, providing materials that could be used at their workplace, and changing their workplace practices to improve their work with youth and families.

    Many staff in programs participating in the Quality Improvement System (QIS) noted the value of PBC-AEC in increasing their comfort with and understanding of the QIS.

    Almost half of the participants believed there were barriers at their workplace that would make it difficult to apply parts of the PBC-AEC training. Some of these barriers may include insufficient time for planning and the existing pressures to focus afterschool activities on academic performance.

    What were the perceived effects of training on longer-term outcomes?

    A high percentage of participants reported that PBC-AEC made them feel that they belonged in the afterschool field, and almost 75percent reported that PBC-AEC made them feel more likely to stay in the afterschool field.

    Some staff reported that PBC-AEC inclined them to pursue additional training and educational opportunities, though many in this training were already enrolled in higher education courses.

    We begin the body of the detailed report below by describing the history, development, and role of Prime Time as an intermediary organization. We then turn to a description of the components, activities, and goals of the Palm Beach County Afterschool Educator Certificate. After describing the research questions and methods used in this research, we turn to the detailed findings that draw primarily upon surveys and interviews with training participants. We conclude with several questions for further consideration as Prime Time continues to refine the PBC-AEC program.

  • Chapin Hall at the University of Chicago 4

    Prime Time Palm Beach County, Inc.: History and Context1 The idea for an independent, intermediary agency to support the afterschool field in Palm Beach County emerged during several years in the mid-1990s when a consortium of Palm Beach County community stakeholders began to meet and coordinate resources for existing afterschool programs. Formally incorporated as a nonprofit organization in 2001, Prime Time serves afterschool programs and practitioners, and provides supports and resources intended to increase program quality and positively affect school-age youth. With support from the Children’s Services Council (CSC) of Palm Beach County and private foundations, Prime Time currently works with key afterschool stakeholders, including the School District of Palm Beach County, the Department of Parks and Recreation, Palm Beach Health Department, Palm Beach State College, various municipalities, and others with interest in school-aged youth in the county.

    Initially, Prime Time focused its work on establishing opportunities for providers to network and developing the quality of selected programs serving elementary and middle school children in targeted low-income communities. In subsequent years, while still prioritizing support to low-income populations, Prime Time broadened the orientation of its work toward a sustainable countywide system of quality standards, supports, and resources for not-for-profit afterschool programs. The current mission of the organization is to “ensure that afterschool programs are of high quality in terms of delivery, practice and standards.”

    Prime Time activities are organized under three areas of work, with a staff member serving as director for each area: quality improvement, professional development, and community engagement and supports. The PBC-AEC draws upon the activities organized under each of these three areas—though primarily the first two—and benefits from its links to them.

    Quality Improvement

    For many afterschool programs in Palm Beach County, quality improvement efforts are grounded in Prime Time’s Quality Improvement System, a multistep intervention that begins with baseline program quality assessments by trained outside assessors on a standardized measure called the Palm Beach County

    1 In a series of reports between 2005 and 2009, Chapin Hall documented Prime Time’s goals and strategies as they emerged as individual activities and as part of an emerging system of supports for afterschool programs serving children and youth, with a focus on the development of the QIS. More detailed information on the development, operation, and impacts of Prime Time and Prime Time’s QIS is available in reports from Chapin Hall at the University of Chicago (Spielberger & Lockaby, 2006; Spielberger & Lockaby, 2007; Spielberger, Lockaby, Mayers, & Guterman, 2008; Spielberger, Lockaby, Mayers, & Guterman, 2009; Baker, Spielberger, Lockaby, & Guterman, 2010).

  • Chapin Hall at the University of Chicago 5

    Program Quality Assessment (PBC-PQA).2 The PBC-PQA uses a 5-point rating scale to measure key physical, social, and emotional aspects of program environments. In addition to external assessments at the beginning of the process, the QIS includes self-assessments by program staff, the development of program improvement plans based on areas for improvement identified by the assessments, the provision of on-site technical assistance by quality advisors, staff trainings and curricular resources to implement improvements, and reassessment by external assessors. Programs enrolled in the QIS are automatically eligible for additional program enhancements provided by outside agencies, and staff of these programs are eligible for wage supplements through a separate program.

    Prime Time began piloting its QIS with 34 organizations in 2005, and two years later expanded eligibility for the QIS to most programs in the county. Currently, QIS participation is required for agencies seeking certain subsidized afterschool funding, as well as other specific financial incentives for individual afterschool practitioners. Organizations participating in QIS for several years may be provided different levels of support and services based upon their evolving needs. Most of the individuals participating in the PBC-AEC during the past year worked at organizations that were participating in the QIS (and its associated assessment process), and the alignment and overlap between these two Prime Time activities are key attributes of PBC-AEC training.

    Professional Development

    In recognition of the importance of professional development in supporting quality afterschool services, Prime Time has developed several lines of work within its professional development department. One focus has been to establish agreed-upon written “core competencies” for afterschool practitioners in Palm Beach County, standards that are categorized into eight content areas and grouped into four levels of accomplishment, beginning with entry-level staff. Connected to this, Prime Time has partnered with Palm Beach State College (PBSC) and others to formalize a “noncredit pathway” and “credit pathway” coursework of training for afterschool practitioners at PBSC. The PBC-AEC operates as a class within the noncredit pathway, but was intended, in part, to inspire participants to continue their training more formally in credit courses leading to other certificates and degrees. Funding for trainings at PBSC is supported through scholarship programs administered by Prime Time, and Prime Time also independently offers a range of other individual training classes free of charge.

    Community Engagement and Supports

    Partnering with other agencies in the county, rather than trying to provide all services itself, follows directly from Prime Time’s mission as an intermediary organization, and Prime Time maintains a focus

    2 The PBC-PQA was developed by the High/Scope Foundation, an educational research foundation that also develops curricula, conducts training, and publishes educational materials, in collaboration with Prime Time and other stakeholders.

  • Chapin Hall at the University of Chicago 6

    on these community partnership activities in several key ways. Prime Time seeks to increase community awareness of the value of afterschool programs and the importance of program quality through its support of and participation in local, statewide, and national events and organizations that highlight the importance of afterschool activities to youth development and community safety. Prime Time hosts regular networking events throughout the county that provide opportunities for formal learning about policies and issues relevant to afterschool providers, and opportunities for less formal networking and sharing. Prime Time also makes “program enhancements” available to eligible afterschool programs. These program enhancements are provided for different lengths of time ranging from a few weeks to a few months and involve linking a community partner with a specific area of expertise (e.g., arts or sports) to an individual program so that program participants and practitioners alike can be exposed to new content.

    Summary of the Identified Need for PBC-AEC

    As noted earlier, despite many improvements in the availability and coordination of trainings and supports by Prime Time and partners during the past decade, Prime Time staff identified several important challenges had been as they began planning for the PBC-AEC. Many practitioners in the field reportedly either lacked an understanding of their important developmental role with youth, or understood this role but lacked effective ways to translate it into practice. Many did not understand how greater attention to youth development principles and practices could increase their own satisfaction and impact. At a higher level, many practitioners did not understand how the afterschool field might offer career opportunities, rather than serving primarily as a temporary job or stepping stone to work outside of the field, or how they could progress through levels of mastery of youth development concepts and practices. In the next section, we describe the Palm Beach County Afterschool Educator Certificate program, its orientation to these and other challenges, and its operation during the pilot year.

  • Chapin Hall at the University of Chicago 7

    Palm Beach County Afterschool Educator Certificate Program

    The Palm Beach County Afterschool Educator Certificate program was actively developed during 2009 by a committee constituted of representatives from Prime Time, the Palm Beach County School District, Palm Beach State College, the Institute of Excellence in Early Care and Education, and the Department of Parks and Recreation. In the spring and summer months leading up to the start of training in the fall of 2009, program planners grappled with important questions about the structure and purpose of PBC-AEC training. The answers to these and other questions established important parameters for the program during this pilot year:

    Should participants in training complete all sessions together as a cohort? As late as the summer of 2009, PBC-AEC training was expected to be offered as separate components that individuals could complete as their schedule allowed, rather than with a cohort of classmates. This flexibility was intended to respond to concerns that practitioners would not be able to commit to a large block of training time in a short period. Similarly, there were questions about whether participants who had already completed one of the existing Prime Time training components in PBC-AEC (i.e., “Bringing Yourself to Work”) would be allowed to opt out of this part of training.

    Is the target of the program individuals taking the course, their afterschool program, or their agency? To what extent should PBC-AEC be understood as directed toward individuals or toward programs as a whole? Were any program effects, as one Prime Time staff described them, expected to “trickle down” to the program from the individuals who were trained? Or was there to be a more intentional effort to target programs or organizations in how programs were recruited and the training focused?

    How theoretical should the course be and where would it fit with credit courses offered at PBSC? Planning for the PBC-AEC occurred soon after the Business Partnership Council at PBSC had agreed upon the “credit” pathway toward associate’s degrees and bachelor’s degrees for afterschool

  • Chapin Hall at the University of Chicago 8

    practitioners. What should the PBC-AEC look like, given the interest in using it to inspire participants to enroll in this new credit pathway? How much should the PBC-AEC replicate or depart from that more formal educational model? How would it balance teaching key youth development concepts that could be helpful as background to additional coursework and practicing hands-on learning that could be applied directly at programs?

    How should the PBC-AEC draw upon parallel efforts in the county targeting assessment and standards? Given the range of efforts in Palm Beach County to improve quality, including the QIS, the recently developed local “Core Competencies,” and existing training opportunities (e.g., the School Age Professional Certificate), how would PBC-AEC draw upon these existing curricular resources? What elements of PBC-AEC would be created specifically for the course, and why?

    PBC-AEC Values and Goals In conversations with Prime Time staff, a few key attributes of the PBC-AEC training were emphasized that together constituted a distinct vision for this training. PBC-AEC was intended to provide practitioners a “whole picture” of youth development principles and how they apply to working with youth in afterschool programs, instead of covering only one specific topic area for a few hours. As one instructor described it, the training was to provide afterschool practitioners “everything that you need to know.” The ideas informing PBC-AEC were intended to be centered on positive youth development, which was viewed as a “hard concept to grasp” for many currently working in the field. The training itself, however, was intended to break down that concept into pieces that were easier to understand and implement. The training style was to avoid traditional lecturing, since that format was seen as less effective in increasing learning and changing practices. By these measures, it was to differ substantially from the existing School Age Professional Certificate training, which used a more traditional teaching style and emphasized rules and regulations rather than reinforcing youth development principles and practices.

    PBC-AEC was to be a place where supervisors and front-line staff learned new ways of working together as they completed the curriculum together, both in the classroom and in assignments between workshops. Activities during the training that grouped them side-by-side were intended, in part, to provide opportunities for front-line staff to make greater contributions and for supervisors to recognize and appreciate these contributions. More generally, PBC-AEC was expected to support peer-to-peer learning, during which participants learned from the instructor and also from each other. It was, at its foundation, to be a safe place for staff to try out new ideas and share their own experiences and thoughts.

    The specific written goals of the training sessions were to

    Provide practitioners with tools to improve the social and emotional environments of their programs;

  • Chapin Hall at the University of Chicago 9

    Teach afterschool practitioners techniques and skills that are instrumental in providing a safe, supportive, interactive, and engaging environment;

    Teach practitioners how to create an encouraging afterschool climate with opportunities for youth to strengthen self-esteem and relations with others;

    Teach practitioners how to promote enthusiasm for learning by presenting youth with fun and innovative ways to enhance academic skills;

    Give practitioners the information they need to effectively help youth develop the skills to partner with others and take on leadership roles;

    Teach practitioners how to involve families in ways that effectively meet their needs; and

    Teach practitioners how professional development opportunities fit together as a system in Palm Beach County.

    Palm Beach County Afterschool Educator Certificate goals were also described by program administrators as targeting specific immediate impacts and longer-term effects:

    Increasing the afterschool practitioner’s knowledge about how to provide quality offerings

    Increasing the afterschool practitioner’s ability (level of skill) to provide quality offerings

    Increasing how long practitioners stay in their current jobs

    Increasing how long practitioners stay in the afterschool field

    Increasing practitioners’ pursuit of higher education

    PBC-AEC Structure and Course Curriculum As implemented during the pilot year, the core of the PBC-AEC was a series of classes organized around content that is directly linked to Palm Beach County’s “Core Competencies for Afterschool Practitioners” and the PBC-PQA assessment. The training provided front-line and supervisory staff from afterschool organizations opportunities to engage in hands-on practical applications of youth development practices and principles, with the core intention of teaching important competencies in practice knowledge and skills.

  • Chapin Hall at the University of Chicago 10

    By the time training started at the end of September 2009, program designers had mapped out a schedule based upon a cohort approach and had designed a training curriculum that was intended to be completed in a particular order during a period of two-and-a-half months. Participants were expected to attend every class and arrive on time. Later components within the curriculum were designed to build, recursively, upon earlier lessons. PBC-AEC was envisioned not as a collection of discrete trainings, but as an integrated approach to training staff on what they needed to understand youth development and use it effectively in their individual work and workplace. Though program designers sought to be responsive to suggestions from potential participants about the training schedule, the structure of PBC-AEC sought to maximize learning in an intense period with a prescribed training sequence.

    The package of training selected emphasized evidence-based curriculum, curriculum that filled identified gaps in practitioner knowledge, and curriculum that had already been well received by practitioners. Some of the components of PBC-AEC were offered separately as individual trainings by Prime Time. Much of the first half of the curriculum explored ideas from the High/Scope approach, including key concepts that are measured through the QIS assessment process (see cypq.org).3 In addition, many of the specific High/Scope approaches for working with youth were invoked in later training components, including ways to incorporate youth in planning, ways of grouping students, methods for actively learning with students, and setting aside time for reflection at the end of sessions.

    Most of the High/Scope training materials referred to specific items that are in the PBC-PQA assessment, though some material linked only generally to one or more of the four general levels of the YPQA quality “pyramid.” This pyramid identifies what are considered to be foundational elements of high-quality programs (i.e., a safe and supportive environment) as well as aspects of programming that are especially important when youth rate the quality of programs (i.e., interaction and youth engagement), but that are often difficult for programs to achieve in practice (Akiva, 2005). The PBC-AEC training covers each of the four levels of the pyramid, although not in the same order in which they are shown in Figure 1 below.

    3 There are minor differences between the Palm Beach-PQA and the standard YPQA. In this report, we refer to PBC-PQA in reference to the local assessment processes and instruments, and to YPQA in reference to standard instruments and materials (e.g., the standard High/Scope training materials).

  • Chapin Hall at the University of Chicago 11

    Figure 1. High/Scope Youth Development Approach as Pyramid of Youth Needs (adapted from Akiva, 2007)

    The lead PBC-AEC instructor had been offering High/Scope training through Prime Time since 2006, and the other components within PBC-AEC are a mix of new and existing training curriculum. The youth leadership component is drawn from established Advancing Youth Development (AYD) curriculum (see nti.aed.org). Training on “Bringing Yourself to Work” has been a regular offering on Prime Time’s training calendar since 2005 (Seligson & Stahl, 2003). The components around teaching academics in afterschool and “Inclusion and Play” were jointly designed by the two PBC-AEC instructors and inaugurated within PBC-AEC. Similarly, the Family Engagement component—another important measure on the QIS—was new to Prime Time and added in the spring of 2010. Also added in the spring was a component on Core Competencies and how these could be used to guide professional development in the afterschool field; this curriculum had been offered as a Prime Time training for a year prior to PBC-AEC. At different points during the year, students work on building a written portfolio of their work and training. Additional information on each of these components is attached in Appendix A.

    PBC-AEC Implementation and Context The process of recruiting participants for the PBC-AEC varied during the pilot year. The fall 2009 cohort of participants was drawn from attendance lists from recent Prime Time trainings, targeting primarily QIS programs. Participants who joined this cohort were expected by Prime Time staff to be motivated and supportive of the program as it was piloted and implemented for the first time. For the second and third cohorts that started in February 2010, Prime Time kept a list of interested agencies, including many from the school district that were not in the QIS, and randomly selected them to participate. Selection was also based upon agency location, since training for the second and third cohorts was held in different sites in the north and south parts of the county. In the summer 2010 cohort, training was held in a central location

    Safe Environment

    Supportive Environment

    Peer Interaction

    Youth Engagement

  • Chapin Hall at the University of Chicago 12

    in the county, and staff agency location was not a factor. In each instance, Prime Time staff visited selected programs, explained the training and its requirements, and asked staff to complete necessary paperwork to sign them up.

    The PBC-AEC was offered as a noncredit course through Palm Beach State College. The instructors during the past year, with the exception of one training component during the first cohort, were staff in Prime Time’s professional development department formally hired by PBSC as adjunct instructors. Beginning in the second cohort, Prime Time administered surveys to assess participants’ level of knowledge and learning in the course.

    The PBC-AEC pilot program was associated with several financial supports and incentives. Afterschool practitioners accepted into the class were eligible for scholarships, administered by Prime Time, that covered the full cost of the course and course materials. Staff were also provided financial incentives for completing individual training components, as described in Table 1. The total incentive provided for completing all 80 hours of training was $1,200, or $15 an hour—a rate estimated by program staff to exceed the hourly wage for many practitioners. Consistent with PBSC policy, to receive payment for a training session, practitioners had to arrive within 20 minutes of its scheduled start. Participants are expected to get stipends for participation during the 2010–2011 program year as well, but this is not expected to be a permanent feature of the PBC-AEC program.

    Table 1. Incentive Payment Schedule for Last Three Cohorts of Pilot Year

    Incentive amount PBC-AEC courses Course hours $150.00 Bringing Yourself to Work 9 $400.00 High/Scope Approach 30 $150.00 Academics in Afterschool 9 $100.00 Inclusion and Play 9 $150.00 Family Engagement (added after fall 2009) 10 $150.00 Youth Leadership 9 $100.00 Core Competencies (added after fall 2009) 4

    Note: The total stipend amount was the same for the first cohort, though it required 14 fewer hours of course time.

    Some financial incentives were also available after training was completed. In an attempt to fashion a comprehensive and systematic approach to afterschool workforce development and connect compensation, education, and retention, Prime Time partnered with Children’s Forum, Inc., to modify and locally administer a national program, the WAGE$ Project, which had been successfully used in the early care and education sector. The final product, the Afterschool WAGE$ Florida Project, was designed to

  • Chapin Hall at the University of Chicago 13

    increase staff stability and improve afterschool program quality by reducing turnover and encouraging continued education of afterschool practitioners. The program provides education-based salary supplements for low-to-moderate wage earners who work with children and youth in grades K–12. Practitioners must work in an eligible program participating in the Quality Improvement System, be working at the same program for at least a year prior to applying, and must continue working in the same program throughout the following year. Afterschool practitioners who meet these qualifications and successfully complete the PBC-AEC, for example, are eligible for a supplement of $200 per year ($100 in December and another $100 in May). The supplement rises as high as $3,000 for a lower-paid staff member who stays at an afterschool agency and who has earned his or her bachelor’s degree in a related field (e.g., human services, education) with 30 credits in youth development. The link between PBC-AEC and WAGE$ therefore provides at least a modest incentive for individuals completing the PBC-AEC to remain with existing agencies for at least six months following training.

    The Evolution of the PBC-AEC during the Pilot Year As expected by program staff, the PBC-AEC continued to evolve during its pilot year, and additional changes are reportedly planned for the 2010–2011 cohorts. Specific details about the four training cohorts and key changes during the past year are listed in Table 2. In addition to the changes in recruiting practices among the cohorts and the expansion from five components to seven noted earlier, the location, days, and times of training changed as well. The fourth (summer) cohort was not planned until late in the spring of 2010 and was offered in part to provide opportunities to school district staff who were partly employed or not employed during the summer.

  • Chapin Hall at the University of Chicago 14

    Table 2. Description of Training Cohorts and Changes

    Location and dates of training sessions

    Days/hours Key changes from earlier cohort trainings

    Cohort 1 September 25 to December 12, 2009 Greenacres Elementary

    Friday nights from 6:15 to 9:15 p.m.; Saturdays from 9:00 a.m. to 4:00 p.m.

    N/A

    Cohort 2 February 10 to May 1, Royal Palm Beach Elementary

    Wednesday nights from 6:15 to 9:15 p.m.; Saturdays from 9:00 a.m. to 4:00 p.m.

    Single instructor for the entire course, rather than different instructors for different components

    Addition of two more training components. “Clock hours” of training increases from 66 to 80. Course expanded from 9 to 11 weeks.

    Cohort 3 February 17 to May 8, Morikami Park Elementary

    Wednesday nights from 6:15 to 9:15 p.m.; Saturdays from 9:00 a.m. to 4:00 p.m.

    Second instructor added for teaching the entire curriculum.

    Cohort 4 May 12 to July 14, Palm Beach State College

    Monday, Wednesday, and Thursday nights, from 6:15 to 9:15 p.m.

    Offered to participants not employed during training.

    In addition to this training curriculum, Prime Time added a career advisor position in April 2010. The career advisor is expected to visit programs that have participated in PBC-AEC training, observe quality at the point of service, and work with PBC-AEC graduates and others to reinforce high-quality work and support ongoing training and education. The individual in the career advisor position was completing background training during the period covered by this implementation evaluation and was not yet engaged in an outreach and assessment role.

  • Chapin Hall at the University of Chicago 15

    Focus and Research Questions for 2009–2010 Evaluation

    Chapin Hall’s evaluation of the PBC-AEC during its pilot year was organized to describe training implementation and perceptions of its immediate impacts and potential longer-term effects. The evaluation sought to capture the experiences of participants in the program at multiple points: as they were recruited, participated, sought to apply what they learned, and finished with training. As a short-term study of a pilot implementation, the research concentrated on participants’ immediate experiences with training and suggestions for what to change and retain; their perceptions of what they had learned; and their initial experiences in trying to apply their learning in their own agencies. The evaluation also solicited participants’ predictions of longer-term effects upon such things as further training, higher education, and continued employment at their agency or in the afterschool field.

    Research questions for this study were negotiated with Prime Time before the start of the first training cohort and are listed below. They are crafted to the program assumptions identified by Prime Time: that participating in an intensive, noncredit training program will improve the knowledge and skills of afterschool program staff, improve their sense of themselves as professionals, encourage them to remain in the afterschool field, and encourage them to continue their training and formal education.

    1. What are the characteristics of PBC-AEC program participants (e.g., demographic characteristics,

    prior training and education, and experience)? How were they recruited and selected? Why did they participate? What are/were their expectations of PBC-AEC?

    2. What were participants’ experiences with the PBC-AEC program? What were the barriers to and facilitators of participating? How long does it take to complete each component of the curriculum?

    3. How satisfied were participants with the PBC-AEC program? How do they compare it to other training and professional development experiences? What did they like and dislike about the

  • Chapin Hall at the University of Chicago 16

    PBC-AEC curriculum? What would participants change, if anything, in how the PBC-AEC program is structured and operates?

    4. What effects of the PBC-AEC curriculum did participants perceive on their attitudes, knowledge, and work practices?

    5. Were participants able to apply what they learned to their program? How did their work setting support or hinder them using what they learned through PBC-AEC?

    6. What are the professional goals and intentions of PBC-AEC participants at the end of the program? Do they intend to remain in the field? Do they express interest in enrolling in additional training or college-level coursework?

    7. What are the perceptions of course instructors of the PBC-AEC program and students’ abilities to complete it?

    8. What are the expectations and perceptions of program directors of the PBC-AEC program in terms of its impact on program quality and staff retention?

    9. How well does PBC-AEC fit with and support the QIS?

    These research questions were applied with an evolving emphasis as the research progressed during the year. As noted earlier, in late spring 2010, Prime Time added a summer training cohort. Chapin Hall extended its evaluation to include these participants as well, which provided an opportunity to examine some of our initial research interests and findings from the first three cohorts, including the value of participating in training while working at a program (since not all those in the summer cohort were employed during training) and the value of PBC-AEC to staff who were in organizations participating in the QIS (since not all of those in the summer cohort were QIS participants).

    Methods To answer these research questions, we employed both primary and secondary data sources. For each of the cohorts, we conducted surveys of training participants and telephone interviews with selected participants, including front-line staff, program directors, and Prime Time staff. We also participated in ongoing conference calls with Prime Time staff to review PBC-AEC operations and reviewed secondary data.

    Participant Survey

    An 11-page survey was administered on the last day of each of the four training cohorts and captured responses from 111 of the 116 enrolled participants, a 96percent response rate. The survey consisted of several kinds of questions: Likert-type items that asked respondents to indicate level of agreement or level

  • Chapin Hall at the University of Chicago 17

    of importance for a series of statements; items that asked respondents to rank specific components of the training on different criteria; open-ended questions; and demographic information. The survey was administered at the end of training, but also employed some retrospective pre-test questions for comparison purposes. Demographic characteristics of the survey sample are provided in the findings section below. Minor changes were made to the survey following the first cohort, primarily in response to the addition of two new courses to the PBC-AEC curriculum.

    Survey data was analyzed using SPSS, examining totals across all cohorts and using crosstabs and chi-square analysis to identify differences across cohorts and between front-line staff and supervisors.

    Telephone Interviews

    We conducted interviews with 34 participants in the four cohorts, including 12 of the 17 program directors who either participated in PBC-AEC training or sent staff, and 22 participating front-line staff. Together, these interviews included at least one representative from all of the agencies participating in PBC-AEC training for the first three cohorts and all but one of the agencies participating across the four cohorts. We also interviewed PBC-AEC instructors, including a follow-up interview with the instructor who taught three of the four cohorts during the year.

    Interviews were semistructured. Key topics were covered with all respondents, and other topics were tailored to the position and background of the informant. Interviews were recorded, transcribed, and summarized, and then coded and analyzed by research question within and across training cohorts.

    Copies of the survey used for the spring and summer 2010 training cohorts, and the interview guides for the PBC-AEC instructors and training participants, can be found in Appendix B.

    Secondary Data

    Secondary data reviewed for this study included documents describing the components, goals, and expected outcomes of PBC-AEC; the PBC-AEC training manual; minutes of meetings; Prime Time’s documents about the program and participant characteristics; course evaluations; and participant “reflections” conducted by instructors. Information from these sources are used throughout this report, often as the basis of contextual or descriptive information about the program. A summary of findings specific to the PBSC course evaluations is attached as Appendix C.

    Data Analysis and Presentation

    Our research proposal did not hypothesize differences across the cohorts during the year, nor did we make initial plans to compare participants across cohorts. In the course of implementing the program during the pilot year, however, we became aware of differences in how individuals were recruited across the cohorts and differences in the demographics of the cohorts (primarily in the fourth, summer cohort). Our analysis included comparisons across cohorts, and in most cases, findings were not statistically different across

  • Chapin Hall at the University of Chicago 18

    cohorts. Where there was evidence of differences across cohorts, however, we specify these. Similarly, we did not know that supervisors would be joining front-line practitioners in training when we drafted our research plan. Given important differences between front-line and supervisory staff (e.g., in education, employment status, background), we also examined these two populations separately as part of our analysis.

    Data collection for this study was guided by the nine research questions identified earlier, and these questions were used to organize the initial stages of analysis for each data source (e.g., interviews, surveys, secondary data sources). In later analytic stages, we synthesized findings from these research questions under four umbrella questions that also approximate the chronological experiences of participants going through PBC-AEC training. These four questions incorporate the nine original research questions:

    What are the characteristics of individuals participating in PBC-AEC? (Research question 1)

    What did participants think about PBC-AEC prior to training, and as they were participating? (Research questions 2 and 3)

    What were the perceived immediate and short-term effects of training? (Research questions 4, 5, 7 and 9)

    What were the perceived effects of training on longer-term outcomes? (Research questions 6 and 8)

    In organizing the findings of this study, we typically interleaved survey and interview results. Survey findings provide data on the experiences of 96percent of all participants from the pilot year and can be interpreted as closely approximating the experience of program participants as a whole. Interview findings, by contrast, represent the experiences of about one-third of participants and use participants’ own language to describe these experiences. These interviews are not a truly random sample of the training cohort, and should be understood as helping to identify the qualitative experiences of participants, rather than as data from a proportional and generalizable sample.

    One of the benefits of using multiple forms of data collection is the opportunity to reflect on the unique contributions generated from each method, as well as to understand the overlap that exists between them. In our presentation of findings, we explore the ways in which different sources of data are complementary to our understanding of program implementation as a whole.

  • Chapin Hall at the University of Chicago 19

    Implementation Study Findings

    As noted, we adopted four questions to frame our presentation of findings in this section. Because some of PBC-AEC’s possible effects are both short term and potentially longer term (e.g., improvements in the quality of service provided by staff), these categories are not always mutually exclusive. We provide them as one narrative strategy for organizing the experience of participants in the PBC-AEC program.

    What Are the Characteristics of Individuals Participating in PBC-AEC?

    Demographic Characteristics

    Survey data collected from participants in all four cohorts of the PBC-AEC program provides a picture of PBC-AEC participants as a whole during the period from fall 2009 to summer 2010. Survey respondents4 across all four cohorts were predominantly female (79%) but racially and ethnically diverse (see Table 1 in Appendix D). Forty-one percent of respondents identified themselves as “White, not Hispanic,” 29percent identified themselves as “Black, not Hispanic,” and 30percent identified themselves in other racial and ethnic categories. Respondents were front-line staff (81%) and program directors or assistant program directors (19%) from 17 different afterschool programs.5 Twelve of the seventeen participating program directors attended the same cohort with other staff members from their programs. Two of the participating program directors participated in different cohorts from their staff, and three program directors did not attend any of the four cohorts. A diagram of supervisors and front-line staff participation across the four training cohorts is attached as Appendix E.

    4 Because our sample includes 96% of the participants, we sometimes adopt the shorthand “participants” in describing survey sample results.

    5Seven respondents identified themselves by “other” titles, including paraprofessional, clerical assistant, office person, learning coordinator, volunteer, and department director. For the purposes of consistency in comparing “all” participants and differences between front-line staff and supervisors, we have excluded these 7 “other” individuals from our survey analysis.

  • Chapin Hall at the University of Chicago 20

    Respondents ranged broadly in age from 18 to 66, with most participants (66%) between 18 and 25 years old. Front-line staff were in general younger than program directors; most front-line staff (75%) were between the ages of 18 and 25, and most program directors (89%) were between the ages of 26 and 58.

    Educational Characteristics

    The respondents in our sample started PBC-AEC with a wide range of prior education, training, and experience, with program directors generally having higher levels than front-line staff. Seventy-seven percent of front-line staff reported that the highest education they had completed was high school/GED or some college, while 68percent of directors/assistant directors reported having an associate’s degree or higher (see Table 2 in Appendix D). Fifty-three percent of directors/assistant directors had earned the 40-hour Child Care Certificate with the Advancing Youth Development (AYD) curriculum, compared to 29percent of front-line staff. While the reasons are unclear, of the 34percent of respondents who had earned the 40-hour Child Care Certificate across all cohorts, more than twice as many in the first and third cohorts held this certification than did in the second and fourth cohorts. Fifty-three percent of directors had received the Director’s Credential. Only 13percent of all respondents, however, had earned the School Age Professional Certificate.

    Professional Characteristics

    Seventy-nine percent of all respondents reported working for school-based organizations (see Table 3a in Appendix D). Respondents were part time and full time (greater than 30 hours per week), and the vast majority (82%) of front-line staff working less than full time in the afterschool field reported that they would like to work full time.

    Overall, directors and assistant directors were far more likely to have long-term plans to stay in the field. Eighty-three percent of directors and assistant directors indicated they were planning to stay 6 years or more; the comparable figure for front-line staff was 35percent. Thirty-two percent of front-line staff said they planned to stay in the field only 2 to 3 years. Thus, the PBC-AEC program served a diverse population of staff during its pilot year, with different levels of formal education, experience, and expectations for remaining in the field following training.

  • Chapin Hall at the University of Chicago 21

    What Did Participants Think about PBC-AEC Prior to Training and as They Were Participating?

    Survey Findings: Engagement, Recruitment, and Concerns prior to Starting Training

    Participants were asked to rate their level of agreement with a series of questions about PBC-AEC training. These questions were all asked at the conclusion of training, but some asked participants to recall the different influences on their earlier decision to participate in training. As indicated in Table 3 below, 96 percent of survey respondents reported that their supervisors or program administrators encouraged them to participate in the PBC-AEC program, a figure that was similar for program directors and assistant directors and front-line staff. (Tables 4a and 4b in Appendix D provide the data delineated by job role.) Almost three-fourths of program directors/assistant program directors and more than half of front-line staff identified concerns about the time training would take. Front-line staff were more worried than program directors that the course material would be too hard (20% and 5%, respectively). Most respondents (63%) indicated that they had not experienced “challenging barriers” to participating in the AEC.

    Eighty-three percent of all respondents agreed that the PBC-AEC financial incentive was an important consideration in their decision to participate. Ninety-four percent of respondents agreed or strongly agreed that they expected the program “to teach me a lot” and that attending with coworkers would be beneficial. Most respondents also agreed or strongly agreed with the idea that PBC-AEC would be like a lot of other afterschool training (74%) and that participating in the program would make them more likely to stay with their current jobs (71%).

  • Chapin Hall at the University of Chicago 22

    Table 3. Perceptions of the AEC prior to Program Start (as Recalled at Program Completion) that Influenced the Decision to Participate and Contributed to Expectations of the Program % % % % Strongly

    disagree Disagree Agree Strongly

    agree I expected that PBC-AEC would be fun 3 21 57 20

    I expected that PBC-AEC would teach me a lot. 1 5 48 46

    I believed that attending PBC-AEC with others from my school/agency would be a good idea.

    4 2 27 67

    I thought that participating in PBC-AEC would make me more likely to stay with my current job.

    1 28 50 21

    The Prime Time Education incentive award was important in my decision to participate.

    2 14 42 41

    The WAGE$ program influenced my decision to participate in PBC-AEC.

    23 38 30 9

    My supervisor or program administrator encouraged me to participate.

    2 2 49 47

    My coworkers influenced my decision to participate. 11 20 48 21

    I thought PBC-AEC would be a lot like other afterschool training.

    4 22 58 16

    I was worried about the time it would take. 11 29 43 17

    I was worried that the course material would be too hard.

    28 55 13 4

    I had to overcome challenging barriers to be able to participate in PBC-AEC training.

    12 51 25 12

    Interview Findings: Engagement, Recruitment, and Concerns prior to Starting Training

    Recruitment Data from interviews with 34 participants during the year were consistent with and expanded upon these survey findings. Some front-line staff were encouraged by their director to participate in the PBC-AEC as a way to further their education and skills, and in other cases, program directors removed the decision-

  • Chapin Hall at the University of Chicago 23

    making power from the front-line staff entirely and mandated participation. Several staff also reported that Prime Time successfully encouraged their participation by making presentations at their agencies and through other existing, ongoing relationships with Prime Time. Several participants described their interest in participating in this training as influenced by either their generally favorable views of Prime Time or specific prior experiences with Prime Time training. It was material to many participants that Prime Time was the organization offering this training.

    Barriers to and Facilitators of Participation Our interviews were conducted only with individuals who had been able to overcome any barriers to involvement in PBC-AEC and remain in training for its duration. As such, our sample cannot speak to the direct experience of staff that may have been interested in training but were unable to attend. (Prime Time staff identified only a few who vocalized an intention to join training, or even signed up for it, but did not participate.) Several staff we talked to specifically said they encountered no barriers as they made their decision, but some training participants were able to recall concerns as they considered whether or not to sign up for PBC-AEC.

    The two most common barriers, in all cohorts, related to time and scheduling demands. Some specific concerns included the long day of “work” that resulted from evening training during the work week, class being held on Friday night (a first cohort training evening), and even difficulty in waking up on a Saturday morning. However, in the summer cohort, one participant thought having a weekend day and two nights would have been preferable to the three weeknights schedule, hinting at the idiosyncratic nature of scheduling difficulties and preferences. A related concern was that the total number of hours and weeks required by PBC-AEC training was simply too great. As one participant did the calculations:

    The length of the actual sessions is almost like three months, right? So, I think that’s a lot of wear and tear, especially when you have a family….Basically if you work full time you’re going Friday night until 9:30 and then Saturday all day, you really don’t have any time, personal time.

    Some of the additional barriers identified prior to starting training included concerns that the class would not be interesting, that there would be too much homework, and that transportation to the training location would be difficult.

    To understand more about PBC-AEC potential from a broader policy perspective, we also asked participants about potential barriers for other staff at their program or other afterschool programs. Their responses identified some of the same concerns mentioned above (e.g., scheduling, insufficient time for additional training, and location), but also raised questions about dispositional or contextual differences between those who were participating and those who were not. Staff speculated that it might take a lot of time and effort to get some practitioners interested; that practitioners may not know about Prime Time or its good reputation (and, may therefore not be as predisposed to join this particular Prime Time training);

  • Chapin Hall at the University of Chicago 24

    practitioners may not be interested in the field enough to spend time in trainings; or that they may be unwilling to try the PBC-AEC because they are “set in their old ways.” In the summer cohort, planned vacations were reportedly another barrier identified for some colleagues who opted not to participate.

    Our interview informants also identified a range of factors that facilitated joining training. Among program directors, some were drawn to PBC-AEC by the possibility of attending training with their staff. The social aspect of participating with colleagues, or understanding what they did and not wanting to be left out, also encouraged participation. As one respondent, whose coworkers took the AEC in an earlier cohort explained,

    I really wanted to learn everything that they were learning. They were coming back and telling me about stuff from class and I had no clue what they’re talking about. I really wanted to be able to be on their level.

    In addition, specific logistical features (e.g., location of training, availability in their schedule) were influential for many participants. Knowing specific details of the program, particularly regarding the financial incentive, also facilitated participation. One participant in the summer cohort suggested that having the PBC-AEC over summer break made it more accessible for college-aged participants. Finally, and critically, almost all respondents (again, reflecting upon their earlier decision at a point when they had completed training) expressed an intrinsic motivation to improve their knowledge and work practices, and were especially interested in PBC-AEC as the forum to pursue this because of Prime Time’s reputation as a provider of high-quality training and support. This was particularly true among program directors, who served as key recruiting points of contact with the PBC-AEC during the pilot year.

    Survey Findings: Experiences with the Program

    Overall, respondents’ experiences with the program matched or exceeded their expectations. For some of the survey items regarding respondents’ expectations of the program prior to its start, we asked the same (or very similar) questions about their views at the end of the program (see Table 4). The notable changes for these items are in the extremes of the rating scale (i.e., strongly agree and strongly disagree). For example, when asked at program completion to recall their feelings about the training prior to its start, 20percent strongly agreed that they thought it would be “fun.” But 74percent strongly agreed that it was “fun” at completion (and 100% either agreed or strongly agreed it was “fun”). Forty-sixpercent strongly agreed that they thought PBC-AEC “would teach me a lot,” but by the end of training, 77percent strongly agreed that the PBC-AEC had taught them a lot (and 100% either agreed or strongly agreed that the AEC had taught them a lot).

  • Chapin Hall at the University of Chicago 25

    Table 4 . Survey Items Using Similar Language: Changes in Strong Agreement prior to Program (as Recalled) and at End of Program % % % Strongly

    agree (prior to start)

    Strongly agree (at

    completion)

    Amount of change

    I expected that PBC-AEC would be fun / PBC-AEC was fun

    20

    74

    +54

    I expected that PBC-AEC would teach me a lot / PBC-AEC taught me a lot

    46

    77

    +31

    I believed that attending PBC-AEC with others from my school/agency would be a good idea / Attending PBC-AEC with others from my agency was a good idea

    67

    74

    +7

    I believed there was much more that I could learn about providing quality afterschool care / I believe there is much more I can learn about high-quality afterschool care

    35

    48

    +13

    Responses to other questions about views at the end of training also demonstrate a pattern of positive attitudes and high levels of support for the program (see Table 6 in Appendix D). For example, 60percent of respondents either agreed or strongly agreed that they were “worried about the time it would take” when thinking back to the start of training, but only 12percent either agreed or strongly agreed at the end that the program was not “worth the time I invested.” A full 100percent of respondents reported that they “learned more from PBC-AEC than I expected.” Finally, while 76percent of participants either agreed (61%) or strongly agreed (15%) that the PBC-AEC “would be a lot like other afterschool training” prior to the program’s start, 96percent agreed (44%) or strongly agreed (52%) upon completion that “the PBC-AEC was better than other afterschool training I’ve had.”

    Respondents also indicated that, in most ways, participating in the training series with their colleagues was beneficial to them (see Table 7 in Appendix D). For example, 97percent reported that participating with their colleagues “helped reinforce my learning while at work” and “provided good opportunities for me to teach my coworkers.” In addition, 96percent responded that participating with colleagues “provided important social support for doing the work,” and 95percent agreed that participating with colleagues “helped me learn from my coworkers.”

  • Chapin Hall at the University of Chicago 26

    Interview Findings: Experiences with the Program

    Consistent with survey data findings, interview participants described the PBC-AEC in largely positive language in terms of how they felt during training (e.g., it was “fun”), what they felt they learned, and how they benefited. At the same time, some respondents were able to identify “barriers” they experienced during the operation of the program. Some of these barriers were similar to those they considered in their initial decision to participate (e.g., scheduling, length of day, conflict with existing events on their calendars, transportation). Others were not foreseen and arose only when the program started. A shared complaint related to transportation and commuting difficulties. Others included the lack of air conditioning in the training room (first cohort), the pressure to arrive on time to class, and the individual workload of quizzes, tests, and homework (second and fourth cohorts). While most participants believed the use of small groups during training facilitated their continued involvement, we heard from several participants for whom this approach was not comfortable at first.6

    Indeed, the hands-on, group-based, interactive teaching style was commonly mentioned as one of the leading aspects of the program that made it easy for participants to keep coming to training, though participants cited several others as well. Participants often mentioned enjoying teamwork and friendships within their class, the instructors, how much fun they had in training, the way that a feeling of teamwork supported a sense of accountability for coming to training and doing the work, and the convenient time and location for training. Additionally, the learning and sense of achievement they experienced during the PBC-AEC program encouraged participants from each cohort.

    One aspect of the program that encouraged regular and sustained involvement was the weekly feedback loop of being able to learn a concept in the course, apply it at the program site, and reconvene to discuss implementation and possible changes (see Figure 2 below).

    6 The PBC-AEC training curriculum acknowledges that some learners are most comfortable starting with hands-on, social experiences, while others are most comfortable starting with conceptual presentations. The intention of the PBC-AEC, as noted elsewhere in this report, is to provide multiple kinds of learning experiences over time, so that individuals have a chance to connect to the learning materials at different points and in different ways.

  • Chapin Hall at the University of Chicago 27

    Figure 2. PBC-AEC Cycle of Learning, Trying, Reflecting, and Refining

    This cycle of learning was identified as the favorite part of the AEC by several participants. One respondent explained its value, saying that she enjoyed

    …being able to take what I learned and applying it right away…and not having to fumble through things. In other words, it was easy to understand since I didn’t have to go back and read it all over again to get it.7

    This feedback loop was absent for some summer participants who were not working with children at the time of their participation in the AEC. One participant described this arrangement as creating a barrier to implementation due to the lag of time between learning and applying concepts, saying it would have been

    7 Most participants in PBC-AEC training are female, and for purposes of confidentiality we have uniformly adopted the female gender when quoting respondents to avoid awkward “she/he” language.

    Introduce or

    refine PBC-AEC

    concept

    Implement at participants’

    agencies

    Solicit feedback

    in next PBC-AEC

    session

  • Chapin Hall at the University of Chicago 28

    better to “use it right away,” as it would have “stayed in my mind.” Other summer cohort participants, however, were employed during training and were able to experience this feedback loop without this lag.

    The fact that participants in training were typically present with others from their own agency also created dynamics that supported continued involvement in training. Practitioners for whom training was mandated or strongly encouraged by supervisors (many of whom were also present) likely found it more difficult to miss training or arrive late. Participant attendance overall was almost perfect, according to Prime Time staff, creating a continuity within the cohorts. But participants described additional relational and reciprocal benefits to participating with others from their work. Having staff from agencies participate together increased comfort in consulting with coworkers, eased anxiety about group activities during training, facilitated the learning process, and more generally increased social ties among staff from the same agency.

    Participants often mentioned feeling closer to and more familiar with their coworkers, as individuals and as a team. Directors were especially articulate in describing the value of working with their staff out of the agency context and being seen more as equals and classmates. As one director said, “The staff saw a different side of me. I enjoyed spending this time with them outside of work.” Several front-line staff echoed this sentiment.

    Furthermore, a number of participants noted that being at the training with others from their program allowed for specific and realistic dialogue about their program and children in relation to PBC-AEC lessons and concepts. Some, indeed, could not identify any drawbacks to having all of the staff from their agency participate in training. However, a few participants who took the course on their own did suggest that there was some value in not participating with colleagues, such as being able to think more for themselves and have their own, independent opinion and voice in the course.

    Survey Findings: Satisfaction with the Overall Program and Specific Courses

    Training participants indicated very high levels of satisfaction with the PBC-AEC. Ninety-eight percent of all respondents agreed or strongly agreed that “overall, I was very satisfied with PBC-AEC training.” Eighty-four percent of all respondents agreed that the “PBC-AEC classes fit my schedule,” although directors and assistant directors were somewhat more likely than front-line staff to disagree with this statement. As noted earlier, in comparison to other trainings respondents had participated in, 96percent agreed with the statement that PBC-AEC “was better than other afterschool training I have had.”

    As indicated in Table 5 below, respondents were exceptionally positive about what they learned from the program, with almost 100percent reporting they agree or strongly agree that the PBC-AEC was “very effective” in teaching core course concepts.

  • Chapin Hall at the University of Chicago 29

    Table 5. Perceptions of What Participants Learned from PBC-AEC Training

    % % Agree Strongly agree PBC-AEC was very effective in teaching… …how professional development opportunities fit together in Palm Beach County (n = 51)*

    38 60

    …how to improve relationships with coworkers 35 63 …how to involve families in ways that meet their needs effectively (n = 53)* 33 63 …how to improve the overall social and emotional environments of programs 31 68 …how to foster an environment that is built on a foundation of empathy for others 32 68 …how to promote enthusiasm for learning 30 70 …how to develop or strengthen youth leadership skills and governance by creating youth advisory councils

    25 73

    …how to engage youth into peer staff positions, leadership roles, and partnerships 24 74 …how to use play to encourage positive self-expression and creativity 25 74 …how to provide a program environment that is safe, supportive, and engaging 22 78 *These components were not part of the curriculum for the first cohort so the number of responses is lower for these items.

    Although these areas of learning are designed to map closely upon the courses within the training series, respondents had more diverse views when asked about the courses individually by name (see Tables 9 and 9a in Appendix D). In general, Inclusion and Play was the most highly regarded course by respondents; however, Youth Leadership, Bringing Yourself to Work (BYTW), and High/Scope were also highly regarded.8 For example:

    Over half of participants (53%) identified either High/Scope or Inclusion and Play as the course they “would recommend most to others in the afterschool field.”

    Nearly half of participants (46%) identified Youth Leadership or Inclusion and Play as the “one component that was the most valuable to my own work practices.”

    Nearly half of participants (47%) reported that BYTW or Inclusion and Play “had the biggest positive impact on my attitude about my work.”

    Nearly half of participants (46%) reported that Youth Leadership or Inclusion and Play “inspired me the most to want to learn more.”

    8 Family Engagement and Core Competencies courses were not included in the curriculum for the first cohort, so data are drawn from the second, third, and fourth cohorts to ensure comparability of responses using the largest sample of respondents.

  • Chapin Hall at the University of Chicago 30

    In general, the Core Competencies and Family Engagement courses, which were added after the first cohort, were held in lower regard by respondents. For example, almost half of participants (45%) identified either Core Competencies (23%) or Family Engagement (22%) as the “most difficult component for me.” When considering how to interpret the responses to the items pertaining to the different courses, it is important to note that for the negatively worded survey items, several participants responded with “Not Applicable” or did not respond. Although “Not Applicable” was an option on most of the items in the survey, it was most often used in response to questions that were phrased negatively (i.e., “this component was the most difficult for me,” “less teaching time should be devoted to this one component,” and “this component was the least valuable to my own work”). The very positive findings about the PBC-AEC program in other survey items raises the question of whether these “Not Applicable” responses reflect respondents’ difficulty in critiquing individual courses as, for example, “least valuable” or “most difficult.”

    Interview Findings: Satisfaction with the Overall Program and Specific Courses

    Consistent with the overwhelmingly positive survey findings, when asked directly about their satisfaction with the program in an interview, every respondent indicated that, on the whole, they were satisfied with the program. Program directors described themselves as satisfied with their own experiences and with the experiences of their participating staff. Also consistent with survey findings, the respondents frequently said that their experience exceeded their expectations:

    It blew my expectations away. I thought it was going to be a repeat of everything that I knew. But it wasn’t. It was nothing like I knew. Nothing at all. I thought I knew, I thought I was doing the job like I was supposed to.

    Specific aspects identified as most satisfying included the instructors; materials provided to take back to work; the way training served to bring together members of different programs to collaborate and share ideas and knowledge; and the interactive group structures and activities. Furthermore, several respondents simply mentioned the satisfaction of achievement upon completing the PBC-AEC.

    Many participants did not identify any aspects of the program that they disliked. Among those who did, several common sources of dissatisfaction related to issues that were also noted as “barriers” in our earlier discussion, such as the commute to class, the time commitment, and the weekly quizzes.

    For the training curriculum in particular, about half of those we talked to had no suggestions for changes. Those commenting positively on the curriculum mentioned its accessibility to staff with a range of educational and professional backgrounds; its direct ties to QIS assessment; its incorporation of materials that could be brought back to work; and the hands-on way in which the curriculum was taught.

  • Chapin Hall at the University of Chicago 31

    Participants frequently mentioned satisfaction with specific lessons or skills that were threaded throughout the program, including grouping strategies, reflections, giving children voice, and program scheduling.

    Concerns that were raised about the curriculum centered on the experiences of the first cohort with the Bringing Yourself to Work segment (though one individual also identified this as her favorite part of the curriculum) and the weekly quizzes. Additionally, a few participants working with children in an older or younger age group felt the curriculum was less directly applicable to their work. Several participants also expressed some uncertainty regarding the relevance and implementation feasibility of aspects of the Youth Advisory Council curriculum.9

    Because PBC-AEC is one of many training opportunities offered to staff in Palm Beach County, and it includes training elements that are available separately, we were interested in understanding more about how it compared with other offerings familiar to PBC-AEC participants. The baseline perception about Prime Time was on the whole very positive, but, consistent with survey findings, PBC-AEC was described overall as exceeding Prime Time’s other trainings. A number of respondents did not have previous experience with Prime Time trainings, and some of them described a new interest in participating in future Prime Time offerings.

    Compared to other Prime Time training, PBC-AEC was described as more intense (e.g., longer, more training sessions each week), which felt to some participants like it allowed the instructor to make sure everyone understood the material. It was described as more interactive, and with greater allowances for staff to talk with and learn from each other. The stipend was mentioned as a distinguishing feature, as well as training with many others who appeared to genuinely want to participate. More generally, respondents mentioned the helpful way in which PBC-AEC training expanded upon other training provided by Prime Time and allowed the material to be explored in greater depth. The setting and structure of the PBC-AEC, including small classrooms, were also described as providing incentives for learning.

    Those with experiences in PBC-AEC and non–Prime Time training described an even larger contrast than that between PBC-AEC and other Prime Time trainings. They felt PBC-AEC was more interactive and hands-on than the trainings offered through the school district, and it provided more varied and creative tools for afterschool providers; encouraged staff to think flexibly about how to apply what was presented, rather than promoting a “one correct way” perspective; provided more relevant training than school district offerings; and more systematically encouraged all participants to be actively engaged in training. Overall, individuals described PBC-AEC as “better,” “different,” and “sim