description: tags: 2006-324l

Embed Size (px)

Citation preview

  • 8/14/2019 description: tags: 2006-324l

    1/32

    1

    LANGUAGE AND VOCABULARY DEVELOPMENT SPECIAL EDUCATION

    RESEARCH GRANTS PROGRAM

    CFDA NUMBER: 84.324

    RELEASE DATE: May 6, 2005

    REQUEST FOR APPLICATIONS NUMBER: NCSER-06-03

    INSTITUTE OF EDUCATION SCIENCEShttp://www.ed.gov/about/offices/list/ies/programs.html

    LETTER OF INTENT RECEIPT DATE: June 6, 2005

    APPLICATION RECEIPT DATE: August 4, 2005, 8:00 p.m. Eastern time

    THIS REQUEST FOR APPLICATIONS CONTAINS THE FOLLOWING INFORMATION:1. Request for Applications2. Overview of the Institute's Research Programs3. Purpose and Background4. Requirements of the Proposed Research5. Applications Available6. Mechanism of Support7. Funding Available8. Eligible Applicants9. Special Requirements10. Letter of Intent11. Submitting an Application12. Contents and Page Limits of Application13. Application Processing14. Peer Review Process15. Review Criteria for Scientific Merit16. Receipt and Review Schedule17. Award Decisions18. Where to Send Inquiries19. Program Authority20. Applicable Regulations21. References

    1. REQUEST FOR APPLICATIONS

    The Institute of Education Sciences (Institute) invites applications for research projects that willcontribute to its Language and Vocabulary Development Special Education Research program(Language/Vocabulary). For this competition, the Institute will consider only applications thatmeet the requirements outlined below under the section on Requirements of the ProposedResearch.

    http://www.ed.gov/about/offices/list/ies/programs.htmlhttp://www.ed.gov/about/offices/list/ies/programs.html
  • 8/14/2019 description: tags: 2006-324l

    2/32

    2

    For the purpose of this Request for Applications (RFA), students with disabilities are as definedin the Individuals with Disabilities Education Act as a child (i) with mental retardation, hearingimpairments (including deafness), speech or language impairments, visual impairments(including blindness), serious emotional disturbance (referred to in this title as emotionaldisturbance), orthopedic impairments, autism, traumatic brain injury, other health impairments,

    or specific learning disabilities; and (ii) who, by reason thereof, needs special education andrelated services. (Part A, Sec. 602). Students who have already been identified with a disabilityas defined here, as well as students at high risk of developing a disability without preventiveservices are of interest under this RFA.

    2. OVERVIEW OF THE INSTITUTE'S RESEARCH PROGRAMS

    The Institute supports research that contributes to improved academic achievement for allstudents, and particularly for those whose education prospects are hindered by conditionsassociated with poverty, minority status, disability, family circumstance, and inadequateeducation services. Although many conditions may affect academic outcomes, the Institutesupports research on those that are within the control of the education system, with the aim of

    identifying, developing and validating effective education programs and practices. Theconditions of greatest interest to the Institute are curriculum, instruction, assessment andaccountability, the quality of the teaching and administrative workforce, resource allocation, andthe systems and policies that affect these conditions and their interrelationships. In this section,the Institute describes the overall framework for its research grant programs. Specificinformation on the competition(s) described in this announcement begins in Section 3.

    The Institute addresses the educational needs of typically developing students through itsEducation Research programs and the needs of students with disabilities through its SpecialEducation Research programs. Both the Education Research and the Special Education Researchprograms are organized by academic outcomes (e.g., reading, mathematics), type of educationcondition (e.g., curriculum and instruction; teacher quality; administration, systems, and policy),grade level, and research goals.

    a. Outcomes. The Institute's research programs focus on improvement of the followingeducation outcomes: (a) readiness for schooling (pre-reading, pre-writing, early mathematics andscience knowledge and skills, and social development); (b) academic outcomes in reading,writing, mathematics, and science; (c) student behavior and social interactions within schoolsthat affect the learning of academic content; (d) skills that support independent living forstudents with significant disabilities; and (e) educational attainment (high school graduation,enrollment in and completion of post-secondary education).

    b. Conditions. In general, each of the Institute's research programs focuses on a particular typeof condition (e.g., curriculum and instruction) that may affect one or more of the outcomes listedpreviously (e.g., reading). The Institute's research programs are listed below according to theprimary condition that is the focus of the program.

    (i) Curriculum and instruction. Several of the Institute's programs focus on the developmentand evaluation of curricula and instructional approaches. These programs include: (1)Reading and Writing Education Research, (2) Mathematics and Science Education

  • 8/14/2019 description: tags: 2006-324l

    3/32

    3

    Research, (3) Cognition and Student Learning Education Research, (4) Reading andWriting Special Education Research, (5) Mathematics and Science Special EducationResearch, (6) Language and Vocabulary Development Special Education Research, (7)Serious Behavior Disorders Special Education Research, (8) Early Intervention andAssessment for Young Children with Disabilities Special Education Research, and (9)

    Secondary and Post-Secondary Outcomes Special Education Research.

    (ii) Teacher quality. A second condition that affects student learning and achievement is thequality of teachers. The Institute funds research on how to improve teacher qualitythrough its programs on (10) Teacher Quality Read/Write Education Research, (11)Teacher Quality Math/Science Education Research, (12) Teacher Quality Read/WriteSpecial Education Research, and (13) Teacher Quality Math/Science Special EducationResearch.

    (iii) Administration, systems, and policy. A third approach to improving student outcomes isto identify systemic changes in the ways in which schools and districts are led, organized,

    managed, and operated that may be directly or indirectly linked to student outcomes. TheInstitute takes this approach in its programs on (14) Individualized Education ProgramsSpecial Education Research (15) Education Finance, Leadership, and ManagementResearch, (16) Assessment for Accountability Special Education Research, and (18)Research on High School Reform.

    Applicants should be aware that some of the Institute's programs cover multiple conditions. Ofthe programs listed above, these include (3) Cognition and Student Learning, (14) IndividualizedEducation Programs Special Education Research, and (15) Education Finance, Leadership, andManagement. Finally, the Institute's National Center for Education Statistics supports the (17)National Assessment of Educational Progress (NAEP) Secondary Analysis Research Program.The NAEP Secondary Analysis program funds projects that cut across conditions (programs,practices, and policies) and types of students (regular education and special education students).

    c. Grade levels. The Institute's research programs also specify the ages or grade levels coveredin the research program. The specific grades vary across research programs and within eachresearch program, and grades may vary across the research goals. In general, the Institutesupports research for (a) pre-kindergarten and kindergarten, (b) elementary school, (c) middleschool, (d) high school, (e) post-secondary education, (f) vocational education, and (g) adulteducation.

    d. Research goals. The Institute has established five research goals for its research programs(http://www.ed.gov/about/offices/list/ies/programs.html). Within each research program one ormore of the goals may apply: (a) Goal One identify existing programs, practices, and policiesthat may have an impact on student outcomes and the factors that may mediate or moderate theeffects of these programs, practices, and policies; (b) Goal Two develop programs, practices,and policies that are potentially effective for improving outcomes; (c) Goal Three establish theefficacy of fully developed programs, practices, or policies that either have evidence of potentialefficacy or are widely used but have not been rigorously evaluated; (d) Goal Four provide

    http://www.ed.gov/about/offices/list/ies/programs.htmlhttp://www.ed.gov/about/offices/list/ies/programs.html
  • 8/14/2019 description: tags: 2006-324l

    4/32

    4

    evidence on the effectiveness of programs, practices, and policies implemented at scale; and (e)Goal Five develop or validate data and measurement systems and tools.

    Applicants should be aware that the Institute does not fund research on every condition andevery outcome at every grade level in a given year. For example, at this time, the Institute is not

    funding research on science education interventions (curriculum, instructional approaches,teacher preparation, teacher professional development, or systemic interventions) at the post-secondary or adult education levels. Similarly, at this time, the Institute is not funding researchon measurement tools relevant to systemic conditions at the post-secondary or adult levels.

    For a list of the Institute's FY 2006 grant competitions, please see Table 1 below. This listincludes the Postdoctoral Research Training Fellowships in the Education Sciences, which is nota research grant program. Funding announcements for these competitions may be downloadedfrom the Institute's website at http://www.ed.gov/about/offices/list/ies/programs.html. Releasedates for the Requests for Applications vary by competition.

    Table 1: FY 2006 Research Grant Competitions:1 Reading and Writing Education Research2 Mathematics and Science Education Research3 Cognition and Student Learning Education Research4 Reading and Writing Special Education Research5 Mathematics and Science Special Education Research6 Language and Vocabulary Development Special Education Research7 Serious Behavior Disorders Special Education Research8 Early Intervention and Assessment for Young Children with Disabilities Special

    Education Research9 Secondary and Post-Secondary Outcomes Special Education Research10 Teacher Quality Read/Write Education Research11 Teacher Quality Math/Science Education Research12 Special Education Teacher Quality Research Read/Write13 Special Education Teacher Quality Research Math/Science14 Individualized Education Programs Special Education Research15 Education Finance, Leadership, and Management Research16 Assessment for Accountability Special Education Research17 National Assessment of Educational Progress Secondary Analysis Research

    Program18 Research on High School Reform19 Education Research and Development Centers20 Postdoctoral Research Training Fellowships in the Education Sciences

    3. PURPOSE AND BACKGROUND

    A. Purpose of the Language/Vocabulary Special Education Research ProgramThrough its Language/Vocabulary research program, the Institute intends to contribute to theimprovement of language and vocabulary skills and reading outcomes of students withdisabilities by: (a) identifying curriculum and instructional practices that are potentially effective

    http://www.ed.gov/about/offices/list/ies/programs.htmlhttp://www.ed.gov/about/offices/list/ies/programs.htmlhttp://www.ed.gov/about/offices/list/ies/programs.html
  • 8/14/2019 description: tags: 2006-324l

    5/32

    5

    for improving language and vocabulary outcomes for children with disabilities; (b) developinginterventions for addressing the underlying causes of language and vocabulary difficulties ofstudents with disabilities from kindergarten through middle school; (c) establishing the efficacyof existing interventions and approaches for reducing/preventing language and vocabularydifficulties of students with disabilities from kindergarten through middle school; (d) providing

    evidence on the effectiveness of language or vocabulary development interventions implementedat scale, whether they focus on language and vocabulary outcomes or reading/writing outcomes;and (e) developing and validating language or vocabulary assessment tools for students withdisabilities in kindergarten through middle school. Interventions appropriate for developmentand/or evaluation under this program are interventions intended to improve language orvocabulary outcomes of children with disabilities or children at high risk for learning disabilities.Interventions may be school-based interventions or home-based interventions that are integratedwith school-based interventions and intended to support children's success in school. The long-term outcome of this program will be an array of tools and strategies (e.g., assessments tools,instructional approaches) that have been documented to be effective for improving the languageand vocabulary skills of children with disabilities from kindergarten through middle school.

    B. Background of the Language/Vocabulary Special Education Research ProgramSeveral noteworthy and comprehensive reading research reviews conducted in the last decade(Adams, 1990; National Research Council, 1998; National Reading Panel, 2000) now makeconspicuously apparent the role of oral vocabulary in language development, beginning readinginstruction, and reading comprehension. In the early preschool and primary years, oralvocabulary far outstrips print vocabulary (Kamil & Hiebert, in press, p. 1-4) when words areknown largely as oral representations. However, print vocabulary quickly takes on anincreasingly larger role in literacy than does oral vocabulary (Kamil & Hiebert, in press) aschildren move from learning to read to reading to learn (Chall, 1983; Kameenui, Adams, &Lyon, 1990) in more sophisticated content area texts, and the demands of learning unfamiliarvocabulary words in both oral and print form increase dramatically. The demands of unfamiliarwords in both oral and written form exact a range of consequences on the productive (speakingor writing) and receptive vocabulary knowledge (listening or reading) of students withdisabilities, who are often challenged mightily in their efforts to gain cognitive access to wordsin whatever form or context. Students with disabilities do not attain the same performancethresholds as their peers on a range of language, reading, mathematics, and state outcomemeasures.

    Vocabulary learning manifests its greatest growth and development during the schooling years,while systematically and dependably exerting its greatest impact on academic achievementmeasures at successive grade levels and beyond (Nagy, in press). Plainly, a long-term,comprehensive program of vocabulary instruction and learning throughout a childs growth anddevelopment in both home and school is essential for all children but particularly for childrenwith disabilities who are often at serious risk for early language development, intellectualgrowth, and reading competence, and concomitantly at serious risk for later academic and socialdevelopment as young adults and adults as full participants in the life of a community.

    Current research on vocabulary learning and development offers general education practitionerswith a reasonable set of tenets and a range of effective instructional strategies to guide

  • 8/14/2019 description: tags: 2006-324l

    6/32

    6

    vocabulary instruction (Baumann & Kameenui, 2004; Kamil & Hiebert, in press; NICHD,2000), including: (a) teaching individual vocabulary words directly to comprehend specific texts,(b) providing multiple exposures of vocabulary words in multiple contexts, (c) utilizing richcontexts for vocabulary learning, (d) restructuring vocabulary tasks to clarify student responserequirements, (e) maximizing active engagement when learning vocabulary, (f) recognizing the

    efficacy of incidental learning of vocabulary, and (g) relying on multiple instructional methodsfor a range of vocabulary learning tasks and outcomes. The extension and application of thesestrategies to the vocabulary learning and development of students with disabilities require furtherprogrammatic research.

    The Institute intends to address the need for education interventions to improve language andvocabulary development of students with disabilities through its Language/Vocabulary researchprogram. For example, what are the number of words and the particular words that should betaught and learned, at what particular points in a childs vocabulary growth and readingdevelopment beginning in kindergarten through middle school, for what specific social andacademic purposes and contexts, and to what criterion levels of performance? What levels of

    instructional intensity (e.g., high intensity includes daily levels of frequent and distributedpractice on selected word meanings in a range of text or non-text conditions), specificity (e.g.,highly specified instruction includes explicit teacher scaffolding of verbal support andprompting), or linguistic or literacy emphasis (e.g., complex linguistic emphasis includesvocabulary words taught in rich, connected text; simple linguistic emphasis includes vocabularytaught through a sequence of word association examples) are necessary to ensure high thresholdlevels of vocabulary performance on a range of vocabulary and reading measures? The programwill support research to develop and evaluate the effectiveness of interventions aimed atenhancing vocabulary growth and independent reading/writing for students with disabilities.The Institute also recognizes the need for and seeks applications to develop measures that aretechnically sound, sensitive to instruction, and permit an analysis of vocabulary growth anddevelopment in different linguistic units at different points in time.

    4. REQUIREMENTS OF THE PROPOSED RESEARCH

    A. General Requirements

    a. Interventions/assessments intended for individuals with disabilities. This competition isrestricted to research directed to individuals with disabilities and individuals at high risk fordeveloping disabilities without preventive services, as previously defined (see 1. Request forApplications).

    b. Applying to multiple competitions. Applicants may submit proposals to more than one of theInstitute's FY 2006 competitions. Applicants may submit more than one proposal to a particularcompetition. However, applicants may only submit a given proposal once (i.e., applicants maynot submit the same proposal or very similar proposals to multiple competitions or to multiplegoals in the same competition).

    c. Applying to a particular goal within a competition. To submit an application to one of theInstitute's education research programs, applicants must choose the specific goal under whichthey are applying. Each goal has specific requirements.

  • 8/14/2019 description: tags: 2006-324l

    7/32

    7

    d. Inclusions and restrictions on for the Language/Vocabulary research program.

    (i) For the FY 2006 Language/Vocabulary competition, applicants must submit under eitherGoal One orGoal Two orGoal Three orGoal Four orGoal Five. Goal One incorporatesefforts to identify conditions that are associated with and are potential determinants of

    language and vocabulary development for students with disabilities. The understandingdeveloped through Goal One awards is expected to be relevant to the design andimplementation of future interventions. The typical methodology for Goal One will bethe analysis of existing databases, including state longitudinal databases, using statisticalapproaches that allow for testing models of the relationships among variables in waysthat strengthen hypotheses about paths of influence. For the FY 2006Language/Vocabulary competition, Goal One is limited to students with disabilities fromkindergarten through middle school.

    (ii) Applicants proposing to develop new interventions should apply under Goal Two. UnderGoal Three, the Institute will accept proposals to conduct efficacy or replication trials of

    interventions. Goal Four targets evaluations of the effectiveness of interventionsimplemented at scale.The second through fourth goals can be seen as a progression fromdevelopment (Goal Two) to efficacy (Goal Three), to effectiveness at scale (Goal Four).

    .Goals Two, Three, and Four are limited to interventions for students with and at high riskfor disabilities from kindergarten through middle school.

    (iii) Goal Five is to develop and validate measurement tools for with and at high risk fordisabilities from kindergarten through middle school.

    (iv) Applicants interested in interventions or assessments for young children with disabilitiesat the pre-kindergarten level should refer to the Institute's Early Intervention andAssessment of Young Children with Disabilities research program(http://www.ed.gov/about/offices/list/ies/programs.html).

    B. Applications under Goal One (Identification)

    Because the requirements for Goals One through Four are essentially the same across the

    Institute's competitions, a generic description is used in all of the relevant funding

    announcements. Consequently, the examples provided may not apply to a particular

    competition.

    a. Purpose of identification studies. Through all of its research programs that include theIdentification goal (Goal One), the Institute is primarily interested in analyses of multivariatedata, such as longitudinal individual student data that exist in a number of state-level and district-level databases, to identify existing programs, practices, and policies that may have an impact onacademic outcomes and to examine factors that may mediate or moderate the effects of theseprograms, practices, and policies.

    http://www.ed.gov/about/offices/list/ies/programs.htmlhttp://www.ed.gov/about/offices/list/ies/programs.html
  • 8/14/2019 description: tags: 2006-324l

    8/32

    8

    For Goal One, the Institute expects investigators typically to use existing longitudinal data sets tocapitalize on natural variation or discontinuities in education practices or policies. For example,in a particular year, a large district might have implemented a policy to hire master readingteachers for elementary schools. An investigator might propose interrupted time series analysesof the district's longitudinal datasets to examine changes in student outcomes that follow the

    implementation of the new policy. As a second example, a state may have implemented a newpolicy for early intervention for children with disabilities that provides financial incentives forexisting daycare and preschool providers to include children with disabilities within theirprograms. A researcher might utilize the states administrative database on preschool programsfunded through the states pre-kindergarten initiative to determine the degree to which the newpolicy changed the rate of inclusion, the conditions that were correlated with the variations in theuptake of the new policy by individual preschool providers, and the performance of children withdisabilities on assessments. These quantitative data might be augmented by interviews withadministrators and teachers to garner impressions on barriers and challenges to implementing thenew policy. The aim would be to develop a quantitative description of how the new policy isworking, and hypotheses derived from both quantitative and qualitative data about how it could

    be made to work more effectively.

    Value-added analyses can often strengthen the conclusions drawn from multivariate orinterrupted times series analyses. Value-added analyses use statistically adjusted gain scores forindividual students to control for student characteristics when estimating the effects of othervariables. For example, the analysis of the relationship between teacher professionaldevelopment and reading outcomes described previously would be more persuasive if individualstudent outcomes in a particular year were adjusted for student scores on the same or a similarassessment at the end of the previous school year.

    Evidence of the potential effectiveness of a program, practice, or policy obtained through a GoalOne project has the possibility of being used to support a subsequent application for a GoalThree (Efficacy) project.

    b. Methodological requirements.

    (i) Database. The applicant should describe clearly the database(s) to be used in theinvestigation including information on sample characteristics, variables to be used, andability to ensure access to the database if the applicant does not already have access to it.The database should be described in sufficient detail so that reviewers will be able tojudge whether or not the proposed analyses may be conducted with the database. Ifmultiple databases will be linked to conduct analyses, applicants should providesufficient detail for reviewers to be able to judge the feasibility of the plan.

    The applicant should describe the primary outcome measures to be used, includingreliability and validity. In particular, applicants should provide sufficient information onthe construct validity of the proposed measures. For example, if the applicant proposes touse a state database from which the primary outcome measure will be high schooldropout rates, the applicant should detail how the high school dropout rates are derived.

  • 8/14/2019 description: tags: 2006-324l

    9/32

    9

    (ii) Primary data collection (optional). For some projects, applicants may need to collectoriginal data; these data will generally be used to supplement an existing longitudinaldatabase in order to answer the question of interest. In such cases, the application mustdetail the methodology and procedures proposed for the primary data collection.Applicants should describe the sample and how the sample is related to or links to the

    proposed secondary database, the measures to be used (including information on thereliability and validity of the proposed instruments), and data collection procedures.

    (iii) Data analysis. The applicant must include detailed descriptions of data analysisprocedures. Because predictor variables relevant to education outcomes (e.g., studentcharacteristics, teacher characteristics, school and district characteristics) often covary,the Institute expects investigators to utilize the most appropriate state-of-the-art analytictechniques to isolate the possible effects of variables of interest. Analytic strategiesshould allow investigators to examine mediators and moderators of programs andpractices. The relation between hypotheses, measures, independent and dependentvariables should be well specified.

    c. Personnel and resources. Competitive applicants will have research teams that collectivelydemonstrate expertise in (a) the relevant academic content area (e.g., reading, mathematics),including where applicable, teacher education; and (b) implementation of and analysis of resultsfrom the research design that will be employed. Competitive applicants will have access toinstitutional resources that adequately support research.

    d. Awards. Typical awards for projects at this level are $100,000 to $250,000 (total cost =direct + indirect costs) per year for 1 or 2 years. The size of the award depends on the scope ofthe project.

    C. Applications under Goal Two (Development)a. Purpose of Goal Two (Development). Through all of its research programs that include theDevelopment goal (Goal Two), the Institute intends to support the development of interventions programs, practices, and policies. From the Institute's standpoint, a funded developmentproject would be successful if at the end of the 2 or 3 year development award, the investigatorshad a fully developed version of the proposed intervention, including for example, materials forstudents and teachers and preliminary data demonstrating thepotential of the intervention forimproving student outcomes. The Institute anticipates that investigators with successfuldevelopment projects would submit proposals to subsequent competitions for Goal Three(Efficacy) awards. Thus, Goal Two applicants should be aware that the type of data (e.g.,measures of student learning and achievement) they propose to collect under Goal Two awardsshould prepare them to apply for Goal Three awards.

    The Institute recognizes that research on children with disabilities often utilizes alternativeresearch designs due to low incidence of specific disabilities. In such cases, single subjectdesigns are appropriate.

    b. Requirements for proposed intervention. Under Goal Two, the Institute will considerinterventions that are in the early stages of development (e.g., those that do not have an entire

  • 8/14/2019 description: tags: 2006-324l

    10/32

    10

    curriculum ready to evaluate). Applicants should provide a strong rationale to support the use ofthe proposed intervention (e.g., curriculum, instructional practice, teacher professionaldevelopment program or professional development delivery model). Reviewers will considerwhether there is a strong theoretical foundation for the proposed intervention and whether theproposed intervention is grounded in empirical research. For example, a proposed language

    intervention might be based on empirical data on language and vocabulary development forstudents with disabilities obtained through laboratory studies or on the gaps reported in theNational Reading Panel Report (NICHD, 2000). In other cases, applicants might have alreadydeveloped some components of the intervention and have pilot data showing the potentialefficacy of those components. In such cases, the proposed project might be to complete thedevelopment of the intervention and collect data on the potential efficacy of the intervention.Alternatively one could imagine a proposal to develop an intervention for struggling high schoolreaders that is based on an intervention developed for upper elementary and middle schoolstudents and for which there are some data showing the potential of the intervention forimproving reading comprehension. In this case, the applicant would be proposing to modify thisexisting intervention to make it appropriate for high school students who are struggling readers

    and to collect data on the potential efficacy of the modified intervention. The point is thatapplicants should clearly and concisely articulate why the proposed intervention, as opposed tosome other type of intervention, should be developed. Why is the proposed intervention likely tobe successful for improving student learning and achievement?

    In the rationale to support the proposed intervention, applicants should also address the practicalimportance of the proposed intervention. For example, when the proposed intervention is fullydeveloped, will it form a set of math instructional strategies that has the potential to improvestudents' mathematics test scores in educationally meaningful increments, if it were implementedover the course of a semester or school year? Is the planned intervention sufficientlycomprehensive, for instance, to address multiple types of difficulties that students encounter inmastering algebra and to lead to improvements in students' grades or mathematics achievementtest scores? In addition, would the proposed intervention be both affordable for schools andeasily implemented by schools (e.g., not involve major adjustments to normal school schedules)?Appropriate applications for Goal Two may include, for example, proposals to develop and testcurriculum materials that ultimately could be combined to form a complete stand-alonecurriculum for a grade. Also appropriate would be proposals to develop supplementary materialsthat would be used in conjunction with existing curricula.

    Finally, the Institute recognizes there are some fully developed interventions that would notqualify for investigation under Goal Three because there are no student outcome data indicatingpotential efficacy (as defined below) nor is there wide-spread use.In such cases, applicants mayapply under Goal Two for support to conduct a small study to test whether the interventionshows evidence of potential efficacy as defined below. Such projects are limited to amaximum of 2 years of support because the Institute expects the investigator to be ready to

    implement the intervention in schools or other education delivery settings at the beginning

    of the award period. The applicant should clearly state in the beginning of the researchnarrative that he or she is applying under Goal Two with a fully developed intervention that hasnot been previously evaluated using student outcome measures.

  • 8/14/2019 description: tags: 2006-324l

    11/32

    11

    c. Methodological requirements. In addition to providing a strong rationale for the proposedintervention, applicants should clearly and completely describe the proposed research methodsfor obtaining evidence of thepotential efficacy of the proposed intervention. By potentialefficacy, the Institute means that there are student outcome data indicating that exposure to theintervention is at least correlated with increases in student performance. For example, the

    applicant might compare pre-intervention to post-intervention gain scores on a standard measureof reading comprehension between students whose teachers received a new professionaldevelopment program on reading instruction and students whose teachers did not receiveprofessional development on reading instruction. Alternatively, the applicant might compareend-of-year science achievement scores in classrooms using the intervention with district scoresfor the same grade level. The Institute recognizes that such data do notprovide causal evidenceof the impact of the intervention on student outcomes. However, the purpose of theDevelopment goal is to provide funds to develop interventions that on the basis of the theoreticalrationale and relevant empirical evidence appear to have the potential to improve studentlearning and to collect preliminary data that would permit a reasonable evaluation of whether ornot the intervention has sufficient potential to merit further investment.

    (i) Sample. The applicant should define, as completely as possible, the sample to be selectedand sampling procedures to be employed for the proposed study. Additionally, if theapplicant proposes a longitudinal study, the applicant should show how the long-termparticipation of those sampled would be assured.

    (ii) Design. The applicant must provide a detailed research design. Applicants shoulddescribe how potential threats to internal and external validity will be addressed.Although procedures differ between group evaluations and single subject evaluations,regardless of approach, these issues should be addressed. For example, in single subjectdesigns, applicants should consider the anticipated size of the intervention effect and howthey would handle within-subject variability in the outcome measure in the baseline orcomparison condition, variability in response to treatment within participants across timeand variability in response to treatment between subjects. Applicants should indicate thenumber of replications anticipated to establish external validity. In essence, what criteriawill the applicant use to determine if the response to the treatment is sufficiently largeand sufficiently replicated to be generalizable beyond the participants included in thestudy.

    (iii) Measures. For all proposals under Goal Two, investigators must include measures ofrelevant student outcomes (e.g., measures of reading or mathematics achievement).Applicants to the Teacher Quality competitions must include measures of teacherpractice, as well as measures of student learning and achievement. The applicant shouldprovide information on the reliability and validity of the selected measures and justify theappropriateness of the proposed measures.

    All applicants should note that data that only describeprocess (e.g., observations ofstudent behavior during planned lessons, case study of the implementation of thecurriculum, a discourse analysis of classroom discussions) or data only on teacher or

  • 8/14/2019 description: tags: 2006-324l

    12/32

    12

    student perception of improvement or ease of use will notbe considered as sufficientevidence of the potential efficacy of the intervention.

    (iv) Process data. Although the applicant must include relevant student outcome data toaddress the question of potential efficacy, this requirement does notpreclude the

    collection of process data. In fact, the Institute encourages the collection of such data,which can help the researcher refine the intervention and provide insight into why anintervention does or does not work, and is or is not well implemented. Observational,survey, or qualitative methodologies are encouraged as a complement to quantitativemeasures of student outcomes to assist in the identification of factors that may, forexample, explain the effectiveness or ineffectiveness of the intervention or identifyconditions that hinder implementation of the intervention.

    (v) Data analysis. The applicant must include detailed descriptions of data analysisprocedures. For quantitative data, specific statistical procedures should be cited. Therelation between hypotheses, measures, independent and dependent variables should be

    clear. For qualitative data, the specific methods used to index, summarize, and interpretdata should be delineated. For single subject studies, where applicable, the Instituteexpects applicants to describe what statistical methods (e.g., time series analyses) will beconducted to determine if the size of the effect is significant.

    d. Personnel and resources. Competitive applicants will have research teams that collectivelydemonstrate expertise in (a) specific academic domain (e.g., vocabulary development, or reading,mathematics, and if applicable, teacher education), (b) implementation of and analysis of resultsfrom the research design that will be employed, and (c) working with teachers, schools, or othereducation delivery settings that will be employed. Competitive applicants will have access toinstitutional resources that adequately support research activities and access to educationdelivery settings in which to conduct the research.

    An applicant may involve for-profit entities in the project. Involvement of the commercialdeveloper or distributor must not jeopardize the objectivity of the evaluation. Collaborationsincluding for-profit developers or distributors of education products must justify the need for

    Federal assistance to undertake the evaluation of programs that are marketed to consumers and

    consider sharing the cost of the evaluation.

    e. Awards. Typical awards for projects at this level are $150,000 to $500,000 (total cost =direct + indirect costs) per year for 2 to 3 years. The size of the award depends on the scope ofthe project.

    D. Applications under Goal Three (Efficacy and Replication Trials)Under Goal Three, the Institute requests proposals to test the efficacy of fully developedinterventions that already have evidence of potential efficacy. By efficacy, the Institute meansthe degree to which an intervention has a net positive impact on the outcomes of interest inrelation to the program or practice to which it is being compared.

  • 8/14/2019 description: tags: 2006-324l

    13/32

    13

    a. Purpose of efficacy and replication trials. Through all of its research programs that includethe Efficacy and Replication goal (Goal Three), the Institute intends to fund efficacy trials todetermine whether or not fully-developed interventions programs, practices, policies areeffective under specified conditions (e.g., large urban high schools with large class sizes andhigh turnover rate among teachers) and with specific types of students (e.g., low income or high

    proportion of English language learners). Results from efficacy projects have lessgeneralizability than results from effectiveness trials under Goal Four. The limitedgeneralizability can arise both from the lack of a full range of types of settings and participants inthe study, as well as through the intensive involvement of the developers and researchers in theimplementation of the intervention. A well designed efficacy trial provides evidence on whetheran intervention can work, but not whether it would work if deployed widely. Under Goal Three,applicants may propose an efficacy trial to determine if an intervention will work under specificconditions or a replication trial to determine if an intervention shown to produce a net positiveimpact in one setting will produce a net positive impact in a different setting or with a differentpopulation of students.

    Under Goal Three, an applicant might propose to examine the efficacy of the intervention in anexperimental study in which half of the classrooms are randomly assigned to the interventioncondition and half of the classrooms are assigned to continue to use the district's standardcurriculum. If the research team hypothesized that level of teacher professional developmentwould meaningfully affect implementation and student outcomes, the team might proposeinstead to randomly assign one-third of the classrooms to an intervention condition in whichteachers receive a training workshop for implementing the treatment curriculum at the beginningof the year, one-third of the classrooms to an intervention condition in which teachers receive thetraining workshop on implementation of the treatment curriculum with follow-up coachingsessions during the year, and one-third of classrooms to continue to use the district's standardcurriculum. The point is that applicants should use the efficacy and replication trials todetermine the conditions, if any, under which an intervention produces meaningful improvementon academic outcomes.

    Also of interest to the Institute are proposals to compare the impact of two interventions that arebased on different theoretical models. In such cases, the purpose might be to compare theefficacy of two well-developed approaches to improving student learning.

    From the Institute's standpoint, a funded Efficacy/Replication project would be methodologicallysuccessful if at the end of the grant period, the investigators had rigorously evaluated the impactof a clearly specified intervention on relevant student outcomes and under clearly describedconditions using a research design that meets the Institute's What Works Clearinghouse Level 1study criteria (http://whatworks.ed.gov) whether or not the intervention is found to improvestudent outcomes relative to the comparison condition. Further, the Institute would considermethodologically successful projects to bepragmatically successful if the rigorous evaluationdetermined that the intervention has a net positive impact on student outcomes in relation to theprogram or practice to which it is being compared.

    The Institute recognizes that research on children with disabilities often utilizes alternativeresearch designs for determining the causal impact of an intervention due to small populations of

    http://whatworks.ed.gov/http://whatworks.ed.gov/http://whatworks.ed.gov/
  • 8/14/2019 description: tags: 2006-324l

    14/32

  • 8/14/2019 description: tags: 2006-324l

    15/32

    15

    (i) Sample. Theapplicant should define, as completely as possible, the sample to be selectedand sampling procedures to be employed for the proposed study. Additionally, theapplicant should describe strategies to insure that participants will remain in the studyover the course of the evaluation.

    (ii) Design. The applicant must provide a detailed research design. Applicants shoulddescribe how potential threats to internal and external validity will be addressed. Studiesusing randomized assignment to treatment and comparison conditions are stronglypreferred. When a randomized trial is used, the applicant should clearly state the unit ofrandomization (e.g., students, classroom, teacher, or school). Choice of randomizing unitor units should be grounded in a theoretical framework. Applicants should explain theprocedures for assignment of groups (e.g., schools, classrooms) or participants totreatment and comparison conditions.

    Only in circumstances in which a randomized trial is not possible may alternatives thatsubstantially minimize selection bias or allow it to be modeled be employed. Applicants

    proposing to use a design other than a randomized design must make a compelling casethat randomization is not possible. Acceptable alternatives include appropriatelystructured regression-discontinuity designs or other well-designed quasi-experimentaldesigns that come close to true experiments in minimizing the effects of selection bias onestimates of effect size. A well-designed quasi-experiment is one that reducessubstantially the potential influence of selection bias on membership in the interventionor comparison group. This involves demonstrating equivalence between the interventionand comparison groups at program entry on the variables that are to be measured asprogram outcomes (e.g., reading achievement test scores), or obtaining such equivalencethrough statistical procedures such as propensity score balancing or regression. It alsoinvolves demonstrating equivalence or removing statistically the effects of other variableson which the groups may differ and that may affect intended outcomes of the programbeing evaluated (e.g., demographic variables, experience and level of training of teachers,motivation of parents or students). Finally, it involves a design for the initial selection ofthe intervention and comparison groups that minimizes selection bias or allows it to bemodeled. For example, a very weak quasi-experimental design that would notbeacceptable as evidence of program efficacy would populate the intervention conditionwith students who volunteered for the program to be evaluated, and would selectcomparison students who had the opportunity to volunteer but did not. In contrast, anacceptable design would select students in one particular geographical area of a city to bein the intervention; whereas students in another geographical area, known to bedemographically similar, would be selected to be in the comparison condition. In theformer case, self-selection into the intervention is very likely to reflect motivation andother factors that will affect outcomes of interest and that will be impossible to equateacross the two groups. In the latter case, the geographical differences between theparticipants in the two groups would ideally be unrelated to outcomes of interest, and inany case, could be measured and controlled for statistically.

    For instances in which small populations of children with specific disabilities restricts thepossibilities for conducting group evaluations, rigorous single subject designs are

  • 8/14/2019 description: tags: 2006-324l

    16/32

    16

    appropriate for demonstrating the efficacy of an intervention. See 4.D.d.Requirementsfor single subject designsfor details.

    (iii) Power. Applicants should clearly address the power of the evaluation design to detect areasonably expected and minimally important effect. Many evaluations of education

    interventions are designed so that clusters or groups of students, rather than individualstudents, are randomly assigned to treatment and comparison conditions. In such cases,the power of the design depends in part on the degree to which the observations ofindividuals within groups are correlated with each other on the outcomes of interest. Fordetermining the sample size, applicants need to consider the number of clusters, thenumber of individuals within clusters, the potential adjustment from covariates, thedesired effect, the intraclass correlation (i.e., the variance between clusters relative to thetotal variance between and within clusters), and the desired power of the design (note,other factors may also affect the determination of sample size, such as using one-tailed vstwo-tailed tests, repeated observations, attrition of participants, etc.; see Donner & Klar,2000; Murray, 1998; W.T. Grant Foundation, http://www.wtgrantfoundation.org/info-

    url_nocat3040/info-url_nocat_show.htm?doc_id=225435&attrib_id=9485). Whencalculating the power of the design, applicants should anticipate the degree to which themagnitude of the expected effect may vary across the primary outcomes of interest.

    (iv) Measures. Investigators should include relevant standardized measures of studentachievement (e.g., standardized measures of mathematics achievement or readingachievement) in addition to other measures of student learning and achievement (e.g.,researcher-developed measures). For Special Education Teacher Quality applications,applicants must also include measures of teacher practices. The applicant should provideinformation on the reliability, validity, and appropriateness of proposed measures.

    (v) Fidelity of implementation of the intervention. Researchers should attend to questions ofimplementation and how best to train and support teachers in the use of theseinterventions. The applicant should specify how the implementation of the interventionwill be documented and measured. The proposal should either indicate how theintervention will be maintained consistently across multiple groups (e.g., classrooms andschools) over time or describe the parameters under which variations in theimplementation may occur. Investigators should propose research designs that permit theidentification and assessment of factors impacting the fidelity of implementation.

    (vi) Comparison group, where applicable. The applicant should describe strategies theyintend to use to avoid contamination between treatment and comparison groups.Comparisons of interventions against other conditions are only meaningful to the extentthat one can tell what students in the comparison settings receive or experience.Applicants should include procedures for describing practices in the comparison groups.Applicants should be able to compare intervention and comparison groups on theimplementation of key features of the intervention so that, for example, if there is noobserved difference in student performance between intervention and comparisonstudents, they can determine if key elements of the intervention were also practiced andimplemented in the comparison groups.

    http://www.wtgrantfoundation.org/info-url_http://www.wtgrantfoundation.org/info-url_http://www.wtgrantfoundation.org/info-url_http://www.wtgrantfoundation.org/info-url_http://www.wtgrantfoundation.org/info-url_
  • 8/14/2019 description: tags: 2006-324l

    17/32

    17

    In evaluations of education interventions, students in the comparison group typicallyreceive some kind of treatment (i.e., the comparison group is generally not a "no-treatment" control because the students are still in school experiencing the school'scurriculum and instruction). For some evaluations, the primary question is whether the

    treatment is more effective than a particular alternative treatment. In such instances, thecomparison group receives a well-defined treatment that is usually an importantcomparison to the target intervention for theoretical or pragmatic reasons. In other cases,the primary question is whether the treatment is more effective than what is generallyavailable and utilized in schools. In such cases, the comparison group might receive whatis sometimes called "business-as-usual." That is, the comparison group receiveswhatever the school or district is currently using or doing in a particular area. Business-as-usual generally refers to situations in which the standard or frequent practice acrossthe nation is a relatively undefined education treatment. However, business-as-usual mayalso refer to situations in which a branded intervention (e.g., a published curriculum) isimplemented with no more support from the developers of the program than would be

    available under normal conditions. In either case, using a business-as-usual comparisongroup is acceptable. When business-as-usual is one or another branded intervention,applicants should specify the treatment or treatments received in the comparison group.In all cases, applicants should account for the ways in which what happens in thecomparison group are important to understanding the net impact of the experimentaltreatment. As noted in the preceding paragraph, applicants should be able to compare theintervention and comparison groups on key features of the intervention.

    The purpose here is to obtain information useful forpost hoc explanations of why theexperimental treatment does or does not improve student learning relative to thecounterfactual.

    (vii) Mediating and moderating variables. Observational, survey, or qualitativemethodologies are encouraged as a complement to experimental methodologies to assistin the identification of factors that may explain the effectiveness or ineffectiveness of theintervention. Mediating and moderating variables that are measured in the interventioncondition that are also likely to affect outcomes in the comparison condition should bemeasured in the comparison condition (e.g., student time-on-task, teacherexperience/time in position).

    The evaluation should be designed to account for sources of variation in outcomes acrosssettings (i.e., to account for what might otherwise be part of the error variance).Applicants should provide a theoretical rationale to justify the inclusion (or exclusion) offactors/variables in the design of the evaluation that have been found to affect the successof education programs (e.g., teacher experience, fidelity of implementation,characteristics of the student population). The research should demonstrate theconditions and critical variables that affect the success of a given intervention. The mostscalable interventions are those that can produce the desired effects across a range ofeducation contexts.

  • 8/14/2019 description: tags: 2006-324l

    18/32

    18

    (viii) Data analysis. All proposals must include detailed descriptions of data analysisprocedures. For quantitative data, specific statistical procedures should be described.The relation between hypotheses, measures, independent and dependent variables shouldbe clear. For qualitative data, the specific methods used to index, summarize, andinterpret data should be delineated.

    Most evaluations of education interventions involve clustering of students in classes andschools and require the effects of such clustering to be accounted for in the analyses, evenwhen individuals are randomly assigned to condition. For random assignment studies,applicants need to be aware that typically the primary unit of analysis is the unit ofrandom assignment.

    Finally, documentation of the resources required to implement the program and a costanalysis needs to be part of the study.

    d. Requirements for single subject designs. By single-subject designs, the Institute refers to

    experimental studies using reversal or multiple baseline or interrupted time series designsintended to demonstrate a causal relationship between two variables using a small number ofparticipants or cases. We are not referring to descriptive case studies.

    (i) Sample. Applicants must define the criteria used for selecting participants and the settingfrom which participants are recruited with sufficient detail to allow other researchers toidentify similar individuals from similar settings. Defining selection criteria typicallyrequires specifying a particular disability, the measurement instrument, and criterion usedto identify the disability.

    (ii) Intervention. In addition to meeting the requirements for interventions listed above insubsection 4.D.b.Requirements for proposed intervention, applicants must describe theintervention in sufficient detail to allow other researchers to reliably replicate theintervention. Applicants must clearly specify how, when, and under what conditions theintervention will be implemented.

    (iii) Fidelity of implementation. Applicants must describe how treatment fidelity will bemeasured, frequency of assessments, and what degree of variation in treatment fidelitywill be accepted over the course of the study.

    (iv) Comparison condition. Applicants must describe the baseline or comparison condition insufficient detail to allow other researchers to replicate the baseline condition.

    (v) Measures. Applicants must identify and describe the dependent variables (DVs) oroutcome measures, provide technical information on the reliability and validity of themeasures, detail procedures for collecting observations, and where applicable, specifyprocedures for determining inter-observer reliability and monitoring inter-observerreliability during the study and over both baseline and treatment conditions.

  • 8/14/2019 description: tags: 2006-324l

    19/32

  • 8/14/2019 description: tags: 2006-324l

    20/32

    20

    of interest is, "Does this intervention produce a net positive increase in student learning andachievement relative to the variety of products or practices that are currently available andutilized by schools?"

    b. Requirements for proposed intervention. To be considered for Goal Four awards, applicants

    should provide a clear rationale for thepractical importance of the intervention. Applicantsshould address three questions. (1) Is the intervention likely to produce educationallymeaningful effects on outcomes that are important to educational achievement (e.g., grades,achievement test scores) and, therefore, are of interest to parents, teachers, and educationdecision makers? (2) Is the intervention reasonably affordable to schools and other educationdelivery entities? (3) Is the intervention designed so that it is feasible for schools and othereducation delivery entities to implement the intervention? Interventions appropriate for studyunder Goal Four are interventions that have not yet been implemented at scale but have evidenceof the efficacy of the program on a limited scale.

    Applicants must provide strong evidence of the efficacy of the program as implemented on a

    small scale to justify the proposal to conduct a large-scale evaluation of the effectiveness of theintervention. As an example of strong evidence of efficacy, an applicant might describe theresults of two or more small scale, rigorously conducted evaluations using random assignment tointervention and comparison conditions in which the efficacy of the intervention is demonstratedwith different populations of students (e.g., students from middle income families in suburbanhigh schools and students from low income families in poor rural high schools). Evidence forefficacy from single-subject experimental would involve multiple studies in different settingsthat demonstrate causal effects. Alternatively, a single efficacy evaluation might have involvedschools from more than one district and included a diverse population of students and alonecould constitute sufficient evidence of the efficacy of the intervention. Importantly, the evidenceof efficacy must be based on the results of randomized field trials, or well-designed quasi-experimental evaluations.

    c. Implementation of the intervention. One goal of evaluations of interventions implemented atscale is to determine if programs are effective when implemented at a distance from thedevelopers of the program and with no more support from the developers of the program thanwould be available under normal conditions. A second goal is to determine if programsimplemented under these conditions are effective in a variety of settings. Interventions that areeffective at scale are those that can produce the desired effects across a range of educationcontexts. For Goal Four, the applicant should detail the conditions under which the interventionwill be implemented and provide procedures that will capture the conditions and criticalvariables that affect the success of a given intervention.

    d. Methodological requirements. For the methodological requirements for Goal Four projects,please refer to the methodological requirements listed under Goal Three.

    e. Personnel and resources. Competitive applicants will have research teams that collectivelydemonstrate expertise in (a) the relevant academic content areas (e.g., reading, science, andwhere applicable, teacher education), (b) implementation of and analysis of results from the

  • 8/14/2019 description: tags: 2006-324l

    21/32

    21

    research design that will be employed, and (c) working with teachers, schools, or other educationdelivery settings that will be employed.

    An applicant may involve curriculum developers or distributors (including for-profit entities) inthe project, from having the curriculum developers as full partners in its proposal to using off-

    the-shelf curriculum materials without involvement of the developer or publisher. Involvementof the curriculum developer or distributor must not jeopardize the objectivity of the evaluation.Collaborations including for-profit distributors of curriculum materials should justify the need

    for Federal assistance to undertake the evaluation of programs that are marketed to consumers

    and consider sharing the cost of the evaluation.

    When the developer of the intervention is involved in the project (whether or not the developer isa for-profit entity), applicants should clearly describe the role that the developer will take in theevaluation. Developers may not provide any training or support for the implementation that isnot normally available to users of the intervention.

    Competitive applicants will have access to institutional resources that adequately supportresearch activities and access to schools in which to conduct the research. Applicants arerequired to document the availability and cooperation of the schools or other education deliverysettings that will be required to carry out the research proposed in the application via a letter ofsupport from the education organization.

    f. Awards. The scope of Goal Four projects may vary. A smaller project might involve severalschools within a large urban school district in which student populations vary in terms of SES,race, and ethnicity. A larger project might involve large numbers of students in several schooldistricts in different geographical areas.

    Awards for Goal Four projects may go up to a limit of $6,000,000 (total cost = direct + indirectcosts) over a 5 year period. Typical awards are less. Awards depend in part on the number ofsites, cost of data collection, and cost of implementation. The size of the award depends on thescope of the project.

    F. Applications under Goal Five (Measurement)Across the Institute's research programs, the Measurement goals differ in purpose. Requirementsdescribed below apply to the Language/Vocabulary research program.

    a. Requirements for Goal Five (Measurement) proposals to the Language/Vocabulary research

    program.

    (i) Purpose of Goal Five proposals. Through Goal Five, the Institute intends to support thedevelopment of assessment tools for four purposes including screening, diagnosis,progress monitoring, and outcome evaluation.

    Screening involves brief assessments conducted with all children at the beginning of theschool year and targets skills that are strongly predictive of important future outcomes.The goal of screening is to identify children who are at high risk of failure and likely to

  • 8/14/2019 description: tags: 2006-324l

    22/32

    22

    need additional or alternative forms of instruction either to supplement or supplantconventional instruction.

    Diagnosis refers to more in-depth assessment of strengths and weaknesses in a particulardomain, and should not be confused with assessment for the purpose of labeling children

    with disabilities. The goal of diagnostic assessment is to provide teachers with a profileof skills and deficits to guide instruction.

    Progress monitoring is assessment of students' performance on critical criterionperformance skills a minimum of three times a year but typically more frequently (e.g.,weekly, monthly, or quarterly) using alternate forms of a test. The purpose of progressmonitoring is to estimate rates of improvement, to identify children who are notdemonstrating adequate progress and, therefore, require supplementary instruction.Progress monitoring assessment provides information on a student's performance on anongoing basis (e.g., weekly data on whether students are benefiting from a particular typeof instruction). This information can be used to compare different types of instruction for

    a particular child on a frequent basis. Such monitoring provides a means for designing orredesigning instructional programs to accommodate the instructional needs of studentswith disabilities.

    Outcome assessment is designed to determine if students have achieved or not achievedgrade-level performance or if their performance has improved or not improved.

    (ii) Requirements of proposed assessments. Applicants under Goal Five should propose todevelop assessments that can be used in education delivery settings for students withdisabilities from kindergarten through middle school. Applications that would beappropriate for consideration under Goal Five include, but are not limited to: (a)proposals to develop new assessments; (b) proposals to modify or adapt existingassessments; and (c) proposals to adapt assessments originally designed and used forresearch purposes for broader use in instructional settings.

    Applicants must provide a compelling rationale to support the development of theproposed assessment. Reviewers will consider the strength of theoretical foundation forthe proposed assessment, the existing empirical evidence supporting the proposedassessment, and whether the proposed assessment duplicates existing assessments. Indeveloping these assessments, researchers should keep in mind the pragmatic constraints(e.g., number of students, limited class time, time required to train teachers to use theassessments, costs) that teachers and administrators will consider to determine whetherthe instrument is a viable option for use in classrooms and other education deliverysettings. Applications should provide sufficient description of the proposed assessmentand how it could be utilized within education delivery settings for reviewers to judge thepracticality of the proposed assessment for instructional purposes.

    (iii) Methodological requirements. Applicants should detail the proposed procedures fordeveloping the assessment instrument (e.g., procedures for determining whichlanguage difficulties are being tapped by the instrument (i.e., construct validity), for

  • 8/14/2019 description: tags: 2006-324l

    23/32

    23

    selecting target sentences or problems to be used in the assessment, for assessingdifficulty of selected text or problems, or for obtaining representative responses toquestions). Applicants must clearly describe the research plans for assessing thevalidity and reliability of the instrument. Applicants should describe thecharacteristics and size of samples to be used in each study, procedures for collecting

    data, measures to be used, and data analytic strategies.

    b. Personnel and resources. Competitive applicants will have research teams that collectivelydemonstrate expertise in (a) content area (e.g., language), (b) assessment, (c) implementation ofand analysis of results from the research design that will be employed, and (d) working withteachers, schools, or other education delivery settings in which the proposed assessment might beused. Competitive applicants will have access to institutional resources that adequately supportresearch activities and access to schools in which to conduct the research.

    c. Awards. Typical awards under Goal Five will be $150,000 to $400,000 (direct plus indirectcost) per year for up to 4 years. Larger budgets will be considered if a compelling case can be

    made for such support. The size of award depends on the scope of the project.

    5. APPLICATIONS AVAILABLE

    Application forms and instructions for the electronic submission of applications will be availablefor the programs of research listed in this RFA from the following web site:

    https://ies.constellagroup.com

    by the following date:

    Language/Vocabulary July 7, 2005

    6. MECHANISM OF SUPPORT

    The Institute intends to award grants for periods up to 5 years pursuant to this request forapplications. Please see specific details for each goal in the Requirements of the ProposedResearch section of the announcement.

    7. FUNDING AVAILABLE

    The size of the award depends on the scope of the project. Please see specific details in theRequirements of the Proposed Research section of the announcement. Although the plans of theInstitute include this program of research, awards pursuant to this request for applications arecontingent upon the availability of funds and the receipt of a sufficient number of meritoriousapplications. The number of projects funded under a specific goal depends upon the number ofhigh quality applications submitted to that goal. The Institute does not have plans to award aspecific number of grants under each particular goal.

    8. ELIGIBLE APPLICANTS

    Applicants that have the ability and capacity to conduct scientifically valid research are eligibleto apply. Eligible applicants include, but are not limited to, non-profit and for-profitorganizations and public and private agencies and institutions, such as colleges and universities.

    https://ies.constellagroup.com/https://ies.constellagroup.com/
  • 8/14/2019 description: tags: 2006-324l

    24/32

    24

    9. SPECIAL REQUIREMENTS

    Research supported through this program must be relevant to U.S. schools.

    Recipients of awards are expected to publish or otherwise make publicly available the results of

    the work supported through this program. Beginning June 1, 2005, the Institute asks IES-fundedinvestigators to submit voluntarily to the Educational Resources Information Center (ERIC) anelectronic version of the author's final manuscript upon acceptance for publication in a peer-reviewed journal, resulting from research supported in whole or in part, with direct costs fromthe Institute. The author's final manuscript is defined as the final version accepted for journalpublication, and includes all modifications from the peer review process. Details of theInstitute's policy are posted on the Institute's website at http://www.ed.gov/ies.

    Applicants should budget for one meeting each year in Washington, DC, with other grantees andInstitute staff. At least one project representative should attend the two-day meeting.

    The Institute anticipates that the majority of the research will be conducted in field settings.Hence, the applicant is reminded to apply its negotiated off-campus indirect cost rate, as directedby the terms of the applicant's negotiated agreement.

    Research applicants may collaborate with, or be, for-profit entities that develop, distribute, orotherwise market products or services that can be used as interventions or components ofinterventions in the proposed research activities. Involvement of the developer or distributormust not jeopardize the objectivity of the evaluation. Applications from or collaborationsincluding such organizations should justify the need for Federal assistance to undertake theevaluation of programs that are marketed to consumers and consider sharing the cost of theevaluation, as well as sharing all or a substantial portion of the cost of the implementation of theproduct being evaluated (e.g., sharing the cost of textbooks for students).

    10. LETTER OF INTENT

    A letter indicating a potential applicants intent to submit an application is optional, butencouraged, for each application. The letter of intent must be submitted electronically by thedate listed at the beginning of this document, using the instructions provided at the followingweb site:

    https://ies.constellagroup.com

    The letter of intent should include a descriptive title, the goal which the application will address,and brief description of the research project (about 3,500 characters including spaces, which isapproximately one page, single-spaced); the name, institutional affiliation, address, telephonenumber and e-mail address of the principal investigator(s); and the name and institutionalaffiliation of any key collaborators. The letter of intent should indicate the duration of theproposed project and provide an estimated budget request by year, and a total budget request.Although the letter of intent is optional, is not binding, and does not enter into the review ofsubsequent applications, the information that it contains allows Institute staff to estimate thepotential workload to plan the review.

    http://www.ed.gov/ieshttp://www.ed.gov/ieshttps://ies.constellagroup.com/https://ies.constellagroup.com/https://ies.constellagroup.com/https://ies.constellagroup.com/http://www.ed.gov/ies
  • 8/14/2019 description: tags: 2006-324l

    25/32

    25

    11. SUBMITTING AN APPLICATION

    Applications must be submitted electronically by 8:00 p.m. Eastern time on the applicationreceipt date, using the ED standard forms and the instructions provided at the following web site:https://ies.constellagroup.com

    Application forms and instructions for the electronic submission of applications will be availableby the following date:

    Language/Vocabulary July 7, 2005

    Potential applicants should check this site for information about the electronic submissionprocedures that must be followed and the software that will be required.

    The application form approved for this program is OMB Number 1890-0009.

    12. CONTENTS AND PAGE LIMITS OF APPLICATIONAll applications and proposals for Institute funding must be self-contained within specified pagelimitations. Internet Web site addresses (URLs) may not be used to provide informationnecessary to the review because reviewers are under no obligation to view the Internet sites.

    Sections described below, and summarized in Table 2, represent the body of a proposalsubmitted to the Institute and should be organized in the order listed below. Sections a (ED 424)through i (Appendix A) are required parts of the proposal. Sectionj (Appendix B) is optional.All sections must be submitted electronically.

    Observe the page number limitations given in Table 2.Table 2

    Section Page Limit Additional Information

    a. Application for Federal EducationAssistance (ED 424)

    n/a

    b. Budget Information Non-ConstructionPrograms (ED 524) Sections A and B

    n/a

    c. Budget Information Non-ConstructionPrograms (ED 524) Section C

    n/a

    d. Project Abstract 1

    e. Research Narrative 20 Figures, charts, tables, and

    diagrams may be included inAppendix A

    f. Reference List no limit Complete citations, includingTitles and all authors

    g. Curriculum Vita of Key Personnel 4 per CV No more than 4 pages for eachkey person

    h. Budget Justification no limit

    i. Appendix A 15

    https://ies.constellagroup.com/https://ies.constellagroup.com/
  • 8/14/2019 description: tags: 2006-324l

    26/32

    26

    j. Appendix B 10 See restrictions

    A. Application for Federal Education Assistance (ED 424)

    The form and instructions are available on the website.

    B. Budget Information Non-Construction Programs (ED 524)Sections A and B

    The application should include detailed budget information for each year of support requestedand a cumulative budget for the full term of requested Institute support. Applicants shouldprovide budget information for each project year using the ED 524 form (a link to the form isprovided on the application website at https://ies.constellagroup.com). The ED 524 form hasthree sections: A, B, and C. Instructions for Sections A and B are included on the form.

    C. Budget Information Non-Construction Programs (ED 524)Section CInstructions for ED 524 Section C are as follows. Section C is a document constructed orgenerated by the applicant and is typically an Excel or Word table. Section C should provide a

    detailed itemized budget breakdown for each project year, for each budget category listed inSections A and B. For each person listed in the personnel category, include a listing of percenteffort for each project year, as well as the cost. Section C should also include a breakdown ofthe fees to consultants, a listing of each piece of equipment, itemization of supplies into separatecategories, and itemization of travel requests (e.g. travel for data collection, conference travel,etc.) into separate categories. Any other expenses should be itemized by category and unit cost.

    D. Project Abstract

    The abstract is limited to one page, single-spaced (about 3,500 characters including spaces) andshould include: (1) The title of the project; (2) the RFA goal under which the applicant isapplying (e.g., development, efficacy); and brief descriptions of (3) the purpose (e.g., to develop

    and obtain preliminary evidence of potential efficacy of a reading comprehension interventionfor struggling high school readers); (4) the setting in which the research will be conducted (e.g.,4 high schools from a rural school district in Alabama); (5) the population(s) from which theparticipants of the study(ies) will be sampled (age groups, race/ethnicity, SES); (6) if applicable,the intervention or assessment to be developed or evaluated or validated; (7) if applicable, thecontrol or comparison condition (e.g., what will participants in the control condition experience);(8) the primary research method (e.g., experimental, quasi-experimental, single-subject,correlational, observational, descriptive); (9) measures of key outcomes; and (10) data analyticstrategy.

    E. Research Narrative

    Incorporating the requirements outlined under the section on Requirements of the ProposedResearch, the research narrative provides the majority of the information on which reviewerswill evaluate the proposal. The research narrative must include the four sections describedbelow (a. "Significance" through d. "Resources") in the order listed and must conform to theformat requirements described in section e.

    a. Significance (suggested: 2-3 pages). Describe the contribution the study will make toproviding a solution to an education problem identified in the Background Section of this RFA.

    https://ies.constellagroup.com/https://ies.constellagroup.com/https://ies.constellagroup.com/https://ies.constellagroup.com/
  • 8/14/2019 description: tags: 2006-324l

    27/32

    27

    Provide a compelling rationale addressing, where applicable, the theoretical foundation, relevantprior empirical evidence, and the practical importance of the proposed project. For projects inwhich an intervention is proposed (whether to be developed or to be evaluated), include adescription of the intervention along with the theoretical rationale and empirical evidence

    supporting the intervention. For projects in which an assessment is proposed (whether to bedeveloped or evaluated), include a description of the assessment and a compelling rationalejustifying the development or evaluation of the assessment. (Applicants proposing anintervention or assessment may use Appendix B to include up to 10 pages of examples ofcurriculum material, computer screens, and/or test items.)

    b. Research Narrative (suggested: 13-16 pages).(i) Include clear, concise hypotheses or research questions;

    (ii) Present a clear description of, and a rationale for, the sample or study participants,including justification for exclusion and inclusion criteria and, where groups or

    conditions are involved, strategies for assigning participants to groups;

    (iii) Provide clear descriptions of, and rationales for, data collection procedures;

    (iv) Provide clear descriptions of and justification for measures to be used, includinginformation on the reliability and validity of measures; and

    (v) Present a detailed data analysis plan that justifies and explains the selected analysisstrategy, shows clearly how the measures and analyses relate to the hypotheses orresearch questions, and indicates how the results will be interpreted. Quantitativestudies should, where sufficient information is available, include an appropriatepower analysis to provide some assurance that the sample is of sufficient size.

    c. Personnel (suggested: 1-2 pages). Include brief descriptions of the qualifications of keypersonnel (information on personnel should also be provided in their curriculum vitae). For eachof the key personnel, please describe the roles, responsibilities, and percent of time devoted tothe project.

    d. Resources (suggested: 1-2 pages). Provide a description of the resources available to supportthe project at the applicants institution and in the field settings in which the research will beconducted.

    e. Format requirements. The research narrative is limited to the equivalent of 20 pages, wherea page is 8.5 in. x 11 in., on one side only, with 1 inch margins at the top, bottom, and bothsides. Single space all text in the research narrative. To ensure that the text is easy for reviewersto read and that all applicants have the same amount of available space in which to describe theirprojects, applicants must adhere to the type size and format specifications for the entire researchnarrative including footnotes. See frequently asked questions available athttps://ies.constellagroup.com on or before June 6, 2005.

    https://ies.constellagroup.com/https://ies.constellagroup.com/
  • 8/14/2019 description: tags: 2006-324l

    28/32

    28

    Conform to the following four requirements:

    (i) The height of the letters must not be smaller than 12 point;

    (ii) Type density, including characters and spaces, must be no more than 15 characters

    per inch (cpi). For proportional spacing, the average for any representative sectionof text must not exceed 15 cpi;

    (iii) No more than 6 lines of type within a vertical inch;

    (iv) Margins, in all directions, must be at least 1 inch.

    Applicants should check the type size using a standard device for measuring type size, ratherthan relying on the font selected for a particular word processing/printer combination. Figures,charts, tables, and figure legends may be smaller in size but must be readily legible. The typesize and format used must conform to all four requirements. Small type size makes it difficult

    for reviewers to read the application; consequently, the use of small type will be grounds for theInstitute to return the application without peer review. Adherence to type size and line spacingrequirements is also necessary so that no applicant will have an unfair advantage, by using smalltype, or providing more text in their applications. Note, these requirements apply to the PDFfile as submitted. As a practical matter, applicants who use a 12 point Times New Romanwithout compressing, kerning, condensing or other alterations typically meet these requirements.

    Use only black and white in graphs, diagrams, tables, and charts. The application must containonly material that reproduces well when photocopied in black and white.

    The 20-page limit does notinclude the ED 424 form, the one-page abstract, the ED 524 form andnarrative budget justification, the curriculum vitae, or reference list. Reviewers are able toconduct the highest quality review when applications are concise and easy to read, with pagesnumbered consecutively.

    F. Reference List

    Please include complete citations, including titles and all authors, for literature cited in theresearch narrative.

    G. Brief Curriculum Vita of Key Personnel

    Abbreviated curriculum vita should be provided for the principal investigator(s) and other keypersonnel. Each vitae is limited to 4 pages and should include information sufficient todemonstrate that personnel possess training and expertise commensurate with their duties (e.g.,

    publications, grants, relevant research experience) and have adequate time devoted to the

    project to carry out their duties (e.g., list current and pending grants with the proportion of the

    individual's time allocated to each project). The curriculum vita must adhere to the margin,format, and font size requirements described in the research narrative section.

    H. Budget Justification

  • 8/14/2019 description: tags: 2006-324l

    29/32

    29

    The budget justification should provide sufficient detail to allow reviewers to judge whetherreasonable costs have been attributed to the project. It should include the time commitments andbrief descriptions of the responsibilities of key personnel. The budget justification shouldcorrespond to the itemized breakdown of project costs that is provided in Section C. Forconsultants, the narrative should include the number of days of anticipated consultation, the

    expected rate of compensation, travel, per diem, and other related costs. A justification forequipment purchase, supplies, travel and other related project costs should also be provided inthe budget narrative for each project year outlined in Section C. For applications that includesubawards for work conducted at collaborating institutions, applicants should submit an itemizedbudget spreadsheet for each subaward for each project year, and the details of the subaward costsshould be included in the budget narrative. Applicants should use their institutions federalindirect cost rate and use the off-campus indirect cost rate where appropriate (see instructionsunder Section 9 Special Requirements). If less than 75 percent of total indirect costs are basedon application of the off-campus rate, the applicant should provide a detailed justification.

    I. Appendix A

    The purpose of Appendix A is to allow the applicant to include any figures, charts, or tables thatsupplement the research text, examples of measures to be used in the project, and letters ofagreement from partners (e.g., schools) and consultants. In addition, in the case of aresubmission, the applicant may use up to 3 pages of the appendix to describe the ways in whichthe revised proposal is responsive to prior reviewer feedback. These are the only materials thatmay be included in Appendix A; all other materials will be removed prior to review of theapplication. Narrative text related to any aspect of the project (e.g., descriptions of the proposedsample, the design of the study, or previous research conducted by the applicant) should beincluded in the 20-page research narrative. Letters of agreement should include enoughinformation to make it clear that the author of the letter understands the nature of thecommitment of time, space, and resources to the research project that will be required if theapplication is funded. The appendix is limited to 15 pages.

    J. Appendix B (optional)

    The purpose of Appendix B is to allow applicants who are proposing an intervention orassessment to include examples of curriculum material, computer screens, test items, or othermaterials used in the intervention or assessment. These are the only materials that may beincluded in Appendix B; all other materials will be removed prior to review of the application.Appendix B is limited to 10 pages. Narrative text related to the intervention (e.g., descriptions ofresearch that supports the use of the intervention/assessment, the theoretical rationale for theintervention/assessment, or details regarding the implementation or use of theintervention/assessment) should be included in the 20-page research narrative.

    K. Additional FormsPlease note that applicants selected for funding will be required to submit the followingcertifications and assurances before a grant is issued:

    (1) SF 424B-Assurances-Non-Construction Programs(2) ED-80-0013-Certification Regarding Lobbying, Debarment, Suspension and other

    Responsibility Matters; and Drug-Free Workplace Requirements

  • 8/14/2019 description: tag