254
© Project 2020 Findings Report Presented to the Olentangy Board of Educaon February 25, 2013

OLSD Project 2020 Report

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: OLSD Project 2020 Report

©

Project 2020Findings Report

Presented to the Olentangy Board of Education

February 25, 2013

Page 2: OLSD Project 2020 Report

Project 2020 Findings Report Introductions and Acknowledgments: A special thank you is extended to the four original members of the Project 2020 committee who have provided continual support and guidance as we’ve moved through the process of research, exploration and findings: Martin Johnson, Project 2020 Chairman, Harvey Alston, Mark Gutentag and John Scherer, all active in the community and members of the Project 2020 steering committee. The donation of their time, wisdom and energy allowed the committee to investigate countless options and opportunities as are presented throughout this report. Executive Summary: In Spring 2011, the Project 2020 committee was launched and charged with the task of researching alternatives to address evolving student educational needs and facility demands as the district looks to the future. The committee researched different avenues to traditional construction for the district to consider in an effort to accommodate the district’s skyrocketing enrollment and ever-­‐changing student needs. While enrollment continues to climb, state funding per pupil for the district has decreased at a steady rate. In addition, the district has a desire to reduce the frequency of placing issues on the ballot, if at all possible. Meanwhile, educational options for today’s and future students are constantly evolving from the traditional classroom and teaching methods as advancements in technology change the face of education. Reviewing reports, statistics, projections and visiting alternative and existing facilities framed the research the Project 2020 committee completed in an effort to determine viable solutions for the future. Presentations from the DeJong Richter consulting firm, administrators, students, teachers, parents and other key stakeholders gave the Project 2020 committee multiple angles to consider while developing strategic and tactical ideas targeted toward a more sustainable and cost-­‐efficient environment for the future educational needs in Olentangy schools. Project 2020 culminated with a two-­‐day summit in January 2013. The summit was facilitated by DeJong Richter consulting, bringing parents, community members, business owners, teachers and administrators together to provide feedback and suggestions related to student and district needs in the near future and beyond. Throughout the process, the Project 2020 committee used the following key considerations and constraints:

• The continued burden on the local taxpayer • The rapid rate of growth in the district • The ever-­‐changing face of education and cutting-­‐edge teaching methods • The continued reduction in state funding to Olentangy Local Schools

Over the past two years, the Project 2020 committee noted several key factors surfaced: • Opportunities exist for middle and high school building capacity to be flexible and

increased when considering student scheduling and the use of teacher work stations

Page 3: OLSD Project 2020 Report

2

• The Development Committee’s enrollment projections and reports point to a clear need for a change in how Olentangy meets student needs in the upcoming ten-­‐year time period

• Using current facilities differently and/or acquiring existing facilities within the district may provide relief without building a fourth, traditional high school

• Advancements in technology will provide students opportunities to take advantage of cutting-­‐edge teaching methods including flipped classrooms, online coursework, and blended-­‐learning opportunities

In conclusion, as enrollment continues to steadily increase, the question remains, will the increased number of students’ needs be met with the current existing facilities? Project 2020 committee formation: The concept of “Project 2020” began in the summer of 2010 during a discussion about future needs for Olentangy between the Superintendent, Board president and vice president. From that initial conversation the committee was formed as a Superintendent’s Committee and work began in spring 2011. The committee’s work was completed over the past two years through monthly meetings, visits to alternative buildings (site visits) and researching multiple concepts and options. Martin Johnson, community member and parent, led the committee throughout the duration of the project. The committee presented periodic updates to the Board of Education and the Superintendent throughout the course of their work. Project 2020 charter:

Through comprehensive stakeholder engagement, pervasive communication and evaluation of all available operational and technological avenues, to develop a long-­‐term strategic and tactical plan for a more sustainable and cost efficient environment for managing the growth of Olentangy Local Schools. Project 2020 objectives:

1. Evaluate all operational and technology based options to minimize or eliminate the needs for additional ‘brick and mortar” facilities

2. Establish a means by which the needs and interests of all stakeholder groups (community, administration, educators, students, etc.) are identified, understood and incorporated into the revised operational models.

3. Drive continuous improvements and operational efficiencies while continuing the highest expectations for academic achievement

Page 4: OLSD Project 2020 Report

3

4. Develop improved methods for on-­‐going communication and collaboration with critical stakeholder groups.

Project 2020 communications: Throughout the course of Project 2020, the committee has communicated with the community, teachers and staff in the following ways:

• Project 2020 web page link on the district webpage • Articles in the district newsletter (three times) • Project 2020 email inbox for committee to receive questions, suggestions and

information • Videos of Martin Johnson, chair, and Harvey Alston, committee member, sharing

information about Project 2020 and concluding with an online survey link • Online survey for community members and staff members

Project 2020 committee members: The following community members, parents, teachers and administration are members of the Project 2020 committee: Martin Johnson, Chairperson, community member Harvey Alston, community member Sue Andrews, Liberty High School, work and family instructor Elaine Eddy, Orange High School, physical education instructor, OTA President Greta Gnagy, Indian Springs Elementary School, instructor Mark Gutentag, community member David Hinds, Olentangy High School, theater/drama instructor Dave King, Olentangy School Board member Cindy Long, Liberty Middle School, technology instructor Erica O’Keeffe, Liberty High School, foreign language instructor Keith Pomeroy, Olentangy Local Schools, Technology Director Linda Martin, Olentangy Local Schools, Assistant Superintendent John Scherer, community member Randy Wright, Liberty High School, Principal

Project 2020 key milestones:

• February 1, 2011 – Charter approved/organization set • April 1, 2011 – Support group assignments defined • December 15, 2011 – Initial data collection • February 1, 2012 – Analysis of data • April 1, 2012 – Stakeholder reviews • January 16-­‐17, 2013 – Project 2020 summit • February, 2013 – Present committee findings to Board of Education

Page 5: OLSD Project 2020 Report

4

Project 2020 Background (Supportive evidence): State funding history: Over the past ten years, Olentangy Local Schools’ state funding has remained relatively flat. The last substantial increase the district received in state funding was in 2005 when there were approximately 9000 students and 12 buildings. Since then, the district has added an additional 8000 students, 11 more schools and recognizes continued enrollment growth projections for at least the next ten years. Currently, Olentangy Schools receives $363 per pupil, while the average amount other Ohio school districts receive is nine times that amount-­‐ $3200. During this same time, Olentangy Schools’ taxpayers have approved three bond issues and two operating levies in less than six years to address the difference between funds received by the state and the actual need. Meanwhile, Olentangy’s latest per-­‐pupil expenditure is $9465, one of the lowest in central Ohio, while the district’s Performance Index score from the state is 107.3, one of the highest in the region. Please see attached state funding spreadsheet, outlining funds received by the state over the past ten years Community and student demographics: The Olentangy Local School District is comprised of 95 square miles. Most of the district is in Delaware County, with a small, southern portion in Franklin County. The district serves all or part of numerous municipalities including Berkshire Township, Berlin Township, Concord Township, Delaware Township, Genoa Township, Liberty Township, Orange Township, the City of Columbus, the City of Delaware, the City of Powell, and the City of Westerville. While it is an ever-­‐growing number, the district currently boasts more than 32, 000 residential households. The current, average property valuation in the district is just over $200,000. The median average income is $71,487.00. When considering district resident education level, approximately 95% of district residents have a high school diploma, and 50% have earned a bachelor’s degree or higher. Total enrollment was calculated at 17,540 for all of the district’s 24 schools at the beginning of February 2013. Olentangy high schools experience a graduation rate of 99%. More than 90% of Olentangy students from all three high schools attend post-­‐secondary education and a vast majority of those students enroll in a four-­‐year university, upon receiving their diploma from the district. In addition, all three Olentangy high schools are ranked in the top 20 in the state, according to US News and World Report. Please see attached Cupp Report for additional information

Page 6: OLSD Project 2020 Report

5

Enrollment history and projections: “Explosive growth” has defined Olentangy Local Schools for the past two decades. Over the past twenty years, Olentangy’s 95 square miles have experienced significant development, adding students to the district at a rapid rate. Olentangy has been the fastest growing district in Ohio for the past ten years, adding approximately 11,000 students during that time frame. Ten years ago, Olentangy was the 44th largest district in the state; today Olentangy is the 7th largest district in Ohio. Looking ahead at the next ten years, Olentangy’s Development Committee projects enrollment will grow from the 17,540 students enrolled in the district’s 23 buildings for the 2012-­‐13 academic year to approximately 21,052 students in 2022-­‐23. Please see attached development committee enrollment projection reports and DeJong Richter reports Performance and student achievement: Olentangy students excel in all areas and measures of academic competency. The district received the “Excellent with Distinction” rating, the highest a district could receive, from the state in 2012. The district has been rated “Excellent” or better for nine straight years. Olentangy students continue to top the charts in all measures of academic excellence. The district has met all 26 indicators on the state report card. In addition, Olentangy students have a state Performance Index of 107.3, which places Olentangy Schools in the top 6% of districts in the state. Please see attached copy of the Ohio Department of Education local report card for Olentangy Local Schools District construction history and future projects: Through the early 1900’s one room school houses scattered among the landscape made up the local education system. Beginning in 1911, these schools consolidated into four K-­‐12 facilities. Fast forward to 1952 when construction began to create one consolidated school building in the Olentangy Local School District, Olentangy High School, where all students combined and attended grades 9-­‐12. The Shanahan building (814 Shanahan Road) is still in use today. In 1990, district construction began in earnest with the building of a new Olentangy High School, leaving the Shanahan building to house grades K-­‐8 for the entire district. In 1993, Wyandot Run Elementary School opened its doors, thus beginning 20 years of rapid growth and construction. Over the past two decades, Olentangy has built or added to existing facilities 25 times to accommodate the growing and demanding needs placed on the district. Looking ahead, the district does not have any firm plans in place for new construction, nor has it sold the bonds approved by voters for a proposed elementary school #16. As Olentangy buildings begin to age, the district will need to realize a shift from primarily

Page 7: OLSD Project 2020 Report

6

new construction to maintenance and upkeep of current facilities, requiring a reallocation of current resources. However, student enrollment continues to rise. Therefore, the development committee presented several proposals in fall 2011 to address the need. Please see attached Development Committee documents: Enrollment Projections, School Openings, and Project 2020 Development Committee Report (10.12.11)

Page 8: OLSD Project 2020 Report

7

Project 2020 Overall Analysis: Analysis process: The Project 2020 committee spent time during multiple meetings reviewing and analyzing background information and data collected. The committee was able to consult with numerous support groups to the school district to further understand the climate and components factoring into the future of the district. Throughout, the committee continued to keep the key considerations and constraints in the forefront as information was analyzed. Support groups identification:

• Development Committee • Innovation and Creativity Committee • Cost Savings and Efficiency Committee • Olentangy Local Schools building administrators • Olentangy Local Schools curriculum department • Olentangy Local Schools OASIS program administration • Olentangy Local Schools technology department • Olentangy Local Schools communications department • Olentangy Local Schools operations department • Olentangy Local Schools business and facilities personnel • Olentangy Local Schools guidance counselors • Delaware County officials • City of Powell officials • Delaware County Auditor • Liberty Township officials • Berlin Township officials • Delaware County Planner • Delaware County Economic Development Director • Delaware City Engineer • Delaware County Commissioner • Delaware Sewer Planner • Delaware County Administrator • Columbus State President • Delaware Career Center • Gahanna High School • Local Realtors • Olentangy High School DECA students • PTO groups at all buildings • DeJong Richter consulting firm

Project 2020 Data/Findings:

Page 9: OLSD Project 2020 Report

8

Financial data and information: Olentangy Local Schools has continued to operate on less, with more students entering its facilities each year. Despite minimal dollars received via state funding, in early 2012, the Board of Education committed to not return to local taxpayers for additional bond or levy issues for four years, recognizing the fact that local taxpayers have shouldered the tax burden via recent levies and bond issues frequently. House Bill 1, passed in June 2009, returned Olentangy to a guarantee district, thus significantly decreasing anticipated state aid, and does not acknowledge tremendous growth. House Bill 153 accelerates the phase out of Tangible Personal Property (TPP) reimbursements, which adds up to a loss of over 7 million to the district in FY12 and FY13. Property valuation within the district is reflective of the current economic climate downturn, and provides a 6% decrease in dollars in FY11. This trend is expected to continue in the next few years. Meanwhile, the district’s continued enrollment projections and growth will add operating costs, as does rising fuel and energy expenses. Please see attached October 2012 Five Year Forecast and Risk Assumptions for more detailed information Construction and development options: As noted previously, new construction within the district over the past 20 years has been steady and constant. Looking ahead, the district may consider several options from continued new construction to other alternatives. The Development Committee presented a report to the Board of Education in fall of 2011, outlining five options for consideration:

• Build a new, fourth high school • Expand existing high school buildings • Grade reconfiguration • Build stand-­‐along facility • Acquire and renovate existing facility within district boundaries

Please see attached Project 2020 Development Committee report for additional details Schedule and program options: Considering options to the current high school schedules would allow the district to accommodate additional students. Mark Raiff, Executive Director of Academics, developed a report to the Board of Education and the Project 2020 committee in 2011. The report presented options that included ways to work within the current high school schedule and structure, modifications to current schedule and alternative instructional delivery methods, including online learning, blended learning and off-­‐site programming. All or any combinations of these options would allow Olentangy Schools to increase the

Page 10: OLSD Project 2020 Report

9

capacity for all three high schools while continuing to provide an excellent educational program. Please see attached “Enhancing Academic Delivery and Improving College and Career Readiness” Report for additional details Shared services/partner options: The Cost Efficiency committee has investigated the possibility of shared services with the local townships in the areas of maintenance and technology. At this time, the district and local townships were unable to find common ground in those two areas. Looking ahead, the Cost Efficiency committee is beginning to explore potential options to share bus garage space with several neighboring townships. Alternative learning options: In fall 2011, the departments of curriculum, pupil services and technology analyzed opportunities and reported to the Project 2020 committee on current pilot projects happening within the district related to alternative learning options. The departments reported on opportunities for increased service learning for students, allowing real world experience for credit. In addition, the departments analyzed opportunities for blended approaches with other educational institutions and alternative configuration of the school days and year. Finally, the departments reported the progress on current, on-­‐going pilot programs for online courses for students, blended learning opportunities as well as the possibility to build Olentangy’s own online courses versus purchasing online courses. In January 2013, an Alternative Learning Environment Committee (ALEC) was established to review existing and proposed alternative learning options for Olentangy students. The committee is comprised of district administrators and teachers. The goal of the committee will be to make recommendations about potential alternative learning options to district administration and the teachers association. Types of instruction currently being examined are: dual enrollment, online courses, summer school, OASIS, credit flex, ventures, evening classes and distance learning. Please see attached report, 2020 Data Collection, for additional information Dual Enrollment: Beginning in fall 2013, Olentangy Local School District will be partnering with Columbus State Community College to offer a dual enrollment option for Olentangy students. Students who take advantage of the dual enrollment program will have the opportunity to enroll in classes, taught by Olentangy educators, at Columbus State’s Delaware County facility and earn college and high school credit concurrently. During the 2013-­‐14 academic year, a total of seven sections of five different classes will be offered to students through the dual enrollment option. The maximum number of possible students that could be served in the first year of the dual enrollment program is 136. In

Page 11: OLSD Project 2020 Report

10

the future, course offerings could expand to include a wider variety of options for students at the Columbus State Delaware County site. In addition, holding dual enrollment courses in another outside facility would exponentially increase the student capacity in the dual enrollment program. Please see attached Dual Enrollment report for detailed information One-­‐to-­‐One Computing: District administrators have explored the concept of one-­‐to-­‐one computing with the Licking Valley school district, as the district recently adopted a new one-­‐to-­‐one computing program for the 2013-­‐14 school year. Every student in Licking Valley high school was given a new laptop to be used for instruction. Each student will keep their laptop through the remainder of their high school years. The district invested in each student, in addition to costs to upgrade their wireless infrastructure and bandwidth. The additional cost was off-­‐set by adjustments in personnel and reducing the number of desktop computers purchased annually. In addition, the cost was defrayed through the receipt of a grant from Intel and Educational Collaborators. Please see attached article from the Advocate newspaper Local Real Estate: Over the course of two years, Project 2020 committee members took several trips to tour Clark Hall, Gahanna-­‐Lincoln Schools’ premier facility for juniors and seniors. Clark Hall provides an environment emphasizing collaboration and independent thinking. Clark hall mimics a higher education type of environment. The committee also looked at least six properties within the Olentangy school district that could be purchased and used as alternative facilities for a variety of any of the district’s programs. Jeff Meyer, commercial relator, showed committee members the facilities available within the district boundaries. Community and student feedback: In fall 2011, Olentangy High School DECA students completed a creative marketing research project for a class assignment, selecting the topic of Project 2020 for their focus. The students surveyed students and teachers at all three high schools as well as community members, collecting feedback on multiple innovative options for the district to consider. Options students weighed in on included a desire to not experience redistricting at the high school level, have an off-­‐campus Advanced Placement (AP) site/building for students, and block scheduling/blended learning. Students and staff also commented on cost efficiencies, new and creative teaching methods but still maintaining a traditional feel of the current high school experience. During the spring of 2012, Paul Fallon conducted an in-­‐depth survey of randomly selected community residents and held a focus group of district residents. During the phone survey and the focus group, district residents were questioned on their attitudes and assumptions on various alternative teaching strategies, including on-­‐line education.

Page 12: OLSD Project 2020 Report

11

In addition, residents discussed and gave feedback on the district building a fourth high school, given the current factors. The results were compiled in a comprehensive report for the Board of Education. Please see attached DECA report and Fallon survey results for additional information Project 2020 Summit: Dejong-­‐Richter, consulting firm, facilitated a full two-­‐day “summit” for the district in mid-­‐January 2013. The Project 2020 summit included a diverse group of parents, community and business leaders, teachers and administrators from the district. The intent of the summit was to review information developed and gathered over the past two years by the Project 2020 committee and DeJong-­‐Richter, identify potential challenge areas regarding projected facility utilization, and explore possible solutions to those issues. Small and large group discussion provided participants the opportunity to share thoughts and solutions for the future of Olentangy Local Schools. Please see attached report Project 2020 Summit: Summary of Findings Attachments:

Ø State Funding Spreadsheet Ø Cupp Report Ø The Ohio Department of Education’s local report card for Olentangy Local Schools Ø Development Committee documents: Enrollment projections, school openings, and

Project 2020 Development Committee report Ø October 2012 Five Year Forecast and Risk Assumptions Ø Enhancing Academic Delivery and Improving College and Career Readiness report-­‐2012

(written by Mark Raiff) Ø 2020 Data Collection (Achievement and Accountability, Pupil Services, Technology)-­‐

December 2011 (written by Keith Pomeroy) Ø Dual enrollment report (written by Jack Fette) Ø The Newark Advocate newspaper article Ø Olentangy High School DECA student presentation

(http://prezi.com/7bgqjzf4gtyv/copy-­‐of-­‐creative-­‐marketing/) Ø Fallon 2012 survey results Ø Project 2020 Summit report (provided by DeJong Richter)

Page 13: OLSD Project 2020 Report

Actual Unrestricted Grants-­‐in-­‐Aid (state per pupil funding) per five year forecasts

FY12 7,330,465$ FY11 7,009,414$ FY10 7,458,556$ FY09 7,853,156$ FY08 5,935,817$ FY07 6,592,875$ FY06 6,141,534$ FY05 6,717,805$ FY04 4,695,360$ FY03 3,544,294$ FY02 2,620,907$

Olentangy
Typewritten Text
Olentangy
Typewritten Text
Olentangy
Typewritten Text
Olentangy
Typewritten Text
Page 14: OLSD Project 2020 Report

Comparison District 1 Comparison District 2 Comparison District 3

Olentangy Local

SD, Delaware

Similar District

Average

Statewide average

of Local, E.V., &

City Districts

A - Demographic Data:

1 School District Area Square Mileage (FY12) 95.00 33.33 67.71

2 District Pupil Density (FY12) 168.10 243.78 42.83

3 Total Average Daily Membership (FY11) 15,969.80 8,125.96 2,900.13

4 Total Year-End Enrollment (FY11) 16,263.00 7,863.67 2,688.39

5 Asian Students As % Of Total (FY11) 7.23% 6.93% 1.72%

6 Pacific Islander Students As % Of Total (FY11) 0.01% 0.03% 0.03%

7 Black Students As % Of Total (FY11) 3.95% 5.37% 16.21%

8 American Indian/Alaskan Native Students As % Of Total (FY11) 0.10% 0.11% 0.14%

9 Hispanic Students As % Of Total (FY11) 2.27% 2.88% 3.53%

10 White Students As % Of Total (FY11) 82.75% 80.69% 74.18%

11 Multiracial Students As % Of Total (FY11) 3.69% 3.99% 4.19%

12 % Of Students In Poverty (FY11) 7.42% 13.10% 43.01%

13 % Of Students With Limited English Proficiency (FY11) 1.71% 3.13% 1.99%

14 % Of Students With Disability (FY11) 8.98% 10.02% 13.29%

B - Personnel Data:

15 FTE Number Of Regular Education Classroom Teachers (FY08) 619.78 354.06 122.95

16 Classroom Teachers' Average Salary (FY11) $63,783.07 $66,852.13 $57,904.47

17 % Teachers With 0-4 Years Experience (FY11) 19.80% 23.22% 22.62%

18 % Teachers With 4-10 Years Experience (FY11) 36.65% 20.78% 18.82%

19 % Teachers With 10+ Years Experience (FY11) 43.55% 56.00% 58.56%

20 K-12 Regular Education Pupil Teacher Ratio (FY08) 18.32 18.75 18.47

21 FTE Number Of Administrators (FY11) 72.50 39.15 17.87

22 Administrators' Average Salary (FY11) $80,365.16 $89,053.24 $76,037.71

23 Pupil Administrator Ratio (FY11) 219.70 205.70 159.34

C - Property Valuation And Tax Data:

24 Assessed Property Valuation Per Pupil (TY10 [FY12]) $200,479.34 $188,256.22 $140,481.02

25 Res & Agr Real Property Valuation As % Of Total (TY10 [FY12]) 83.80% 79.10% 74.25%

26 All Other Real Property Valuation As % Of Total (TY10 [FY12]) 13.62% 18.77% 21.68%

27 Public Utility Tangible Value As % Of Total (TY10 [FY12]) 2.41% 2.01% 3.94%

28 General Tangible Value As % Of Total (TY10 [FY12]) 0.17% 0.12% 0.13%

29 Business Valuation As % Of Total (TY10 [FY12]) 16.21% 20.91% 25.82%

30 Per Pupil Revenue Raised By One Mill Property Tax (TY10 [FY12]) $200.48 $188.26 $140.48

31 Total Property Tax Per Pupil (TY10 [FY12]) $7,661.97 $7,477.49 $4,853.09

32 Rollback & Homestead Per Pupil (FY11) $947.16 $862.12 $564.29

33 OSFC 3-Year Adjusted Valuation Per Pupil (FY12) $246,524.01 $221,556.75 $139,468.72

34 District Ranking Of OSFC Valuation Per Pupil (FY12) 576 NA NA

35 Median Income (TY09) $71,487.00 $48,961.00 $30,827.00

36 Average Income (TY09) $105,402.00 $82,626.78 $58,564.00

D - Local Effort Data:

37 Current Operating Millage Including JVS Mills (TY10 [FY12]) 64.80 68.94 48.22

38 Effective Class 1 Millage Including JVS Mills (TY10 [FY12]) 37.61 37.73 28.49

39 Effective Class 2 Millage Including JVS Mills (TY10 [FY12]) 36.93 42.28 32.07

40 School Inside Millage (TY10 [FY12]) 5.00 4.90 4.48

41 School District Income Tax Per Pupil (FY11) $0.00 $1,001.15 $1,009.84

42 Local Tax Effort Index (FY11) 0.5484 0.6499 1.0000

E - Expenditure Per Pupil Data:

43 Administration Expenditure Per Pupil (FY11) $867.55 $1,068.87 $1,229.44

44 Building Operation Expenditure Per Pupil (FY11) $1,720.80 $1,904.84 $2,036.62

45 Instructional Expenditure Per Pupil (FY11) $5,731.40 $6,440.07 $5,941.27

46 Pupil Support Expenditure Per Pupil (FY11) $934.29 $1,195.25 $1,093.35

47 Staff Support Expenditure Per Pupil (FY11) $210.87 $402.86 $396.25

48 Total Expenditure Per Pupil (FY11) $9,464.92 $11,011.90 $10,696.94

F - Revenue By Source Data:

49 State Revenue Per Pupil (FY11) $1,384.14 $2,940.46 $4,509.89

50 State Revenue As % Of Total (FY11) 15.72% 28.16% 42.49%

51 Local Revenue Per Pupil (FY11) $7,140.02 $7,066.06 $5,053.49

52 Local Revenue As % Of Total (FY11) 81.11% 67.67% 47.61%

53 Federal Revenue Per Pupil (FY11) $278.82 $435.19 $1,050.92

54 Federal Revenue As % Of Total (FY11) 3.17% 4.17% 9.90%

55 Total Revenue Per Pupil (FY11) $8,802.98 $10,441.71 $10,614.30

56 Total Formula Funding Per Pupil (FY11) $363.45 $1,414.05 $3,205.90

57 Total Formula Funding As % Of Income Tax Liability (FY11) 4.56% 18.24% 75.49%

G - District Financial Status From Five Year Forecast Data:

58 Salaries As % Of Operating Expenditures (FY11) 62.95% 63.69% 56.79%

59 Fringe Benefits As % Of Operating Expenditures (FY11) 22.58% 22.31% 21.72%

60 Purchased Services As % Of Operating Expenditures (FY11) 7.62% 9.14% 16.48%

61 Supplies & Materials As % Of Operating Expenditures (FY11) 2.80% 2.68% 2.96%

62 Other Expenses As % Of Operating Expenditures (FY11) 4.05% 2.18% 2.05%

Olentangy Local SD, Delaware

46763

District Profile Report For City, Exempted Village And Local School Districts for

Office of School Options and Finance

Ohio Department Of Education

Page 15: OLSD Project 2020 Report

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

3

92.2%

92.1%

93.7%

90.8%

89.3%

84.0%

87.6%

95.4%

91.4%

92.0%

90.2%

95.2%

90.2%

86.5%

97.0%

95.2%

97.1%

91.9%

94.4%

98.5%

97.5%

98.7%

96.1%

97.2%

96.4%

97.0%

94.7%

94.0%

96.5%

92.8%

90.7%

84.5%

89.1%

96.2%

93.3%

91.8%

90.3%

95.8%

89.7%

89.5%

97.9%

96.8%

97.0%

93.2%

96.7%

98.8%

98.0%

99.1%

97.6%

97.9%

96.1%

99.5%

79.9%

82.0%

83.8%

78.1%

74.1%

66.1%

71.1%

85.6%

77.5%

77.3%

74.8%

85.1%

74.3%

67.4%

87.2%

82.6%

89.5%

74.7%

80.1%

92.4%

89.1%

93.4%

84.2%

88.0%

94.5%

84.3%

Excellentwith

Distinction

IRN# 046763

Olentangy Local School District814 Shanahan Rd Ste 100, Lewis Center, OH 43035-9078 – Delaware County

Current Superintendent: Wade E. Lucas (740) 657-4050

26106.9

Met

OK+ = Above

Page 16: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

91.5 93.2 94.7 91.0 89.3 94.0 94.9 92.5 96.5 91.0 89.7 92.8

88.9 90.7 90.7 85.0 88.1 84.5 88.9 88.7 89.1 88.7 94.0 96.2 93.6 91.7 93.3

91.2 92.1 91.8 91.9 89.0 90.3 90.9 92.8 95.8 89.4 87.5 89.7 83.2 85.2 89.5

96.0 96.6 97.9 97.5 97.0 96.8 97.6 96.1 97.0 92.7 94.4 93.2 96.8 96.9 96.7

99.1 98.1 98.8 98.7 98.4 98.0 98.8 98.4 99.1 97.1 97.5 97.6 98.5 97.8 97.9

* Cumulative results for students who took the tests as 10th or 11th graders.

Page 17: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

+

0.1

1.4

5.6

26.0

31.4

35.5

0.0

0.4

3.3

26.0

34.6

42.6

106.9106.9 105.8 105.1

+ + - 3 3

+ + - + -

Page 18: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

-- >95% >95% >95%

92.3% 94.4% >95% >95% >95%

Met Met Met Met NR Met Met Met Met Met

Met Met Met Met NR Met Met Met Met Met

Met Met Met Met NR Met Met Met Met Met

Met Met Met Met NR Met Met Met Met Met

Met

Met

Met Met Met Met NR Met Met Met Met Met

Met

Met

Met

Met

Met

Met

Met

>95%

Page 19: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

100.0% 0.0% 100.0%

76.0% 0.0% 76.0%

0.0% -- 0.0%

100.0% -- 100.0%

0.0% -- 0.0%

16

16263 4.0% 0.1% 7.2% 2.3% 3.7% 82.7% 7.4% 1.7% 11.1% --

4.50.06.40.02.1

----------

1.50.00.30.50.0

6.24.88.58.84.8

1.60.00.90.93.0

1.00.61.70.50.7

0.70.20.80.30.1

7.14.2

11.54.18.4

----------

1.00.41.40.50.7

5.12.27.32.92.2

10.75.06.19.10.0

1.00.21.70.70.6

1.70.92.10.71.1

10.610.420.121.72.1

----------

3.50.03.54.51.5

9.014.314.719.119.0

3.45.97.86.10.0

3.61.66.08.42.2

2.80.64.86.40.8

14.120.022.030.017.9

----------

3.31.55.87.71.8

11.711.816.822.48.7

12.625.019.220.825.0

2.91.47.18.72.2

5.03.36.29.02.6

40.235.436.028.627.1

----------

23.818.217.315.04.6

37.347.632.225.033.3

32.611.831.422.612.1

28.920.326.422.310.8

28.018.026.021.510.1

41.252.630.328.328.4

----------

28.319.725.821.711.2

42.335.534.428.517.4

48.440.029.426.020.0

26.212.627.824.211.4

32.428.825.220.412.1

27.152.116.836.014.6

----------

33.962.120.436.420.0

29.433.320.935.34.8

37.373.521.440.933.3

35.568.124.834.618.1

36.571.624.936.518.7

21.412.616.422.314.7

----------

35.768.624.235.318.0

26.341.921.732.121.7

18.125.021.032.525.0

35.872.822.935.516.1

34.260.425.034.620.3

17.62.1

20.713.754.2

----------

37.319.758.543.673.8

18.10.0

23.711.838.1

25.28.8

38.529.651.5

30.99.4

41.134.168.2

32.09.5

43.535.370.3

16.210.519.815.330.5

----------

31.79.7

42.834.868.3

14.68.6

19.814.150.0

10.25.0

24.311.730.0

34.213.040.530.869.7

26.76.6

41.635.464.0

Page 20: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

No Schools Identified for Improvement in District.

Page 21: OLSD Project 2020 Report

Olentangy Local School District, Delaware County

Page 22: OLSD Project 2020 Report

>95% -- >95% 93.3% 93.3%

-- -- -- 90.0% >95%

Olentangy Local School District, Delaware County

Page 23: OLSD Project 2020 Report

REPORT

Enrollment Projection Update October 17, 2012

Page 24: OLSD Project 2020 Report

REPORT Olentangy Local Schools

1

AKNOWLEDGEMENTS

DeJONG-HEALY would like to extend our appreciation to the Olentangy Local Schools for choosing

us to assist them in developing these enrollment projections. In addition, thank you to Michelle

Murphy and Ralph Au.

Future growth in the Olentangy Local School District Community will have a significant impact on the

school facilities and student population. The following information will continue to provide the District

with a valuable planning tool to assist in determining the future direction of its school system.

As a consulting team, we appreciate this opportunity to serve your school community as you embark

on your vision for the future of education in the Olentangy Local Schools.

Tracy Healy

President

Page 25: OLSD Project 2020 Report

REPORT Olentangy Local Schools

2

INTRODUCTION

DeJONG-HEALY has provided enrollment projections for the Olentangy Local Schools annually since

1998. For several years, enrollment was projected using a methodology based on new housing starts

due to the explosive growth seen in the District primarily from 1997-2005. Single-family building

permits peaked in 2004 at over 1,700 permits. In 2006, single-family permits dropped to 768.

Currently, just over 350 permits have been issued this year.

2008 Projections

In 2008, DeJONG-HEALY developed enrollment projections based on a methodology to better reflect

the economic conditions of the time. Three basic assumptions were made:

Kindergarten enrollment would continue to be negatively impacted by the reduced

housing starts for the next few years. It would then start to level off.

First grade growth would continue as it has. Traditionally, there is an increase in first

grade enrollment from kindergarten the previous year. This is due, in great part, to

parents choosing a private, all-day kindergarten program then enrolling their child in

the district for first grade.

Grades 2-12 would see a greater growth than previously anticipated through 2009-

10 due to students leaving the private and parochial schools for financial reasons.

After 2009-10, the growth rate would then slow.

2009 Projections

While the enrollment projections from 2008 were highly accurate [within 10 students total for the first

year], there was concern raised regarding the kindergarten enrollment. Actual kindergarten

enrollment for 2009-10 was 71 students higher than projected. Based on this, kindergarten

enrollment was projected to grow at 4.6% in 2010-11, from a base of 1,344 students [kindergarten

registration count]. Following 2010-11, growth was projected to be approximately 0.5% from the

previous year until 2019-20.

Page 26: OLSD Project 2020 Report

REPORT Olentangy Local Schools

3

2010 Projections

While the enrollment projections from 2009 were within less than 1 percent of the actual enrollment,

there was again concern raised regarding the kindergarten enrollment. Actual kindergarten

enrollment for 2010-11 was 63 students lower than projected.

To better address kindergarten enrollment, births in the area were incorporated into the projections for

years 2011-12 – 2013-14. In 2013-14, a growth factor of 7.2% was applied to account for growth

due to the implementation of all-day kindergarten. Starting in the 2014-15 school year, a 0.5%

growth factor was applied to the previous year’s enrollment. For first grade, the average survival ratio

of the past 2 years was used for kindergarten to 1st grade through 2013-14. Starting in the 2014-15

school year, the growth factor was reduced to 6% to account for the increase in kindergarten

enrollment due to all-day kindergarten implementation.

2011 Projections

Overall, the methodology used in 2010 was successful. Kindergarten enrollment was within 7

students of the projection. The difference between the actual and projected enrollment for Pre-K-12

was 83 students or .49 percent.

For 2011, we used the same methodology with two minor modifications. Due to the change in

legislation regarding all-day Kindergarten, we did not incorporate the additional growth factor for

kindergarten in the 2013-14 school year. We incorporated births in the area for years 2012-13 –

2013-14. For the years 2014-15 – 2021-22, a 0.5% growth factor was applied to the previous

year’s enrollment. For first grade, the average survival ratio of the past 2 years was used.

Additionally, an “end of year” projection for the 2011-12 school year was included.

Page 27: OLSD Project 2020 Report

REPORT Olentangy Local Schools

4

2012 Projections

Overall, the methodologies used in 2010 and 2011 were successful. The difference between the

actual and projected enrollment for Pre-K-12 was 117 students or .66 percent.

This year, we have used the same methodology as 2011 with one modification. The preschool

projection is based on kindergarten enrollment rather than simply adding 34 students each year as we

did in 2011.

As with any enrollment projection, there are a number of factors that can influence future growth—

either positively or negatively. Due to the uncertain economic conditions we are currently living in, it is

impossible to know how quickly the housing market will recover and to what extent.

Page 28: OLSD Project 2020 Report

REPORT Olentangy Local Schools

5

Actual, 2012-13* Projected Difference PercentagePre-K 315 356 41 13.02%K 1,241 1,319 78 6.29%1 1,503 1,498 -5 -0.33%2 1,520 1,503 -17 -1.12%3 1,559 1,529 -30 -1.92%4 1,510 1,531 21 1.39%5 1,450 1,423 -27 -1.86%K-5 Total 8,783 8,803 20 0.23%6 1,440 1,458 18 1.25%7 1,428 1,440 12 0.84%8 1,283 1,276 -7 -0.55%6-8 Total 4,151 4,174 23 0.55%9 1,272 1,271 -1 -0.08%10 1,125 1,125 0 0.00%11 1,039 1,065 26 2.50%12 1,067 1,075 8 0.75%9-12 Total 4,503 4,536 33 0.73%Total 17,752 17,869 117 0.66%

* as of October 3, 2012

Source: Olentangy Local School District

Olentangy Local School District

ENROLLMENT COMPARISON

As mentioned previously, the difference between the actual and projected enrollment for the 2012-13

school year is less than 1 percent. The grades with the greatest difference were PreK and K at 41 and

78, respectively.

The following table illustrates the difference by grade.

Page 29: OLSD Project 2020 Report

REPORT Olentangy Local Schools

6

2003-04 2004-05 2005-06 2006-07 2007-08 2008-09 2009-10 2010-11 2011-12 2012-13*Pre-K 107 127 147 172 234 300 276 293 321 315K 848 860 1,080 1,150 1,203 1,253 1,319 1,343 1,341 1,2411 825 963 1,011 1,237 1,320 1,318 1,422 1,488 1,477 1,5032 762 868 1,023 1,087 1,270 1,334 1,349 1,460 1,499 1,5203 788 803 934 1,069 1,123 1,320 1,356 1,378 1,487 1,5594 730 852 870 1,015 1,106 1,169 1,362 1,399 1,410 1,5105 719 790 896 927 1,032 1,168 1,167 1,379 1,408 1,450K-5 Total 4,672 5,136 5,814 6,485 7,054 7,562 7,975 8,447 8,622 8,7836 675 775 849 942 948 1,061 1,204 1,217 1,415 1,4407 628 717 803 887 982 985 1,086 1,203 1,258 1,4288 557 660 756 838 906 993 1,002 1,091 1,231 1,2836-8 Total 1,860 2,152 2,408 2,667 2,836 3,039 3,292 3,511 3,904 4,1519 588 603 689 815 848 948 1,023 1,043 1,116 1,27210 481 610 629 725 822 863 952 1,042 1,042 1,12511 468 493 629 640 728 839 871 968 1,069 1,03912 441 474 515 626 641 720 838 875 974 1,0679-12 Total 1,978 2,180 2,462 2,806 3,039 3,370 3,684 3,928 4,201 4,503Total 8,617 9,595 10,831 12,130 13,163 14,271 15,227 16,179 17,048 17,752

* as of October 3, 2012

Historical Enrollment

Source: Olentangy Local School District

Olentangy Local School District

HISTORICAL ENROLLMENT

Over the past ten years, the PreK-12 enrollment in the Olentangy Local Schools has increased by

9,135 students or 106 percent. Total enrollment for the 2012-13 school year is 17,752, an increase

of 704 students, or 4%, from the previous school year.

The following table and graphs illustrate the District’s enrollment history from 2003-04 through 2012-

13.

Page 30: OLSD Project 2020 Report

REPORT Olentangy Local Schools

7

Page 31: OLSD Project 2020 Report

REPORT Olentangy Local Schools

8

LIVE BIRTH DATA

Utilization of live birth data is recommended when projecting future kindergarten enrollments. This

data provides a helpful overall trend. Large bubbles in birth counts, either up or down, can also be

planned for or anticipated by the District.

In addition, the live birth counts are used in determining a birth-to-kindergarten survival ratio. This

ratio identifies the percentage of children born in a representative area who attend kindergarten in the

District five years later. The survival ratios for birth-to-kindergarten as well as grades 1-12 can be

found later in this report.

The Ohio Department of Health [ODH] data warehouse provides information about live birth events

for Ohio residents. Information about events occurring outside of Ohio to Ohio residents is included.

Information about events occurring inside Ohio to non-Ohio residents is not included.

Data is arranged by the residence of the mother. For example, if a mother lives in Powell, Delaware

County but delivers her baby in Columbus, Franklin County, the birth is counted in Powell, Delaware

County.

The number of live births is recorded by:

State

County

City/Town

Census Tract

Address [not available to the public]

Live birth counts are different from live birth rates. The live birth count is simply the actual number of

live births. A birth rate is the number of births per 1,000 women in a specified population group.

Birth rates are provided for counties and for 9 age groups from 10-14 years to 45+ years.

Page 32: OLSD Project 2020 Report

REPORT Olentangy Local Schools

9

Year 43015 43016 43017 43021 43035 43065 43074 43081 43082 43235 43240Total # of

Births

Total # of Births Used

1996 423 211 462 47 127 377 92 663 143 468 0 3,013 5041997 478 277 458 45 153 392 117 671 167 481 0 3,239 5451998 467 275 481 55 192 410 92 703 220 480 4 3,379 6061999 536 295 468 58 260 370 108 732 226 460 12 3,525 6422000 525 296 470 65 317 469 99 701 301 437 34 3,714 8202001 594 310 452 62 346 456 112 676 329 452 33 3,822 8352002 646 361 466 71 358 481 117 675 354 397 36 3,962 8752003 694 361 441 82 429 542 153 654 406 451 48 4,261 1,0192004 678 419 449 94 434 522 154 650 413 421 67 4,301 1,0232005 622 398 409 89 437 525 151 620 372 409 53 4,085 1,0152006 694 465 428 103 507 570 137 668 368 439 63 4,442 1,1402007 732 494 460 129 447 477 149 699 325 479 75 4,466 9992008 683 495 436 112 447 505 142 702 308 467 71 4,368 1,0232009 654 497 430 116 458 441 129 697 289 497 84 4,292 9832010 640 521 463 113 431 490 139 711 312 516 73 4,409 9942011* 634 522 418 118 437 435 123 769 291 510 69 4,326 941

* provisional

Olentangy Local School DistrictLive Birth Counts by Zip Code

Source: Ohio Department of Health, Vital Statistics

The following chart and graph include the live birth count for zip codes 43015, 43016, 43017,

43021, 43035, 43065, 43074, 43081, 43082, 43235, and 43240. However, upon analysis of the

map on page 10, only zip codes 43035, 43065, and 43240 were used for projection purposes.

Page 33: OLSD Project 2020 Report

REPORT Olentangy Local Schools

10

Page 34: OLSD Project 2020 Report

REPORT Olentangy Local Schools

11

Delaware County

Per Capita Income $26,708Median Household Income $50,502Persons Below Poverty 15.9%

General Demographic Information

Source: US Census, American Community Survey 2010

2000 Census 2010Delaware County 109,989 174,214

Columbus City (part) 1,891 7,245Delaware City 25,243 34,753Powell City 6,247 11,500Berkshire Township# 1,946 2,428Berlin Township 3,315 6,498Concord Township 4,507 9,294Delaware Township* 1,559 1,964Genoa Township 11,293 23,093Liberty Township** 9,182 14,581Orange Township 12,464 26,269

# does not include Galena or Sunbury villages

Total Population

Source: ODOD Policy Research & Strategic Planning Office, August 2011

* does not include Delaware City

** does not include Delaware City or Powell City

DEMOGRAPHICS

The Olentangy Local School District is located in southern Delaware County and includes Columbus

City, Delaware City, Powell City, Berkshire Township, Berlin Township, Concord Township, Delaware

Township, Genoa Township, Liberty Township, and Orange Township. General demographic data is

included in the following tables for the areas located completely or partially in the District.

Page 35: OLSD Project 2020 Report

REPORT Olentangy Local Schools

12

Included are block group estimates and projections provided by ESRI Business Information Solutions

(ESRI BIS). ESRI BIS uses a time series of estimates from the U.S. Census Bureau that includes the

latest estimates and intercensal estimates adjusted for error of closure. The Census Bureau’s time

series is consistent, but testing has revealed improved accuracy by using a variety of sources to track

county population trends.

ESRI BIS also employs a time series of building permits and housing starts plus residential deliveries.

Finally, local data sources that tested well against Census 2000 are reviewed. Data sources are

integrated and then analyzed by Census Block Groups.

Sources of data include:

Supplementary Surveys of the Census Bureau

Bureau of Labor Statistics’ (BLS) Local Area Unemployment Statistics

BLS Occupational Employment Statistics

InfoUSA

U.S. Bureau of the Census’ Current Population Survey

National Planning Association Data Service

Below is a list of definitions as they appear on the U.S. Census Bureau website, to aid in interpretation

of the following tables and maps.

Household:

A household includes all the people who occupy a housing unit as their usual place of residence.

Average family size:

A measure obtained by dividing the number of members of families by the total number of families (or

family householders).

Page 36: OLSD Project 2020 Report

REPORT Olentangy Local Schools

13

Family household (Family):

A family includes a householder and one or more people living in the same household who are related

to the householder by birth, marriage, or adoption. All people who are related to the householder are

regarded as members of his or her family. A family household may contain people not related to the

householder, but those people are not included as part of the householder’s family in census

tabulations. Thus, the number of family households is equal to the number of families, but family

households may include more members than do families. A household can contain only one family for

purposes of census tabulations. Not all households contain families since a household may comprise

a group of unrelated people or one person living alone.

Householder:

The person, or one of the people, in whose name the home is owned, being bought, or rented. If

there is no such person present, any household member 15 years old and over can serve as the

householder for the purposes of the census. Two types of householders are distinguished: a family

householder and a nonfamily householder. A family householder is a householder living with one or

more people related to him or her by birth, marriage, or adoption. The householder and all people in

the household related to him are family members. A nonfamily householder is a householder living

alone or with nonrelatives only.

Page 37: OLSD Project 2020 Report

REPORT Olentangy Local Schools

14

The following tables illustrate the current estimates and 5-year population projections based on block

groups that comprise the state and school district, indicating areas of current and projected growth.

The tables have been developed to determine selected age group projections and projections for

household income, family size, and family income.

The total population in the State of Ohio is 11,605,005. This population is projected to increase by

79,971 people, or approximately 1% over a 5-year period. The 0-18 year-old population in the State

currently totals 2,909,667. This population is projected to decrease by 24,064 children, or

approximately 1 percent.

State of Ohio 2010 Population Estimate

2015 Population Projection

Total Population 11,605,005 11,684,976Ages 0-4 761,241 753,923ES Ages (5-10) 912,087 912,641MS Ages (11-13) 453,226 467,708HS Ages (14-18) 783,113 751,331

Total Ages 0-18 2,909,667 2,885,603Average Age 37 38Source: ESRI BIS

0

200,000

400,000

600,000

800,000

1,000,000

Ages 0‐4 ES Ages (5‐10) MS Ages (11‐13) HS Ages (14‐18)

State of Ohio Population Estimates & Projections by Age Group

2010 Population Estimate 2015 Population Projection

Page 38: OLSD Project 2020 Report

REPORT Olentangy Local Schools

15

The total population in the District is 65,985. This population is projected to increase by 11,115

people, or approximately 17% over a 5-year period. The 0-18 year-old population in the District

currently totals 21,457. This population is projected to increase by 3,267 children, or approximately

15 percent.

Olentangy Local School District2010 Population

Estimate

2015 Population

Projection

Total Population 65,985 77,100

Ages 0‐4 5,840 6,455

ES Ages (5‐10) 7,125 8,224

MS Ages (11‐13) 3,611 4,099

HS Ages (14‐18) 4,880 5,946

Total Ages 0‐18 21,457 24,724

Average Age 34 34

Source: ESRI BIS

Page 39: OLSD Project 2020 Report

REPORT Olentangy Local Schools

16

Average household and family incomes in the State are projected to increase by 10% and 12%,

respectively over a 5-year period. Average family size is projected to remain relatively the same.

Average household and family incomes in the District are projected to increase by 10% and 8%,

respectively over a 5-year period. Average family size is projected to remain the same.

Olentangy Local School District2010 Population

Estimate

2015 Population

Projection

Average Household Income $125,562 $138,653

Average Family Size 3.13 3.13

Average Family Income $130,688 $141,217

Source: ESRI BIS

The maps on the following pages illustrate the data identified in the tables. The color coding identifies

areas within the District that may be increasing or decreasing at different rates than others.

State of Ohio 2010 Population Estimate

2015 Population Projection

Average Household Income $60,721 $66,573Average Family Size 2.99 2.98Average Family Income $66,852 $74,726Source: ESRI BIS

Page 40: OLSD Project 2020 Report

REPORT Olentangy Local Schools

17

Page 41: OLSD Project 2020 Report

REPORT Olentangy Local Schools

18

Page 42: OLSD Project 2020 Report

REPORT Olentangy Local Schools

19

Page 43: OLSD Project 2020 Report

REPORT Olentangy Local Schools

20

Page 44: OLSD Project 2020 Report

REPORT Olentangy Local Schools

21

Page 45: OLSD Project 2020 Report

REPORT Olentangy Local Schools

22

Page 46: OLSD Project 2020 Report

REPORT Olentangy Local Schools

23

YearBerkshire

TwpBerlin Twp

Concord Twp

Delaware Twp

Genoa Twp

Liberty Twp

Orange Twp

Delaware City

City of Powell

City of Columbus

Total

2001 n/a 182 n/a n/a n/a 198 532 17 105 84 1,1182002 1 154 101 8 179 227 494 25 127 180 1,4962003 1 123 94 17 144 170 574 99 261 101 1,5842004 8 97 72 4 154 179 767 102 209 139 1,7312005 12 84 52 9 194 165 405 92 216 164 1,3932006 20 66 13 8 96 102 216 72 138 37 7682007 23 40 12 0 78 75 227 58 94 42 6492008 8 30 12 0 44 69 142 23 41 18 3872009 19 20 8 1 52 30 129 24 32 14 3292010 6 35 9 0 52 45 117 52 34 20 3702011 9 40 11 2 65 66 170 46 45 14 4682012* 8 17 9 5 43 65 111 53 33 15 359

* all through August 2012, except Delaware City which is as of 10/10/12

Olentangy Local School DistrictSingle Family Building Permit Data

Source: Delaware County Regional Planning Commission; City of Powell; Delaware City; Columbus City

HOUSING INFORMATION

The following chart illustrates housing trends in the Olentangy Local School District since 2001. The

number of single-family permits issued increased steadily through 2004 then declined. Last year, the

permits totaled over 400 for the first time since 2007.

Page 47: OLSD Project 2020 Report

REPORT Olentangy Local Schools

24

from to birth -> K K->1 1->2 2->3 3->4 4->5 5->6 6->7 7->8 8->9 9->10 10->11 11->122003 2004 134.0% 113.6% 105.2% 105.4% 108.1% 108.2% 107.8% 106.2% 105.1% 108.3% 103.7% 102.5% 101.3%

2004 2005 131.7% 117.6% 106.2% 107.6% 108.3% 105.2% 107.5% 103.6% 105.4% 104.4% 104.3% 103.1% 104.5%

2005 2006 137.7% 114.5% 107.5% 104.5% 108.7% 106.6% 105.1% 104.5% 104.4% 107.8% 105.2% 101.7% 99.5%

2006 2007 137.5% 114.8% 102.7% 103.3% 103.5% 101.7% 102.3% 104.2% 102.1% 101.2% 100.9% 100.4% 100.2%

2007 2008 123.0% 109.6% 101.1% 103.9% 104.1% 105.6% 102.8% 103.9% 101.1% 104.6% 101.8% 102.1% 98.9%

2008 2009 128.9% 113.5% 102.4% 101.6% 103.2% 99.8% 103.1% 102.4% 101.7% 103.0% 100.4% 100.9% 99.9%

2009 2010 132.3% 112.8% 102.7% 102.1% 103.2% 101.2% 104.3% 99.9% 100.5% 104.1% 101.9% 101.7% 100.5%

2010 2011 131.6% 110.0% 100.7% 101.8% 102.3% 100.6% 102.6% 103.4% 102.3% 102.3% 99.9% 102.6% 100.6%

2011 2012 124.2% 112.1% 102.9% 104.0% 101.5% 102.8% 102.3% 100.9% 102.0% 103.3% 100.8% 99.7% 99.8%

average 131.21% 113.151% 103.48% 103.8% 104.77% 103.5% 104.2% 103.2% 102.7% 104.336% 102.100% 101.639% 100.566%standard deviation 4.864% 2.322% 2.189% 1.792% 2.643% 2.774% 2.044% 1.802% 1.680% 2.220% 1.776% 1.040% 1.519%

SURVIVAL RATIOS

The chart below demonstrates the changes in enrollment as students move through the system.

Percentages greater than 100 indicate that there are more students than there were in the previous

grade the previous year. In other words, there was growth and new students were added to the

system. Percentages less than 100 indicate that there was decline or students left the system.

The following table illustrates the survival ratios used in developing the enrollment projections for the

Olentangy Local Schools.

Page 48: OLSD Project 2020 Report

REPORT Olentangy Local Schools

25

ENROLLMENT PROJECTIONS

Based on the current trends seen in the District, DeJONG-HEALY recommends the following enrollment projection methodology. Pre-K

The preschool projection is based on kindergarten enrollment rather than simply adding 34.15

students each year as we did in 2011. For the past three years, preschool enrollment has equaled

24.66% of the kindergarten enrollment. We applied this percentage to the kindergarten projection to

determine the Pre-K projection.

Kindergarten

For kindergarten, the average survival ratio of the past 3 years was used for birth to k through 2013-

14. Starting in the 2014-15 school year, a 0.5% growth factor was applied to the previous year’s

enrollment.

Grades 1-12

For grades 1-12, the average survival ratio of the past 2 years was applied.

2012-13 End of Year Projection

The 2012-13 “End of Year” projection was determined by applying the difference of .66% between

the actual October 3, 2012 and previous projected enrollment for 2012-13 and adding it to the

current enrollment.

For example, the current kindergarten enrollment is 1,241. By adding .66% [or 8 students] to that

number, the end of year kindergarten enrollment is projected to be 1,249 students.

Page 49: OLSD Project 2020 Report

REPORT Olentangy Local Schools

26

2012-13 Actual*

2012-13 End of Year

2013-14 2014-15 2015-16 2016-17 2017-18 2018-19 2019-20 2020-21 2021-22 2022-23

Pre-K 315 318 323 311 312 314 315 317 318 320 322 323K 1,241 1,249 1,309 1,258 1,264 1,270 1,277 1,283 1,290 1,296 1,303 1,3091 1,503 1,513 1,386 1,461 1,404 1,411 1,418 1,425 1,432 1,439 1,447 1,4542 1,520 1,530 1,535 1,415 1,492 1,434 1,441 1,448 1,455 1,462 1,470 1,4773 1,559 1,569 1,561 1,576 1,453 1,532 1,472 1,479 1,487 1,494 1,501 1,5094 1,510 1,520 1,596 1,598 1,613 1,487 1,568 1,506 1,514 1,521 1,529 1,5375 1,450 1,460 1,534 1,621 1,623 1,639 1,510 1,592 1,530 1,538 1,545 1,553K-5 Total 8,783 8,841 8,921 8,929 8,849 8,773 8,686 8,733 8,708 8,750 8,795 8,8396 1,440 1,450 1,495 1,581 1,671 1,672 1,689 1,556 1,641 1,577 1,585 1,5937 1,428 1,437 1,461 1,516 1,603 1,694 1,696 1,712 1,578 1,664 1,599 1,6078 1,283 1,291 1,451 1,484 1,540 1,629 1,721 1,723 1,739 1,603 1,690 1,6246-8 Total 4,151 4,178 4,407 4,581 4,814 4,995 5,106 4,991 4,958 4,844 4,874 4,8249 1,272 1,280 1,325 1,498 1,532 1,590 1,682 1,777 1,779 1,796 1,655 1,74510 1,125 1,132 1,283 1,336 1,511 1,545 1,603 1,696 1,792 1,794 1,811 1,66911 1,039 1,046 1,140 1,300 1,354 1,531 1,566 1,625 1,718 1,816 1,818 1,83512 1,067 1,074 1,043 1,144 1,304 1,358 1,536 1,570 1,629 1,724 1,821 1,8239-12 Total 4,503 4,532 4,791 5,278 5,701 6,024 6,387 6,668 6,918 7,130 7,105 7,072Total 17,752 17,869 18,442 19,099 19,676 20,106 20,494 20,709 20,902 21,044 21,096 21,058

* as of October 3, 2012

Olentangy Local School DistrictProjected Enrollment

Source: DeJONG-HEALY

Student enrollment is projected to increase by 3,306 students in grades Pre-K-12 from the 2012-13 to

the 2022-23 school year. The following table and graphs illustrate projected enrollments by grade

and grade group through the 2022-23 school year.

Page 50: OLSD Project 2020 Report

REPORT Olentangy Local Schools

27

Page 51: OLSD Project 2020 Report

REPORT Olentangy Local Schools

28

2011-12 Actual*

2011-12 End of Year

2012-13 2013-14 2014-15 2015-16 2016-17 2017-18 2018-19 2019-20 2020-21 2021-22

Pre-K 321 323 356 390 424 458 492 526 561 595 629 663K 1,341 1,348 1,319 1,351 1,358 1,364 1,371 1,378 1,385 1,392 1,399 1,4061 1,477 1,484 1,498 1,473 1,509 1,516 1,524 1,532 1,539 1,547 1,555 1,5622 1,499 1,506 1,503 1,524 1,499 1,534 1,542 1,550 1,558 1,565 1,573 1,5813 1,487 1,494 1,529 1,533 1,554 1,528 1,565 1,573 1,581 1,589 1,597 1,6054 1,410 1,417 1,531 1,574 1,577 1,599 1,573 1,611 1,619 1,627 1,635 1,6435 1,408 1,415 1,423 1,544 1,588 1,591 1,614 1,587 1,625 1,633 1,641 1,650K-5 Total 8,622 8,664 8,803 8,999 9,085 9,132 9,189 9,231 9,307 9,353 9,400 9,4476 1,415 1,422 1,458 1,473 1,598 1,644 1,647 1,670 1,642 1,682 1,690 1,6997 1,258 1,264 1,440 1,483 1,498 1,626 1,671 1,675 1,699 1,670 1,710 1,7198 1,231 1,237 1,276 1,460 1,503 1,519 1,648 1,695 1,698 1,722 1,694 1,7346-8 Total 3,904 3,923 4,174 4,416 4,599 4,789 4,966 5,040 5,039 5,074 5,094 5,1529 1,116 1,121 1,271 1,317 1,506 1,551 1,568 1,701 1,749 1,753 1,778 1,74810 1,042 1,047 1,125 1,281 1,327 1,518 1,564 1,580 1,715 1,763 1,767 1,79211 1,069 1,074 1,065 1,150 1,309 1,357 1,552 1,598 1,615 1,752 1,802 1,80612 974 979 1,075 1,071 1,156 1,316 1,363 1,560 1,606 1,623 1,761 1,8119-12 Total 4,201 4,221 4,536 4,819 5,298 5,742 6,047 6,439 6,685 6,891 7,108 7,157Total 17,048 17,131 17,869 18,624 19,406 20,121 20,694 21,236 21,592 21,913 22,231 22,419

Olentangy Local School DistrictProjected Enrollment 2011

Source: DeJONG-HEALY

COMPARISON TO 2011 ENROLLMENT PROJECTION

Last year’s projection is illustrated in the following table. For the 2021-22 school year, the difference

between this and the current projection is 1,323 students.

The biggest difference is at the Pre-K level with 341 students, and at the elementary school level with

652 students, or 109 students per grade.

The difference at the middle school level is 278 students (93 students per grade) and at the high

school level is 52 students (13 students per grade).

Page 52: OLSD Project 2020 Report

REPORT Olentangy Local Schools

29

CONCLUSION

As with any projections, the District should continue to pay close attention to live birth counts,

enrollment in the elementary schools, housing growth, and demographics. Each of these factors will

have an impact on future student enrollment.

DeJONG-HEALY is pleased to have had the opportunity to provide the District with enrollment

projection services. We hope this document will provide the necessary information to make informed

decisions about the future of the Olentangy Local Schools.

Page 53: OLSD Project 2020 Report

OLENTANGY LOCAL SCHOOL DISTRICT - Facility OpeningsLast Updated August 1, 2012

Building 1st Year Opened Paid For

Olentangy High School 1990-91

OHS Addition July-97

Wyandot Run Elementary School 1993-94

Alum Creek Elementary School 1996-97

Arrowhead Elementary School 1998-99

Scioto Ridge Elementary School 1998-99

Oak Creek Elementary School 2000-01 2000 Bond Issue

Tyler Run Elementary School 2001-02 1999 Bond Issue

Olentangy Liberty Middle School 2001-02 1999 Bond Issue

Olentangy Liberty High School 2003-04 2001 Bond Issue

Walnut Creek Elementary School 2003-04 2002 Bond Issue

Indian Springs Elementary School 2003-04 2002 Bond Issue

Olentangy Orange Middle School 2004-05 2001 Bond Issue

Home Road Bus/Garage Facility February-03 2001 Bond Issue

Glen Oak Elementary School 2005-06 2004 Bond Issue

Olentangy Meadows Elementary School2006-07 2004 Bond Issue

Olentangy Hyatts Middle School 2007-08 2004 Bond Issue

Liberty Tree Elementary School 2007-08 2004 Bond Issue

Johnnycake Corners Elementary School2007-08 2005 Bond Issue

Olentangy Orange High School 2008-09 2005 Bond Issue

Berkshire Road Bus Facility January-09 2008 Bond Issue

Freedom Trail Elementary School 2009-10 2008 Bond Issue

OHS Expansion & Renovations 2009-10 2008 Bond Issue

Cheshire Elementary School 2010-11 2008 Bond Issue

Olentangy Berkshire Middle School2011-12 2008 Bond Issue

Heritage Elementary School 2011-12 2008 Bond Issue

Page 54: OLSD Project 2020 Report
Page 55: OLSD Project 2020 Report
Page 56: OLSD Project 2020 Report
Page 57: OLSD Project 2020 Report
Page 58: OLSD Project 2020 Report
Page 59: OLSD Project 2020 Report
Page 60: OLSD Project 2020 Report

OLENTANGYLOCALSCHOOLDISTRICT‐‐DELAWARECOUNTYScheduleOfRevenue,ExpendituresandChangesInFundBalances

ActualandForecastedOperatingFund

ACTUAL FORECASTEDFiscalYear FiscalYear FiscalYear FiscalYear FiscalYear FiscalYear FiscalYear FiscalYear

2010 2011 2012 2013 2014 2015 2016 2017Revenue:

1.010‐GeneralPropertyTax(RealEstate) 95,938,250 96,946,482 106,623,772 117,832,378 118,898,094 120,638,068 122,763,489 125,259,6801.020‐TangiblePersonalPropertyTax 5,142,342 5,170,660 5,283,158 5,811,401 5,927,629 6,046,182 6,167,105 6,228,1661.030‐IncomeTax ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐1.035‐UnrestrictedGrants‐in‐Aid 7,458,556 7,009,414 7,330,465 7,330,465 7,330,464 7,330,464 7,330,464 7,330,4641.040‐RestrictedGrants‐in‐Aid 561,288 671,088 19,679 19,675 19,675 19,675 19,675 19,6751.045‐RestrictedFederalGrants‐in‐Aid‐SFSF ‐ ‐ 394,926 ‐ ‐ ‐ ‐ ‐1.050‐PropertyTaxAllocation 15,312,340 16,852,285 15,803,622 14,873,589 15,039,413 15,265,290 15,549,578 15,889,5091.060‐AllOtherOperatingRevenues 13,731,899 13,739,404 18,361,120 18,321,101 18,812,707 18,566,500 18,759,333 18,953,129

1.070‐TotalRevenue 138,144,675 140,389,333 153,816,742 164,188,608 166,027,982 167,866,178 170,589,643 173,680,622

OtherFinancingSources:2.010‐ProceedsfromSaleofNotes ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐2.020‐StateEmergencyLoansandAdvancements ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐2.040‐OperatingTransfers‐In ‐ ‐ 8,382 ‐ ‐ ‐ ‐ ‐2.050‐Advances‐In ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐2.060‐AllOtherFinancingSources 10,272 38,567 127 25,000 25,000 25,000 25,000 25,0002.070‐TotalOtherFinancingSources 10,272 38,567 8,509 25,000 25,000 25,000 25,000 25,000

2.080‐TotalRevenuesandOtherFinancingSources 138,154,947 140,427,900 153,825,251 164,213,608 166,052,982 167,891,178 170,614,643 173,705,622

Expenditures:3.010‐PersonnelServices 85,642,580 89,157,154 93,542,391 96,920,093 102,334,337 108,103,611 113,628,425 118,941,5353.020‐Employees'Retirement/InsuranceBenefits 29,928,348 34,290,693 34,642,140 36,738,602 40,608,886 45,005,666 49,726,813 54,673,6993.030‐PurchasedServices 8,470,542 10,994,276 10,258,527 11,394,854 11,850,648 12,324,674 12,817,661 13,330,3673.040‐SuppliesandMaterials 5,229,198 3,762,705 4,012,908 4,260,596 5,214,243 4,824,764 5,411,478 5,654,9943.050‐CapitalOutlay 284,063 ‐ 214,604 251,060 256,081 261,203 266,427 271,7563.060‐Intergovernmental ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐

DebtService:4.010‐Principal‐AllYears ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.020‐Principal‐Notes ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.030‐Principal‐StateLoans ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.040‐Principal‐StateAdvances ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.050‐Principal‐HB264Loan ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.055‐Principal‐Other ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.060‐InterestandFiscalCharges ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐4.300‐OtherObjects 4,210,071 5,635,742 6,482,136 7,629,259 8,217,188 8,850,838 9,471,424 10,119,421

4.500‐TotalExpenditures 133,764,802 143,840,570 149,152,706 157,194,464 168,481,383 179,370,755 191,322,228 202,991,772

OtherFinancingUses5.010‐OperatingTransfers‐Out ‐ ‐ 8,382 ‐ ‐ ‐ ‐ ‐5.020‐Advances‐Out ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐5.030‐AllOtherFinancingUses ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐5.040‐TotalOtherFinancingUses ‐ ‐ 8,382 ‐ ‐ ‐ ‐ ‐

5.050‐TotalExpendituresandOtherFinancingUses 133,764,802 143,840,570 149,161,088 157,194,464 168,481,383 179,370,755 191,322,228 202,991,772

ExcessofRev&OtherFinancingUsesOver(Under)6.010‐ExpendituresandOtherFinancingUses 4,390,145 (3,412,670) 4,664,163 7,019,144 (2,428,401) (11,479,577) (20,707,585) (29,286,150)

CashBalanceJuly1‐ExcludingProposedRenewal/7.010‐ReplacementandNewLevies 23,046,870 27,437,015 24,024,345 28,688,508 35,707,652 33,279,250 21,799,673 1,092,087

7.020‐CashBalanceJune30 27,437,015 24,024,345 28,688,508 35,707,652 33,279,250 21,799,673 1,092,087 (28,194,063)

8.010‐EstimatedEncumbrancesJune30 2,894,084 2,366,809 2,719,602 2,600,000 2,678,000 2,758,340 2,841,090 2,926,323

ReservationsofFundBalance:9.010‐TextbooksandInstructionalMaterials ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.020‐CapitalImprovements ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.030‐BudgetReserve 58,478 ‐ ‐ ‐ ‐ ‐ ‐ ‐9.040‐DPIA ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.050‐DebtService ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.060‐PropertyTaxAdvances ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.070‐BusPurchases ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐9.080‐Subtotal 58,478 ‐ ‐ ‐ ‐ ‐ ‐ ‐

FundBalanceJune30forCertification10.010‐ofAppropriations 24,484,453 21,657,536 25,968,906 33,107,652 30,601,250 19,041,333 (1,749,003) (31,120,386)

RevfromReplacement/RenewalLevies11.010‐IncomeTax‐Renewal ‐ ‐ ‐ ‐ ‐11.020‐PropertyTax‐RenewalorReplacement ‐ ‐ ‐ ‐ ‐11.030‐CumulativeBalanceofReplacement/RenewalLevies ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐

FundBalanceJune30forCertification12.010‐ofContracts,SalaryandOtherObligations 24,484,453 21,657,536 25,968,906 33,107,652 30,601,250 19,041,333 (1,749,003) (31,120,386)

RevenuefromNewLevies13.010‐IncomeTax‐New ‐ ‐ ‐ ‐ ‐13.020‐PropertyTax‐New ‐ ‐ ‐ ‐ ‐13.030‐CumulativeBalanceofNewLevies ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐

14.010‐RevenuefromFutureStateAdvancements ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐

15.010‐UnreservedFundBalanceJune30 24,484,453 21,657,536 25,968,906 33,107,652 30,601,250 19,041,333 (1,749,003) (31,120,386)

ADMForecasts20.010‐Kindergarten‐OctoberCount 1,319 1,351 1,358 1,364 1,37120.015‐Grades1‐12‐OctoberCount 16,194 16,883 17,624 18,299 18,831

Page 61: OLSD Project 2020 Report

09/14/12 Page 1

OLENTANGY LOCAL SCHOOL DISTRICT NOTES AND ASSUMPTIONS RELATED TO THE

FIVE YEAR FORECAST OCTOBER 2012

OVERVIEW The October 2012 forecast considers not only the 2011 property value reappraisal by the County Auditor, but also future property valuation growth estimates and the effects of HB153 passed by the State legislature and signed by the State Governor. HB153 went into effect on July 1, 2011. HB153 created a “Bridge Formula” for calculating state foundation funding until a new formula can be created. This Bridge Formula basically means Olentangy is still a guarantee district, meaning we will receive approximately the same amount of state foundation funding in FY13 that we received in FY12. The most significant consequence of being a guarantee district is that we receive no additional funding for our tremendous student growth. While Olentangy continues to grow at close to 700-800 students per year, our state funding has remained flat. The most significant negative outcome of HB153 is the accelerated phase out of state Tangible Personal Property (TPP) reimbursements. The District did not anticipate that the elimination of the TPP phase out would be expedited so that it would lose all of its TPP reimbursement by FY13, as opposed to the original TPP phase out deadline of FY18. That equates to a combined loss of over $7 million for FY12 and FY13. As a continual objective to find more efficient ways to deliver our educational product, the District administration has already begun making plans for additional expenditure reductions, while still delivering the same level of service to our students and community. Those reductions will be discussed throughout the notes to the financial statements. This process is ongoing and updates to the forecast will continue as reductions are made.

Page 62: OLSD Project 2020 Report

09/14/12 Page 2

GENERAL

For planning purposes the number of students is essential, along with the timing of opening new buildings.

For the estimation of student population, the District used the most recent enrollment projections as presented by the Development Committee. This plan was approved by the District’s Development Committee on November 2, 2011 and was presented to and approved by the Board on November 9, 2011.

Enrollment figures do not include preschool students.

The District used the Development Committee’s Enrollment Projections dated November 2011 (summary of DeJong and Associates Report) for enrollment projections to determine staffing requirements. The following student enrollment projections were used (FY13 Actual as of date of forecast):

Grades 2013 - Actual 2013 2014 2015 2016 2017

K-5 8,790 8,803 8,999 9,085 9,132 9,189

6-8 4,151 4,174 4,416 4,599 4,789 4,966

9-12 4,517 4,536 4,819 5,298 5,742 6,047

Total 17,458 17,513 18,234 18,982 19,663 20,202

Projected

The District opened one new elementary in fiscal year 2012.

The District opened its fifth middle school in fiscal year 2012.

No buildings are projected to be opened in fiscal years 2013 through 2017.

2013 2014 2015 2016 2017

# of Buildings 23 23 23 23 23

Certified Staff 1,101 1,134 1,171 1,208 1,222

Classified Staff 674 690 695 700 703

Administrative Staff 70 70 70 70 70

Pupil Teacher Ratio 15.86 16.08 16.21 16.27 16.33

Enrollment 17,458 18,234 18,982 19,663 20,202

Staffing Detail

REVENUE

GENERAL PROPERTY TAX (REAL ESTATE)

The County Auditor conducted a reappraisal of existing property values in calendar year 2011. The economic downturn has had a negative impact on property values. As such, property values in total in Olentangy did not see any increase during this or the previous reappraisal. The 2011 reappraisal resulted in a decrease in property valuation of approximately 6%. This is an unusual occurrence as the District saw double digit percentage growth in the prior two reappraisals. Due to the effect of HB920, this decrease in valuation will negatively affect our un-voted, or inside, millage. This means that the total valuation that our five (5) inside mills is multiplied by is decreasing, which decreases that revenue.

Page 63: OLSD Project 2020 Report

09/14/12 Page 3

Also, HB920 causes the District’s voted, or outside, millage to adjust so that the District does not receive less revenue than the voted mills provided when they were approved by the voters. This will be true of all existing operating levies, except the levy passed in March 2008. The 2008 reappraisal warranted no adjustment to real estate values, therefore, this levy has not been rolled back, causing it to be collected at its original 7.9 mills. Because this levy is a fixed rate levy, there is no room for HB920 to adjust upward. This will cause OLSD to lose approximately $1.2 million in annual real estate tax collections. The reduction in valuation also causes our May 2011 levy to produce approximately $1.2 million less revenue than it would have. The County Auditor’s estimate of what the levy would generate was based on our 2010 values. The levy began generating revenue based on our 2011 values, which were reduced by the reappraisal.

The forecast assumes that growth in new residential and commercial real estate will remain slow due to the economy. Therefore, outside of the impact of a new levy, tax collections are anticipated to grow at a rate consistent with new construction. The District continues to have conversations with the Delaware County Auditor concerning this matter.

The District estimates a collection rate of approximately 97% based on historical trends. Delinquencies are expected to remain at their current level.

The District has experienced a rapid decline in new construction over the past several years. In conversations with County officials, we anticipate new construction to gradually begin to increase as noted below.

Delaware County, like many counties, has also seen a large influx of BOR (board of revision) real estate value complaint cases. This is where property owners, both residential and commercial, can request to have their property values reduced by the BOR. The BOR is comprised of a county commissioner, the county auditor and the county treasurer. Not all requests are granted, however, enough have been approved that the District has seen a significant negative impact on real estate tax revenues due to the resulting reduction in real estate values. HB920 provides some relief due to the fact that if one property owner’s taxes are reduced, another property owner(s) taxes must be increased to comply with HB920. This happens because as the total valuation decreases due to BOR cases, effective millage is rolled up so that the District collects about the same amount of revenue from each levy; however, the District still loses revenue on the inside millage, levies that have no room to be rolled up, and the first year reduction in value because the total values are not adjusted until the following year.

It is worth noting that a half percent change in valuation equates to approximately $500k in revenue, so even a minor change in new construction, BOR cases, delinquencies, etc. will have a significant dollar impact on the forecast.

Page 64: OLSD Project 2020 Report

09/14/12 Page 4

Projected Projected Projected Projected Projected

Collection Collection Collection Collection Collection

Year Year Year Year Year

2013 2014 2015 2016 2017

Residential

Inflation/Reappraisal/BOR -0.50% -0.50% 0.00% 0.00% 0.00%

New Construction 1.20% 1.50% 1.75% 2.00% 2.40%

Commercial

Inflation/Reappraisal/BOR 0.00% 0.00% 0.00% 0.00% 0.00%

New Construction 1.40% 1.40% 1.40% 1.40% 1.40%

Based on the above discussion, the real estate tax revenue is projected as follows:

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Real Estate

Tax Revenue 106,623,772$ 117,832,378$ 118,898,094$ 120,638,068$ 122,763,489$ 125,259,680$

Forecasted

TANGIBLE PERSONAL PROPERTY TAX

The forecast takes into account the impact of HB153, the new biennial budget for FY12-13. Per HB153, the phase out of state tangible personal property reimbursement is accelerated and will be completely eliminated after FY2013. HB153 causes a total loss in tangible personal property reimbursements of $2.5 million in FY2012 and $4.6 million per year thereafter. These reimbursements are recorded in “Property Tax Allocation”.

Personal property utility tax (PPUT) is the tangible personal property used in the operations of a public utility company, such as telephone and electric lines. The District expects to continue collecting this portion of taxes, which is based on voted millage.

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Personal Property

Utility Tax 5,283,158$ 5,811,401$ 5,927,629$ 6,046,182$ 6,167,105$ 6,228,166$

Forecasted

UNRESTRICTED GRANTS-IN-AID

In the prior biennial budget, the State funding model saw substantial changes as it went through the legislative process. The State legislature approved the biennial budget known as House Bill 1 (HB1) on June 30, 2009, which was then signed by the Governor. HB1 implemented an evidence based model (EBM) of funding. The legislature put gain caps on this funding model. Olentangy went from being a formula funded school district to a guarantee district. This is significant in that the District started seeing growth in the State funding based

Page 65: OLSD Project 2020 Report

09/14/12 Page 5

on our enrollment growth beginning in FY2009. However, due to the passage of HB1 and becoming a guarantee district, Olentangy received no additional funding for our tremendous student growth during that time.

The current biennial budget, HB 153, took effect on July 1, 2011. HB153 created a “Bridge Formula” for calculating state foundation funding until a new formula can be created. This Bridge Formula basically means Olentangy is still a guarantee district, hence we will receive approximately the same amount of state foundation funding in FY2013 that we received in FY2012. While Olentangy continues to grow at 700-800 students per year, our state funding has remained flat. Also, HB153 does not replace the State Fiscal Stabilization Funds we had previously received as part of the federal American Recovery and Reinvestment Act, which is a loss of close to $600k compared to FY2011.

This forecast will include two new State biennial budgets during the remainder of the forecast period. We expect to have a new funding formula beginning in FY2014; however, no information is available at this time.

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

State Basic Aid 7,330,465$ 7,330,465$ 7,330,464$ 7,330,464$ 7,330,464$ 7,330,464$

Forecasted

RESTRICTED GRANTS-IN-AID

Special State funding programs are included in this category. The District receives money from the State to assist in career tech funding. A portion of funding which flows through ODE is federal stimulus money. The Ed Jobs funding is a federal program designed to save or create education jobs. This one-time funding was received in FY2012.

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Restricted Grants in Aid 19,679$ 19,675$ 19,675$ 19,675$ 19,675$ 19,675$

Restricted Federal - Ed Jobs 394,926$ -$ -$ -$ -$ -$

Total 414,605$ 19,675$ 19,675$ 19,675$ 19,675$ 19,675$

Forecasted

PROPERTY TAX ALLOCATION

A majority of these funds are reimbursements from the State for tax credits given owner occupied residences known as homestead/rollback equaling 12.5% of the gross property taxes charged to residential taxpayers and up to 10% for commercial and industrial taxpayers. These amounts will increase and decrease with property valuation fluctuations and the number of residents applying for the credit.

Additionally, the State reimbursement for the phasing out of tangible personal property taxes (PPT) is included in this category. See discussion of these reimbursements in the “Personal Property Tax” section.

See discussion of levy and construction growth in the “General Property Tax (Real Estate)” section.

Page 66: OLSD Project 2020 Report

09/14/12 Page 6

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Reimbursements:

PPT Reimbursement 2,121,202$ -$ -$ -$ -$ -$

Rollback/Homestead 13,682,420$ 14,873,589$ 15,039,413$ 15,265,290$ 15,549,578$ 15,889,509$

Total 15,803,622$ 14,873,589$ 15,039,413$ 15,265,290$ 15,549,578$ 15,889,509$

Forecasted

ALL OTHER REVENUES

Included in this category are various items such as tax increment financing payments (TIF’s), investment income, facility rentals, pay-to-participate fees, tuition, donations, income tax sharing agreements and other miscellaneous items. A large portion of this revenue is from tax sharing agreements with the City of Westerville and the City of Columbus. These two entities abated several businesses, and due to the size of the abatements, they are required to share tax revenue with the District. This portion of revenue is expected to increase approximately 6% per year based on discussion with the City of Westerville finance department. Also included for FY2012 in Other Revenue below is $2.1 million for the settlement from leaving the CDMU insurance consortium. This settlement is a one-time payment that basically reimburses the District for payment of run out claims incurred from leaving the consortium.

TIF payments make up the majority of revenue in this category at approximately $14 million of the total. The TIF district includes Bank One (Chase), the Polaris Mall and most of the commercial property along the Polaris corridor. The May 2011 levy caused an increase in TIF revenue for FY2012 and beyond. Also, the District should see some increase in TIF valuation in FY2013 and beyond due to phase two of the Chase Bank TIF expiring. A portion of this increase could be offset by a decrease in valuation caused by various other businesses filing with the County to decrease their respective valuations through the BOR process.

The District began receiving a guaranteed payment from Citicorp in the amount of $453,000 per year beginning in FY2008 as part of a 15 year CRA agreement. The District also receives $120,000 per year from the Kroger Company for a CRA agreement that expires on 12/31/2013. As TIF and CRA agreements expire, resulting in a loss of revenue in the Other Revenue line, that valuation becomes taxable and increases revenue in the Property Tax line.

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

TIF Tax Revenue 12,421,771$ 14,221,771$ 14,592,880$ 14,345,574$ 14,417,302$ 14,482,180$

Tax Revenue Sharing 1,588,344$ 1,704,484$ 1,806,753$ 1,915,158$ 2,030,068$ 2,151,872$

CRAs 573,000$ 573,000$ 573,000$ 453,000$ 453,000$ 453,000$

Other Revenue 3,778,005$ 1,821,846$ 1,840,074$ 1,852,768$ 1,858,963$ 1,866,077$

Total 18,361,120$ 18,321,101$ 18,812,707$ 18,566,500$ 18,759,333$ 18,953,129$

TOTAL OTHER FINANCING SOURCES

Included in this category are operating transfers, advances-in, refunds and sales of notes.

Page 67: OLSD Project 2020 Report

09/14/12 Page 7

EXPENDITURES

PERSONAL SERVICES

Based on a continued lack of growth in state funding and the District’s commitment to maintain its levy promise, District employees have continued to make concessions in salaries and benefits. Expenditures for FY2012 reflect all administrators taking a voluntary pay freeze, as well as all classified union and non-union employees foregoing cost of living increases and step increases. Certified union employees (teachers) negotiated to forego cost of living increases in FY2012, but will still receive the step increase. The certified union approved a three year contract beginning July 1, 2012 that resulted in significant savings to the forecast. The contract calls for no step increase in FY13 resulting in savings of approximately $1.8 million in that year. Base increases will be 1% in FY2013 and 0.5% in FY2014 and FY2015. The most significant savings came from changes made to the health insurance plan as a result of the contract. These changes shift more of the costs to the employee and created a High Deductible option for employees. Step increases are expected to average 2.7% in FY2014 through FY2017. Following the approval of the certified union contract, the classified unions also approved three year contracts beginning July 1, 2012. Base increases under these contracts will be 1% in FY2013 and 0.5% in FY2014 and FY2015. The contract also includes a one-time payment of $200 and step increases each year. The classified unions agreed to the same insurance concessions as the certified union. Administrators will also have the same health insurance plan as the unions, while receiving 2% raises in FY2013 through FY2015. The District has three unions: OTA for teachers, OAPSE for bus drivers, and OAPSE CMF for custodial, maintenance, and field service technicians.

In addition to annual raise and step increases, certified staff can increase their salaries by increasing their education (ie. Bachelors Degree to Masters Degree, etc). This cost will continue over the years as the number of employees continues to grow.

The District estimates future staffing needs based on student enrollment projections and the opening of schools. (See discussion relating to growth in the “General Assumptions” section).

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Base Wages 87,618,790$ 93,542,391$ 96,920,093$ 102,334,337$ 108,103,611$ 113,628,425$

Pay/Merit Increase 2,032,243$ 1,481,583$ 2,850,946$ 3,127,409$ 2,831,525$ 2,976,236$

Education Adv 711,000$ 700,000$ 735,000$ 771,750$ 810,338$ 850,855$

New Staffing 3,180,358$ 1,196,119$ 1,828,298$ 1,870,115$ 1,882,951$ 1,486,019$

Total 93,542,391$ 96,920,093$ 102,334,337$ 108,103,611$ 113,628,425$ 118,941,535$

Forecasted

Classroom/Teacher ratio for hiring purposes is based on 25:1 at the middle school/high school level and 24:1 at the elementary level.

Note that certified staff includes special classes such as librarians, physical education, art, music, foreign language, etc. which decreases the actual classroom ratio.

Page 68: OLSD Project 2020 Report

09/14/12 Page 8

Pupil/Teacher ratios are as follows:

FY13 FY 14 FY 15 FY 16 FY 17

Cert Actual Cert Projected Cert Projected Cert Projected Cert Projected

Grade Staff Enrollment Ratio Staff Enrollment Ratio Staff Enrollment Ratio Staff Enrollment Ratio Staff Enrollment Ratio

K-5 539 8,790 16.31 551 8,999 16.33 558 9,085 16.28 564 9,132 16.19 569 9,189 16.15

6-8 280 4,151 14.83 288 4,416 15.33 296 4,599 15.54 306 4,789 15.65 316 4,966 15.72

9-12 282 4,517 16.02 295 4,819 16.34 317 5,298 16.71 338 5,742 16.99 352 6,047 17.18

Total 1,101 17,458 15.86 1,134 18,234 16.08 1,171 18,982 16.21 1,208 19,663 16.28 1,237 20,202 16.33

EMPLOYEES RETIREMENT/INSURANCE BENEFITS

Benefits include the following:

Employer pension payments to STRS/SERS equal to 14% of payroll. Participation in STRS/SERS is governed by Ohio Revised Code (ORC). The rate of 14% is also governed by ORC. SERS charges a surcharge for any employee who does not make $35,600 per year. This surcharge means that any employee salary less than the surcharge amount will be charged by SERS to the District as if the employee was paid the surcharge amount. There is a maximum amount per year based on total payroll. The SERS surcharge is dictated by the SERS board. The District has no control over these rates.

The District also pays pick-up on the pick-up for all administrative staff as part of their compensation package. An additional 11% on administrative salaries only is also included in this line.

Medical insurance premiums increased by 9.7% in FY2012. The District is currently fully insured after exiting the CDMU insurance consortium effective September 1, 2010. Based on high claims utilization during FY2012, the prior forecast projected a 25% increase for FY2013; however, the insurance plan negotiated by the unions beginning in FY2013 will help offset some of that increase. The treasurer also negotiated with the insurance carrier to make the changes effective 1/1/13 and waive the 25%-30% premium increase on the current plan which was to be effective 9/1/12. This has a substantial positive effect on the District’s financial position. Future increases are expected to be approximately 12% to 13% based on input from the District’s insurance broker. A board insurance task force was formed to continue to evaluate and recommend potential cost savings in this area.

A. STRS/SERS Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

STRS/SERS/Pickup 13,244,435$ 14,074,773$ 14,547,652$ 15,305,646$ 16,113,344$ 16,886,818$

Pay/Merit Increase 284,514$ 207,422$ 399,132$ 437,837$ 396,414$ 416,673$

Education Adv 99,540$ 98,000$ 102,900$ 108,045$ 113,447$ 119,120$

New Staffing 446,284$ 167,457$ 255,962$ 261,816$ 263,613$ 208,043$

Total 14,074,773$ 14,547,652$ 15,305,646$ 16,113,344$ 16,886,818$ 17,630,654$

Forecasted

Page 69: OLSD Project 2020 Report

09/14/12 Page 9

B. Health Insurance

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Base Cost 17,219,746$ 19,198,481$ 22,558,112$ 26,122,101$ 29,944,106$ 34,262,202$

Run Out Claims 49,037$

New Staff/Open Enroll 1,180,974$ 764,450$ 765,192$ 613,708$ 647,146$ 442,613$

Total 18,449,757$ 19,962,931$ 23,323,304$ 26,735,809$ 30,591,252$ 34,704,815$

C. Other Insurances

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Medicare Premiums 1,361,164$ 1,431,931$ 1,483,848$ 1,650,503$ 1,732,613$ 1,811,777$

BWC/Unempl/Tuition 756,446$ 796,088$ 496,088$ 506,010$ 516,130$ 526,453$

Total 2,117,610$ 2,228,019$ 1,979,936$ 2,156,513$ 2,248,743$ 2,338,230$

PURCHASED SERVICES

Purchased services include various contracted services such as utilities, legal fees, insurance, professional development, and substitute teachers hired through the Educational Service Center of Central Ohio Council of Governments. Other factors include:

Utilities between 6% - 8% each year per discussion with the Business Manager.

Utility savings of approximately $100k beginning in FY2012 due to implementing four ten hour day work weeks in the summer.

Utility savings of approximately $500k beginning in FY2013 due to capital improvements to gain efficiencies (ie. occupancy sensors, air quality sensors, etc.). On August 8, 2012 the PUCO issued a modified electric security plan (ESP) for AEP to establish generation rates through May 2015. Base electric rates are frozen at this time; however riders that will be implemented are expected to increase rates on average from 5% to 7% starting September 2012 with a 12% cap on increases. This increase will likely offset a portion of the savings noted above.

Community school costs, post secondary educations costs, and other foundation payments 4% per year.

Additionally, in FY2012, the District incurred utility costs for a new elementary school and a new middle school.

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Purchased Services 4,792,479$ 5,238,872$ 5,358,932$ 5,478,425$ 5,597,006$ 5,714,296$

Foundation Payments 1,405,317$ 1,681,250$ 1,748,500$ 1,818,440$ 1,891,178$ 1,966,825$

Utilities 3,650,731$ 4,474,732$ 4,743,216$ 5,027,809$ 5,329,477$ 5,649,246$

Utilities - New Schools 410,000$ -$ -$ -$ -$ -$

Total 10,258,527$ 11,394,854$ 11,850,648$ 12,324,674$ 12,817,661$ 13,330,367$

Forecasted

Page 70: OLSD Project 2020 Report

09/14/12 Page 10

SUPPLIES AND MATERIALS

Overall, supplies and materials are expected to increase 5% per year to keep up with growth and inflation. However, building budgets were reduced by 10% for FY2012 and 20% for FY2013, which predominantly impacts supplies and materials. Each year a school opens, there is an additional building budget and the opening year budget is larger than normal in order to help equip the school. The District has a curriculum plan to revisit various subjects over the next seven years with costs of implementing new textbooks. The District re-evaluates the curriculum plan annually. In addition, the state adoption of common core curriculum will require the purchase of additional curriculum resources to align with the common core standards.

Curriculum adoptions at Olentangy Local Schools take place on a rotating basis according to the initial curriculum map adoption. During an adoption year, the following filters are used to determine the new adoption:

Student achievement data

State comparison district adoption data

Independent research on proposed adoption materials

Alignment to the Olentangy curriculum map

All projections for textbook adoptions are based on the current District adoption costs and projected using a 3% inflationary/growth adjustment for each year beyond the current pricing year. Included in all projections is a 10% shipping charge.

FY2014 Math

FY2015 Science, Social Studies, AP Spanish

FY2016 Language Arts

FY2017 Language Arts

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

Supplies and Materials 3,702,908$ 4,260,596$ 4,412,323$ 4,694,364$ 4,911,478$ 5,154,994$

New Building Budgets 310,000$ -$ -$ -$ -$ -$

New Txtbk Adoptions -$ -$ 801,920$ 130,400$ 500,000$ 500,000$

Total 4,012,908$ 4,260,596$ 5,214,243$ 4,824,764$ 5,411,478$ 5,654,994$

Forecasted

CAPITAL OUTLAY

Capital outlay consists of any equipment type items the District purchases over $2,000. The capital outlay projection for FY2013 is based on the FY2013 budget and is expected to increase approximately 2% each year. The majority of the District’s capital outlay expenditures are taken out of bond funds; therefore, the capital outlay from the General fund is low compared to the size of the District.

Page 71: OLSD Project 2020 Report

09/14/12 Page 11

OTHER OBJECTS

The majority of expenses in this category relate to contracted services with the Educational Service Center of Central Ohio (ESCCO), most of which are due to contract services needed for special education curriculum and various other ESC costs. Additionally, the County Auditor fees for collection of taxes are also included. As tax revenue increases, collection fees also increase. The District has no control over the collection fee assessed by the County Auditor. The expected increase each year is 2% along with additional services contracted with the ESCCO as the District’s enrollment continues to grow.

The spike in FY2012 Other Objects expenditures is mainly due to the District using IDEA-B ARRA (federal stimulus monies) to offset some of the ESCCO costs in FY2011. These funds are no longer available; therefore, the costs must be put back into the General fund. ARRA expenditures were approximately $1.1 million in FY2011.

Actual

Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year Fiscal Year

2012 2013 2014 2015 2016 2017

ESC Services 4,328,354$ 5,277,951$ 5,846,402$ 6,430,411$ 7,000,296$ 7,596,510$

County Auditor Fees 1,899,026$ 2,106,124$ 2,148,246$ 2,191,211$ 2,235,036$ 2,279,736$

Addtnl Fees and Svcs 254,756$ 245,184$ 222,540$ 229,216$ 236,092$ 243,175$

Total 6,482,136$ 7,629,259$ 8,217,188$ 8,850,838$ 9,471,424$ 10,119,421$

Forecasted

RISK ASSESSMENT

One of the most critical areas of risk in this forecast is the Unrestricted Grants-In-Aid or State funding payments. State funding was completely changed due to HB1 passed by legislators in June 2009. HB1 caused the District to be placed back on a guarantee. This significantly decreased anticipated State aid as the District was beginning to receive additional money resulting from our rapid student growth. To add to this loss of anticipated revenue, HB153 accelerates the phase-out of TPP reimbursement to OLSD. While HB153 keeps the District on a guarantee for state foundation funding for FY2013, there is not a formula in place for FY2014 or beyond. This forecast assumes our state foundation funding will remain relatively flat, but the new formula could have a significant impact on these projections. Another important provision of HB153 is the elimination of automatic step increases, in favor of a merit based pay system. Much work remains to be done in this area at the state level and the mechanics of merit pay is yet to be determined.

Enrollment may be a key driver in the Unrestricted Grants-in-Aid revenue area or State foundation payments. New growth will impact not only personnel/benefits, but also operating costs associated with opening new buildings.

Another area of risk in this forecast is property valuations, given the current economic climate. The District went through the reappraisal process for calendar year 2011, which resulted in a 6% decrease in property values. See the Property Tax Note for the effect on current revenue; however, this will also have a significant effect on future revenues. The amount of revenue generated by each mill of a future levy will decrease due to that mill being applied to a lower anticipated tax base.

Page 72: OLSD Project 2020 Report

09/14/12 Page 12

Also, due to the current financial crisis, there is a risk of increased delinquencies in tax collections, as well as risk to the anticipation of collection recovery time as the turnaround may be slower than projected in this forecast. The economy has also caused an increase in the number of residential and commercial Board of Revision cases, which could further decrease our tax base.

As in prior years, there is risk to the medical insurance premiums. As stated in the notes, the District separated from the healthcare consortium to which it belonged, with the goal of long term savings and more control over the level of coverage. However, based on claims experience, these premiums can fluctuate rather significantly from year to year. Constant attention will be paid to healthcare premiums.

Utility costs are also a risk factor depending on weather conditions and cost increases from year to year. Ohio experienced one of the warmest winters in recorded history in FY2012, which lead to a significant savings in utilities. This forecast assumes FY2012 weather conditions were an anomaly and FY2013 utility costs and beyond will be more in line with historical trends. There is also a significant risk related to electricity costs. On August 8, 2012 the PUCO issued a modified electric security plan (ESP) for AEP to establish generation rates through May 2015. Base electric rates are frozen at this time; however, riders that will be implemented are expected to increase rates on average from 5% to 7% starting September 2012 with a 12% cap on increases.

The rising cost of fuel in the transportation budget is a risk factor for the Supplies and Materials line. Fuel prices have been increasing much faster than anticipated. Fuel expenditures were significantly over budget in FY2012, causing a projected increase throughout the forecast.

Reimbursements for State and Federal mandates continue to be at risk. For example, the loss of ARRA funding from FY2010 and FY2011 added over $3 million in annual costs to the District’s General fund. State grants have been cut altogether and cuts to our Federal grants could be a further burden on the General fund. Other current and future mandates and reimbursements will continue to be susceptible to state and federal budget cuts.

HB136 is pending legislative approval, which could have a negative financial impact on the District. This legislation would offer vouchers to students in every Ohio school district, regardless of the district’s academic performance. The only qualifier for eligibility is household income. This legislation would also allow students currently enrolled in private schools to be eligible for the voucher. Funding for these vouchers would be deducted from public school districts’ state funding.

Elementary #16 has been pushed out of the forecast in an effort to reduce operating costs, as well as the associated construction costs that would be expended out of bond funds. The timing of future buildings is a risk factor to the forecast due to the operating costs a new building adds. The 2020 committee is currently looking at ways to continue to offer the same level of education, while reducing the need for new buildings such as a potential fourth high school.

On November 3, 2009 Ohio voters passed the Ohio casino ballot issue. This issue allowed for the opening of four (4) casinos one each in Cleveland, Toledo, Columbus and Cincinnati. The casinos in Cleveland and Toledo opened in spring 2012 generating $6,718,000 in revenue for the School District Fund as of June 30, 2012. The Columbus and Cincinnati casinos are slated to open in fall of 2012 and spring 2013 respectively. Thirty-three percent (33%) of the gross casino revenue will be collected as a tax. School districts will receive 34% of the total tax that will be paid into a student fund at the state level. These funds will be distributed to school districts in January and August each year beginning January 2013.

Page 73: OLSD Project 2020 Report

09/14/12 Page 13

Projections of $36 to $40 million in the School District Fund through December 2012 suggest the per pupil amount districts might expect to receive indicate a $18 - $22 range per pupil in FY13 with the possibility of increasing to more than $80 per pupil by FY15 based on Ohio Department of Taxation (ODT) revenue assumptions of approximately $643 million in casino revenue once the full collection of all four (4) casinos is realized. If this holds true, this could add approximately $367,000 in additional revenue in the first year.

No official ODT guidance has been given as to the exact amount of these payments. It is a safe assumption that the District will receive casino revenue in FY13; however, the amount and funding beyond FY13 is uncertain. Due to the lack of guidance from the State, Olentangy Local School District has not included casino revenue at this time. Once more guidance is given from the ODT, the District will project these revenues accordingly.

Page 74: OLSD Project 2020 Report

ENHANCING ACADEMIC DELIVERY & IMPROVING

COLLEGE & CAREER READINESS Project 2020

High School Scheduling Presentation

Page 75: OLSD Project 2020 Report

ADDITIONAL CAPACITY AT EACH HIGH SCHOOL

! AVAILABLE CLASSROOM SPACE ! 65 Core Area Teachers Assigned to a Classroom ! Per Contractual Requirements Each Teacher Has Unassigned Class Periods Per Day for Planning and CollaboraGon. There are approximately 65 Rooms open two periods per day

! 130 Open Class Spaces Times 25 Seats Equals 3250 Seats ! 3250 Seats Divided by 6 Periods Equals 540 Students per High School

Page 76: OLSD Project 2020 Report

ADDITIONAL CAPACITY AT EACH HIGH SCHOOL

! FLEX SCHEDULE TO ADD TWO PERIODS TO SCHOOL DAY AND ALLOW LATE ARRIVAL / EARLY DISMISSAL ! Students Needing TransportaGon Start at 7:20 a.m and End at 2:35 p.m.

! Self-­‐TransporGng Students Start at 9:05 a.m and End at 4:15 p.m.

! Adds Seats for an AddiGonal 540 students per High School Similar to Previous Example

Page 77: OLSD Project 2020 Report

ADDITIONAL CAPACITY AT EACH HIGH SCHOOL

! REPLACE FIVE COMPUTER LABS WITH WIRELESS COMPUTER CARTS ! Five Labs Times Eight Periods Per Day Equals 40 Class Periods

! 40 Class Periods Gmes 25 Students Equals 1000 Class Seats ! 1000 Class Seats Divided by Six Class Periods Per Student Equals an AddiGonal 167 Students

! Similar to What Has Been Done at Elementary Schools to Increase Capacity

! Minimal Costs for This Conversion

Page 78: OLSD Project 2020 Report

ADDITIONAL CAPACITY OF EACH HIGH SCHOOL

! Design Capacity Equals 1600 Students ! AddiGonal Capacity OpGon #1 Equals Approximately 500 Students

! AddiGonal Capacity OpGon #2 Equals Approximately 500 Students

! AddiGonal Capacity OpGon #3 Equals Approximately 167 Students

! PotenGal Capacity for Greater Than 2000 Students

Page 79: OLSD Project 2020 Report

ADDITIONAL CAPACITY OF EACH HIGH SCHOOL

! Common Space ConsideraGons ! Lunch Commons ! Open Lunch for Juniors & Seniors

! ElecGve Space ConsideraGons ! Art, Music, Physical EducaGon, Technology, & Family/Consumer

Sciences

! Extra-­‐Curricular ConsideraGons ! Perfect Scheduling Scenario ! NegoGated Agreement ! What Does an InstrucGonal Day Look Like in This School?

Page 80: OLSD Project 2020 Report

ADDITIONAL ALTERNATIVES & CONSIDERATIONS

! INSTRUCTIONAL DELIVERY MODELS ! OFF-­‐SITE PROGRAMMING ! EXAMINATION OF GRADUATION REQUIREMENTS, COURSE OFFERINGS & CURRENT SCHEDULE

Page 81: OLSD Project 2020 Report

ON-­‐LINE & BLENDED LEARNING

! Students Currently Enrolled in On-­‐Line Coursework Through APEX.

! Currently Developing the Olentangy Curriculum to be Delivered On-­‐Line Through O2A Project

! Priority to Train Teachers to Use Blended Learning to “Flip” Their Classes

! Research IndicaGng Improved Achievement ! Lecture/RecitaGon Approach Like College Experience ! Allows for AddiGonal Flexible Use of Space Thereby PotenGally Increasing Capacity of Schools

Page 82: OLSD Project 2020 Report

INTRA-­‐DISTRICT DISTANCE LEARNING

! Technology Upgrades Will Allow Low Enrollment Courses to be Offered in One LocaGon in the District

! Can Also be Used for Larger-­‐Scale Blended Learning Delivery

! Lecture/RecitaGon Model Delivered Throughout Three Schools

Page 83: OLSD Project 2020 Report

OFF-­‐SITE PROGRAMMING

! Junior/Senior College Credit Academy ! Goal to Have 100-­‐300 Students ParGcipaGng Within Two Years

! Currently Aligning Twelve Courses With CSCC/OSU That Have Guaranteed Transfer Credit to OSU & Ohio Public Colleges ! English ComposiGon, English Literature, Spanish, Pre-­‐Calculus, Calculus, StaGsGcs, Biology, Chemistry, Physics, Psychology, Sociology, & AccounGng

! Students Will be Able to Choose From a Variety of Courses ! Taught by Olentangy Instructors at an AlternaGve LocaGon ! Discussing Dual AP/College Credit to Benefit Students Considering Going Out-­‐of-­‐State for College

Page 84: OLSD Project 2020 Report

POTENTIAL MODIFICATIONS TO OUR CURRENT REQUIREMENTS

! Will Allow For More Student Choice and Flexibility in Junior/Senior Year

! SGll Maintain 22 Credits for GraduaGon ! All of These Topics Can be Taught in Other Curricular Areas ! AddiGonal Half-­‐Credit of Social Studies ! Half-­‐Credit of Post-­‐Secondary Planning/Personal Finance ! Half-­‐Credit Speech Requirement ! Consider A Required On-­‐Line or Blended Learning Class

Page 85: OLSD Project 2020 Report

EXAMINE LOCAL GRADUATION REQUIREMENTS

7th Credit Value 8th

Credit Value 9th

Credit Value 10th

Credit Value 11th

Credit Value 12th

Credit Value

Algebra 1 1.0 Foreign

Language 1 1.0 CP English 9 1.0 CP English

10 1.0

CP English 11 or AP

Literature 1.0

CP English 12 or AP

ComposiGon 1.0

Algebra 1 or

Hon. Geometry 1.0

Geometry or

Hon. Algebra 2 1.0

Alg. 2 or Pre-­‐

Calculus 1.0

Pre-­‐Calculus or

AP Calculus AB 1.0

Math ElecGve AP Calculus BC AP StaGsGcs 1.0

Physical Science 1.0 Biology 1.0

Chemistry or

AP Science 1.0

Science ElecGve Physics or AP Science 1.0

World History 1.0 History 1.0 SS Elec=ve 1.0

American Govt. 0.5

For. Lang. 1 or

For. Lang. 2 1.0

For. Lang. 2 or

For. Lang. 3 1.0 PSPPF 0.5 ElecGve 0.5

Health 0.5 Speech 0.5 SS

Elec=ve 0.5 ElecGve 1.0

Physical Ed. 0.25 Physical Ed. 0.25 ElecGve 1.0 ElecGve 1.0

Fine Art 1.0 ElecGve 1.0 ElecGve 1.0 ElecGve 1.0

ElecGve Credit 1.0 ElecGve 1.0 ElecGve 1.0 ElecGve 1.0

Yearly Credits 1.0 2.0 7.75 7.75 8.0 8.0 Total Credits 1.0 1.0 -­‐ 3.0 8.75-­‐ 10.75 16.5-­‐18.5 24.5-­‐26.5

32.5-­‐34.5

Page 86: OLSD Project 2020 Report

ACADEMICALLY RIGOROUS & FISCALLY RESPONSIBLE

! Sustainability of the Current Schedule for All Grades ! Study Hall Enrollment-­‐ConGnued Availability? ! Open Campus Lunch for Juniors & Seniors ! Shihing the Paradigm That Olentangy has 1600 Student High Schools

! No Need for High School Re-­‐Distric=ng!

Page 87: OLSD Project 2020 Report

THANK YOU!

! Many OpGons to Consider ! Most Have Liile or No AddiGonal Cost ! Many Will Improve OperaGonal Efficiency ! Students Will Have Increased College and Career Readiness

Page 88: OLSD Project 2020 Report

2020 DATA COLLECTIONACHIEVEMENT AND ACCOUNTABILITY, PUPIL SERVICES, TECHNOLOGY

DATA COLLECTED THROUGH DECEMBER 2011

Olentangy

Page 89: OLSD Project 2020 Report

TABLE OF CONTENTSTask: analyze the oppor tunities for community service (real-world experience) for school credit! 4

Task: repor t on current pilots for online courses, blended learning oppor tunities, and build versus

buy options! 5

Task: Analyze oppor tunities for a blended approach with other educational institutions and

alternative configuration of school days/school year! 9

Research! 12

page 2

Page 90: OLSD Project 2020 Report

DEFINITIONS

It is important to define the options that will be presented in this report. This report will reference

both online and blended options. The following definitions from the Hanover Research K-12 and

Blended Classes Study do a decent job of explaining the difference in approaches.

! Full-time online programs: Draw students from across multiple districts, and often an entire state.

! Supplemental online programs: A supplemental online program generally takes place within the

structure of a degree-granting high school. Students select online courses for their mainstream,

credit recovery, specialized, or AP courses.8

! Blended courses combine two delivery modes of instruction, online and face-to- face. Instruction

involves increased interaction between student-and- instructor, student-to-content and student-to-

student.9

! Content Provider: Content providers (or providers) are generally other schools (charter, public,

or private), educational services companies, or state agencies. They almost always charge a fee, but there are different models for these as well; some charge per class enrollment, whereas others

charge a subscription fee.

page 3

Page 91: OLSD Project 2020 Report

TASK: ANALYZE THE OPPORTUNITIES FOR COMMUNITY SERVICE (REAL-WORLD EXPERIENCE) FOR SCHOOL CREDIT

Previously, Ohio legislation required Service Learning as part of the graduation requirement for all

students. At this time, there is no graduation requirement from ODE for service hours. Currently each of

our high schools, 4 middle schools, and 4 elementary schools (11/23 buildings total) have a student

council or service-learning club. These clubs provide opportunities for approximately 1,000 students to

be involved in various community service projects throughout the year. The projects include

volunteering, collecting items for donation, and fundraising for local and national community organizations.

Currently, the district does not spend any money for community service clubs. Teachers volunteer their

time to advise these groups without a stipend. High schools principals have proposed allocating

supplemental contracts for some groups with less students towards service clubs. This would require no

additional costs but would need to be approved by the supplemental contract committee.

Ohio Revised Code 3313.605, states that districts may give high school credit for a community service

course. Olentangy would be required to create a course in which half the class was devoted to direct

classroom instruction in service learning and volunteerism and the other half, engagement in service

activities. In order to implement this course for all students, each building would need about 2.6 FTE’s.

This would cost the district approximately $156, 000 per building, per year. The number of FTE’s and the

cost would decrease if the course was an elective or if it was a one-semester course for half credit.

However, any additional course or elective may have no additional costs if another elective is eliminated.

While service learning is a wonderful opportunity for students and our community, giving students credit

would not be financially beneficial for the district. A large number of students already participate in and

benefit from the current structure, which has a minimal financial impact on the district and a largely

positive impact on our surrounding community.

page 4

Page 92: OLSD Project 2020 Report

TASK: REPORT ON CURRENT PILOTS FOR ONLINE COURSES, BLENDED LEARNING OPPORTUNITIES, AND BUILD VERSUS BUY OPTIONS

Olentangy is currently piloting online courses in three different ways. The district is using Apex learning,

Passkey, and online courses that are built by Olentangy teachers in either 1st class Ed or Schoology. the

data below gives an idea of the courses and the methods being used;

Apex Learning-

Olentangy has purchased one hundred seats each of the last two years in Apex learning. These seats

allow students to rotate through as spaces become available. Each time a student completes a course

that seat becomes available for another student. The Olentangy Academy: Supporting Individualized

Success (OASIS) program has been using Apex to support many of their students. There are currently

sixty five students enrolled in the OASIS program. Each high school has also used Apex to support

accelerating students beyond their current curriculum.

Apex is available at $157.50 per seat (200 seats and $2,200 for professional development). This has

been funded through proceeds collected from summer school offerings.

http://www.apexlearning.com/

Course Usage

Solution

ClassTools Achieve

Organization Pathway Subject Enrolled StudentsActive EnrollmentsEnrollments ProgressingEnrollments Progressing Avg QOWCompleted StudentsCompleted EnrollmentsOASIS Advanced PlacementEnglish 2 2 0 0% 0 0

Math 0 0 0 0% 0 0Science 3 3 0 0% 0 0Social Studies 3 4 0 0% 0 0World Language 0 0 0 0% 0 0

Core Electives 32 49 0 0% 32 34English 29 32 0 0% 6 7Math 41 47 0 0% 0 0Science 22 23 0 0% 2 2

page 5

Page 93: OLSD Project 2020 Report

Course Usage

Social Studies 46 53 0 0% 5 5World Language 16 16 0 0% 0 0

FoundationsEnglish 15 19 0 0% 0 0Math 3 4 0 0% 0 0Science 9 9 0 0% 0 0

Honors English 4 4 0 0% 1 1Math 5 5 0 0% 0 0Science 2 2 0 0% 0 0Social Studies 1 1 0 0% 0 0World Language 0 0 0 0% 0 0

Literacy AdvantageEnglish 16 17 0 0% 1 1Math 14 15 0 0% 0 0Science 11 12 0 0% 1 1Social Studies 6 6 0 0% 0 0

Olentangy High SchoolAdvanced PlacementEnglish 1 2 0 0% 0 0Science 1 1 0 0% 0 0Social Studies 2 4 0 0% 0 0World Language 1 1 0 0% 0 0

Core Electives 1 1 0 0% 0 0English 5 9 0 0% 0 0Math 6 6 0 0% 12 12Science 6 6 0 0% 0 0Social Studies 5 7 0 0% 1 1World Language 0 0 0 0% 0 0

FoundationsEnglish 1 2 0 0% 0 0Math 0 0 0 0% 0 0

Honors Math 0 0 0 0% 22 22Social Studies 0 0 0 0% 0 0World Language 1 2 0 0% 0 0

Literacy AdvantageEnglish 1 3 0 0% 0 0OLENTANGY LIBERTY HIGH SCHOOLCore Electives 6 6 0 0% 1 1

English 2 2 0 0% 0 0Math 4 4 0 0% 0 0Science 1 1 0 0% 0 0Social Studies 6 7 0 0% 19 20World Language 0 0 0 0% 0 0

FoundationsEnglish 0 0 0 0% 1 1Math 0 0 0 0% 0 0

Honors Math 3 3 0 0% 1 1Olentangy Orange High SchoolAdvanced PlacementScience 0 0 0 0% 0 0

Social Studies 0 0 0 0% 0 0Core Electives 0 0 0 0% 5 5

English 3 3 0 0% 1 1

page 6

Page 94: OLSD Project 2020 Report

Course Usage

Math 0 0 0 0% 1 1Science 2 2 0 0% 0 0Social Studies 8 9 0 0% 1 1World Language 0 0 0 0% 0 0

FoundationsEnglish 2 3 0 0% 0 0Math 3 3 0 0% 0 0

Honors Math 0 0 0 0% 0 0Literacy AdvantageEnglish 0 0 0 0% 1 1

Passkey–

Passkey has been used for the past three years to support credit recovery. This is known to be a lesser

quality option, but is much more affordable and provides a good option for students who are behind in

credits. The seats in Passkey work in a very similar fashion with the ability to rotate students through

the seats. Transcripts with a course completed through Passkey will include a *cr with the course to

denote that it is a credit recovery option.

The district has purchased 100 seats in Passkey at $40 a seat each of the last three years. The district

recently added 10 additional seats to bring the total to 110.

http://www.passkeylearning.com/

Olentangy developed courses–

Olentangy is currently offering 3 middle school electives, post secondary finance, and physical

education as online courses developed by Olentangy teachers. These courses are being delivered

through the district’s learning management system Schoology. These pilots are in their 2nd year and

the number of students registering for these courses has increased in the 2011 2012 school year.

page 7

Page 95: OLSD Project 2020 Report

In addition, there are currently 45 teachers involved in the

Olentangy online Academy (O2A). This Academy of

teachers is working to take courses and move them into a

completely online environment as well as supporting a

flipped classroom.

The district has become a member of the CampusEAI

Consortium in order to combine access to multiple learning

tools in one portal. This portal (myCampus) will give

students at all grade levels access to the learning

management system (Schoology) as well as their gradebook

(PowerSchool).

The money campus platform will support all 17,000 students at a price of $2.94 per user. As

Olentangy’s student population increases, the cost of the myCampus platform will remain the same

causing the price per pupil to reduce.

page 8

What is the flipped classroom?

The flipped classroom inverts traditional teaching methods, delivering instruction online outside of class and moving “homework” into the classroom.

Knewton infographics

h t t p : / / w w w. k n e w t o n . c o m / fl i p p e d -classroom/

Page 96: OLSD Project 2020 Report

TASK: ANALYZE OPPORTUNITIES FOR A BLENDED APPROACH WITH OTHER EDUCATIONAL INSTITUTIONS AND ALTERNATIVE CONFIGURATION OF SCHOOL DAYS/SCHOOL YEAR

Olentangy will offer a dual-enrollment program next year that will be housed at Columbus State

Community College Delaware Campus. The coursework will be offered sequentially in the mornings. All

college-ready juniors and seniors will be eligible to participate. College readiness is determined by a

student's performance on ACT or PLAN.

Olentangy teachers will provide course instruction as adjunct professors at Columbus State. To become

an adjunct, a teacher needs a master's degree in the content area they are

teaching, or must have taken master's coursework in that content and have a plan to complete the

master's degree within three years. These teachers will teach three or four sections of their class at

Columbus State in the morning, then return to their home high school to finish the day.

The intent of the program will be to offer AP courses as Dual Enrollment courses. All courses will align

with general requirements at Ohio public universities and credit will be

transferable so that student can earn up to one full year of college credit during their junior and senior

years.

In the first year (2012-13) the program will serve approximately 100 students. As the Delaware campus

begins to expand, the program will have the potential to expand to offer

more courses to more students, which will help to alleviate space concerns at each high school.

Columbus State will not charge any fees in the first year. In future years, fees may be required. Those fees

may be $50 to $100 per student. As long as Olentangy teachers are teaching the course, Columbus State

costs are minimal.

2011-2012 Pre-Implementation Plan

page 9

Page 97: OLSD Project 2020 Report

1. Identify OLSD teachers who are qualified to teach dual enrollment courses

2. Determine preliminary course offerings and evaluate impact on home high schools

if those courses and teachers were (partially) moved off site

• Courses to be taught must be Ohio TAG courses (transferable credit to any state public college or

university

• High preference for AP courses• Courses must be easily applied as a general requirement for a majority of programs at any state

college or university

3. Make official selection of course to be offered in 2012-13 (4-12 courses)

4. OLSD and CSCC instructors collaborate over course content

5. AP syllabi submitted to College Board as needed

6. OLSD guidance counselors and CSCC academic advisors collaborate

7. Market dual enrollment offerings to parents and students

8. Schedule students for 2012-13

We have also analyzed data on enrollments to look for opportunities within the AP Classes that are

offered. The items highlighted in yellow below appears to provide some opportunities;

Term Course Student_N

11-12 02510-AP Art History 38

11-12 05510-AP English (Literature & Composition)

303

11-12 05520-AP English (Language & Composition)

271

11-12 06510-AP Spanish 75

11-12 06520-AP German 4

11-12 11510-AP Calculus AB 195

11-12 11520-AP Calculus BC 88

11-12 11530-AP Statistics 239

11-12 11540-AP Computer Science

27

11-12 12510-AP Music Theory 23

page 10

Page 98: OLSD Project 2020 Report

11-12 13510-AP Biology 124 lab??

11-12 13520-AP Chemistry 131 lab??

11-12 13530-AP Physics 93 lab??

11-12 13540-AP Environmental Science

84 lab??

11-12 15510-AP U.S. Government/Politics

427

11-12 15520-AP U.S. History 204

11-12 15530-AP European History

93

11-12 15560-AP Psychology 534

S1 15540-AP Macro Economics

125

S1 15550-AP Micro Economics

107

S2 15540-AP Macro Economics

107

S2 15550-AP Micro Economics

128

page 11

Page 99: OLSD Project 2020 Report

RESEARCH

The addendum includes two research studies on online learning. The 1st is the US Department of

Education study titled Evaluation of Evidence-Based Practices in Online Learning. The second is a

Hanover research study titled K-12 Online and Blended Classes.

Key highlights from the Evaluation of Evidence-Based Practices in Online Learning;

•Online learning—for students and for teachers—is one of the fastest growing trends in educational

uses of technology. The National Center for Education Statistics (2008) estimated that the number of

K-12 public school students enrolling in a technology-based distance education course grew by 65

percent in the two years from 2002-03 to 2004-05. On the basis of a more recent district survey,

Picciano and Seaman (2009) estimated that more than a million K– 12 students took online courses

in school year 2007–08.

•These activities were undertaken to address four research questions:

1. How does the effectiveness of online learning compare with that of face-to-face

instruction?

2. Does supplementing face-to-face instruction with online instruction enhance learning?

3. What practices are associated with more effective online learning?

4. What conditions influence the effectiveness of online learning?

•Students in online conditions performed modestly better, on average, than those

learning the same material through traditional face-to-face instruction. Learning

outcomes for students who engaged in online learning exceeded those of students receiving face-to-

face instruction, with an average effect size of +0.20 favoring online conditions.3 The mean difference

between online and face-to-face conditions across the 50 contrasts is statistically significant at the p

< .001 level.4 Interpretations of this result, however, should take into consideration the fact that online

and face-to-face conditions generally differed on multiple dimensions, including the amount of time

that learners spent on task. The advantages observed for online learning conditions therefore may be

the product of aspects of those treatment conditions other than the instructional delivery medium

per se.

•Instruction combining online and face-to-face elements had a larger advantage

relative to purely face-to-face instruction than did purely online instruction. The

page 12

Page 100: OLSD Project 2020 Report

mean effect size in studies comparing blended with face-to-face instruction was +0.35, p < .001. This

effect size is larger than that for studies comparing purely online and purely face-to-face conditions,

which had an average effect size of +0.05, p =.46. In fact, the learning outcomes for students in purely

online conditions and those for students in purely face-to- face conditions were statistically

equivalent. An important issue to keep in mind in reviewing these findings is that many studies did not

attempt to equate (a) all the curriculum materials, (b) aspects of pedagogy and (c) learning time in

the treatment and control conditions. Indeed, some authors asserted that it would be impossible to

have done so. Hence, the observed advantage for blended learning conditions is not necessarily

rooted in the media used per se and may reflect differences in content, pedagogy and learning time.

•Effect sizes were larger for studies in which the online instruction was

collaborative or instructor-directed than in those studies where online learners

worked independently.5 The type of learning experience moderated the size of the online

learning effect (Q = 6.19, p < .05).6 The mean effect sizes for collaborative instruction (+0.25) and for

instructor-directed instruction (+0.39) were significantly positive whereas the mean effect size for

independent learning (+0.05) was not.

•Most of the variations in the way in which different studies implemented online

learning did not affect student learning outcomes significantly. Analysts examined 13

online learning practices as potential sources of variation in the effectiveness of online learning

compared with face-to-face instruction. Of those variables, the two mentioned above (i.e., the use of a

blended rather than a purely online approach and instructor-directed or collaborative rather than

independent, self-directed instruction) were the only statistically significant influences on effectiveness.

The other 11 online learning practice variables that were analyzed did not affect student learning

significantly. However, the relatively small number of studies contrasting learning outcomes for online

and face-to-face instruction that included information about any specific aspect of implementation

impeded efforts to identify online instructional practices that affect learning outcomes.

•The effectiveness of online learning approaches appears quite broad across

different content and learner types. Online learning appeared to be an effective option for

both undergraduates (mean effect of +0.30, p < .001) and for graduate students and professionals

(+0.10, p < .05) in a wide range of academic and professional studies. Though positive, the mean

effect size is not significant for the seven contrasts involving K–12 students, but the number of K–12

studies is too small to warrant much confidence in the mean effect estimate for this learner group.

Three of the K–12 studies had significant effects favoring a blended learning condition, one had a

significant negative effect favoring face-to-face instruction, and three contrasts did not attain statistical

page 13

Page 101: OLSD Project 2020 Report

significance. The test for learner type as a moderator variable was nonsignificant. No significant

differences in effectiveness were found that related to the subject of instruction.

•Effect sizes were larger for studies in which the online and face-to-face

conditions varied in terms of curriculum materials and aspects of instructional

approach in addition to the medium of instruction. Analysts examined the characteristics

of the studies in the meta- analysis to ascertain whether features of the studies’ methodologies could

account for obtained effects. Six methodological variables were tested as potential moderators: (a)

sample size, (b) type of knowledge tested, (c) strength of study design, (d) unit of assignment to

condition, (e) instructor equivalence across conditions, and (f) equivalence of curriculum and

instructional approach across conditions. Only equivalence of curriculum and instruction emerged as a

significant moderator variable (Q = 6.85, p < .01). Studies in which analysts judged the curriculum and

instruction to be identical or almost identical in online and face-to-face conditions had smaller effects

than those studies where the two conditions varied in terms of multiple aspects of instruction (+0.13

compared with +0.40, respectively). Instruction could differ in terms of the way activities were

organized (for example as group work in one condition and independent work in another) or in the

inclusion of instructional resources (such as a simulation or instructor lectures) in one condition but

not the other.

Key highlights from the Hanover Research K-12 and Blended Classes Study;

Some benefits to students from hybrid & online classes include:

! Struggling students can take credit recovery courses online, and advanced or gifted students can avail themselves to enriched curriculums that may otherwise be unavailable. ! Online education’s ability to meet diverse student needs is especially relevant for small, rural, or inner-city schools, as online options can go well beyond what schools with limited resources can offer. ! For students that have not had ready access to a range of current computing and communications devices, online courses teach technological skills as they teach academic subject matter. ! As a practical, cost-effective, and flexible way to enrich a school’s curriculum, online education may be capable of transforming small, underserved schools into ones with nearly unlimited resources.

page 14

Page 102: OLSD Project 2020 Report

! Some students particularly well-suited to online learning include: students requiring flexible learning schedules, credit recovery students, independent learners and gifted students, and disabled students.

Other points of interest include: ! For supplemental online programs that that will be located on campus, identifying a site coordinator will probably be necessary or recommended from the earliest stages of the project. ! Teacher buy-in and training go hand-in-hand as surveys show that teachers are more likely to use teaching technologies when they receive adequate support and training. ! While the diversity of online classes makes comparative and aggregate analyses difficult, preliminary research suggests that online teaching methods are effective. ! In the best online courses, students have the opportunity to interact with instructors on a personalized level and one that supports them with the specific challenges they face. ! Laws regarding state funding of online education vary widely by state, and may undergo frequent changes as the landscape of virtual education is continually transformed. While students enrolled in full-time online programs usually generate per pupil revenue, supplemental or part-time programs generally do not. ! For most schools that are new to online education, partnering with a provider is likely to be the most efficient means to achieve their objectives. ! In many cases, schools will already have the basic hardware and infrastructure necessary for online or hybrid programs. The main investments will be directly tied to supporting the unique technologies of the program.

page 15

Page 103: OLSD Project 2020 Report

Evaluation of Evidence-Based Practices in Online Learning

A Meta-Analysis and Review of Online Learning Studies

Page 104: OLSD Project 2020 Report
Page 105: OLSD Project 2020 Report

Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and

Review of Online Learning Studies

U.S. Department of Education

Office of Planning, Evaluation, and Policy Development

Policy and Program Studies Service

Revised September 2010

Prepared by

Barbara Means

Yukie Toyama

Robert Murphy

Marianne Bakia

Karla Jones

Center for Technology in Learning

Page 106: OLSD Project 2020 Report

This report was prepared for the U.S. Department of Education under Contract number ED-04-

CO-0040 Task 0006 with SRI International. Bernadette Adams Yates served as the project

manager. The views expressed herein do not necessarily represent the positions or policies of the

Department of Education. No official endorsement by the U.S. Department of Education is

intended or should be inferred.

U.S. Department of Education

Arne Duncan

Secretary

Office of Planning, Evaluation and Policy Development

Carmel Martin

Assistant Secretary

Policy and Program Studies Service Office of Educational Technology

Alan Ginsburg Karen Cator

Director Director

September 2010

This report is in the public domain. Authorization to reproduce this report in whole or in part is granted.

Although permission to reprint this publication is not necessary, the suggested citation is: U.S.

Department of Education, Office of Planning, Evaluation, and Policy Development, Evaluation of

Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies,

Washington, D.C., 2010.

This report is also available on the Department’s Web site at

www.ed.gov/about/offices/list/opepd/ppss/reports.html.

On request, this publication is available in alternate formats, such as braille, large print, or computer

diskette. For more information, please contact the Department’s Alternate Format Center at

(202) 260-0852 or (202) 260-0818.

Page 107: OLSD Project 2020 Report

iii

Contents

EXHIBITS ...................................................................................................................................................................... V

ACKNOWLEDGMENTS ................................................................................................................................................ VII

ABSTRACT ................................................................................................................................................................... IX

EXECUTIVE SUMMARY ............................................................................................................................................... XI

Literature Search .................................................................................................................................................... xii Meta-Analysis ....................................................................................................................................................... xiii Narrative Synthesis ................................................................................................................................................ xiv Key Findings .......................................................................................................................................................... xiv Conclusions ......................................................................................................................................................... xviii

1. INTRODUCTION ......................................................................................................................................................... 1

Context for the Meta-analysis and Literature Review .............................................................................................. 2 Conceptual Framework for Online Learning ............................................................................................................ 3 Findings From Prior Meta-Analyses ......................................................................................................................... 6 Structure of the Report .............................................................................................................................................. 7

2. METHODOLOGY ........................................................................................................................................................ 9

Definition of Online Learning .................................................................................................................................. 9 Data Sources and Search Strategies ........................................................................................................................ 10 Electronic Database Searches ................................................................................................................................. 10 Additional Search Activities ................................................................................................................................... 10 Screening Process ................................................................................................................................................... 11 Effect Size Extraction ............................................................................................................................................. 13 Coding of Study Features ....................................................................................................................................... 14 Data Analysis .......................................................................................................................................................... 15

3. FINDINGS ................................................................................................................................................................. 17

Nature of the Studies in the Meta-Analysis ............................................................................................................ 17 Main Effects ............................................................................................................................................................ 18 Test for Homogeneity ............................................................................................................................................. 27 Analyses of Moderator Variables ........................................................................................................................... 27 Practice Variables ................................................................................................................................................... 28 Condition Variables ................................................................................................................................................ 30 Methods Variables .................................................................................................................................................. 31

4. NARRATIVE SYNTHESIS OF STUDIES COMPARING VARIANTS OF ONLINE LEARNING ........................................ 37

Blended Compared With Pure Online Learning ..................................................................................................... 38 Media Elements ...................................................................................................................................................... 40 Learning Experience Type ...................................................................................................................................... 41 Computer-Based Instruction ................................................................................................................................... 43 Supports for Learner Reflection .............................................................................................................................. 44 Moderating Online Groups ..................................................................................................................................... 46 Scripts for Online Interaction .................................................................................................................................. 46 Delivery Platform ................................................................................................................................................... 47 Summary ................................................................................................................................................................. 48

5. DISCUSSION AND IMPLICATIONS ............................................................................................................................ 51

Comparison With Meta-Analyses of Distance Learning ........................................................................................ 52 Implications for K–12 Education ............................................................................................................................ 53

Page 108: OLSD Project 2020 Report

iv

REFERENCES ............................................................................................................................................................... 55

Reference Key ........................................................................................................................................................ 55

APPENDIX META-ANALYSIS METHODOLOGY ....................................................................................................... A-1

Terms and Processes Used in the Database Searches .......................................................................................... A-1 Additional Sources of Articles ............................................................................................................................. A-3 Effect Size Extraction .......................................................................................................................................... A-4 Coding of Study Features .................................................................................................................................... A-5

Page 109: OLSD Project 2020 Report

v

Exhibits

Exhibit 1. Conceptual Framework for Online Learning ................................................................................................ 5

Exhibit 2. Bases for Excluding Studies During the Full-Text Screening Process ....................................................... 13

Exhibit 3. Effect Sizes for Contrasts in the Meta-Analysis ......................................................................................... 20

Exhibit 4a. Purely Online Versus Face-to-Face (Category 1) Studies Included in the Meta-Analysis ........................ 21

Exhibit 4b. Blended Versus Face-to-Face (Category 2) Studies Included in the Meta-Analysis ................................ 24

Exhibit 5. Tests of Practices as Moderator Variables .................................................................................................. 29

Exhibit 6. Tests of Conditions as Moderator Variables ............................................................................................... 30

Exhibit 7. Studies of Online Learning Involving K–12 Students ................................................................................ 32

Exhibit 8. Tests of Study Features as Moderator Variables ......................................................................................... 34

Exhibit 9. Learner Types for Category 3 Studies ........................................................................................................ 37

Exhibit A-1. Terms for Initial Research Database Search ........................................................................................ A-2

Exhibit A-2. Terms for Additional Database Searches for Online Career Technical Education and Teacher

Professional Development ............................................................................................................................... A-2

Exhibit A-3. Sources for Articles in the Full-Text Screening ................................................................................... A-3

Exhibit A-4. Top-level Coding Structure for the Meta-analysis ............................................................................... A-6

Page 110: OLSD Project 2020 Report
Page 111: OLSD Project 2020 Report

vii

Acknowledgments

This revision to the 2009 version of the report contains corrections made after the discovery of

several transcription errors by Shanna Smith Jaggars and Thomas Bailey of the Community

College Research Center of Teachers College, Columbia University. We are indebted to Jaggars

and Bailey for their detailed review of the analysis.

We would like to acknowledge the thoughtful contributions of the members of our Technical

Work Group in reviewing study materials and prioritizing issues to investigate. The advisors

consisted of Robert M. Bernard of Concordia University, Richard E. Clark of the University of

Southern California, Barry Fishman of the University of Michigan, Dexter Fletcher of the

Institute for Defense Analysis, Karen Johnson of the Minnesota Department of Education, Mary

Kadera of PBS, James L. Morrison an independent consultant, Susan Patrick of the North

American Council for Online Learning, Kurt D. Squire of the University of Wisconsin, Bill

Thomas of the Southern Regional Education Board, Bob Tinker of The Concord Consortium,

and Julie Young of the Florida Virtual School. Robert M. Bernard, the Technical Work Group’s

meta-analysis expert, deserves a special thanks for his advice and sharing of unpublished work

on meta-analysis methodology as well as his careful review of an earlier version of this report.

Many U.S. Department of Education staff members contributed to the completion of this report.

Bernadette Adams Yates served as project manager and provided valuable substantive guidance

and support throughout the design, implementation and reporting phases of this study. We would

also like to acknowledge the assistance of other Department staff members in reviewing this

report and providing useful comments and suggestions, including David Goodwin, Daphne

Kaplan, Tim Magner, and Ze’ev Wurman.

We appreciate the assistance and support of all of the above individuals; any errors in judgment

or fact are of course the responsibility of the authors.

The Study of Education Data Systems and Decision Making was supported by a large project

team at SRI International. Among the staff members who contributed to the research were Sarah

Bardack, Ruchi Bhanot, Kate Borelli, Sara Carriere, Katherine Ferguson, Reina Fujii, Joanne

Hawkins, Ann House, Katie Kaattari, Klaus Krause, Yessica Lopez, Lucy Ludwig, Patrik Lundh,

L. Nguyen, Julie Remold, Elizabeth Rivera, Luisana Sahagun Velasco, Mark Schlager, and Edith

Yang.

Page 112: OLSD Project 2020 Report
Page 113: OLSD Project 2020 Report

ix

Abstract

A systematic search of the research literature from 1996 through July 2008 identified more than

a thousand empirical studies of online learning. Analysts screened these studies to find those that

(a) contrasted an online to a face-to-face condition, (b) measured student learning outcomes, (c)

used a rigorous research design, and (d) provided adequate information to calculate an effect

size. As a result of this screening, 50 independent effects were identified that could be subjected

to meta-analysis. The meta-analysis found that, on average, students in online learning

conditions performed modestly better than those receiving face-to-face instruction. The

difference between student outcomes for online and face-to-face classes—measured as the

difference between treatment and control means, divided by the pooled standard deviation—was

larger in those studies contrasting conditions that blended elements of online and face-to-face

instruction with conditions taught entirely face-to-face. Analysts noted that these blended

conditions often included additional learning time and instructional elements not received by

students in control conditions. This finding suggests that the positive effects associated with

blended learning should not be attributed to the media, per se. An unexpected finding was the

small number of rigorous published studies contrasting online and face-to-face learning

conditions for K–12 students. In light of this small corpus, caution is required in generalizing to

the K–12 population because the results are derived for the most part from studies in other

settings (e.g., medical training, higher education).

Page 114: OLSD Project 2020 Report
Page 115: OLSD Project 2020 Report

xi

Executive Summary

Online learning—for students and for teachers—is one of the fastest growing trends in

educational uses of technology. The National Center for Education Statistics (2008) estimated

that the number of K-12 public school students enrolling in a technology-based distance

education course grew by 65 percent in the two years from 2002-03 to 2004-05. On the basis of a

more recent district survey, Picciano and Seaman (2009) estimated that more than a million K–

12 students took online courses in school year 2007–08.

Online learning overlaps with the broader category of distance learning, which encompasses

earlier technologies such as correspondence courses, educational television and

videoconferencing. Earlier studies of distance learning concluded that these technologies were

not significantly different from regular classroom learning in terms of effectiveness. Policy-

makers reasoned that if online instruction is no worse than traditional instruction in terms of

student outcomes, then online education initiatives could be justified on the basis of cost

efficiency or need to provide access to learners in settings where face-to-face instruction is not

feasible. The question of the relative efficacy of online and face-to-face instruction needs to be

revisited, however, in light of today’s online learning applications, which can take advantage of a

wide range of Web resources, including not only multimedia but also Web-based applications

and new collaboration technologies. These forms of online learning are a far cry from the

televised broadcasts and videoconferencing that characterized earlier generations of distance

education. Moreover, interest in hybrid approaches that blend in-class and online activities is

increasing. Policy-makers and practitioners want to know about the effectiveness of Internet-

based, interactive online learning approaches and need information about the conditions under

which online learning is effective.

The findings presented here are derived from (a) a systematic search for empirical studies of the

effectiveness of online learning and (b) a meta-analysis of those studies from which effect sizes

that contrasted online and face-to-face instruction could be extracted or estimated. A narrative

summary of studies comparing different forms of online learning is also provided.

These activities were undertaken to address four research questions:

1. How does the effectiveness of online learning compare with that of face-to-face

instruction?

2. Does supplementing face-to-face instruction with online instruction enhance learning?

3. What practices are associated with more effective online learning?

4. What conditions influence the effectiveness of online learning?

This meta-analysis and review of empirical online learning research are part of a broader study

of practices in online learning being conducted by SRI International for the Policy and Program

Studies Service of the U.S. Department of Education. The goal of the study as a whole is to

provide policy-makers, administrators and educators with research-based guidance about how to

implement online learning for K–12 education and teacher preparation. An unexpected finding of

the literature search, however, was the small number of published studies contrasting online and

Page 116: OLSD Project 2020 Report

xii

face-to-face learning conditions for K–12 students. Because the search encompassed the research

literature not only on K–12 education but also on career technology, medical and higher

education, as well as corporate and military training, it yielded enough studies with older learners

to justify a quantitative meta-analysis. Thus, analytic findings with implications for K–12

learning are reported here, but caution is required in generalizing to the K–12 population because

the results are derived for the most part from studies in other settings (e.g., medical training,

higher education).

This literature review and meta-analysis differ from recent meta-analyses of distance learning in

that they

! Limit the search to studies of Web-based instruction (i.e., eliminating studies of video-

and audio-based telecourses or stand-alone, computer-based instruction);

! Include only studies with random-assignment or controlled quasi-experimental designs;

and

! Examine effects only for objective measures of student learning (e.g., discarding effects

for student or teacher perceptions of learning or course quality, student affect, etc.).

This analysis and review distinguish between instruction that is offered entirely online and

instruction that combines online and face-to-face elements. The first of the alternatives to

classroom-based instruction, entirely online instruction, is attractive on the basis of cost and

convenience as long as it is as effective as classroom instruction. The second alternative, which

the online learning field generally refers to as blended or hybrid learning, needs to be more

effective than conventional face-to-face instruction to justify the additional time and costs it

entails. Because the evaluation criteria for the two types of learning differ, this meta-analysis

presents separate estimates of mean effect size for the two subsets of studies.

Literature Search

The most unexpected finding was that an extensive initial search of the published literature from

1996 through 2006 found no experimental or controlled quasi-experimental studies that both

compared the learning effectiveness of online and face-to-face instruction for K–12 students and

provided sufficient data for inclusion in a meta-analysis. A subsequent search extended the time

frame for studies through July 2008.

The computerized searches of online databases and citations in prior meta-analyses of distance

learning as well as a manual search of the last three years of key journals returned 1,132

abstracts. In two stages of screening of the abstracts and full texts of the articles, 176 online

learning research studies published between 1996 and 2008 were identified that used an

experimental or quasi-experimental design and objectively measured student learning outcomes.

Of these 176 studies, 99 had at least one contrast between an included online or blended learning

condition and face-to-face (offline) instruction that potentially could be used in the quantitative

meta-analysis. Just nine of these 99 involved K–12 learners. The 77 studies without a face-to-

face condition compared different variations of online learning (without a face-to-face control

condition) and were set aside for narrative synthesis.

Page 117: OLSD Project 2020 Report

xiii

Meta-Analysis

Meta-analysis is a technique for combining the results of multiple experiments or quasi-

experiments to obtain a composite estimate of the size of the effect. The result of each

experiment is expressed as an effect size, which is the difference between the mean for the

treatment group and the mean for the control group, divided by the pooled standard deviation. Of

the 99 studies comparing online and face-to-face conditions, 45 provided sufficient data to

compute or estimate 50 independent effect sizes (some studies included more than one effect).

Four of the nine studies involving K–12 learners were excluded from the meta-analysis: Two

were quasi-experiments without statistical control for preexisting group differences; the other

two failed to provide sufficient information to support computation of an effect size.

Most of the articles containing the 50 effects in the meta-analysis were published in 2004 or

more recently. The split between studies of purely online learning and those contrasting blended

online/face-to-face conditions against face-to-face instruction was fairly even, with 27 effects in

the first category and 23 in the second. The 50 estimated effect sizes included seven contrasts

from five studies conducted with K–12 learners—two from eighth-grade students in social

studies classes, one for eighth- and ninth-grade students taking Algebra I, two from a study of

middle school students taking Spanish, one for fifth-grade students in science classes in Taiwan,

and one from elementary-age students in special education classes. The types of learners in the

remaining studies were about evenly split between college or community college students and

graduate students or adults receiving professional training. All but two of the studies involved

formal instruction. The most common subject matter was medicine or health care. Other content

types were computer science, teacher education, mathematics, languages, science, social science,

and business. Among the 48 contrasts from studies that indicated the time period over which

instruction occurred, 19 involved instructional time frames of less than a month, and the

remainder involved longer periods. In terms of instructional features, the online learning

conditions in these studies were less likely to be instructor-directed (8 contrasts) than they were

to be student-directed, independent learning (17 contrasts) or interactive and collaborative in

nature (22 contrasts).

Effect sizes were computed or estimated for this final set of 50 contrasts. Among the 50

individual study effects, 11 were significantly positive, favoring the online or blended learning

condition. Three contrasts found a statistically significant effect favoring the traditional face-to-

face condition.1

1 When a " < .05 level of significance is used for contrasts, one would expect approximately 1 in 20 contrasts to

show a significant difference by chance. For 50 contrasts, then, one would expect 2 or 3 significant differences by

chance. The finding of 3 significant contrasts associated with face-to-face instruction is within the range one

would expect by chance; the 11 contrasts associated with online or hybrid instruction exceeds what one would

expect by chance.

Page 118: OLSD Project 2020 Report

xiv

Narrative Synthesis

In addition to the meta-analysis comparing online learning conditions with face-to-face

instruction, analysts reviewed and summarized experimental and quasi-experimental studies

contrasting different versions of online learning. Some of these studies contrasted purely online

learning conditions with classes that combined online and face-to-face interactions. Others

explored online learning with and without elements such as video, online quizzes, assigned

groups, or guidance for online activities. Five of these studies involved K–12 learners.

Key Findings

The main finding from the literature review was that

! Few rigorous research studies of the effectiveness of online learning for K–12 students

have been published. A systematic search of the research literature from 1994 through

2006 found no experimental or controlled quasi-experimental studies comparing the

learning effects of online versus face-to-face instruction for K–12 students that provide

sufficient data to compute an effect size. A subsequent search that expanded the time

frame through July 2008 identified just five published studies meeting meta-analysis

criteria.

The meta-analysis of 50 study effects, 43 of which were drawn from research with older learners,

found that2

! Students in online conditions performed modestly better, on average, than those learning

the same material through traditional face-to-face instruction. Learning outcomes for

students who engaged in online learning exceeded those of students receiving face-to-

face instruction, with an average effect size of +0.20 favoring online conditions.3 The

mean difference between online and face-to-face conditions across the 50 contrasts is

statistically significant at the p < .001 level.4 Interpretations of this result, however,

should take into consideration the fact that online and face-to-face conditions generally

differed on multiple dimensions, including the amount of time that learners spent on task.

The advantages observed for online learning conditions therefore may be the product of

aspects of those treatment conditions other than the instructional delivery medium per se.

2 The meta-analysis was run also with just the 43 studies with older learners. Results were very similar to those for

the meta-analysis including all 50 contrasts. Variations in findings when K-12 studies are removed are described

in footnotes. 3 The + sign indicates that the outcome for the treatment condition was larger than that for the control condition. A

– sign before an effect estimate would indicate that students in the control condition had stronger outcomes than

those in the treatment condition. Cohen (1992) suggests that effect sizes of .20 can be considered “small,” those of

approximately .50 “medium,” and those of .80 or greater “large.” 4 The p-value represents the likelihood that an effect of this size or larger will be found by chance if the two

populations under comparison do not differ. A p-value of less than .05 indicates that there is less than 1 chance in

20 that a difference of the observed size would be found for samples drawn from populations that do not differ.

Page 119: OLSD Project 2020 Report

xv

! Instruction combining online and face-to-face elements had a larger advantage relative

to purely face-to-face instruction than did purely online instruction. The mean effect size

in studies comparing blended with face-to-face instruction was +0.35, p < .001. This

effect size is larger than that for studies comparing purely online and purely face-to-face

conditions, which had an average effect size of +0.05, p =.46. In fact, the learning

outcomes for students in purely online conditions and those for students in purely face-to-

face conditions were statistically equivalent. An important issue to keep in mind in

reviewing these findings is that many studies did not attempt to equate (a) all the

curriculum materials, (b) aspects of pedagogy and (c) learning time in the treatment and

control conditions. Indeed, some authors asserted that it would be impossible to have

done so. Hence, the observed advantage for blended learning conditions is not necessarily

rooted in the media used per se and may reflect differences in content, pedagogy and

learning time.

! Effect sizes were larger for studies in which the online instruction was collaborative or

instructor-directed than in those studies where online learners worked independently.5

The type of learning experience moderated the size of the online learning effect (Q =

6.19, p < .05).6 The mean effect sizes for collaborative instruction (+0.25) and for

instructor-directed instruction (+0.39) were significantly positive whereas the mean effect

size for independent learning (+0.05) was not.

! Most of the variations in the way in which different studies implemented online learning

did not affect student learning outcomes significantly. Analysts examined 13 online

learning practices as potential sources of variation in the effectiveness of online learning

compared with face-to-face instruction. Of those variables, the two mentioned above (i.e.,

the use of a blended rather than a purely online approach and instructor-directed or

collaborative rather than independent, self-directed instruction) were the only statistically

significant influences on effectiveness. The other 11 online learning practice variables

that were analyzed did not affect student learning significantly. However, the relatively

small number of studies contrasting learning outcomes for online and face-to-face

instruction that included information about any specific aspect of implementation

impeded efforts to identify online instructional practices that affect learning outcomes.

! The effectiveness of online learning approaches appears quite broad across different

content and learner types. Online learning appeared to be an effective option for both

undergraduates (mean effect of +0.30, p < .001) and for graduate students and

professionals (+0.10, p < .05) in a wide range of academic and professional studies.

Though positive, the mean effect size is not significant for the seven contrasts involving

K–12 students, but the number of K–12 studies is too small to warrant much confidence

in the mean effect estimate for this learner group. Three of the K–12 studies had

5 Online experiences in which students explored digital artifacts and controlled the specific material they wanted to

view were categorized as independent learning experiences. 6 Online experiences in which students explored digital artifacts and controlled the specific material they wanted to

view were categorized as “active” learning experiences. This contrast is not statistically significant (p=.13) when

the five K-12 studies are removed from the analysis.

Page 120: OLSD Project 2020 Report

xvi

significant effects favoring a blended learning condition, one had a significant negative

effect favoring face-to-face instruction, and three contrasts did not attain statistical

significance. The test for learner type as a moderator variable was nonsignificant. No

significant differences in effectiveness were found that related to the subject of

instruction.

! Effect sizes were larger for studies in which the online and face-to-face conditions varied

in terms of curriculum materials and aspects of instructional approach in addition to the

medium of instruction. Analysts examined the characteristics of the studies in the meta-

analysis to ascertain whether features of the studies’ methodologies could account for

obtained effects. Six methodological variables were tested as potential moderators: (a)

sample size, (b) type of knowledge tested, (c) strength of study design, (d) unit of

assignment to condition, (e) instructor equivalence across conditions, and (f) equivalence

of curriculum and instructional approach across conditions. Only equivalence of

curriculum and instruction emerged as a significant moderator variable (Q = 6.85, p <

.01). Studies in which analysts judged the curriculum and instruction to be identical or

almost identical in online and face-to-face conditions had smaller effects than those

studies where the two conditions varied in terms of multiple aspects of instruction (+0.13

compared with +0.40, respectively). Instruction could differ in terms of the way activities

were organized (for example as group work in one condition and independent work in

another) or in the inclusion of instructional resources (such as a simulation or instructor

lectures) in one condition but not the other.

The narrative review of experimental and quasi-experimental studies contrasting different online

learning practices found that the majority of available studies suggest the following:

! Blended and purely online learning conditions implemented within a single study

generally result in similar student learning outcomes. When a study contrasts blended

and purely online conditions, student learning is usually comparable across the two

conditions.

! Elements such as video or online quizzes do not appear to influence the amount that

students learn in online classes. The research does not support the use of some frequently

recommended online learning practices. Inclusion of more media in an online application

does not appear to enhance learning. The practice of providing online quizzes does not

seem to be more effective than other tactics such as assigning homework.

! Online learning can be enhanced by giving learners control of their interactions with

media and prompting learner reflection. Studies indicate that manipulations that trigger

learner activity or learner reflection and self-monitoring of understanding are effective

when students pursue online learning as individuals.

! Providing guidance for learning for groups of students appears less successful than does

using such mechanisms with individual learners. When groups of students are learning

together online, support mechanisms such as guiding questions generally influence the

way students interact, but not the amount they learn.

Page 121: OLSD Project 2020 Report

xvii

Page 122: OLSD Project 2020 Report

xviii

Conclusions

In recent experimental and quasi-experimental studies contrasting blends of online and face-to-

face instruction with conventional face-to-face classes, blended instruction has been more

effective, providing a rationale for the effort required to design and implement blended

approaches. When used by itself, online learning appears to be as effective as conventional

classroom instruction, but not more so.

However, several caveats are in order: Despite what appears to be strong support for blended

learning applications, the studies in this meta-analysis do not demonstrate that online learning is

superior as a medium, In many of the studies showing an advantage for blended learning, the

online and classroom conditions differed in terms of time spent, curriculum and pedagogy. It was

the combination of elements in the treatment conditions (which was likely to have included

additional learning time and materials as well as additional opportunities for collaboration) that

produced the observed learning advantages. At the same time, one should note that online

learning is much more conducive to the expansion of learning time than is face-to-face

instruction.

In addition, although the types of research designs used by the studies in the meta-analysis were

strong (i.e., experimental or controlled quasi-experimental), many of the studies suffered from

weaknesses such as small sample sizes; failure to report retention rates for students in the

conditions being contrasted; and, in many cases, potential bias stemming from the authors’ dual

roles as experimenters and instructors.

Finally, the great majority of estimated effect sizes in the meta-analysis are for undergraduate

and older students, not elementary or secondary learners. Although this meta-analysis did not

find a significant effect by learner type, when learners’ age groups are considered separately, the

mean effect size is significantly positive for undergraduate and other older learners but not for

K–12 students.

Another consideration is that various online learning implementation practices may have

differing effectiveness for K–12 learners than they do for older students. It is certainly possible

that younger online students could benefit from practices (such as embedding feedback, for

example) that did not have a positive impact for college students and older learners. Without new

random assignment or controlled quasi-experimental studies of the effects of online learning

options for K–12 students, policy-makers will lack scientific evidence of the effectiveness of

these emerging alternatives to face-to-face instruction.

Page 123: OLSD Project 2020 Report
Page 124: OLSD Project 2020 Report

1

1. Introduction

Online learning has roots in the tradition of distance education, which goes back at least 100

years to the early correspondence courses. With the advent of the Internet and the World Wide

Web, the potential for reaching learners around the world increased greatly, and today’s online

learning offers rich educational resources in multiple media and the capability to support both

real-time and asynchronous communication between instructors and learners as well as among

different learners. Institutions of higher education and corporate training were quick to adopt

online learning. Although K–12 school systems lagged behind at first, this sector’s adoption of e-

learning is now proceeding rapidly.

The National Center for Education Statistics estimated that 37 percent of school districts had

students taking technology-supported distance education courses during school year 2004–05

(Zandberg and Lewis 2008). Enrollments in these courses (which included two-way interactive

video as well as Internet-based courses), were estimated at 506,950, a 60 percent increase over

the estimate based on the previous survey for 2002-03 (Selzer and Lewis 2007). Two district

surveys commissioned by the Sloan Consortium (Picciano and Seaman 2007; 2008) produced

estimates that 700,000 K–12 public school students took online courses in 2005–06 and over a

million students did so in 2007–08—a 43 percent increase.7 Most of these courses were at the

high school level or in combination elementary-secondary schools (Zandberg and Lewis 2008).

These district numbers, however, do not fully capture the popularity of programs that are entirely

online. By fall 2007, 28 states had online virtual high school programs (Tucker 2007). The

largest of these, the Florida Virtual School, served over 60,000 students in 2007–08. In addition,

enrollment figures for courses or high school programs that are entirely online reflect just one

part of overall K–12 online learning. Increasingly, regular classroom teachers are incorporating

online teaching and learning activities into their instruction.

Online learning has become popular because of its potential for providing more flexible access to

content and instruction at any time, from any place. Frequently, the focus entails (a) increasing

the availability of learning experiences for learners who cannot or choose not to attend traditional

face-to-face offerings, (b) assembling and disseminating instructional content more cost-

efficiently, or (c) enabling instructors to handle more students while maintaining learning

outcome quality that is equivalent to that of comparable face-to-face instruction.

Different technology applications are used to support different models of online learning. One

class of online learning models uses asynchronous communication tools (e.g., e-mail, threaded

discussion boards, newsgroups) to allow users to contribute at their convenience. Synchronous

technologies (e.g., webcasting, chat rooms, desktop audio/video technology) are used to

approximate face-to-face teaching strategies such as delivering lectures and holding meetings

with groups of students. Earlier online programs tended to implement one model or the other.

More recent applications tend to combine multiple forms of synchronous and asynchronous

online interactions as well as occasional face-to-face interactions.

7 The Sloan Foundation surveys had very low response rates, suggesting the need for caution with respect to their

numerical estimates.

Page 125: OLSD Project 2020 Report

2

In addition, online learning offerings are being designed to enhance the quality of learning

experiences and outcomes. One common conjecture is that learning a complex body of

knowledge effectively requires a community of learners (Bransford, Brown and Cocking 1999;

Riel and Polin 2004; Schwen and Hara 2004; Vrasidas and Glass 2004) and that online

technologies can be used to expand and support such communities. Another conjecture is that

asynchronous discourse is inherently self-reflective and therefore more conducive to deep

learning than is synchronous discourse (Harlen and Doubler 2004; Hiltz and Goldman 2005;

Jaffee et al. 2006).

This literature review and meta-analysis have been guided by four research questions:

1. How does the effectiveness of online learning compare with that of face-to-face

instruction?

2. Does supplementing face-to-face instruction with online instruction enhance learning?

3. What practices are associated with more effective online learning?

4. What conditions influence the effectiveness of online learning?

Context for the Meta-analysis and Literature Review

The meta-analysis and literature review reported here are part of the broader Evaluation of

Evidence-Based Practices in Online Learning study that SRI International is conducting for the

Policy and Program Studies Service of the U.S. Department of Education. The overall goal of the

study is to provide research-based guidance to policy-makers, administrators and educators for

implementing online learning for K–12 education. This literature search, analysis, and review

has expanded the set of studies available for analysis by also addressing the literature concerning

online learning in career technical education, medical and higher education, corporate and

military training, and K–12 education.

In addition to examining the learning effects of online learning, this meta-analysis has considered

the conditions and practices associated with differences in effectiveness. Conditions are those

features of the context within which the online technology is implemented that are relatively

impervious to change. Conditions include the year in which the intervention took place, the

learners’ demographic characteristics, the teacher’s or instructor’s qualifications, and state

accountability systems. In contrast, practices concern how online learning is implemented (e.g.,

whether or not an online course facilitator is used). In choosing whether or where to use online

learning (e.g., to teach mathematics for high school students, to teach a second language to

elementary students), it is important to understand the degree of effectiveness of online learning

under differing conditions. In deciding how to implement online learning, it is important to

understand the practices that research suggests will increase effectiveness (e.g., community

building among participants, use of an online facilitator, blending work and training).

Page 126: OLSD Project 2020 Report

3

Conceptual Framework for Online Learning

Modern online learning includes offerings that run the gamut from conventional didactic lectures

or textbook-like information delivered over the Web to Internet-based collaborative role-playing

in social simulations and highly interactive multiplayer strategy games. Examples include

primary-grade students working on beginning reading skills over the Internet, middle school

students collaborating with practicing scientists in the design and conduct of research, and

teenagers who dropped out of high school taking courses online to attain the credits needed for

graduation. The teachers of K–12 students may also participate in online education, logging in to

online communities and reference centers and earning inservice professional development credit

online.

To guide the literature search and review, the research team developed a conceptual framework

identifying three key components describing online learning: (a) whether the activity served as a

replacement for or an enhancement to conventional face-to-face instruction, (b) the type of

learning experience (pedagogical approach), and (c) whether communication was primarily

synchronous or asynchronous. Each component is described in more detail below.

One of the most basic characteristics for classifying online activities is its objective—whether

the activity serves as a replacement for face-to-face instruction (e.g., a virtual course) or as an

enhancement of the face-to-face learning experience (i.e., online learning activities that are part

of a course given face-to-face). This distinction is important because the two types of

applications have different objectives. A replacement application that is equivalent to

conventional instruction in terms of learning outcomes is considered a success if it provides

learning online without sacrificing student achievement. If student outcomes are the same

whether a course is taken online or face-to-face, then online instruction can be used cost-

effectively in settings where too few students are situated in a particular geographic locale to

warrant an on-site instructor (e.g., rural students, students in specialized courses). In contrast,

online enhancement activities that produce learning outcomes that are only equivalent to (not

better than) those resulting from face-to-face instruction alone would be considered a waste of

time and money because the addition does not improve student outcomes.

A second important dimension is the type of learning experience, which depends on who (or

what) determines the way learners acquire knowledge. Learning experiences can be classified in

terms of the amount of control that the student has over the content and nature of the learning

activity. In traditional didactic or expository learning experiences, content is transmitted to the

student by a lecture, written material, or other mechanisms. Such conventional instruction is

often contrasted with active learning in which the student has control of what and how he or she

learns. Another category of learning experiences stresses collaborative or interactive learning

activity in which the nature of the learning content is emergent as learners interact with one

another and with a teacher or other knowledge sources. Technologies can support any of these

three types of learning experience:

! Expository instruction—Digital devices transmit knowledge.

! Active learning—The learner builds knowledge through inquiry-based manipulation of

digital artifacts such as online drills, simulations, games, or microworlds.

Page 127: OLSD Project 2020 Report

4

! Interactive learning—The learner builds knowledge through inquiry-based collaborative

interaction with other learners; teachers become co-learners and act as facilitators.

This dimension of learning-experience type is closely linked to the concept of learner control

explored by Zhang (2005). Typically, in expository instruction, the technology delivers the

content. In active learning, the technology allows students to control digital artifacts to explore

information or address problems. In interactive learning, technology mediates human interaction

either synchronously or asynchronously; learning emerges through interactions with other

students and the technology.

The learner-control category of interactive learning experiences is related to the so-called “fifth

generation” of distance learning, which stresses a flexible combination of independent and group

learning activities. Researchers are now using terms such as “distributed learning” (Dede 2006)

or “learning communities” to refer to orchestrated mixtures of face-to-face and virtual

interactions among a cohort of learners led by one or more instructors, facilitators or coaches

over an extended period of time (from weeks to years).

Finally, a third characteristic commonly used to categorize online learning activities is the extent

to which the activity is synchronous, with instruction occurring in real time whether in a physical

or a virtual place, or asynchronous, with a time lag between the presentation of instructional

stimuli and student responses. Exhibit 1 illustrates the three dimensions in the framework

guiding this meta-analysis of online learning offerings. The descriptive columns in the table

illustrate uses of online learning comprising dimensions of each possible combination of the

learning experience, synchronicity, and objective (an alternative or a supplement to face-to-face

instruction).

Page 128: OLSD Project 2020 Report

5

Exhibit 1. Conceptual Framework for Online Learning

Learning Experience Dimension Synchronicity

Face-to-Face Alternative

Face-to-Face Enhancement

Expository

Synchronous Live, one-way webcast of online lecture course with limited learner control (e.g., students proceed through materials in set sequence)

Viewing webcasts to supplement in-class learning activities

Asynchronous Math course taught through online video lectures that students can access on their own schedule

Online lectures on advanced topics made available as a resource for students in a conventional math class

Active

Synchronous Learning how to troubleshoot a new type of computer system by consulting experts through live chat

Chatting with experts as the culminating activity for a curriculum unit on network administration

Asynchronous Social studies course taught entirely through Web quests that explore issues in U.S. history

Web quest options offered as an enrichment activity for students completing their regular social studies assignments early

Interactive

Synchronous

Health-care course taught entirely through an online, collaborative patient management simulation that multiple students interact with at the same time

Supplementing a lecture-based course through a session spent with a collaborative online simulation used by small groups of students

Asynchronous Professional development for science teachers through “threaded” discussions and message boards on topics identified by participants

Supplemental, threaded discussions for pre-service teachers participating in a face-to-face course on science methods

Exhibit reads: Online learning applications can be characterized in terms of (a) the kind of learning experience they provide, (b) whether computer-mediated instruction is primarily synchronous or asynchronous and (c) whether they are intended as an alternative or a supplement to face-to-face instruction.

Page 129: OLSD Project 2020 Report

6

Many other features also apply to online learning, including the type of setting (classroom,

home, informal), the nature of the content (both the subject area and the type of learning such as

fact, concept, procedure or strategy), and the technology involved (e.g., audio/video streaming,

Internet telephony, podcasting, chat, simulations, videoconferencing, shared graphical

whiteboard, screen sharing).

The dimensions in the framework in Exhibit 1 were derived from prior meta-analyses in distance

learning. Bernard et al. (2004) found advantages for asynchronous over synchronous distance

education. In examining a different set of studies, Zhao et al. (2005) found that studies of

distance-learning applications that combined synchronous and asynchronous communication

tended to report more positive effects than did studies of distance learning applications with just

one of these interaction types.8 Zhao et al. also found (a) advantages for blended learning (called

“Face-to-Face Enhancement” in the Exhibit 1 framework) over purely online learning

experiences and (b) advantages for courses with more instructor involvement compared with

more “canned” applications that provide expository learning experiences. Thus, the three

dimensions in Exhibit 1 capture some of the most important kinds of variation in distance

learning and together provide a manageable framework for differentiating among the broad array

of online activities in practice today.

Findings From Prior Meta-Analyses

Prior meta-analyses of distance education (including online learning studies and studies of other

forms of distance education) and of Web-based or online learning have been conducted. Overall,

results from Bernard et al. (2004) and other reviews of the distance education literature

(Cavanaugh 2001; Moore 1994) indicate no significant differences in effectiveness between

distance education and face-to-face education, suggesting that distance education, when it is the

only option available, can successfully replace face-to-face instruction. Findings of a recent

meta-analysis of job-related courses comparing Web-based and classroom-based learning

(Sitzmann et al. 2006) were even more positive. They found online learning to be superior to

classroom-based instruction in terms of declarative knowledge outcomes, with the two being

equivalent in terms of procedural learning.

However, a general conclusion that distance and face-to-face instruction result in essentially

similar learning ignores differences in findings across various studies. Bernard et al. (2004)

found tremendous variability in effect sizes (an effect size is the difference between the mean for

the treatment group and the mean for the control group, divided by the pooled standard

deviation), which ranged from –1.31 to +1.41.9 From their meta-analysis, which included coding

for a wide range of instructional and other characteristics, the researchers concluded that selected

8 Both of these meta-analyses included video-based distance learning as well as Web-based learning and also

included studies in which the outcome measure was student satisfaction, attitude or other nonlearning measures.

The meta-analysis reported here is restricted to an analysis of effect sizes for objective student learning measures

in experimental, controlled quasi-experimental, and crossover studies of applications with Web-based

components. 9 Cohen (1992) suggests that effect sizes of .20 can be considered “small,” those of approximately .50 “medium,”

and those of .80 or greater “large.”

Page 130: OLSD Project 2020 Report

7

conditions and practices were associated with differences in outcomes. For example, they found

that distance education that used synchronous instruction was significantly negative in its effect,

with an average effect size of –0.10, whereas the average effect size for studies using

asynchronous instruction was significantly positive (+0.05). However, the studies that Bernard et

al. categorized as using synchronous communication involved “yoked” classrooms; that is, the

instructor’s classroom was the center of the activity, and one or more distant classrooms

interacted with it in “hub and spoke” fashion. These satellite classes are markedly different from

today’s Web-based communication among the multiple nodes in a “learning network.”

Machtmes and Asher’s earlier (2000) meta-analysis of telecourses sheds light on this issue.10

Although detecting no difference between distance and face-to-face learning overall, they found

results more favorable for telecourses when classrooms had two-way, as opposed to one-way,

interactions.

Although earlier meta-analyses of distance education found it equivalent to classroom instruction

(as noted above), several reviewers have suggested that this pattern may change. They argue that

online learning as practiced in the 21st century can be expected to outperform earlier forms of

distance education in terms of effects on learning (Zhao et al. 2005).

The meta-analysis reported here differs from earlier meta-analyses because its focus has been

restricted to studies that did the following:

! Investigated significant use of the Web for instruction

! Had an objective learning measure as the outcome measure

! Met higher quality criteria in terms of study design (i.e., an experimental or controlled

quasi-experimental design)

Structure of the Report

Chapter 2 describes the methods used in searching for appropriate research articles, in screening

those articles for relevance and study quality, in coding study features, and in calculating effect

sizes. Chapter 3 describes the 50 study effects identified through the article search and screening

and presents findings in the form of effect sizes for studies contrasting purely online or blended

learning conditions with face-to-face instruction. Chapter 4 provides a qualitative narrative

synthesis of research studies comparing variations of online learning interventions. Finally,

chapter 5 discusses the implications of the literature search and meta-analysis for future studies

of online learning and for K–12 online learning practices.

10 Like the present meta-analysis, Machtmes and Asher limited their study corpus to experiments or quasi-

experiments with an achievement measure as the learning outcome.

Page 131: OLSD Project 2020 Report
Page 132: OLSD Project 2020 Report

9

2. Methodology

This chapter describes the procedures used to search for, screen and code controlled studies of

the effectiveness of online learning. The products of these search, screening and coding activities

were used for the meta-analysis and narrative literature review, which are described in chapters 3

and 4, respectively.

Definition of Online Learning

For this review, online learning is defined as learning that takes place partially or entirely over

the Internet. This definition excludes purely print-based correspondence education, broadcast

television or radio, videoconferencing, videocassettes, and stand-alone educational software

programs that do not have a significant Internet-based instructional component.

In contrast to previous meta-analyses, this review distinguishes between two purposes for online

learning:

! Learning conducted totally online as a substitute or alternative to face-to-face learning

! Online learning components that are combined or blended (sometimes called “hybrid”)

with face-to-face instruction to provide learning enhancement

As indicated in chapter 1, this distinction was made because of the different implications of

finding a null effect (i.e., no difference in effects between the treatment and the control group)

under the two circumstances. Equivalence between online learning and face-to-face learning

justifies using online alternatives, but online enhancements need to be justified by superior

learning outcomes. These two purposes of online learning defined the first two categories of

study in the literature search:

! Studies comparing an online learning condition with a face-to-face control condition

(Category 1)

! Studies comparing a blended condition with a face-to-face control condition without the

online learning components (Category 2).

In addition, researchers sought experimental and controlled quasi-experimental studies that

compared the effectiveness of different online learning practices. This third study category

consisted of the following:

! Studies testing the learning effects of variations in online learning practices such as

online learning with and without interactive video (Category 3).

Page 133: OLSD Project 2020 Report

10

Data Sources and Search Strategies

Relevant studies were located through a comprehensive search of publicly available literature

published from 1996 through July 2008.11 Searches of dissertations were limited to those

published from 2005 through July 2008 to allow researchers to use UMI ProQuest Digital

Dissertations for retrieval.

Electronic Database Searches

Using a common set of keywords, searches were performed in five electronic research databases:

ERIC, PsycINFO, PubMed, ABI/INFORM, and UMI ProQuest Digital Dissertations. The

appendix lists the terms used for the initial electronic database search and for additional searches

for studies of online learning in the areas of career technical education and teacher education.

Additional Search Activities

The electronic database searches were supplemented with a review of articles cited in recent

meta-analyses and narrative syntheses of research on distance learning (Bernard et al. 2004;

Cavanaugh et al. 2004; Childs 2001; Sitzmann et al. 2006; Tallent-Runnels et al. 2006; WestEd

with Edvance Research 2008; Wisher and Olson 2003; Zhao et al. 2005), including those for

teacher professional development and career technical education (Whitehouse et al. 2006; Zirkle

2003). The analysts examined references from these reviews to identify studies that might meet

the criteria for inclusion in the present review.

Abstracts were manually reviewed for articles published since 2005 in the following key

journals: American Journal of Distance Education, Journal of Distance Education (Canada),

Distance Education (Australia), International Review of Research in Distance and Open

Education, and Journal of Asynchronous Learning Networks. In addition, the Journal of

Technology and Teacher Education and Career and Technical Education Research were

searched manually. Finally, the Google Scholar search engine was used with a series of

keywords related to online learning (available from the authors). Article abstracts retrieved

through these additional search activities were reviewed to remove duplicates of articles

identified earlier.

11 Literature searches were performed in two waves: in March 2007 for studies published from 1996–2006 and in

July 2008 for studies published from 2007 to July 2008.

Page 134: OLSD Project 2020 Report

11

Screening Process

Screening of the research studies obtained through the search process described above was

carried out in two stages. The intent of the two-stage approach was to gain efficiency without

risking exclusion of potentially relevant, high-quality studies of online learning effects.

Initial Screen for Abstracts From Electronic Databases

The initial electronic database searches (excluding the additional searches conducted for teacher

professional development and career technical education) yielded 1,132 articles.12 Citation

information and abstracts of these studies were examined to ascertain whether they met the

following three initial inclusion criteria:

1. Does the study address online learning as this review defines it?

2. Does the study appear to use a controlled design (experimental/quasi-experimental

design)?

3. Does the study report data on student achievement or another learning outcome?

At this early stage, analysts gave studies “the benefit of the doubt,” retaining those that were not

clearly outside the inclusion criteria on the basis of their citations and abstracts. As a result of

this screening, 316 articles were retained and 816 articles were excluded. During this initial

screen, 45 percent of the excluded articles were removed primarily because they did not have a

controlled design. Twenty-six percent of excluded articles were eliminated because they did not

report learning outcomes for treatment and control groups. Twenty-three percent were eliminated

because their intervention did not qualify as online learning, given the definition used for this

meta-analysis and review. The remaining six percent of excluded articles posed other difficulties,

such as being written in a language other than English.

Full-text Screen

From the other data sources (i.e., references in earlier reviews, manual review of key journals,

recommendation from a study advisor, and Google Scholar searches), researchers identified and

retrieved an additional 186 articles, yielding a total of 502 articles that they subjected to a full-

text screening for possible inclusion in the analysis. Nine analysts who were trained on a set of

full-text screening criteria reviewed the 502 articles for both topical relevance and study quality.

A study had to meet content relevance criteria to be included in the meta-analysis. Thus,

qualifying studies had to

1. Involve learning that took place over the Internet. The use of the Internet had to be a

major part of the intervention. Studies in which the Internet was only an incidental

component of the intervention were excluded. In operational terms, to qualify as online

learning, a study treatment needed to provide at least a quarter of the

12 This number includes multiple instances of the same study identified in different databases.

Page 135: OLSD Project 2020 Report

12

instruction/learning of the content assessed by the study’s learning measure by means

of the Internet.

2. Contrast conditions that varied in terms of use of online learning. Learning outcomes

had to be compared against conditions falling into at least one of two study categories:

Category 1, online learning compared with offline/face-to-face learning, and Category

2, a combination of online plus offline/face-to-face learning (i.e., blended learning)

compared with offline/face-to-face learning alone.

3. Describe an intervention study that had been completed. Descriptions of study designs,

evaluation plans or theoretical frameworks were excluded. The length of the

intervention/treatment could vary from a few hours to a quarter, semester, year or

longer.

4. Report a learning outcome that was measured for both treatment and control groups.

A learning outcome needed to be measured in the same way across study conditions. A

study was excluded if it explicitly indicated that different examinations were used for

the treatment and control groups. The measure had to be objective and direct; learner or

teacher/instructor self-report of learning was not considered a direct measure.

Examples of learning outcome measures included scores on standardized tests, scores

on researcher-created assessments, grades/scores on teacher-created assessments (e.g.,

assignments, midterm/final exams), and grades or grade point averages. Examples of

learning outcome measures for teacher learners (in addition to those accepted as

student outcomes) included assessments of content knowledge, analysis of lesson plans

or other materials related to the intervention, observation (or logs) of class activities,

analysis of portfolios, or supervisor’s rating of job performance. Studies that used only

nonlearning outcome measures (e.g., attitude, retention, attendance, level of

learner/instructor satisfaction) were excluded.

Studies also had to meet basic Quality (method) criteria to be included. Thus, qualifying studies

had to

5. Use a controlled design (experimental or quasi-experimental). Design studies,

exploratory studies or case studies that did not use a controlled research design were

excluded. For quasi-experimental designs, the analysis of the effects of the intervention

had to include statistical controls for possible differences between the treatment and

control groups in terms of prior achievement.

6. Report sufficient data for effect size calculation or estimation as specified in the

guidelines provided by the What Works Clearinghouse (2007) and by Lipsey and

Wilson (2000).

Studies that contrasted different versions of online learning (Category 3) needed to meet Criteria

1 and 3–5 to be included in the narrative research summary.

Page 136: OLSD Project 2020 Report

13

An analyst read each full text, and all borderline cases were discussed and resolved either at

project meetings or through consultation with task leaders. To prevent studies from being

mistakenly screened out, two analysts coded studies on features that were deemed to require

significant degrees of inference. These features consisted of the following:

! Failure to have students use the Internet for a significant portion of the time that they

spent learning the content assessed by the study’s learning measure

! Lack of statistical control for prior abilities in quasi-experiments

From the 502 articles, analysts identified 522 independent studies (some articles reported more

than one study). When the same study was reported in different publication formats (e.g.,

conference paper and journal article), only the more formal journal article was retained for the

analysis.

Of the 522 studies, 176 met all the criteria of the full-text screening process. Exhibit 2 shows the

bases for exclusion for the 346 studies that did not meet all the criteria.

Exhibit 2. Bases for Excluding Studies During the Full-Text Screening Process

Primary Reason for Exclusion

Number Excluded

Percentage Excluded

Did not use statistical control 137 39

Was not online as defined in this review 90 26

Did not analyze learning outcomes 52 15

Did not have a comparison group that received a comparable treatment 22

7

Did not fit into any of the three study categories 39 11

Excluded for other reasonsa 6 2

Exhibit reads: The most common reason for a study’s exclusion from the analysis was failure to use statistical control (in a quasi-experiment). aOther reasons for exclusion included (a) did not provide enough information, (b) was written in a language other than

English, and (c) used different learning outcome measures for the treatment and control groups.

Effect Size Extraction

Of the 176 independent studies, 99 had at least one contrast between online learning and face-to-

face/offline learning (Category 1) or between blended learning and face-to-face/offline learning

(Category 2). These studies were subjected to quantitative analysis to extract effect sizes.

Of the 99 studies, only nine were conducted with K–12 students (Chang 2008; Englert et al.

2007; Long and Jennings 2005; O’Dwyer, Carey and Kleiman 2007; Parker 1999; Rockman et

al. 2007; Stevens 1999; Sun, Lin and Yu 2008; Uzunboylu 2004). Of them, four were excluded

from the meta-analysis: Chang (2008), Parker (1999), and Uzunboylu (2004) did not provide

sufficient statistical data to compute effect sizes, and the Stevens (1999) study was a quasi-

experiment without a statistical control for potential existing differences in achievement.

Page 137: OLSD Project 2020 Report

14

An effect size is similar to a z-score in that it is expressed in terms of units of standard deviation.

It is defined as the difference between the treatment and control means, divided by the pooled

standard deviation. Effect sizes can be calculated (a) from the means and standard deviations for

the two groups or (b) on the basis of information provided in statistical tests such as t-tests and

analyses of variance. Following the guidelines from the What Works Clearinghouse (2007) and

Lipsey and Wilson (2000), numerical and statistical data contained in the studies were extracted

so that Comprehensive Meta-Analysis software (Biostat Solutions 2006) could be used to

calculate effect sizes (g). The precision of each effect estimate was determined by using the

estimated standard error of the mean to calculate the 95-percent confidence interval for each

effect.

The review of the 99 studies to obtain the data for calculating effect size produced 50

independent effect sizes (27 for Category 1 and 23 for Category 2) from 45 studies. Fifty-four

studies did not report sufficient data to support calculating effect size.

Coding of Study Features

All studies that provided enough data to compute an effect size were coded for their study

features and for study quality. Building on the project’s conceptual framework and the coding

schemes used in several earlier meta-analyses (Bernard et al. 2004; Sitzmann et al. 2006), a

coding structure was developed and pilot-tested with several studies. The top-level coding

structure, incorporating refinements made after pilot testing, is shown in Exhibit A-4 of the

appendix.

To determine interrater reliability, two researchers coded 20 percent of the studies, achieving an

interrater reliability of 86 percent across those studies. Analysis of coder disagreements resulted

in the refinement of some definitions and decision rules for some codes; other codes that

required information that articles did not provide or that proved difficult to code reliably were

eliminated (e.g., whether or not the instructor was certified). A single researcher coded the

remaining studies.

Page 138: OLSD Project 2020 Report

15

Data Analysis

Before combining effects from multiple contrasts, effect sizes were weighted to avoid undue

influence of studies with small sample sizes (Hedges and Olkin 1985). For the total set of 50

contrasts and for each subset of contrasts being investigated, a weighted mean effect size

(Hedges’ g+) was computed by weighting the effect size for each study contrast by the inverse of

its variance. The precision of each mean effect estimate was determined by using the estimated

standard error of the mean to calculate the 95 percent confidence interval. Using a fixed-effects

model, the heterogeneity of the effect size distribution (the Q-statistic) was computed to indicate

the extent to which variation in effect sizes was not explained by sampling error alone.

Next, a series of post-hoc subgroup and moderator variable analyses were conducted using the

Comprehensive Meta-Analysis software. A mixed-effects model was used for these analyses to

model within-group variation.13 A between-group heterogeneity statistic (QBetween) was computed

to test for statistical differences in the weighted mean effect sizes for various subsets of the

effects (e.g., studies using blended as opposed to purely online learning for the treatment group).

Chapter 3 describes the results of these analyses.

13 Meta-analysts need to choose between a mixed-effects and a fixed-effects model for investigating moderator

variables. A fixed-effects analysis is more sensitive to differences related to moderator variables, but has a greater

likelihood of producing Type I errors (falsely rejecting the null hypothesis). The mixed-effects model reduces the

likelihood of Type I errors by adding a random constant to the standard errors, but does so at the cost of

increasing the likelihood of Type II errors (incorrectly accepting the null hypothesis). Analysts chose the more

conservative mixed-effects model for this investigation of moderator variables.

Page 139: OLSD Project 2020 Report
Page 140: OLSD Project 2020 Report

17

3. Findings

This chapter presents the results of the meta-analysis of controlled studies that compared the

effectiveness of online learning with that of face-to-face instruction. The next chapter presents a

narrative synthesis of studies that compared different versions of online learning with each other

rather than with a face-to-face control condition.

Nature of the Studies in the Meta-Analysis

As indicated in chapter 2, 50 independent effect sizes could be abstracted from the study corpus

of 45 studies.14 The number of students in the studies included in the meta-analysis ranged from

16 to 1,857, but most of the studies were modest in scope. Although large-scale applications of

online learning have emerged, only five studies in the meta-analysis corpus included more than

400 learners. The types of learners in these studies were about evenly split between students in

college or earlier years of education and learners in graduate programs or professional training.

The average learner age ranged from 13 to 44. Nearly all the studies involved formal instruction,

with the most common subject matter being medicine or health care. Other content types

included computer science, teacher education, social science, mathematics, languages, science

and business. Roughly half of the learners were taking the instruction for credit or as an

academic requirement. Of the 48 contrasts for which the study indicated the length of instruction,

19 involved instructional time frames of less than a month and the remainder involved longer

periods.

In terms of instructional features, the online learning conditions in these studies were less likely

to be instructor-directed (8 contrasts) than they were to be student-directed, independent learning

(17 contrasts) or interactive and collaborative in nature (22 contrasts). Online learners typically

had opportunities to practice skills or test their knowledge (41 effects were from studies

reporting such opportunities). Opportunities for learners to receive feedback were less common;

however, it was reported in the studies associated with 23 effects. The opportunity for online

learners to have face-to-face contact with the instructor during the time frame of the course was

present in the case of 21 out of 50 effects. The details of instructional media and communication

options available to online learners were absent in many of the study narratives. Among the 50

contrasts, analysts could document the presence of one-way video or audio in the online

condition for 14 effects. Similarly, 16 contrasts involved online conditions which allowed

students to communicate with the instructor with asynchronous communication only; 8 allowed

both asynchronous and synchronous online communication; and 26 contrasts came from studies

that did not document the types of online communication provided between the instructor and

learners.

14 After the first literature search, which yielded 29 independent effects, the research team ran additional analyses to

find out how many more studies could be included if the study design criterion were relaxed to include quasi-

experiments with pre- and posttests with no statistical adjustments made for preexisting differences. The relaxed

standard would have increased the corpus for analysis by just 10 studies, nearly all of which were in Category 1

and which had more positive effect sizes than the Category 1 studies with stronger analytic designs. Analysts

decided not to include those studies in the meta-analysis. Instead, the study corpus was enlarged by conducting a

second literature search in July 2008.

Page 141: OLSD Project 2020 Report

18

Among the 50 individual contrasts between online and face-to-face instruction, 11 were

significantly positive, favoring the online or blended learning condition. Three significant

negative effects favored traditional face-to-face instruction. The fact that multiple comparisons

were conducted should be kept in mind when interpreting this pattern of findings. Because

analysts used a " < .05 level of significance for testing differences, one would expect

approximately 1 in 20 contrasts to show a significant difference by chance alone. For 50

contrasts, then, one would expect 2 or 3 significant differences by chance. The finding of 3

significant contrasts favoring face-to-face instruction is within the range one would expect by

chance; the 11 contrasts favoring online or hybrid instruction exceeds what one would expect by

chance.

Exhibit 3 illustrates the 50 effect sizes derived from the 45 articles.15 Exhibits 4a and 4b present

the effect sizes for Category 1 (purely online versus face-to-face) and Category 2 (blended versus

face-to-face) studies, respectively, along with standard errors, statistical significance, and the 95-

percent confidence interval.

Main Effects

The overall finding of the meta-analysis is that classes with online learning (whether taught

completely online or blended) on average produce stronger student learning outcomes than do

classes with solely face-to-face instruction. The mean effect size for all 50 contrasts was +0.20, p

< .001.

The conceptual framework for this study, which distinguishes between purely online and blended

forms of instruction, calls for creating subsets of the effect estimates to address two more

nuanced research questions:

! How does the effectiveness of online learning compare with that of face to-face

instruction? Looking only at the 27 Category 1 effects that compared a purely online

condition with face-to-face instruction, analysts found a mean effect of +0.05, p =.46.

This finding is similar to that of previous summaries of distance learning (generally from

pre-Internet studies), in finding that instruction conducted entirely online is as effective

as classroom instruction but no better.

15 Some references appear twice in Exhibit 3 because multiple effect sizes were extracted from the same article.

Davis et al. (1999) and Caldwell (2006) each included two contrasts—online versus face-to-face (Category 1) and

blended versus face-to-face (Category 2). Rockman et al. (2007) and Schilling et al. (2006) report findings for two

distinct learning measures. Long and Jennings (2005) report findings from two distinct experiments, a “wave 1” in

which teachers were implementing online learning for the first time and a “wave 2” in which teachers

implemented online learning a second time with new groups of students.

Page 142: OLSD Project 2020 Report

19

! Does supplementing face-to-face instruction with online instruction enhance learning?

For the 23 Category 2 contrasts that compared blended conditions of online plus face-to-

face learning with face-to-face instruction alone, the mean effect size of +0.35 was

significant (p < .0001). Blends of online and face-to-face instruction, on average, had

stronger learning outcomes than did face-to-face instruction alone.

A test of the difference between Category 1 and Category 2 studies found that the mean effect

size was larger for contrasts pitting blended learning against face-to-face instruction (g+ = +0.35)

than for those of purely online versus face-to-face instruction (g+ = +0.05); the difference

between the two subsets of studies was statistically significant (Q = 8.37, p < .01).

Page 143: OLSD Project 2020 Report

20

Exhibit 3. Effect Sizes for Contrasts in the Meta-Analysis

Exhibit reads: The effect size estimate for Schoenfeld-Tacher, McConnell and Graham (2001) was +0.80 with a 95 percent probability that the true effect size lies between -0.10 and +1.70.

Page 144: OLSD Project 2020 Report

21

Exhibit 4a. Purely Online Versus Face-to-Face (Category 1) Studies Included in the Meta-Analysis

Authors Title Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail)

Retention Rate

(percentage) Number of Units

Assigneda g SE

Lower Limit

Upper Limit Z-Value Online

Face-to-Face

Beeckman et al. (2008)

Pressure ulcers: E-learning to improve classification by nurses and nursing students +0.294 0.097 0.104 0.484 3.03** Unknown Unknown

426 participants

Bello et al. (2005) Online vs. live methods for teaching difficult airway management to anesthesiology residents

+0.278 0.265 -0.241 0.797 1.05 100 10056

participants

Benjamin et al. (2007)

A randomized controlled trial comparing Web to in-person training for child care health consultants

+0.046 0.340 -0.620 0.713 0.14 Unknown Unknown23

participants

Beyea et al. (2008) Evaluation of a particle repositioning maneuver Web-based teaching module +0.790 0.493 -0.176 1.756 1.60 Unknown Unknown

17–20 participants

b

Caldwell (2006) A comparative study of traditional, Web-based and online instructional modalities in a computer programming course

+0.132 0.310 -0.476 0.740 0.43 100 100 60 students

Cavus, Uzonboylu and Ibrahim (2007)

Assessing the success rate of students using a learning management system together with a collaborative tool in Web-based teaching of programming languages +0.466 0.335 -0.190 1.122 1.39 Unknown Unknown 54 students

Davis et al. (1999) Developing online courses: A comparison of Web-based instruction with traditional instruction

-0.379 0.339 -1.042 0.285 -1.12 Unknown Unknown2 courses/ classrooms

Hairston (2007) Employees’ attitudes toward e-learning: Implications for policy in industry environments

+0.028 0.155 -0.275 0.331 0.18 70 58.33 168 participants

Harris et al. (2008) Educating generalist physicians about chronic pain with live experts and online education -0.285 0.252 -0.779 0.209 -1.13 84.21 94.44 62 participants

Hugenholtzet al. (2008)

Effectiveness of e-learning in continuing medical education for occupational physicians +0.106 0.233 -0.351 0.564 0.46 Unknown Unknown 72 participants

Jang et al. (2005) Effects of a Web-based teaching method on undergraduate nursing students’ learning of electrocardiography -0.530 0.197 -0.917 -0.143 -2.69** 85.71 87.93 105 students

Page 145: OLSD Project 2020 Report

22

Exhibit 4a. Purely Online Versus Face-to-Face (Category 1) Studies Included in the Meta-Analysis (continued)

Authors Title Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail)

Retention Rate

(percentage) Number of Units

Assigneda

g SE

Lower Limit

Upper Limit Z-Value Online

Face-to-Face

Lowry (2007) Effects of online versus face-to-face professional development with a team-based learning community approach on teachers’ application of a new instructional practice -0.281 0.335 -0.937 0.370 -0.84 80 93.55 53 students

Mentzer, Cryan and Teclehaimanot (2007)

A comparison of face-to-face and Web-based classrooms

-0.796 0.339 -1.460 -0.131 -2.35* Unknown Unknown 36 students

Nguyen et al. (2008)

Randomized controlled trial of an Internet-based versus face-to-face dyspnea self-management program for patients with chronic obstructive pulmonary disease: Pilot study +0.292 0.316 -0.327 0.910 0.93 Unknown Unknown

39 participants

Ocker and Yaverbaum (1999)

Asynchronous computer-mediated communication versus face-to-face collaboration: Results on student learning, quality and satisfaction -0.030 0.214 -0.449 0.389 -0.14 Unknown Unknown 43 students

Padalino and Peres (2007)

E-learning: A comparative study for knowledge apprehension among nurses 0.115 0.281 -0.437 0.666 0.41 Unknown Unknown

49 participants

Peterson and Bond (2004)

Online compared to face-to-face teacher preparation for learning standards-based planning skills -0.100 0.214 -0.520 0.320 -0.47 Unknown Unknown 4 sections

Schmeeckle (2003) Online training: An evaluation of the effectiveness and efficiency of training law enforcement personnel over the Internet

-0.106 0.198 -0.494 0.282 -0.53 Unknown Unknown 101 students

Schoenfeld-Tacher, McConnell and Graham (2001)

Do no harm: A comparison of the effects of online vs. traditional delivery media on a science course +0.800 0.459 -0.100 1.700 1.74 100 99.94 Unknown

Page 146: OLSD Project 2020 Report

23

Exhibit 4a: Purely Online versus Face-to-Face (Category 1) Studies Included in the Meta-analysis (continued)

Authors Title Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail)

Retention Rate

(percentage) Number of Units

Assigneda

g SE

Lower Limit

Upper Limit Z-Value Online

Face-to-Face

Sexton, Raven and Newman (2002)

A comparison of traditional and World Wide Web methodologies, computer anxiety, and higher order thinking skills in the inservice training of Mississippi 4-H extension agents -0.422 0.385 -1.177 0.332 -1.10 Unknown Unknown 26 students

Sun, Lin and Yu (2008)

A study on learning effect among different learning styles in a Web-based lab of science for elementary school students +0.260 0.188 -0.108 0.628 1.38 Unknown Unknown 4 classrooms

Turner et al. (2006) Web-based learning versus standardized patients for teaching clinical diagnosis: A randomized, controlled, crossover trial +0.242 0.367 -0.477 0.960 0.66 Unknown Unknown 30 students

Vandeweerd et al. (2007)

Teaching veterinary radiography by e-learning versus structured tutorial: A randomized, single-blinded controlled trial +0.144 0.207 -0.262 0.550 0.70 Unknown Unknown 92 students

Wallace and Clariana (2000)

Achievement predictors for a computer-applications module delivered online

+0.109 0.206 -0.295 0.513 0.53 Unknown Unknown 4 sections

Wang (2008) Developing and evaluating an interactive multimedia instructional tool: Learning outcomes and user experiences of optometry students -0.071 0.136 -0.338 0.195 -0.53 Unknown Unknown 4 sections

c

Zhang (2005) Interactive multimedia-based e-learning: A study of effectiveness

+0.381 0.339 -0.283 1.045 1.12 Unknown Unknown 51 students

Zhang et al. (2006) Instructional video in e-learning: Assessing the effect of interactive video on learning effectiveness +0.498 0.244 0.020 0.975 2.04* Unknown Unknown 69 students

Exhibit reads: The effect size for the Hugenholtz et al. (2008) study of online medical education was +0.11, which was not significantly different from 0.

*p < .05, ** p < .01, SE = Standard error a The number given represents the assigned units at study conclusion. It excludes units that attrited.

b Two outcome measures were used to compute one effect size. The first outcome measure was completed by 17 participants, and the second outcome measure was

completed by 20 participants. c This study is a crossover study. The number of units represents those assigned to treatment and control conditions in the first round.

Page 147: OLSD Project 2020 Report

24

Exhibit 4b. Blended Versus Face-to-Face (Category 2) Studies Included in the Meta-Analysis

Authors Title Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail)

Retention Rate

(percentage)

Number of Units

Assigneda g SE

Lower Limit

Upper Limit Z-Value Online

Face-to-

Face

Aberson et al. (2003)

Evaluation of an interactive tutorial for teaching hypothesis testing concepts +0.580 0.404 -0.212 1.372 1.44 Unknown

.75 2 sections

Al-Jarf (2004) The effects of Web-based learning on struggling EFL college writers +0.740 0.194 0.360 1.120 3.82*** Unknown Unknown 113 students

Caldwell (2006) A comparative study of traditional, Web-based and online instructional modalities in a computer programming course +0.251 0.311 -0.359 0.861 0.81 100 100 60 students

Davis et al. (1999) Developing online courses: A comparison of Web-based instruction with traditional instruction -0.335 0.338 -0.997 0.327 -0.99 Unknown Unknown

2 courses/ classrooms

Day, Raven and Newman (1998)

The effects of World Wide Web instruction and traditional instruction and learning styles on achievement and changes in student attitudes in a technical writing in agricommunication course +1.113 0.289 0.546 1.679 3.85*** 89.66 96.55 2 sections

DeBord, Aruguete and Muhlig (2004)

Are computer-assisted teaching methods effective?

+0.110 0.188 -0.259 0.479 0.69 Unknown Unknown

112 students

El-Deghaidy and Nouby (2008)

Effectiveness of a blended e-learning cooperative approach in an Egyptian teacher education program +1.049 0.406 0.253 1.845 2.58** Unknown Unknown 26 students

Englert et al. (2007) Scaffolding the writing of students with disabilities through procedural facilitation using an Internet-based technology

+0.740 0.345 0.064 1.416 2.15* Unknown Unknown

6 classrooms from

5 urban schools

Frederickson, Reed and Clifford (2005)

Evaluating Web-supported learning versus lecture-based teaching: Quantitative and qualitative perspectives +0.138 0.345 -0.539 0.814 0.40 Unknown Unknown 2 sections

Gilliver, Randall and Pok (1998)

Learning in cyberspace: Shaping the future +0.477 0.111 0.260 0.693 4.31*** Unknown Unknown

24 classes

Long and Jennings (2005) [Wave 1]

c

The effect of technology and professional development on student achievement +0.025 0.046 -0.066 0.116 0.53 Unknown Unknown 9 schools

Page 148: OLSD Project 2020 Report

25

Exhibit 4b: Blended versus Face-to-Face (Category 2) Studies Included in the Meta-analysis (continued)

Authors Title

Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail) Retention Rate (percentage) Number

of Units Assigned

a g SE

Lower Limit

Upper Limit Z-Value Online

Face-to-Face

Long and Jennings (2005) [Wave 2]

c

The effect of technology and professional development on student achievement +0.554 0.098 0.362 0.747 5.65*** Unknown Unknown 6 teachers

Maki and Maki (2002)

Multimedia comprehension skill predicts differential outcomes of Web-based and lecture courses +0.171 0.160 -0.144 0.485 1.06 91.01 88.10 155 students

Midmer, Kahan and Marlow (2006)

Effects of a distance learning program on physicians’ opioid- and benzodiazepine-prescribing skills +0.332 0.213 -0.085 0.750 1.56m Unknown Unknown 88 students

O’Dwyer, Carey and Kleiman (2007)

A study of the effectiveness of the Louisiana algebra I online course +0.373 0.094 0.190 0.557 3.99*** 88.51 64.4 Unknown

b

Rockman et al. (2007) [Writing]

c

ED PACE final report -0.239 0.102 -0.438 -0.039 -2.34* Unknown Unknown

28 classrooms

Rockman et al. (2007) [Multiple-choice test]

c

ED PACE final report

-0.146 0.102 -0.345 0.054 -1.43 Unknown Unknown28

classrooms

Schilling et al. (2006) [Search strategies]

c

An interactive Web-based curriculum on evidence-based medicine: Design and effectiveness +0.585 0.188 0.216 0.953 3.11** 68.66 59.62 Unknown

Schilling et al. (2006) [Quality of care calculation]

c

An interactive Web-based curriculum on evidence-based medicine: Design and effectiveness +0.926 0.183 0.567 1.285 5.05*** 66.42 86.54 Unknown

Spires et al. (2001) Exploring the academic self within an electronic mail environment +0.571 0.357 -0.130 1.271 1.60 Unknown 100.00 31 students

Suter and Perry (1997)

Evaluation by electronic mail +0.140 0.167 -0.188 0.468 0.84 Unknown Unknown Unknown

Page 149: OLSD Project 2020 Report

26

Exhibit 4b: Blended versus Face-to-Face (Category 2) Studies Included in the Meta-analysis (continued)

Authors Title

Effect Size

95-Percent Confidence

Interval

Test of Null Hypothesis

(2-tail) Retention Rate (percentage) Number

of Units Assigned

a g SE

Lower Limit

UpperLimit Z-Value Online

Face-to-Face

Urban (2006) A comparison of computer-based distance education and traditional tutorial sessions in supplemental instruction for students at-risk for academic difficulties +0.264 0.192 -0.112 0.639 1.37 96.86 73.85 110 students

Zacharia (2007) Comparing and combining real and virtual experimentation: An effort to enhance students’ conceptual understanding of electric circuits +0.570 0.216 0.147 0.993 2.64** 100 95.56 88 students

Exhibit reads: The effect size for the Aberson et al. (2003) study of an interactive tutorial on hypothesis testing was +0.58, which was not significantly different from 0.

*p < .05, ** p < .01, *** p < .001, SE = Standard error. a This number represents the assigned units at study conclusion. It excludes units that attrited.

b The study involved 18 online classrooms from six districts and two private schools; the same six districts were asked to identify comparable face-to-face classrooms,

but the study does not report how many of those classrooms participated. c Two independent contrasts were contained in this article, which therefore appears twice in the table.

Page 150: OLSD Project 2020 Report

27

Test for Homogeneity

Both the Category 1 and Category 2 studies contrasted a condition with online elements with a

condition of face-to-face instruction only. Analysts used the larger corpus of 50 effects that were

either Category 1 or Category 2 to explore the influence of possible moderator variables.

The individual effect size estimates included in this meta-analysis ranged from a low of –0.80

(tendency for higher performance in the face-to-face condition) to a high of +1.11 (favoring

online instruction). A test for homogeneity of effect size found significant differences across

studies (Q = 168.86, p < .0001). Because of these significant differences in effect sizes, analysts

investigated the variables that may have influenced the differing effect sizes.

Analyses of Moderator Variables

As noted in chapter 1, this meta-analysis has distinguished between practice variables, which can

be considered part of intervention implementation, and conditions, which are status variables that

are fairly impervious to outside influence. Relying on prior research, the research team identified

variables of both types that might be expected to correlate with the effectiveness of online

learning. The researchers also considered the potential influence of study method variables,

which often vary with effect size; typically, more poorly controlled studies show larger effects.

Each study in the meta-analysis was coded for these three types of variables—practice, status,

and study method—using the coding categories shown in the appendix.

Many of the studies did not provide information about features considered to be potential

moderator variables, a predicament noted in previous meta-analyses (see Bernard et al. 2004).

Many of the reviewed studies, for example, did not indicate (a) whether or not the online

instructor had received training in the method of instruction, (b) rates of attrition from the

contrasting conditions and (c) contamination between conditions.

For some of the variables, the number of studies providing sufficient information to support

categorization as to whether or not the feature was present was too small to support a meaningful

analysis. Analysts identified those variables for which at least two contrasting subsets of studies,

with each subset containing six or more study effects, could be constructed. In some cases, this

criterion could be met by combining related feature codes; in a few cases, the inference was

made that failure to mention a particular practice or technology (e.g., one-way video) denoted its

absence. Practice, conditions and method variables for which study subsets met the size criterion

were included in the search for moderator variables.

Page 151: OLSD Project 2020 Report

28

Practice Variables

Exhibit 5 shows the variation in effectiveness associated with 12 practice variables. Analysis of

these variables addresses the third research question:

What practices are associated with more effective online learning?

Exhibit 5 and the two data exhibits that follow show significance results both for the various

subsets of studies considered individually and for the test of the dimension used to subdivide the

study sample (i.e., the potential moderator variable). For example, in the case of Computer-

Mediated Communication With Peers, both the 17 contrasts in which students in the online

condition had only asynchronous communication with peers and the 6 contrasts in which online

students had both synchronous and asynchronous communication with peers are shown in the

table. The two subsets had mean effect sizes of +0.27 and +0.17, respectively, and only the

former was statistically different from 0. The Q-statistic of homogeneity tests whether the

variability in effect sizes for these contrasts is associated with the type of peer communication

available. The Q-statistic for Computer-Mediated Communication With Peers (0.32) is not

statistically different from 0, indicating that the addition of synchronous communication with

peers is not a significant moderator of online learning effectiveness.

The test of the moderator variable most central to this study—whether a blended online condition

including face-to-face elements is associated with greater advantages over classroom instruction

than is pure online learning—was discussed above. As noted there, the effect size for blended

approaches contrasted against face-to-face instruction is larger than that for purely online

approaches contrasted against face-to-face instruction. The other two practice variables included

in the chapter 1 conceptual framework—learning experience type and synchronous versus

asynchronous communication with the instructor—were tested in a similar fashion. The former

was found to moderate significantly the size of the online learning effect (Q = 6.19, p < .05).16

The mean effect size for collaborative instruction (+0.25) as well as for instructor-directed

instruction (+0.39) were significantly positive whereas the mean effect size for independent,

active online learning (+0.05) was not.17

Among the other 10 practices, which were not part of the conceptual model, none attained

statistical significance.18 The amount of time that students in the treatment condition spent on

task compared with students in the face-to-face condition did approach statistical significance as

a moderator of effectiveness (Q = 3.62, p = .06).19 The mean effect size for studies with more

time spent on task by online learners than learners in the control condition was +0.45 compared

16 This contrast is not statistically significant (p = .13) when the five K-12 studies are removed from the analysis. 17 Online experiences in which students explored digital artifacts and controlled the specific material they wanted to

view were categorized as “independent” learning experiences. 18 When the five K-12 studies are removed from the analysis, two additional practices are found to be statistically

significant moderators of the effects of online learning – time spent on task and opportunities for face-to-face

interactions with peers. 19 Time on task as a moderator becomes statistically significant (Q = 4.44, p < .05) when the five K-12 studies are

removed from the analysis.

Page 152: OLSD Project 2020 Report

29

Exhibit 5. Tests of Practices as Moderator Variables

Variable Contrast Number Studies

Weighted Effect Size

Standard Error

Lower Limit

Upper Limit

Q-Statistic

Pedagogy/ learning experience

a

Instructor-directed (expository)

8 0.386** 0.120 0.150 0.622

6.19* Independent (active)

17 0.050 0.082 -0.110 0.210

Collaborative (interactive)

22 0.249*** 0.075 0.102 0.397

Computer-mediated communication with instructor

a

Asynchronous only

16 0.239* 0.108 0.027 0.451

1.20 Synchronous + Asynchronous

8 0.036 0.151 -0.259 0.331

Computer-mediated communication with peers

a

Asynchronous only

17 0.272** 0.091 0.093 0.450

0.32 Synchronous + Asynchronous

6 0.168 0.158 -0.141 0.478

Treatment duration

a

Less than 1 month 19 0.140 0.089 -0.034 0.314

0.69 More than 1 month

29 0.234*** 0.069 0.098 0.370

Media features

a

Text-based only 14 0.208 0.111 -0.009 0.425

0.00 Text + other media

32 0.200** 0.066 0.071 0.329

Time on taska

Online > Face to Face

9 0.451*** 0.113 0.229 0.673

3.62 Same or Face to Face > Online

18 0.183* 0.083 0.020 0.346

One-way video or audio

Present 14 0.092 0.091 -0.087 0.271

2.15 Absent/Not reported

36 0.254*** 0.062 0.133 0.375

Computer-based instruction elements

Present 29 0.182** 0.065 0.054 0.311

0.25 Absent/Not reported

21 0.234** 0.081 0.075 0.393

Opportunity for face-to-face time with instructor

During instruction 21 0.298*** 0.074 0.154 0.442

3.70 Before or after instruction

11 0.050 0.118 -0.181 0.281

Absent/Not reported

18 0.150 0.091 -0.028 0.327

Opportunity for face-to-face time with peers

During instruction 21 0.300*** 0.072 0.159 0.442

5.20 Before or after instruction

12 0.001 0.111 -0.216 0.218

Absent/Not reported

17 0.184* 0.093 0.001 0.367

Opportunity to practice

Present 41 0.212*** 0.056 0.102 0.322

0.15 Absent/Not reported

9 0.159 0.124 -0.084 0.402

Feedback provided

Present 23 0.204** 0.078 0.051 0.356

0.00 Absent/Not reported

27 0.203** 0.070 0.066 0.339

Exhibit reads: Studies in which time spent in online learning exceeded time in the face-to-face condition had a mean effect size of +0.45 compared with +0.18 for studies in which face-to-face learners had as much or more instructional time.

*p < .05, **p < .01, ***p < .001. a The moderator analysis for this variable excluded studies that did not report information for this feature.

Page 153: OLSD Project 2020 Report

30

with +0.18 for studies in which the learners in the face-to-face condition spent as much or more

time on task .

Condition Variables

The strategy to investigate whether study effect sizes varied with publication year, which was

taken as a proxy for the sophistication of available technology, involved splitting the study

sample into two nearly equal subsets by contrasting studies published between 1997 and 2003

against those published in 2004 through July 2008.

The studies were divided into three subsets of learner type: K–12 students, undergraduate

students (the largest single group), and other types of learners (graduate students or individuals

receiving job-related training). As noted above, the studies covered a wide range of subjects, but

medicine and health care were the most common. Accordingly, these studies were contrasted

against studies in other fields. Tests of these conditions as potential moderator variables

addressed the study’s fourth research question:

What conditions influence the effectiveness of online learning?

None of the three conditions tested emerged as a statistically significant moderator variable. In

other words, for the range of student types for which studies are available, the effectiveness of

online learning was equivalent in older and newer studies, with undergraduate and older learners,

and in both medical and other subject areas. Exhibit 6 provides the results of the analysis of

conditions.

Exhibit 6. Tests of Conditions as Moderator Variables

Variable Contrast Number of Contrasts

Weighted Effect Size

Standard Error

Lower Limit

Upper Limit Q-Statistic

Year Published

1997–2003 13 0.195 0.105 -0.010 0.400 0.00

2004 or after 37 0.203*** 0.058 0.088 0.317

Learner Type

K–12 students 7 0.1664 0.118 -0.065 0.397

3.25 Undergraduate 21 0.309*** 0.083 0.147 0.471

Graduate student/Other

21 0.100 0.084 -0.064 0.264

Subject Matter

Medical/ Health care

16 0.205* 0.090 0.028 0.382 0.00

Other 34 0.199** 0.062 0.0770 0.320

Exhibit reads: The positive effect associated with online learning over face-to-face instruction was significant both for studies published between 1997 and 2003 and for those published in 2004 or later; the effect size does not vary significantly with period of publication.

*p < .05, **p < .01, ***p < .001.

Page 154: OLSD Project 2020 Report

31

Because of the Evaluation of Evidence-Based Practices in Online Learning study’s emphasis on

K-12 education, the online learning studies involving K-12 students were of particular interest.

The meta-analysis includes seven contrasts from five studies of K-12 school students’ online

learning. Exhibit 7 describes these studies.

Given the small number of studies that addressed K-12 learners in the meta-analysis, attempts to

test for statistical differences between the mean effect for K-12 learners and those for other types

of learners should be viewed as merely suggestive. At +0.17, the average effect size for the seven

contrasts involving K-12 learners appears similar to that for graduate and other students (+0.10)

but less positive than that for undergraduates (+0.31). When learner type was tested as a

moderator variable, however, the resulting Q-statistic was not significant.

Methods Variables

The advantage of meta-analysis is its ability to uncover general effects by looking across a range

of studies that have operationalized the construct under study in different ways, studied it in

different contexts, and used different methods and outcome measures. However, the inclusion of

poorly designed and small-sample studies in the meta-analysis corpus poses concerns because

doing so may give undue weight to spurious effects. Study methods variables were examined as

potential moderators to explore this issue. The results are shown in Exhibit 8.

The influence of study sample size was examined by dividing studies into three subsets,

according to the number of learners for which outcome data were collected. Sample size was not

found to be a statistically significant moderator of online learning effects. Thus, there is no

evidence that the inclusion of small-sample studies in the meta-analysis was responsible for the

overall finding of a positive outcome for online learning.

Comparisons of the three designs deemed acceptable for this meta-analysis (random-assignment

experiments, quasi-experiments with statistical control and crossover designs) indicate that study

design is not significant as a moderator variable (see Exhibit 8). Moreover, in contrast with early

meta-analyses in computer-based instruction, where effect size was inversely related to study

design quality (Pearson et al. 2005), those experiments that used random assignment in the

present corpus produced significant positive effects (+0.25, p < .001) while the quasi-

experiments and crossover designs did not.

Page 155: OLSD Project 2020 Report

32

Exhibit 7. Studies of Online Learning Involving K–12 Students

The meta-analysis study corpus for this meta-analysis included five articles reporting on studies

involving K–12 students. All of these studies compared student learning in a blended condition

with student learning in a face-to-face condition. One of the studies (Long and Jennings 2005,

Wave 1 study) was a randomized control trial and the others were quasi-experiments. One of the

quasi-experiments (Rockman et al. 2007) provided two effect sizes that favored the face-to-face

condition; the other studies provided five effects favoring the blended condition (with a range

from +0.03 to +0.74).

Rockman et al. (2007) used a quasi-experimental matched comparison design to evaluate the

effectiveness of Spanish courses offered to middle schools (seventh and eighth grades) through

the West Virginia Virtual School. This virtual school program used a blended model of

instruction that combined face-to-face and virtual instruction as well as paper and pencil and

Web-based activities. The program was delivered by a three-member teacher team that included

a lead teacher (a certified Spanish teacher) who was responsible for the design and delivery of

the daily lesson plan and weekly phone conversations with each class; an adjunct teacher (a

certified Spanish teacher) who provided content-related feedback by means of e-mail and voice-

mail and who graded student tests and products; and a classroom facilitator (a certified teacher,

but not a Spanish teacher) who guided students on site to ensure that they stayed on task and

completed assignments on time. The hybrid Spanish course was offered to students in 21 schools

that did not have the resources to provide face-to-face Spanish instruction. The students in the

face-to-face group came from seven schools that matched the virtual schools with respect to

average language arts achievement and school size. The study involved a total of 463 students.

Information needed to compute effect sizes was reported for two of the student learning

measures used in the study. For the first of these, a multiple-choice test including subtests on oral

and written comprehension of Spanish, the mean estimated effect was –0.15, and the difference

between the two conditions was not statistically significant. The other measure was a test of

students’ writing ability, and the effect size for this skill was –0.24, with students receiving face-

to-face instruction doing significantly better than those receiving the online blended version of

the course.

Contrasting results were obtained in the other large-scale K–12 study, conducted by O’Dwyer,

Carey and Kleiman (2007). These investigators used a quasi-experimental design to compare the

learning of students participating in the Louisiana Algebra I Online initiative with the learning of

students in comparison classrooms that were “similar with regard to mathematics ability,

environment, and size, but where teachers used traditional ‘business as usual’ approaches to

teaching algebra” (p. 293). Like the West Virginia Virtual School program, this initiative used a

blended model of instruction that combined face-to-face and Web-based activities with two

teachers: one in class and the other online. Matched pre- and posttest scores on researcher-

developed multiple-choice tests were collected from a total of 463 students (231 from the

treatment group, 232 from the comparison group) from multiple schools and school districts. An

effect size of +0.37 was obtained, with online students performing better than their peers in

conventional classrooms.

Page 156: OLSD Project 2020 Report

33

Exhibit 7. Studies of Online Learning Involving K-12 Students (continued)

Long and Jennings (2005) examined whether the performance of eighth-grade students whose

teachers integrated the use of the Pathways to Freedom Electronic Field Trips—an online

collection of interactive activities designed by Maryland Public Television—improved compared

with performance of students whose teachers taught the same content without the online

materials. The study provided two sets of analyses from two waves of data collection, yielding

two independent effect sizes. The first set of analyses involved the data from nine schools in two

Maryland districts. Schools were assigned randomly to conditions. Teachers in both conditions

covered the same learning objectives related to slavery and the Underground Railroad, with the

treatment teachers using the Pathways to Freedom Electronic Field Trips materials. A small

effect size of +0.03 favoring the online condition was computed from change scores on

researcher-developed multiple-choice tests administered to 971 students.

Long and Jennings’ (2005, wave 2) second study involved a subset of teachers from one of the

two participating districts, which was on a semester schedule. The teachers from this district

covered the same curriculum twice during the year for two different sets of students. The gain

scores of 846 students of six teachers (three treatment teachers and three control teachers) from

both semesters were collected. Regression analysis indicated an effect size of +0.55 favoring the

online conditions. This study also looked into the maturation effects of teachers’ using the online

materials for the second time. As hypothesized, the results showed that the online materials were

used more effectively in the second semester.

Sun, Lin and Yu (2008) conducted a quasi-experimental study to examine the effectiveness of a

virtual Web-based science lab with 113 fifth-grade students in Taiwan. Although both treatment

and control groups received an equal number of class hours and although both groups conducted

manual experiments, students in the treatment condition used the virtual Web-based science lab

for part of their lab time. The Web-based lab enabled students to conduct virtual experiments

while teachers observed student work and corrected errors online. The control group students

conducted equivalent experiments using conventional lab equipment. Matched pre- and posttest

scores on researcher-developed assessments were collected for a total of 113 students (56 from

the treatment group and 57 from the comparison group) in four classrooms from two randomly

sampled schools. An effect size of +0.26 favoring the virtual lab condition was obtained from

analysis of covariance results, controlling for pretest scores.

A small-scale quasi-experiment was conducted by Englert et al. (2007). This study examined the

effectiveness of a Web-based writing support program with 35 elementary-age students from six

special education classrooms across five urban schools. Students in the treatment group used a

Web-based program that supported writing performance by prompting attention to the topical

organization and structure of ideas during the planning and composing phases of writing. Control

students used similar writing tools provided in traditional paper-and-pencil formats. Pre- and

posttests of student writing, scored on a researcher-developed rubric, were used as outcome

measures. An effect size of +0.74 favoring the online condition was obtained from an analysis of

covariance controlling for writing pretest scores.

Page 157: OLSD Project 2020 Report

34

Exhibit 8. Tests of Study Features as Moderator Variables

Variable Contrast

Number of

Studies

Weighted Effect Size

Standard Error

Lower Limit

Upper Limit Q-Statistic

Sample size

Fewer than 35 11 0.203 0.139 -0.069 0.476

0.01 From 35 to 100 20 0.209* 0.086 0.039 0.378

More than 100 19 0.199** 0.072 0.058 0.339

Type of knowledge tested

a

Declarative 12 0.180 0.097 -0.010 0.370

0.37

Procedural/ Procedural and declarative

30 0.239*** 0.068 0.106 0.373

Strategic knowledge

5 0.281 0.168 -0.047 0.610

Study design

Random assignment control

32 0.249*** 0.065 0.122 0.376

1.50

Quasi-experimental design with statistical control

13 0.108 0.095 -0.079 0.295

Crossover design

5 0.189 0.158 -0.120 0.499

Unit of assignment to conditions

a

Individual 32 0.169* 0.066 0.040 0. 298

4.73. Class section 7 0.475*** 0.139 0.202 0.748

Course/School 9 0.120 0.103 -0.083 0.323

Instructor equivalence

a

Same instructor 20 0.176* 0.078 0.024 0.329

0.73

Different instructor

19 0.083 0.077 -0.067 0.233

Equivalence of curriculum/ instruction

a

Identical/ Almost identical

29 0.130* 0.063 0.007 0.252

6.85** Different/ Somewhat different

17 0.402*** 0.083 0.239 0.565

Exhibit reads: The average effect size was significantly positive for studies with a sample size of less than 35 as well as for those with 35 to 100 and those with a sample size larger than 100; the weighted average effect did not vary with size of the study sample.

*p < .05, **p < .01, ***p < .001. a The moderator analysis excluded some studies because they did not report information about this

feature.

Page 158: OLSD Project 2020 Report

35

Effect sizes do not vary depending on whether or not the same instructor or instructors taught in

the face-to-face and online conditions (Q = 0.73, p > .05). The average effect size for the 20

contrasts in which instructors were the same across conditions was +0.18, p < .05. The average

effect size for contrasts in which instructors varied across conditions was +0.08, p > .05. The

only study method variable that proved to be a significant moderator of effect size was

comparability of the instructional materials and approach for treatment and control students.

The analysts coding study features examined the descriptions of the instructional materials and

the instructional approach for each study and coded them as “identical,” “almost identical,”

“different” or “somewhat different” across conditions. Adjacent coding categories were

combined (creating the two study subsets Identical/Almost Identical and Different/Somewhat

Different) to test Equivalence of Curriculum/Instruction as a moderator variable. Equivalence of

Curriculum/Instruction was a significant moderator variable (Q = 6.85, p < .01). An examination

of the study subgroups shows that the average effect for studies in which online learning and

face-to-face instruction were described as identical or nearly so was +0.13, p < .05, compared

with an average effect of +0.40 (p < .001) for studies in which curriculum materials and

instructional approach varied across conditions.

The moderator variable analysis for aspects of study method also found additional patterns that

did not attain statistical significance but that should be re-tested once the set of available rigorous

studies of online learning has expanded. The type of learning outcome tested, for example, may

influence the magnitude of effect sizes. Twelve studies measured declarative knowledge

outcomes only, typically through multiple-choice tests. A larger group of studies (30) looked at

students’ ability to perform a procedure, or they combined procedural and declarative knowledge

outcomes in their learning measure. Five studies used an outcome measure that focused on

strategic knowledge. (Three studies did not describe their outcome measures in enough detail to

support categorization.) Among the subsets of studies, the average effect for studies that included

procedural knowledge in their learning outcome measure (effect size of +0.24) and that for

studies that measured strategic knowledge (effect size of +0.28) appeared larger than the mean

effect size for studies that used a measure of declarative knowledge only (+0.18). Even so, the

Type of Knowledge Tested was not a significant moderator variable (Q = 0.37, p > .05).

Page 159: OLSD Project 2020 Report
Page 160: OLSD Project 2020 Report

37

4. Narrative Synthesis of Studies Comparing Variants of Online Learning

This chapter presents a narrative summary of Category 3 studies—those that examined the

learning effects of variations in online practices such as different versions of blended instruction

or online learning with and without immediate feedback to the learner. The literature search and

screening (described in chapter 2) identified 84 Category 3 studies reported in 79 articles.20

Within the set of Category 3 studies, five used K–12 students as subjects and 10 involved K–12

teacher education or professional development. College undergraduates constituted the most

common learner type (see Exhibit 9). All Category 3 studies involved formal education. Course

content for Category 3 studies covered a broad range of subjects, including observation skills,

understanding Internet search engines, HIV/AIDS knowledge and statistics.

When possible, the treatment manipulations in Category 3 studies were coded using the practice

variable categories that were used in the meta-analysis to facilitate comparisons of findings

between the meta-analysis and the narrative synthesis. No attempt was made to statistically

combine Category 3 study results, however, because of the wide range of conditions compared in

the different studies.

Exhibit 9. Learner Types for Category 3 Studies

Educational Level Number of Studies

K–12 5

Undergraduate 37

Graduate 4

Medicala 18

Teacher professional developmentb

10

Adult training 4

Otherc 4

Not available 2

Total 84

Exhibit reads: K–12 students were the learners in 5 of the 84 studies of alternative online practices. a The medical category spans undergraduate and graduate educational levels

and includes nursing and related training. b Teacher professional development includes preservice and inservice training.

c The Other category includes populations consisting of a combination of

learner types such as student and adult learners or undergraduate and graduate learners.

20 Some articles contained not only contrasts that fit the criteria for Category 1 or 2 but also contrasts that fit

Category 3. The appropriate contrasts between online and face-to-face conditions were used in the meta-analysis;

the other contrasts were reviewed as part of the Category 3 narrative synthesis presented here.

Page 161: OLSD Project 2020 Report

38

Blended Compared With Pure Online Learning

The meta-analysis of Category 1 and 2 studies described in chapter 3 found that effect sizes were

larger for studies that compared blended learning conditions with face-to-face instruction than

for studies that compared purely online learning with face-to-face instruction. Another way to

investigate the same issue is by conducting studies that incorporate both blended and purely

online conditions to permit direct comparisons of their effectiveness.

The majority of the 10 Category 3 studies that directly compared purely online and blended

learning conditions found no significant differences in student learning. Seven studies found no

significant difference between the two, two found statistically significant advantages for purely

online instruction, and one found an advantage for blended instruction. The descriptions of some

of these studies, provided below, make it clear that although conditions were labeled as

“blended” or “purely online” on the basis of their inclusion or exclusion of face-to-face

interactions, conditions differed in terms of content and quality of instruction. Across studies,

these differences in the nature of purely online and blended conditions very likely contributed to

the variation in outcomes.

Keefe (2003), for example, contrasted a section of an organizational behavior course that

received lectures face-to-face with another section that watched narrated PowerPoint slides

shown online or by means of a CD-ROM. Both groups had access to e-mail, online chat rooms,

and threaded discussion forums. All course materials were delivered electronically to all students

at the same time. On the course examination, students in the purely online section scored almost

8 percent lower than those receiving face-to-face lectures in addition to the online learning

activities. Keefe’s was the only study in the review that found a significant decrement in

performance for the condition without face-to-face instructional elements.

Poirier and Feldman (2004) compared a course that was predominantly face-to-face but also used

an online discussion board with a course taught entirely online. Students in the predominantly

face-to-face version of the course were required to participate in three online discussions during

the course and to post at least two comments per discussion to an online site; the site included

content, communication and assessment tools. In the purely online version of the course, students

and the instructor participated in two online discussions each week. Poirier and Feldman found a

significant main effect favoring the purely online course format for examination grades but no

effect on student performance on writing assignments.

Campbell et al. (2008) compared a blended course (in which students accessed instruction online

but attended face-to-face discussions) with a purely online course (in which students accessed

instruction and participated in discussions online). Tutors were present in both discussion

formats. Students were able to select the type of instruction they wanted, blended or online.

Mean scores for online discussion students were significantly higher than those for the face-to-

face discussion group.

As a group, these three studies suggest that the relative efficacy of blended and purely online

learning approaches depends on the instructional elements of the two conditions. For the most

part, these studies did not control instructional content within the two delivery conditions (blend

Page 162: OLSD Project 2020 Report

39

of online and face-to-face versus online only). For example, the lecturer in the Keefe (2003)

study may have covered material not available to the students reviewing the lecture’s PowerPoint

slides online. Alternately, in the Poirier and Feldman (2004) study, students interacting with the

instructor in two online discussions a week may have received more content than did those

receiving face-to-face lectures.

Davis et al. (1999) attempted to equate the content delivered in their three class sections (online,

traditional face-to-face, and a blended condition in which students and instructor met face-to-

face but used the online modules). Students in an educational technology course were randomly

assigned to one of the three sections. No significant differences among the three conditions were

found in posttest scores on a multiple-choice test.

An additional six studies contrasting purely online conditions and blended conditions (without

necessarily equating learning content across conditions) also failed to find significant differences

in student learning. Ruchti and Odell (2002) compared test scores from two groups of students

taking a course on elementary science teaching methods. One group took online modules; the

other group received instruction in a regular class, supplemented with an online discussion board

and journal (also used in the online course condition). No significant difference between the

groups was found.

Beile and Boote (2002) compared three groups: one with face-to-face instruction alone, another

with face-to-face instruction and a Web-based tutorial, and a third with Web-based instruction

and the same Web-based tutorial. The final quiz on library skills indicated no significant

differences among conditions.

Gaddis et al. (2000) compared composition students’ audience awareness between a blended

course and a course taught entirely online. The same instructor taught both groups, which also

had the same writing assignments. Both groups used networked computers in instruction, in

writing and for communication. However, the “on campus” group met face-to-face, giving

students the opportunity to communicate in person, whereas the “off campus” group met only

online. The study found no significant difference in learner outcomes between the two groups.

Similarly, Caldwell (2006) found no significant differences in performance on a multiple-choice

test between undergraduate computer science majors enrolled in a blended course and those

enrolled in an online course. Both groups used a Web-based platform for instruction, which was

supplemented by a face-to-face lab component for the blended group.

Scoville and Buskirk (2007) examined whether the use of traditional or virtual microscopy

would affect learning outcomes in a medical histology course. Students were assigned to one of

four sections: (a) a control section where learning and testing took place face-to-face, (b) a

blended condition where learning took place virtually and the practical examination took place

face-to-face, (c) a second blended condition where learning took place face-to-face and testing

took place virtually, and (d) a fully online condition. Scoville and Buskirk found no significant

differences in unit test scores by learning groups.

Finally, McNamara et al. (2008) studied the effectiveness of different approaches to teaching a

weight-training course. They divided students into three groups: a control group that received

Page 163: OLSD Project 2020 Report

40

face-to-face instruction, a blended group that received a blend of online and face-to-face

instruction, and a fully online group. The authors did not find a significant main effect for group

type.21

Thus, as a group, these studies do not provide a basis for choosing online versus blended

instructional conditions

Media Elements

Eight studies in the Category 3 corpus compared online environments using different media

elements such as one-way video (Maag 2004; McKethan et al. 2003; Schmeeckle 2003;

Schnitman 2007; Schroeder 2006; Schutt 2007; Tantrarungroj 2008; Zhang et al. 2006). Seven of

the eight studies found no significant differences among media combinations. In the study that

found a positive effect from enhanced media features, Tantrarungroj (2008) compared two

instructional approaches for teaching a neuroscience lesson to undergraduate students enrolled in

computer science classes. The author contrasted an experimental condition in which students

were exposed to online text with static graphics and embedded video with a control condition in

which students did not have access to the streaming video. Tantrarungroj found no significant

difference in grades for students in the two conditions on a posttest administered immediately

after the course; however, the treatment group scored significantly higher on a knowledge

retention test that was administered 4 weeks after the intervention.

The other seven studies found no effect on learning from adding additional media to online

instruction. For example, Schnitman (2007) sought to determine whether enhancing text with

graphics, navigation options, and color would affect learning outcomes. The author randomly

assigned students to one of two conditions in a Web-based learning interface; the control group

accessed a plain, text-based interface, and the treatment group accessed an enhanced interface

that featured additional graphics, navigational options, and an enhanced color scheme.

Schnitman found no significant differences in learning outcomes between the treatment and

control groups.

The fact that the majority of studies found no significant difference across media types is

consistent with the theoretical position that the medium is simply a carrier of content and is

unlikely to affect learning per se (Clark 1983, 1994). A study by Zhang et al. (2006) suggests

that the way in which a medium is used is more important than merely having access to it. Zhang

et al. found that the effect of video on learning hinged on the learner’s ability to control the video

(“interactive video”). The authors used four conditions: traditional face-to-face and three online

environments—interactive video, noninteractive video, and nonvideo. Students were randomly

assigned to one of the four groups. Students in the interactive video group performed

significantly better than the other three groups. There was no statistical difference between the

online group that had noninteractive video and the online group that had no video.

21 However, in tests of cognitive knowledge and strength, both the control and blended sections showed significant

improvements, whereas the fully online section showed no significant pre- to posttest growth for either outcome.

Page 164: OLSD Project 2020 Report

41

In summary, many researchers have hypothesized that the addition of images, graphics, audio,

video or some combination would enhance student learning and positively affect achievement.

However, the majority of studies to date have found that these media features do not affect

learning outcomes significantly.

Learning Experience Type

Other Category 3 studies manipulated different features of the online learning environment to

investigate the effects of learner control or type of learning experience. The learning experience

studies provide some evidence that suggests an advantage for giving learners an element of

control over the online resources with which they engage; however, the studies’ findings are

mixed with respect to the relative effectiveness of the three learning experience types in the

conceptual framework presented in chapter 2.

Four studies (Cavus et al. 2007; Dinov, Sanchez and Christou 2008; Gao and Lehman 2003;

Zhang 2005) provide preliminary evidence supporting the hypothesis that conditions in which

learners have more control of their learning (either active or interactive learning experiences in

our conceptual framework) produce larger learning gains than do instructor-directed conditions

(expository learning experiences). Three other studies failed to find such an effect (Cook et al.

2007; Evans 2007; Smith 2006).

Zhang (2005) reports on two studies comparing expository learning with active learning, both of

which found statistically positive results in favor of active learning. Zhang manipulated the

functionality of a Web course to create two conditions. For the control group, video and other

instruction received over the Web had to be viewed in a specified order, videos had to be viewed

in their entirety (e.g., a student could not fast forward) and rewinding was not allowed. The

treatment group could randomly access materials, watching videos in any sequence, rewinding

them and fast forwarding through their content. Zhang found a statistically significant positive

effect in favor of learner control over Web functionality (see also the Zhang et al. 2006 study

described above). Gao and Lehman (2003) found that students who were required to complete a

“generative activity” in addition to viewing a static Web page performed better on a test about

copyright law than did students who viewed only the static Web page. Cavus, Uzonboylu and

Ibrahim (2007) compared the success rates of students learning the Java programming language

who used a standard collaborative tool with the success rate of those who used an advanced

collaborative tool that allowed compiling, saving and running programs inside the tool. The

course grades for students using the advanced collaborative tool were higher than those of

students using the more standard tool. Similarly, Dinov, Sanchez and Christou (2008) integrated

tools from the Statistics Online Computational Resource in three courses in probability and

statistics. For each course, two groups were compared: one group of students received a “low-

intensity” experience that provided them with access to a few online statistical tools; the other

students received a “high-intensity” condition with access to many online tools for acting on

data. Across the three classes, pooling all sections, students in the more active, high-intensity

online tool condition demonstrated better understanding of the material on mid-term and final

examinations than did the other students.

These studies that found positive effects for learner control and nondidactic forms of instruction

are counterbalanced by studies that found mixed or null effects from efforts to provide a more

Page 165: OLSD Project 2020 Report

42

active online learning experience. Using randomly assigned groups of nurses who learned about

pain management online, Smith (2006) altered the instructional design to compare a text-based,

expository linear design with an instructional design involving participant problem solving and

inquiry. No significant difference was found between the two groups in terms of learning

outcomes. Cook et al. (2007) found no differences in student learning between a condition with

end-of-module review questions that required active responses and a condition with expository

end-of-module activities. Evans (2007) explored the effects of more and less expository online

instruction for students learning chemistry lab procedures. After asking students to complete an

online unit that was either text-based or dynamic and interactive, Evans found that SAT score

and gender were stronger predictors of student performance on a posttest with conceptual and

procedural items than was the type of online unit to which students were exposed.

Golanics and Nussbaum (2008) examined the effect of “elaborated questions” and “maximizing

reasons” prompts on students’ ability to construct and critique arguments. Students were

randomly divided into groups of three; each group engaged in asynchronous discussions. Half of

the groups received “elaborated questions,” which explicitly instructed them to think of

arguments and counterarguments, whereas the other half of the groups viewed unelaborated

questions. In addition, half of the groups randomly received prompts to provide justifications and

evidence for their arguments (called the “maximizing reasons” condition); half of the groups did

not receive those prompts. Elaborated questions stimulated better-developed arguments, but

maximizing reasons instructions did not.

Chen (2007) randomly assigned students in a health-care ethics class to one of three Web-based

conditions: (a) a control group that received online instruction without access to an advanced

organizer; (b) a treatment group that studied a text-based advanced organizer before online

instruction; and (c) a second treatment group that reviewed an advanced, Flash-based concept

map organizer before engaging in online learning.22 The authors hypothesized that both the

advanced organizer and the concept map would help students access relevant prior knowledge

and increase their active engagement with the new content. Contrary to expectations, Chen found

no significant differences in learning achievement across the three groups.

Suh (2006) examined the effect of guiding questions on students’ ability to produce a good

educational Web site as required in an online educational technology course. Students in the

guiding-question condition received questions through an electronic discussion board and were

required to read the questions before posting their responses. E-mails and online postings

reminded them to think about the guiding questions as they worked through the problem

scenario. Guiding questions were found to enhance the performance of students working alone,

but they did not produce benefits for students working in groups. One possible explanation

offered by the author is that students working in groups may scaffold each other’s work, hence

reducing the benefit derived from externally provided questions.

22 Flash animations are created using Flash software from Adobe; a concept map is a graphic depiction of a set of

ideas and the linkages among them.

Page 166: OLSD Project 2020 Report

43

Computer-Based Instruction

The advantage of incorporating elements that are generally found in stand-alone computer-based

instruction into online learning seems to depend on the nature of the contrasting conditions.

Quizzes, simulations and individualized instruction, all common to stand-alone computer-based

instruction, appear to vary in their effectiveness when added to an online learning environment.

Online Quizzes

Research on incorporating quizzes into online learning does not provide evidence that the

practice is effective. The four studies that examined the effectiveness of online quizzes (Lewis

2002; Maag 2004; Stanley 2006; Tselios et al. 2001) had mixed findings. Maag (2004) and

Stanley (2006) found no advantage for the inclusion of online quizzes. Maag included online

quizzes in a treatment condition that also provided students with online images, text and some

animation; the treatment group was compared with other groups, which differed both in the

absence of online quizzes and in terms of the media used (one had the same text and images

delivered online, one had printed text only, and one had printed text plus images). Maag found

no significant difference between the online group that had the online quizzes and the online

group that did not. Stanley (2006) found that outcomes for students taking weekly online quizzes

did not differ statistically from those for students who completed homework instead.

Two other studies suggested that whether or not quizzes positively affect learning may depend

on the presence of other variables. Lewis (2002) grouped students into two cohorts. For six

modules, Group 1 took online quizzes and Group 2 participated in online discussions. For six

other modules, the groups switched so that those who had been taking the online quizzes

participated in online discussions and vice versa. When Group 1 students took the online quizzes,

they did significantly better than those participating in discussions, but no difference was found

between the groups when Group 2 took the online quizzes in the other six modules. The

researchers interpreted this interaction between student group and condition in terms of the

degree of interactivity in the online discussion groups. Group 1 was more active in the online

discussions, and the authors suggested that this activity mitigated any loss in learning otherwise

associated with not taking quizzes.

Tselios et al. (2001) suggest that the software platform used to deliver an online quiz may affect

test performance. In their study, students completing an online quiz in WebCT performed

significantly better than students taking the online quiz on a platform called IDLE. The

educational content in the two platforms was identical and their functionality was similar;

however, they varied in the details of their user interfaces.

Simulations

The results of three studies exploring the effects of including different types of online simulations

were modestly positive. Two of the studies indicated a positive effect from including an online

simulation; however, one study found no significant difference. In an online module on

information technology for undergraduate psychology students, Castaneda (2008) contrasted two

simulation conditions (one provided a simulation that students could explore as they chose, and

the other guided the students’ interaction with the simulation, providing some feedback and

Page 167: OLSD Project 2020 Report

44

expository material) with a condition that included no simulation. Castaneda also manipulated

the sequencing of instructional activities, with the interaction with the simulation coming either

before or after completion of the expository portion of the instructional module. Knowledge

gains from pre- to posttest were greater for students with either type of simulation, provided they

were exposed to it after, rather than before, the expository instruction.

Hibelink (2007) explored the effectiveness of using two-dimensional versus three-dimensional

images of human anatomy in an online undergraduate human anatomy lab. The group of students

that used three-dimensional images had a small, but significant advantage in identifying

anatomical parts and spatial relationships. Contrasting results were obtained by Loar (2007) in an

examination of the effects of computer-based case study simulations on students’ diagnostic

reasoning skills in nurse practitioner programs. All groups received identical online lectures,

followed by an online text-based case study for one group and by completion of a computer-

simulated case study for the other. No difference was found between the group receiving the case

simulation versus that receiving the text-based version of the same case.

Individualized Instruction

The online learning literature has also explored the effects of using computer-based instruction

elements to individualize instruction so that the online learning module or platform responds

dynamically to the participant’s questions, needs or performance. There were only two online

learning studies of the effects of individualizing instruction, but both found a positive effect.

Nguyen (2007) compared the experiences of people learning to complete tax preparation

procedures, contrasting those who used more basic online training with those who used an

enhanced interface that incorporated a context-sensitive set of features, including integrated

tutorials, expert systems, and content delivered in visual, aural and textual forms. Nguyen found

that this combination of enhancements had a positive effect.

Grant and Courtoreille (2007) studied the use of post-unit quizzes presented either as (a) fixed

items that provided feedback only about whether or not the student’s response was correct or (b)

post-unit quizzes that gave the student the opportunity for additional practice on item types that

had been answered incorrectly. The response-sensitive version of the tutorial was found to be

more effective than the fixed-item version, resulting in greater changes between pre- and posttest

scores.

Supports for Learner Reflection

Nine studies (Bixler 2008; Chang 2007; Chung, Chung and Severance 1999; Cook et al. 2005;

Crippen and Earl 2007; Nelson 2007; Saito and Miwa 2007; Shen, Lee and Tsai 2007; Wang et

al. 2006) examined the degree to which promoting aspects of learner reflection in a Web-based

environment improved learning outcomes. These studies found that a tool or feature prompting

students to reflect on their learning was effective in improving outcomes.

For example, Chung, Chung and Severance (1999) examined how computer prompts designed to

encourage students to use self-explanation and self-monitoring strategies affected learning, as

measured by students’ ability to integrate ideas from a lecture into writing assignments. Chung et

Page 168: OLSD Project 2020 Report

45

al. found that students in the group receiving the computer prompts integrated and elaborated a

significantly higher number of the concepts in their writing than did those in the control group.

In a quasi-experimental study of Taiwan middle school students taking a Web-based biology

course, Wang et al. (2006) found that students in the condition using a formative online self-

assessment strategy performed better than those in conditions using traditional tests, whether the

traditional tests were online or administered in paper-and-pencil format. In the formative online

assessment condition, when students answered an item incorrectly, they were told that their

response was not correct, and they were given additional resources to explore to find the correct

answer. (They were not given the right answer.) This finding is similar to that of Grant and

Courtoreille (2007) described above.

Cook et al. (2005) investigated whether the inclusion of “self-assessment” questions at the end of

modules improved student learning. The study used a randomized, controlled, crossover trial, in

which each student took four modules, two with the self-assessment questions and two without.

The order of modules was randomly assigned. Student performance was statistically higher on

tests taken immediately after completion of modules that included self-assessment questions than

after completion of those without such questions—an effect that the authors attributed to the

stimulation of reflection. This effect, however, did not persist on an end-of-course test, on which

all students performed similarly.

Shen, Lee and Tsai (2007) found a combination of effects for self-regulation and opportunities to

learn through realistic problems. They compared the performance of students who did and did

not receive instruction in self-regulation learning strategies such as managing study time, goal-

setting and self-evaluation. The group that received instruction in self-regulated learning

performed better in their online learning.

Bixler (2008) examined the effects of question prompts asking students to reflect on their

problem-solving activities. Crippen and Earl (2007) investigated the effects of providing students

with examples of chemistry problem solutions and prompts for students to provide explanations

regarding their work. Chang (2007) added a self-monitoring form for students to record their

study time and environment, note their learning process, predict their test scores and create a

self-evaluation. Saito and Miwa (2007) investigated the effects of student reflection exercises

during and after online learning activities. Nelson (2007) added a learning guidance system

designed to support a student’s hypothesis generation and testing processes without offering

direct answers or making judgments about the student’s actions. In all of these studies, the

additional reflective elements improved students’ online learning.

Overall, the available research evidence suggests that promoting self-reflection, self-regulation

and self-monitoring leads to more positive online learning outcomes. Features such as prompts

for reflection, self-explanation and self-monitoring strategies have shown promise for improving

online learning outcomes.

Page 169: OLSD Project 2020 Report

46

Moderating Online Groups

Organizations providing or promoting online learning generally recommend the use of

instructors or other adults as online moderators, but research support for the effects of this

practice on student learning is mixed. A study by Bernard and Lundgren-Cayrol (2001) suggests

that instructor moderation may not improve learning outcomes in all contexts. The study was

conducted in a teacher education course on educational technology in which the primary

pedagogical approach was collaborative, project-based learning. Students in the course were

randomly assigned to groups receiving either low or high intervention on the part of a moderator

and composed of either random or self-selected partners. The study did not find a main effect for

moderator intervention. In fact, the mean examination scores of the low-moderation, random-

selection groups were significantly higher than those of the other groups. A study by De Wever,

Van Winckel and Valcke (2008) also found mixed effects resulting from instructor moderation.

This study was conducted during a clinical rotation in pediatrics in which knowledge of patient

management was developed through case-based asynchronous discussion groups. Researchers

used a crossover design to create four conditions based on two variables: the type of moderator

(instructor moderator versus student moderator) and the presence of a developer of alternatives

for patient management (assigned developer versus no assigned developer). The presence of a

course instructor as moderator was found not to improve learning outcomes significantly. When

no assigned developer of alternatives was assigned, the two moderator conditions performed

equivalently. When a developer of alternatives was specified, the student-moderated groups

performed significantly better than the instructor-moderated groups.

Alternately, Zhang (2004) found that an externally moderated group scored significantly higher

on problems calling for use of statistical knowledge and problem-solving skills than a peer-

controlled group on both well- and ill-structured problems. Zhang’s study compared the

effectiveness of peer versus instructor moderation of online asynchronous collaboration.

Students were randomly assigned to one of two groups. One group had a “private” online space

where students entirely controlled discussion. The other group’s discussion was moderated by

the instructor, who also engaged with students through personal e-mails and other media.

Scripts for Online Interaction

Four Category 3 studies investigated alternatives to human moderation of online discussion in

the form of “scaffolding” or “scripts” designed to produce more productive online interaction.

The majority of these studies indicated that the presence of scripts to guide interactions among

groups learning together online did not appear to improve learning outcomes.

The one study that found positive student outcomes for learners who had been provided scripts

was conducted by Weinberger et al. (2005). These researchers created two types of scripts:

“epistemic scripts,” which specified how learners were to approach an assigned task and guided

learners to particular concepts or aspects of an activity, and “social scripts,” which structured

how students should interact with each other through methods such as gathering information

from each other by asking critical questions. They found that social scripts improved

performance on tests of individual knowledge compared with a control group that participated in

online discussions without either script (whether or not the epistemic script was provided).

Page 170: OLSD Project 2020 Report

47

The remaining three studies that examined the effect of providing scripts or scaffolds for online

interaction found no significant effect on learning (Choi, Land and Turgeon 2005; Hron et al.

2000; Ryan 2007). Hron et al. (2000) used an experimental design to compare three groups: (a) a

control that received no instructions regarding a 1-hour online discussion, (b) a group receiving

organizing questions to help structure their online communication and (c) a group receiving both

the organizing questions and rules for discussion. The discussion rules stated that group members

should discuss only the organizing questions; that discussion of one question had to be

completed before the next discussion was begun; that the discussion needed to be structured as

an argument, with claims justified and alternative viewpoints considered; and that all participants

should take turns moderating the discussion and making sure that the discussion adhered to the

rules. Hron et al. found statistically significant differences across conditions in the content and

coherence of student postings, but no difference across the three groups in terms of knowledge

acquisition as measured by a multiple-choice test.

Ryan’s study (2007) reached conclusions similar to those of Hron et al. Ryan hypothesized that

exposure to collaborative tools would affect student performance. He compared two groups of

middle school students: a treatment group, which engaged in online learning that included

interaction with instructors and peers using online collaboration tools, and a control group, which

did not have access to or instruction in the use of collaboration tools. Like Hron et al., Ryan

found no significant difference in academic performance between the two groups of online

students.

Choi, Land and Turgeon (2005) used a time-series control-group design to investigate the effects

of providing online scaffolding for generating questions to peers during online group

discussions. Although scaffolds were found to increase the number of questions asked, they did

not affect question quality or learner outcomes.

In summary, mechanisms such as scaffolds or scripts for student group interaction online have

been found to influence the way students engage with each other and with the online material,

but have not been found to improve learning.

Delivery Platform

Several platform options are available for online learning—an exclusively Web-based

environment or e-mail or mobile phone. The alternative platforms can be used as primary

delivery channels or as supplements to Web-based instruction. Neither of the two studies that

addressed this issue found significant differences across delivery platforms. Shih (2007)

investigated whether student groups who accessed online materials by means of mobile phone

demonstrated significantly different learning outcomes from groups who did so using a

traditional computer; the author found no statistical difference between the two groups.

Similarly, Kerfoot (2008) compared the effects of receiving course materials and information

through a series of e-mails spaced out over time versus accessing the online materials all at once

by means of a traditional Web-site and found no statistical difference.

Overall, the controlled studies are too few to support even tentative conclusions concerning the

learning effects of using alternative or multiple delivery platforms for online learning.

Page 171: OLSD Project 2020 Report

48

Summary

This narrative review has illustrated the many variations in online, individual and group, and

synchronous and asynchronous activities that can be combined in a course or instructional

intervention. The number of Category 3 studies concerning any single practice was insufficient

to warrant a quantitative meta-analysis, and the results varied to such an extent that only

tentative, rather than firm, conclusions can be drawn about promising online learning practices.

The direct comparison of blended and purely online conditions in 10 studies produced mostly

null results, tempering what appeared to be an advantage of blended compared with purely

online instruction in the moderator variable analysis that was conducted as part of the meta-

analysis presented in chapter 3. Although a fair number of Category 3 studies contrasted these

two versions of online learning, few equated instructional content or activities across conditions,

making it difficult to draw conclusions.

With respect to incorporation of multiple media, the evidence available in the Category 3 studies

suggests that inclusion of more media in an online application does not enhance learning when

content is controlled, but some evidence suggests that the learner’s ability to control the learning

media is important (Zhang 2005; Zhang et al. 2006). Alternately, the set of studies using various

manipulations to try to stimulate more active engagement on the part of online learners (such as

use of advanced organizers, conceptual maps, or guiding questions) had mostly null results.

The clearest recommendation for practice that can be made on the basis of the Category 3

synthesis is to incorporate mechanisms that promote student reflection on their level of

understanding. A dozen studies have investigated what effects manipulations that trigger learner

reflection and self-monitoring of understanding have on individual students’ online learning

outcomes. Ten of the studies found that the experimental manipulations offered advantages over

online learning that did not provide the trigger for reflection.

Another set of studies explored features usually associated with computer-based instruction,

including the incorporation of quizzes, simulations, and techniques for individualizing

instruction. The providing of simple multiple-choice quizzes did not appear to enhance online

learning. The incorporation of simulations produced positive effects in two out of three studies

(Castaneda 2008; Hibelink 2007). Individualizing online learning by dynamically generating

learning content based on the student’s responses was found to be effective in the two studies

investigating this topic (Grant and Courtoreille 2007; Nguyen 2007).

Attempts to guide the online interactions of groups of learners were less successful than the use

of mechanisms to prompt reflection and self-assessment on the part of individual learners. Some

researchers have suggested that students who learn in online groups provide scaffolds for one

another (Suh 2006).

Page 172: OLSD Project 2020 Report

49

Finally, readers should be cautioned that the literature on alternative online learning practices has

been conducted for the most part by professors and other instructors who are conducting research

using their own courses. Moreover, the combinations of technology, content and activities used

in different experimental conditions have often been ad hoc rather than theory based. As a result,

the field lacks a coherent body of linked studies that systematically test theory-based approaches

in different contexts.

Page 173: OLSD Project 2020 Report
Page 174: OLSD Project 2020 Report

51

5. Discussion and Implications

The meta-analysis reported here differs from prior meta-analyses of distance learning in several

important respects:

! Only studies of Web-supported learning have been included.

! All effects have been based on objective measures of learning.

! Only studies with controlled designs that met minimum quality criteria have been

included.

The corpus of 50 effect sizes extracted from 45 studies meeting these criteria was sufficient to

demonstrate that in recent applications, online learning has been modestly more effective, on

average, than the traditional face-to-face instruction with which it has been compared. It should

be noted, however, that this overall effect can be attributed to the advantage of blended learning

approaches over instruction conducted entirely face-to-face. Of the 11 individual studies with

significant effects favoring the online condition, 9 used a blended learning approach.

The test for homogeneity of effects found significant variability in the effect sizes for the

different online learning studies, justifying a search for moderator variables that could explain

the differences in outcomes. The moderator variable analysis found only three moderators

significant at p < .05. Effects were larger when a blended rather than a purely online condition

was compared with face-to-face instruction; when students in the online condition were engaged

in instructor-led or collaborative instruction rather than independent learning; and when the

curricular materials and instruction varied between the online and face-to-face conditions. This

pattern of significant moderator variables is consistent with the interpretation that the advantage

of online conditions in these recent studies stems from aspects of the treatment conditions other

than the use of the Internet for delivery per se.

Clark (1983) has cautioned against interpreting studies of instruction in different media as

demonstrating an effect for a given medium inasmuch as conditions may vary with respect to a

whole set of instructor and content variables. That caution applies well to the findings of this

meta-analysis, which should not be construed as demonstrating that online learning is superior as

a medium. Rather, it is the combination of elements in the treatment conditions, which are likely

to include additional learning time and materials as well as additional opportunities for

collaboration, that has proven effective. The meta-analysis findings do not support simply

putting an existing course online, but they do support redesigning instruction to incorporate

additional learning opportunities online.

Several practices and conditions associated with differential effectiveness in distance education

meta-analyses (most of which included nonlearning outcomes such as satisfaction) were not

found to be significant moderators of effects in this meta-analysis of Web-based online learning.

Nor did tests for the incorporation of instructional elements of computer-based instruction (e.g.,

online practice opportunities and feedback to learners) find that these variables made a

difference. Online learning conditions produced better outcomes than face-to-face learning alone,

regardless of whether these instructional practices were used.

Page 175: OLSD Project 2020 Report

52

The meta-analysis did not find differences in average effect size between studies published

before 2004 (which might have used less sophisticated Web-based technologies than those

available since) and studies published from 2004 on (possibly reflecting the more sophisticated

graphics and animations or more complex instructional designs available). Nor were differences

associated with the nature of the subject matter involved.

Finally, the examination of the influence of study method variables found that effect sizes did not

vary significantly with study sample size or with type of design. It is reassuring to note that, on

average, online learning produced better student learning outcomes than face-to-face instruction

in those studies with random-assignment experimental designs (p < .001) and in those studies

with the largest sample sizes (p < . 01).

The relatively small number of studies meeting criteria for inclusion in this meta-analysis limits

the power of tests for moderator variables. A few contrasts that did not attain significance (e.g.,

time on task or type of knowledge tested) might have emerged as significant influences under a

fixed-effects analysis and may prove significant when tested in future meta-analyses with a

larger corpus of studies.

The narrative synthesis of studies comparing variations of online learning provides some

additional insights with respect to designing effective online learning experiences. The practice

with the strongest evidence of effectiveness is inclusion of mechanisms to prompt students to

reflect on their level of understanding as they are learning online. In a related vein, there is some

evidence that online learning environments with the capacity to individualize instruction to a

learner’s specific needs improves effectiveness.

As noted in chapter 4, the results of studies using purely online and blended conditions cast some

doubt on the meta-analysis finding of larger effect sizes for studies blending online and face-to-

face elements. The inconsistency in the implications of the two sets of studies underscores the

importance of recognizing the confounding of practice variables in most studies. Studies using

blended learning also tend to involve more learning time, additional instructional resources, and

course elements that encourage interactions among learners. This confounding leaves open the

possibility that one or all of these other practice variables, rather than the blending of online and

offline media per se, accounts for the particularly positive outcomes for blended learning in the

studies included in the meta-analysis.

Comparison With Meta-Analyses of Distance Learning

Because online learning has much in common with distance learning, it is useful to compare the

findings of the present meta-analysis with the most comprehensive recent meta-analyses in the

distance-learning field. The two most pertinent earlier works are those by Bernard et al. (2004)

and Zhao et al. (2005). As noted above, the corpus in this meta-analysis differed from the earlier

quantitative syntheses, not only in including more recent studies but also in excluding studies

that did not involve Web-based instruction and studies that did not examine an objective student

learning outcome.

Bernard et al. (2004) found advantages for asynchronous over synchronous distance education, a

finding that on the surface appears incongruent with the results reported here. On closer

Page 176: OLSD Project 2020 Report

53

inspection, however, it turns out that the synchronous distance-education studies in the Bernard

et al. corpus were mostly cases of a satellite classroom yoked to the main classroom where the

instructor taught. It is likely that the nature of the learning experience and extent of collaborative

learning were quite different in the primary and distant classrooms in these studies. For

asynchronous distance education, Bernard et al. also found that the distance-education condition

tended to have more favorable outcomes when opportunities for computer-mediated

communication were available. Online learners in all of the studies in this meta-analysis had

access to computer-mediated communication and in every case there were mechanisms for

asynchronous communication.

Zhao et al. (2005) found advantages for blended learning (combining elements of online and

face-to-face communication) over purely online learning experiences, a finding similar to that of

this meta-analysis. Zhao et al. also found that instructor involvement was a strong mediating

variable. Distance learning outcomes were less positive when instructor involvement was low (as

in “canned” applications), with effects becoming more positive, up to a point, as instructor

involvement increased. At the highest level of instructor involvement (which would suggest that

the instructor became dominant and peer-to-peer learning was minimized), effect size started to

decline in the corpus of studies Zhao et al. examined. Although a somewhat different construct

was tested in the Learning Experience variable used here, the present results are consonant with

those of Zhao et al. Studies in which the online learners worked with digital resources with little

or no teacher guidance were coded here as “independent/active,” and this category was the one

learner experience category for which the advantage of online learning failed to attain statistical

significance at the p < .05 level or better.

The relative disadvantage of independent online learning (called “active” in our conceptual

model) should not be confused with automated mechanisms that encourage students to be more

reflective or more actively engaged with the material they are learning on line. As noted above, a

number of studies reviewed in chapter 4 found positive effects for techniques such as prompts

that encourage students to assess their level of understanding or set goals for what they will learn

whereas mechanisms such as guiding questions or advance organizers had mostly null results.

Implications for K–12 Education

The impetus for this meta-analysis of recent empirical studies of online learning was the need to

develop research-based insights into online learning practices for K–12 students. The research

team realized at the outset that a look at online learning studies in a broader set of fields would

be necessary to assemble sufficient empirical research for meta-analysis. As it happened, the

initial search of the literature published between 1996 and 2006 found no studies contrasting K–

12 online learning with face-to-face instruction that met methodological quality criteria.23 By

23 The initial literature search identified several K–12 online studies comparing student achievement data collected

from both virtual and regular schools (e.g., Cavanaugh et al. 2004; Schollie 2001), but these studies were neither

experiments nor quasi-experiments with statistical control for preexisting differences between groups. Some of

these K–12 studies used a pre-post, within-subject design without a comparison group; others were quasi-

experiments without a statistical control for preexisting differences among study conditions (e.g., Karp and

Woods 2003; Long and Stevens 2004; Stevens 1999). Several studies used experimental designs with K–12

students but did not report the data needed to compute or estimate effect sizes. A few experiments compared a K–

Page 177: OLSD Project 2020 Report

54

performing a second literature search with an expanded time frame (through July 2008), the team

was able to greatly expand the corpus of studies with controlled designs and to identify five

controlled studies of K–12 online learning with seven contrasts between online and face-to-face

conditions. This expanded corpus still comprises a very small number of studies, especially

considering the extent to which secondary schools are using online courses and the rapid growth

of online instruction in K–12 education as a whole. Educators making decisions about online

learning need rigorous research examining the effectiveness of online learning for different types

of students and subject matter as well as studies of the relative effectiveness of different online

learning practices.

12 online intervention with a condition in which there was no instruction (e.g., Teague and Riley 2006). Many of

the references (8 out of 14) used for the Cavanaugh et al. (2004) meta-analysis of K–12 online studies were

databases of raw student performance data and did not describe learning conditions, technology use or

learner/instructor characteristics. A recent large-scale study by the Florida TaxWatch (2007) failed to control for

preexisting differences between the students taking courses online and those taking them in conventional

classrooms.

Page 178: OLSD Project 2020 Report

55

References

Reference Key

aCategory 1 and 2 studies used in the meta-analysis

bCategory 3 studies used in the narrative summary

cCategory 3 studies also included in the Category 1 and 2 studies

dCategory 3 studies reviewed but not cited in the narrative summary

eAdditional cited references.

aAberson, C. L., D. E. Berger, M. R. Healy, and V. L. Romero. 2003. Evaluation of an

interactive tutorial for teaching hypothesis testing concepts. Teaching of Psychology 30

(1):75–78.

aAl-Jarf, R. S. 2004. The effects of Web-based learning on struggling EFL college writers.

Foreign Language Annals 37 (1):49–57.

dBeal, T., K. J. Kemper, P. Gardiner, and C. Woods. 2006. Long-term impact of four different

strategies for delivering an online curriculum about herbs and other dietary supplements.

BMC Medical Education 6:39.

bBeeckman, D., L. Schoonhoven, H. Boucque, G. VanMaele, and T. Defloor. 2008. Pressure

ulcers: E-learning to improve classification by nurses and nursing students. Journal of

Clinical Nursing 17 (13):1697–1707.

bBeile, P. M., and D. N. Boote. 2002. Library instruction and graduate professional development:

Exploring the effect of learning environments on self-efficacy and learning outcomes.

Alberta Journal of Educational Research 48 (4):364–67.

aBello, G., M. A. Pennisi, R. Maviglia, S. M. Maggiore, M. G. Bocci, L. Montini, L., and M.

Antonelli. 2005. Online vs. live methods for teaching difficult airway management to

anesthesiology residents. Intensive Care Medicine 31 (4):547–52.

aBenjamin, S. E., D. F. Tate, S. I. Bangdiwala, B. H. Neelon, A. S. Ammerman, J. M. Dodds, and

D.S. Ward. 2008. Preparing child care health consultants to address childhood overweight: A

randomized controlled trial comparing Web to in-person training. Maternal and Child Health

Journal 12(5):662-669.

eBernard, R. M., P. C. Abrami, Y. Lou, E. Borokhovski, A. Wade, L. Wozney, P.A. Wallet, M.

Fiset, and B. Huang. 2004. How does distance education compare with classroom

instruction? A meta-analysis of the empirical literature. Review of Educational Research 74

(3):379–439.

bBernard, R. M., and K. Lundgren-Cayrol. 2001. Computer conferencing: An environment for

collaborative project-based learning in distance education. Educational Research and

Evaluation 7 (2–3):241–61.

aBeyea, J. A., E. Wong, M. Bromwich, W. W. Weston, and K. Fung. 2008. Evaluation of a

particle repositioning maneuver web-based teaching module. The Laryngoscope 118 (1):175–

180.

Page 179: OLSD Project 2020 Report

56

eBiostat Solutions. 2006. Comprehensive Meta-Analysis (Version 2.2.027). [Computer software].

Mt. Airy, Md.: Biostat Solutions.

bBixler, B. A. 2008. The effects of scaffolding student’s problem-solving process via question

prompts on problem solving and intrinsic motivation in an online learning environment. PhD

diss., The Pennsylvania State University, State College, Penn.

eBransford, J. D., A. L. Brown, and R. R. Cocking. 1999. How people learn: Brain, mind,

experience, and school. Washington, D.C.: National Academy Press.

dBrown, B. A., and K. Ryoo. 2008. Teaching science as a language: A “content-first” approach

to science teaching. Journal of Research in Science Teaching 45 (5):529–53.

b, cCaldwell, E. R. 2006. A comparative study of three instructional modalities in a computer

programming course: Traditional instruction, Web-based instruction, and online instruction.

PhD diss., University of North Carolina at Greensboro.

bCampbell, M., W. Gibson, A. Hall, D. Richards, and P. Callery. 2008. Online vs. face-to-face

discussion in a Web-based research methods course for postgraduate nursing students: A

quasi-experimental study. International Journal of Nursing Studies 45 (5):750–59.

bCastaneda, R. 2008. The impact of computer-based simulation within an instructional sequence

on learner performance in a Web-based environment. PhD diss., Arizona State University,

Tempe.

eCavanaugh, C. 2001. The effectiveness of interactive distance education technologies in K–12

learning: A meta-analysis. International Journal of Educational Telecommunications 7

(1):73–78.

eCavanaugh, C., K. J. Gillan, J. Kromrey, M. Hess, and R. Blomeyer. 2004. The effects of

distance education on K–12 student outcomes: A meta-analysis. Naperville, Ill.: Learning

Point Associates. http://www.ncrel.org/tech/distance/index.html (accessed March 5, 2009).

b, cCavus, N., H. Uzonboylu, and D. Ibrahim. 2007. Assessing the success rate of students using a

learning management system together with a collaborative tool in Web-based teaching of

programming languages. Journal of Educational Computing Research 36 (3):301–21.

eChang, C. C. 2008. Enhancing self-perceived effects using Web-based portfolio assessment.

Computers in Human Behavior 24 (4):1753–71.

bChang, M. M. 2007. Enhancing Web-based language learning through self-monitoring. Journal

of Computer Assisted Learning 23 (3):187–96.

bChen, B. 2007. Effects of advance organizers on learning and retention from a fully Web-based

class. PhD diss., University of Central Florida, Orlando.

eChilds, J. M. 2001. Digital skill training research: Preliminary guidelines for distributed

learning (Final report). Albuquerque, N. Mex.: TRW. Available through

http://www.stormingmedia.us/24/2471/A247193.html (accessed March 5, 2009).

bChoi, I., S. M. Land, and A. J. Turgeon. 2005. Scaffolding peer-questioning strategies to

facilitate metacognition during online small group discussion. Instructional Science 33 (5–

6):483–511.

Page 180: OLSD Project 2020 Report

57

bChung, S., M.-J. Chung, and C. Severance. 1999, October. Design of support tools and

knowledge building in a virtual university course: Effect of reflection and self-explanation

prompts. Paper presented at the WebNet 99 World Conference on the WWW and Internet

Proceedings, Honolulu, Hawaii. (ERIC Document Reproduction Service No. ED448706).

eClark, R. E. 1983. Reconsidering research on learning from media. Review of Educational

Research 53 (4):445–49.

eClark, R. E. 1994. Media will never influence learning. Educational Technology Research and

Development 42 (2):21–29.

Cohen, J. 1992. A power primer. Psychological Bulletin, 112: 155–159.

bCook, D. A., D. M. Dupras, W. G. Thompson, and V. S. Pankratz. 2005. Web-based learning in

residents’ continuity clinics: A randomized, controlled trial. Academic Medicine 80 (1):90–

97.

bCook, D. A., M. H. Gelula, D. M. Dupras, and A. Schwartz. 2007. Instructional methods and

cognitive and learning styles in Web-based learning: Report of two randomised trials.

Medical Education 41 (9):897–905.

bCrippen, K. J., and B. L. Earl. 2007. The impact of Web-based worked examples and self-

explanation on performance, problem solving, and self-efficacy. Computers & Education 49

(3):809–21.

b, cDavis, J. D., M. Odell, J. Abbitt, and D. Amos. 1999, March. Developing online courses: A

comparison of Web-based instruction with traditional instruction. Paper presented at the

Society for Information Technology & Teacher Education International Conference,

Chesapeake, Va.

http://www.editlib.org/INDEX.CFM?fuseaction=Reader.ViewAbstract&paper_id=7520

(accessed March 5, 2009).

aDay, T. M., M. R. Raven, and M. E. Newman. 1998. The effects of World Wide Web

instruction and traditional instruction and learning styles on achievement and changes in

student attitudes in a technical writing in agricommunication course. Journal of Agricultural

Education 39 (4):65–75.

aDeBord, K. A., M. S. Aruguete, and J. Muhlig. 2004. Are computer-assisted teaching methods

effective? Teaching of Psychology 31 (1):65–68.

eDede, C., ed. 2006. Online professional development for teachers: Emerging models and

methods. Cambridge, Mass.: Harvard Education Publishing Group.

bDe Wever, B., M. Van Winckel, and M. Valcke 2008. Discussing patient management online:

The impact of roles on knowledge construction for students interning at the paediatric ward.

Advances in Health Sciences Education 13 (1):25–42.

bDinov, I. D., J. Sanchez, and N. Christou. 2008. Pedagogical utilization and assessment of the

statistic online computational resource in introductory probability and statistics courses.

Computers & Education 50 (1):284–300.

dDuphorne, P. L., and C. N. Gunawardena. 2005. The effect of three computer conferencing

designs on critical thinking skills of nursing students. American Journal of Distance

Education 19 (1):37–50.

Page 181: OLSD Project 2020 Report

58

aEl-Deghaidy, H., and A. Nouby. 2008. Effectiveness of a blended e-learning cooperative

approach in an Egyptian teacher education programme. Computers & Education 51 (3):988–

1006.

aEnglert, C. S., Y. Zhao, K. Dunsmore, N. Y. Collings, and K. Wolbers. 2007. Scaffolding the

writing of students with disabilities through procedural facilitation: Using an Internet-based

technology to improve performance. Learning Disability Quarterly 30 (1):9–29.

bEvans, K. L. 2007. Learning stoichiometry: A comparison of text and multimedia instructional

formats. PhD diss., University of Pittsburgh, Penn.

eFlorida TaxWatch. 2007. Final report: A comprehensive assessment of Florida virtual school.

Tallahassee, Fla.: Florida TaxWatch Center for Educational Performance and Accountability.

www.floridataxwatch.org/resources/pdf/110507FinalReportFLVS.pdf (accessed March 5,

2009).

dFox, E. J., and H. J. Sullivan. 2007. Comparing strategies for teaching abstract concepts in an

online tutorial. Journal of Educational Computing Research 37 (3):307–30.

aFrederickson, N., P. Reed, and V. Clifford. 2005. Evaluating Web-supported learning versus

lecture-based teaching: Quantitative and qualitative perspectives. Higher Education 50

(4):645–64.

bGaddis, B., H. Napierkowski, N. Guzman, and R. Muth. 2000, October. A comparison of

collaborative learning and audience awareness in two computer-mediated writing

environments. Paper presented at the National Convention of the Association for Educational

Communications and Technology, Denver, Colo. (ERIC Document Reproduction Service

No. ED455771).

bGao, T., and J. D. Lehman. 2003. The effects of different levels of interaction on the

achievement and motivational perceptions of college students in a Web-based learning

environment. Journal of Interactive Learning Research 14 (4):367–86.

aGilliver, R. S., B. Randall, and Y. M. Pok. 1998. Learning in cyberspace: Shaping the future.

Journal of Computer Assisted Learning 14 (3):212–22.

bGolanics, J. D., and E. M. Nussbaum. 2008. Enhancing online collaborative argumentation

through question elaboration and goal instructions. Journal of Computer Assisted Learning

24 (3):167–80.

bGrant, L. K., and M. Courtoreille. 2007. Comparison of fixed-item and response-sensitive

versions of an online tutorial. Psychological Record 57 (2):265–72.

dGriffin, T. J. 2007. Liberating the spacing effect from the laboratory: A practical application in a

worldwide Web-based religious education volunteer-teacher training program. PhD diss.,

Utah State University, Logan.

aHairston, N. R. 2007. Employees’ attitudes toward e-learning: Implications for policy in

industry environments. PhD diss., University of Arkansas, Fayetteville.

eHarlen, W., and S. Doubler. 2004. Can teachers learn through enquiry online? Studying

professional development in science delivered online and on-campus. International Journal

of Science Education 26 (10):1247–67.

Page 182: OLSD Project 2020 Report

59

aHarris, J. M., T. E. Elliott, B. E. Davis, C. Chabal, J. V. Fulginiti, and P. G. Fine. 2007.

Educating generalist physicians about chronic pain: Live experts and online education can

provide durable benefits. Pain Medicine 9 (5):555–63.

eHedges, L. V., and I. Olkin. 1985. Statistical methods for meta-analysis. Orlando, Fla.:

Academic Press.

bHilbelink, A. J. 2007. The effectiveness and user perception of 3-dimensional digital human

anatomy in an online undergraduate anatomy laboratory. PhD diss. dissertation, University of

South Florida, Orlando.

eHiltz, S. R., and R. Goldman, eds. 2005. Learning together online: Research on asynchronous

learning networks. Mahwah, N.J.: Lawrence Erlbaum.

bHron, A., F. W. Hesse, U. Cress, and C. Giovis. 2000. Implicit and explicit dialogue structuring

in virtual learning groups. British Journal of Educational Psychology 70 (1):53–64.

eHu, P. J. H., W. Hui, T. H. K. Clark, and K. Y. Tam. 2007. Technology-assisted learning and

learning style: A longitudinal field experiment. Systems, man and cybernetics, part A. IEEE

Transactions on Professional Communication 37 (6):1099 –1112.

aHugenholtz, N. I. R., E. M. de Croon, P. B. Smits, F. J. H. van Dijk, and K. Nieuwenhuijsen.

2008. Effectiveness of e-learning in continuing medical education for occupational

physicians. Occupational Medicine 58 (5):370–72.

eJaffe, R., E. Moir, E. Swanson, and G. Wheeler. 2006. EMentoring for Student Success: Online

mentoring and professional development for new science teachers. In Online professional

development for teachers: Emerging models and methods, ed. C. Dede, 89–116. Cambridge,

Mass.: Harvard Education Press.

aJang, K. S., S. Y. Hwang, S. J. Park, Y. M. Kim, and M. J. Kim. 2005. Effects of a Web-based

teaching method on undergraduate nursing students’ learning of electrocardiography. The

Journal of Nursing Education. 44 (1):35–39.

dJia, J. 2007. The effects of concept mapping as advance organizers in instructional designs for

distance learning programs. PhD diss., Wayne State University, Detroit, Mich..

eKarp, G. G., and M. L. Woods. 2003. Wellness NutriFit online learning in physical education

for high school students. The Journal of Interactive Online Learning 2,

http://www.ncolr.org/jiol/Issues/PDF/2.2.3.pdf (accessed March 5, 2009).

bKeefe, T. J. 2003. Using technology to enhance a course: The importance of interaction.

EDUCAUSE Quarterly 1:24–34.

dKemper, K. J., P. Gardiner, J. Gobble, A. Mitra, and C. Woods. 2006. Randomized controlled

trial comparing four strategies for delivering e-curriculum to health care professions. BMC

Medical Education 6: 2. http://www.biomedcentral.com/1472-6920/6/2 (accessed March 5,

2009).

bKerfoot, B. P. 2008. Interactive spaced education versus Web-based modules for teaching

urology to medical students: a randomized controlled trial. The Journal of Urology 179

(6):2351–57.

Page 183: OLSD Project 2020 Report

60

dKerfoot, B. P., H. E. Baker, M. O. Koch, D. Connelly, D. B. Joseph, and M. L. Ritchey. 2007.

Randomized, controlled trial of spaced education to urology residents in the United States

and Canada. The Journal of Urology 177 (4):1481–87.

dKerfoot, B. P., P. R. Conlin, T. Travison, and G. T. McMahon. 2007. Web-based education in

systems-based practice: A randomized trial. Archives of Internal Medicine 167 (4):361–66.

dKim, K.-H. 2006. Enhancement of secondary special education teachers’ knowledge and

competencies in working with families through online training modules. PhD diss.,

University of Kansas, Lawrence.

dKock, N., and R. C. J. Chatelain-Jardon. 2008. An experimental study of simulated Web-based

threats and their impact on knowledge communication effectiveness. IEEE Transactions on

Professional Communication 51:183–97.

dLawless, K. A., P. G. Schrader, and H. J. Mayall. 2007. Acquisition of information online:

Knowledge, navigation and learning outcomes. Journal of Literacy Research 39 (3):289–

306.

dLee, J.-L., G. Orwig, G. A. Gunter, and L. Witta. 2006. The effect of cognitive styles upon the

completion of a visually-oriented component of online instruction. PhD diss., University of

Central Florida, Orlando.

bLewis, B. A. 2002. The effectiveness of discussion forums in online learning. Brazilian Review

of Open and Distance Learning 1 (1),

http://www.abed.org.br/publique/cgi/cgilua.exe/sys/start.htm?infoid=16&sid=73&UserActiv

eTemplate=1por (accessed March 5, 2009).

eLipsey, M. W., and D. B. Wilson. 2000. Practical meta-analysis. Vol. 49 of Applied social

research methods series. Thousand Oaks, Calif.: Sage.

bLoar, R. S. 2007. The impact of a computer simulated case study on nurse practitioner students’

declarative knowledge and clinical performance. PhD diss., University of Illinois at Urbana-

Champaign.

aLong, M., and H. Jennings. 2005. “Does it work?”: The impact of technology and professional

development on student achievement. Calverton, Md.: Macro International.

eLong, J. D., and K. R. Stevens. 2004. Using technology to promote self-efficacy for healthy

eating in adolescents. Journal of Nursing Scholarship 36 (2):134–39.

aLowry, A. E. 2007. Effects of online versus face-to-face professional development with a team-

based learning community approach on teachers’ application of a new instructional practice.

PhD diss., Johns Hopkins University, Baltimore, Md.

dLu, R., and L. Bol. 2007. A comparison of anonymous versus identifiable e-peer review on

college student writing performance and the extent of critical feedback. Journal of

Interactive Online Learning 6 (2):100–115.

bMaag, M. 2004. The effectiveness of an interactive multimedia learning tool on nursing

students’ math knowledge and self-efficacy. Computers, Informatics, Nursing 22 (1):26–33.

eMachtmes, K., and J. W. Asher. 2000. A meta-analysis of the effectiveness of telecourses in

distance education. The American Journal of Distance Education 14 (1):27–46.

Page 184: OLSD Project 2020 Report

61

aMaki, W. S., and R. H. Maki. 2002. Multimedia comprehension skill predicts differential

outcomes of Web-based and lecture courses. Journal of Experimental Psychology: Applied 8

(2):85–98.

bMcKethan, R. N., M. W. Kernodle, D. Brantz, and J. Fischer. 2003. Qualitative analysis of the

overhand throw by undergraduates in education using a distance learning computer program.

Perceptual and Motor Skills 97 (3 Pt. 1):979–89.

bMcNamara, J. M., R. L. Swalm, D. J. Stearne, and T. M. Covassin. 2008. Online weight

training. Journal of Strength and Conditioning Research 22 (4):1164–68.

aMentzer, G. A., J. Cryan, and B. Teclehaimanot. 2007. A comparison of face-to-face and Web-

based classrooms. Journal of Technology and Teacher Education 15 (2):233–46.

aMidmer, D., M. Kahan, and B. Marlow. 2006. Effects of a distance learning program on

physicians’ opioid- and benzodiazepine-prescribing skills. The Journal of Continuing

Education in the Health Professions 26 (4):294–301.

eMoore, M. 1994. Administrative barriers to adoption of distance education. The American

Journal of Distance Education 8 (3):1–4.

bNelson, B. C. 2007. Exploring the use of individualized, reflective guidance in an educational

multi-user virtual environment. Journal of Science Education and Technology 16 (1):83–97.

bNguyen, F. 2007. The effect of an electronic performance support system and training as

performance interventions. PhD diss., Arizona State University, Tempe.

aNguyen, H. Q., D. Donesky-Cuenco, S. Wolpin, L. F. Reinke, J. O. Benditt, S. M. Paul, and V.

Carrieri-Kohlman. 2008. Randomized controlled trial of an Internet-based versus face-to-face

dyspnea self-management program for patients with chronic obstructive pulmonary disease:

Pilot study. Journal of Medical Internet Research 10 (2). http://www.jmir.org/2008/2/e9/

(accessed March 5, 2009).

aOcker, R. J., and G. J. Yaverbaum. 1999. Asynchronous computer-mediated communication

versus face-to-face collaboration: Results on student learning, quality and satisfaction. Group

Decision and Negotiation 8 (5):427–40.

aO’Dwyer, L. M., R. Carey, and G. Kleiman. 2007. A study of the effectiveness of the Louisiana

Algebra I online course. Journal of Research on Technology in Education 39 (3):289–306.

dO’Leary, P. F., and T. J. Quinlan. 2007. Learner-instructor telephone interaction: Effects on

satisfaction and achievement of online students. American Journal of Distance Education 21

(3):133–43.

aPadalino, Y., and H. H. C. Peres. 2007. E-learning: A comparative study for knowledge

apprehension among nurses. Revista Latino-Americana de Enfermagem 15:397–403.

eParker, M. J. 1999, June 22–24. Are academic behaviors fostered in Web-based environments?

Paper presented at the National Education Computing Conference, Atlantic City, N.J. (ERIC

Document Reproduction Service No. ED432993).

ePearson, P. D., R. E. Ferdig, R. L. Blomeyer, Jr., and J. Moran. 2005. The effects of technology

on reading performance in the middle school grades: A meta-analysis with recommendations

Page 185: OLSD Project 2020 Report

62

for policy. Naperville, Ill.: Learning Point Associates. (ERIC Document Reproduction

Service No. ED489534).

aPeterson, C. L., and N. Bond. 2004. Online compared to face-to-face teacher preparation for

learning standards-based planning skills. Journal of Research on Technology in Education 36

(4):345–61.

ePicciano, A. G., and J. Seaman. 2007. K–12 online learning: A survey of U.S. school district

administrators. Boston: Sloan Consortium. http://www.sloan-c.org/publications/survey/K-

12_06.asp (accessed March 5, 2009).

e———.

2007. K–12 online learning: A survey of U.S. school district administrators. Boston:

Sloan Consortium. http://www.sloan-c.org/publications/survey/K-12_06.asp (accessed March

5, 2009).

bPoirier, C. R., and R. S. Feldman. 2004. Teaching in cyberspace: Online versus traditional

instruction using a waiting-list experimental design. Teaching of Psychology 31 (1):59–62.

eRiel, M., and L. Polin. 2004. Online communities: Common ground and critical differences in

designing technical environments. In Designing for virtual communities in the service of

learning, ed. S. A. Barab, R. Kling, and J. H. Gray, 16–50. Cambridge, Mass.: Cambridge

University Press.

aRockman et al. 2007. ED PACE final report. Submitted to the West Virginia Department of

Education. San Francisco: Author. www.rockman.com/projects/146.ies.edpace/finalreport

(accessed March 5, 2009).

dRomanov, K., and A. Nevgi. 2006. Learning outcomes in medical informatics: Comparison of a

WebCT course with ordinary web site learning material. International Journal of Medical

Informatics 75 (2):156–62.

bRuchti, W. P., and M. R. Odell. 2002, February. Comparison and evaluation of online and

classroom instruction in elementary science teaching methods courses. Paper presented at the

1st Northwest NOVA Cyber-Conference, Newberg, Ore..

http://nova.georgefox.edu/nwcc/arpapers/uidaho.pdf (accessed March 5, 2008).

bRyan, R. 2007. The effects of Web-based social networks on student achievement and

perception of collaboration at the middle school level. PhD diss., Touro University

International, City, Calif.

bSaito, H., and K. Miwa. 2007. Construction of a learning environment supporting learners’

reflection: A case of information seeking on the Web. Computers & Education 49 (2):214–

29.

dSchaad, D. C., E. A. Walker, F. M. B. Wolf, M. Douglas, S. M. Thielke, and L. Oberg. 1999.

Evaluating the serial migration of an existing required course to the World Wide Web.

Academic Medicine 74 (10):84–86.

dScheines, R., G. Leinhardt, J. Smith, and K. Cho. 2005. Replacing lecture with Web-based

course materials. Journal of Educational Computing Research 32 (1):1–25.

aSchilling, K., J. Wiecha, D. Polineni, and S. Khalil. 2006. An interactive Web-based curriculum

on evidence-based medicine: Design and effectiveness. Family Medicine. 38 (2):126–32.

Page 186: OLSD Project 2020 Report

63

b, cSchmeeckle, J. M. 2003. Online training: An evaluation of the effectiveness and efficiency of

training law enforcement personnel over the Internet. Journal of Science Education and

Technology 12 (3):205–60.

dSchmidt, K. 2002. Classroom action research: A case study assessing students’ perceptions and

learning outcomes of classroom teaching versus online teaching. Journal of Industrial

Teacher Education 40 (1):45–59.

bSchnitman, I. 2007. The dynamics involved in Web-based learning environment (WLE)

interface design and human-computer interactions (HCI): Connections with learning

performance. PhD diss., West Virginia University, Morgantown.

aSchoenfeld-Tacher, R., S. McConnell, and M. Graham. 2001. Do no harm: A comparison of the

effects of online vs. traditional delivery media on a science course. Journal of Science

Education and Technology 10 (3):257–65.

eSchollie, B. 2001. Student achievement and performance levels in online education research

study. Edmonton, Alberta: Alberta Online Consortium.

http://www.albertaonline.ab.ca/pdfs/AOCresearch_full_report.pdf (accessed March 5, 2008).

bSchroeder, B. A. 2006. Multimedia-enhanced instruction in online learning environments. PhD

diss., Boise State University, Boise, Idaho.

bSchutt, M. 2007. The effects of instructor immediacy in online learning environments. PhD

diss., University of San Diego and San Diego State University, San Diego, Calif.

eSchwen, T. M., and N. Hara. 2004. Community of practice: A metaphor for online design. In

Designing for virtual communities in the service of learning, ed. S. A. Barab, R. Kling, and J.

H. Gray, 154–78. Cambridge, U.K.: Cambridge University Press.

bScoville, S. A., and T. D. Buskirk. 2007. Traditional and virtual microscopy compared

experimentally in a classroom setting. Clinical Anatomy 20 (5):565–70.

aSexton, J. S., M. R. Raven, and M. E. Newman. 2002. A comparison of traditional and World

Wide Web methodologies, computer anxiety, and higher order thinking skills in the inservice

training of Mississippi 4-H extension agents. Journal of Agricultural Education 43 (3):25–

36.

eSetzer, J. C., and L. Lewis. 2005. Distance education courses for public elementary and

secondary school students: 2002–03. NCES No. 2005-010. Washington, D.C.: National

Center for Education Statistics.

bShen, P. D., T. H. Lee, and C. W. Tsai. 2007. Applying Web-enabled problem-based learning

and self-regulated learning to enhance computing skills of Taiwan’s vocational students: A

quasi-experimental study of a short-term module. Electronic Journal of e-Learning 5

(2):147–56.

bShih, Y. E. 2007. Dynamic language learning: Comparing mobile language learning with online

language learning. PhD diss., Capella University, Minneapolis, Minn..

eSitzmann, T., K. Kraiger, D. Stewart, and R. Wisher. 2006. The comparative effectiveness of

Web-based and classroom instruction: A meta-analysis. Personnel Psychology 59:623–64.

Page 187: OLSD Project 2020 Report

64

bSmith, C. M. 2006. Comparison of Web-based instructional design strategies in a pain

management program for nursing professional development. PhD diss., State University of

New York at Buffalo.

aSpires, H. A., C. Mason, C. Crissman, and A. Jackson. 2001. Exploring the academic self within

an electronic mail environment. Research and Teaching in Developmental Education 17

(2):5–14.

bStanley, O. L. 2006. A comparison of learning outcomes by ‘in-course’ evaluation techniques

for an online course in a controlled environment. The Journal of Educators Online 3 (2):1–

16.

eStevens, K. 1999. Two Canadian approaches to teaching biology, chemistry, mathematics and

physics to senior high school students in virtual classes. Paper presented at the Australasian

Science Education Research Association, Rotorua, New Zealand. (ERIC Document

Reproduction Service No. ED451987).

bSuh, S. 2006. The effect of using guided questions and collaborative groups for complex

problem solving on performance and attitude in a Web-enhanced learning environment. PhD

diss., Florida State University, Tallahassee.

aSun, K., Y. Lin, and C. Yu. 2008. A study on learning effect among different learning styles in a

Web-based lab of science for elementary school students. Computers & Education 50

(4):1411–22.

aSuter, W. N., and M. K. Perry. 1997. Evaluation by electronic mail. Paper presented at the

annual meeting of the Mid-South Educational Research Association, Memphis, Tenn.. (ERIC

Document Reproduction Service No. ED415269).

eTallent-Runnels, M. K., J. A. Thomas, W. Y. Lan, S. Cooper, T. C. Ahern, S. M. Shaw, and X.

Liu. 2006. Teaching courses online: A review of research. Review of Educational Research

76:93–135.

bTantrarungroj, P. 2008. Effect of embedded streaming video strategy in an online learning

environment on the learning of neuroscience. PhD diss., Indiana State University, Terre

Haute.

eTeague, G., and R. H. Riley. 2006. Does it improve high school students’ ability to perform

cardiopulmonary resuscitation in a simulated environment? Resuscitation 71 (3):352–57.

dTian, L., S. Tang, W. Cao, K. Zhang, V. Li, and R. Detels. 2007. Evaluation of a Web-based

intervention for improving HIV/AIDS knowledge in rural Yunnan, China. AIDS 21 (8):137–

42.

bTselios, N. K., N. M. Avouris, A. Dimitracopoulou, and S. Daskalaki. 2001. Evaluation of

distance-learning environments: Impact of usability on student performance. International

Journal of Educational Telecommunications 7 (4):355–78.

eTucker, B. 2007, June. Laboratories of reform: Virtual high schools and innovation in public

education. Washington, D.C.: Education Sector Reports.

http://www.butlertech.org/ek_sitepath/uploadedfiles/Teen_Education/Butler_Tech_Online/L

aboratories%20of%20Reform.pdf (accessed March 5, 2009).

Page 188: OLSD Project 2020 Report

65

aTurner, M. K., S. R. Simon, K. C. Facemyer, L. M. Newhall, and T. L. Veach. 2006. Web-based

learning versus standardized patients for teaching clinical diagnosis: A randomized,

controlled, crossover trial. Teaching and Learning in Medicine 18 (3):208–14.

aUrban, C. Q. 2006. The effects of using computer-based distance education for supplemental

instruction compared to traditional tutorial sessions to enhance learning for students at-risk

for academic difficulties. PhD diss., George Mason University, Fairfax, Va.

eUzunboylu, H. 2004. The effectiveness of Web assisted English language instruction on the

achievement and attitude of the students. In Proceedings of World Conference on

Educational Multimedia, Hypermedia and Telecommunications 2004, ed. L. Cantoni & C.

McLoughlin, 727–33. Chesapeake, Va.: Association for the Advancement of Computing in

Education. (ERIC Document Reproduction Service No. ED490528).

dVance, D., J. Dawson, V. Wadley, J. Edwards, D. Roenker, and M. Rizzo. 2007. The accelerate

study: The longitudinal effect of speed of processing training on cognitive performance of

older adults. Rehabilitation Psychology 52 (1):89–96.

aVandeweerd, J.-M. E. F., J. C. Davies, G. L. Pichbeck, and J. C. Cotton. 2007. Teaching

veterinary radiography by e-learning versus structured tutorial: A randomized, single-blinded

controlled trial. Journal of Veterinary Medical Education 34 (2):160–67.

eVrasidas, C., and G. V. Glass. 2004. Teacher professional development: Issues and trends. In

Online professional development for teachers, ed. C. Vrasidas and G. V. Glass, 1–12).

Greenwich, Conn.: Information Age.

aWallace, P. E., and R. B. Clariana. 2000. Achievement predictors for a computer-applications

module delivered online. Journal of Information Systems Education 11 (1/2):13–18.

aWang, L. 2008. Developing and evaluating an interactive multimedia instructional tool:

Learning outcomes and user experiences of optometry students. Journal of Educational

Multimedia and Hypermedia 17 (1):43–57.

bWang, K. H., T. H. Wang, W. L. Wang, and S. C. Huang. 2006. Learning styles and formative

assessment strategy: Enhancing student achievement in Web-based learning. Journal of

Computer Assisted Learning 22 (3):207–17.

bWeinberger, A., B. Ertl, F. Fischer, and H. Mandl. 2005. Epistemic and social scripts in

computer supported collaborative learning. Instructional Science 33 (1):1–30. [Two studies

are reported.]

eWestEd with Edvance Research. 2008. Evaluating online learning: Challenges and strategies for

success. Washington, D.C.: U.S. Department of Education. http://evalonline.ed.gov/

(accessed March 5, 2009).

eWhat Works Clearinghouse. 2007. Technical details of WWC-conducted computations.

Washington, D.C.: U.S. Department of Education.

http://ies.ed.gov/ncee/wwc/pdf/conducted_computations.pdf (accessed March 5, 2009).

eWhitehouse, P. L., L. A. Breit, E. M. McCloskey, D. J. Ketelhut, and C. Dede. 2006. An

overview of current findings from empirical research on online teacher professional

development. In Online professional development for teachers: Emerging models and

methods, ed. C. Dede, 13–29. Cambridge, Mass.: Harvard University Press.

Page 189: OLSD Project 2020 Report

66

eWisher, R. A., and T. M. Olson. 2003. The effectiveness of Web-based training. Research

Report No. 1802. Alexandria, Va.: U.S. Army Research Institute.

aZacharia, Z. C. 2007. Comparing and combining real and virtual experimentation: An effort to

enhance students’ conceptual understanding of electric circuits. Journal of Computer

Assisted Learning 23 (2):120–32.

eZandberg, I., and L. Lewis. 2008. Technology-based distance education courses for public

elementary and secondary school students: 2002-03 and 2004-05. (NCES 2008-08).

Washington, D.C.: National Center for Education Statistics, Institute of Education Sciences,

U.S. Department of Education.

b, cZhang, D. 2005. Interactive multimedia-based e-learning: A study of effectiveness. American

Journal of Distance Education 19 (3):149–62.

b, cZhang, D., L. Zhou, R. O. Briggs, and J. F. Nunamaker, Jr. 2006. Instructional video in e-

learning: Assessing the impact of interactive video on learning effectiveness. Information

and Management 43 (1):15–27.

bZhang, K. 2004. Effects of peer-controlled or externally structured and moderated online

collaboration on group problem solving processes and related individual attitudes in well-

structured and ill-structured small group problem solving in a hybrid course. PhD diss.,

Pennsylvania State University, State College.

eZhao, Y., J. Lei, B. Yan, C. Lai, and H. S. Tan. 2005. What makes the difference? A practical

analysis of research on the effectiveness of distance education. Teachers College Record 107

(8):183684.

eZirkle, C. 2003. Distance education and career and technical education: A review of the research

literature. Journal of Vocational Education Research 28 2):161–81.

Page 190: OLSD Project 2020 Report

A-1

Appendix Meta-Analysis Methodology

Terms and Processes Used in the Database Searches

In March 2007, researchers performed searches through the following four data sources:

1. Electronic research databases. Using a common set of keywords (see Exhibit A-1),

searches were performed in ERIC, PsycINFO, PubMed, ABI/INFORM, and UMI

ProQuest Digital Dissertations. In addition, to make sure that studies of online

learning in teacher professional development and career technical education were

included, additional sets of keywords, shown in Exhibit A-2, were used in additional

searches of ERIC and PsycINFO.

2. Recent meta-analyses and narrative syntheses. Researchers reviewed the lists of

studies included in Bernard et al. (2004), Cavanaugh et al. (2004), Childs (2001),

Sitzmann et al. (2006), Tallent-Runnels et al. (2006), Wisher and Olson (2003), and

Zhao et al. (2005) for possible inclusions. Additionally, for teacher professional

development and career technical education, references from recent narrative research

syntheses in those fields (Whitehouse et al. 2006; Zirkle 2003) were examined to

identify potential studies for inclusion.

3. Key journals. Abstracts were manually reviewed for articles published since 2005 in

American Journal of Distance Education, Journal of Distance Education (Canada),

Distance Education (Australia), International Review of Research in Distance and

Open Education, and Journal of Asynchronous Learning Networks. In addition, the

Journal of Technology and Teacher Education and Career and Technical Education

Research (formerly known as Journal of Vocational Education Research) were

manually searched.

4. Google Scholar searches. To complement these targeted searches, researchers used

limiting parameters and sets of keywords (available from the authors of this report) in

the Google Scholar search engine.

Page 191: OLSD Project 2020 Report

A-2

Exhibit A-1. Terms for Initial Research Database Search

Technology and Education/

Training Terms

Study Design Terms

a

Distance education Control group

Distance learning Comparison group

E-learning Treatment group

Online education Experimental

Online learning

Online training

Online course

Virtual learning

Virtual training

Virtual & course

Internet & learning

Internet & training

Internet & course

Web-based learning

Web-based instruction

Web-based course

Web-based training

“Distributed learning”

a All four terms were used in one query with “OR” if the database allowed.

Exhibit A-2. Terms for Additional Database Searches for Online Career Technical Education and Teacher Professional Development

Education Terms

Technology Terms Study Design Terms

Career education Distance Control group

Vocational education Distributed Comparison group

Teacher education E-learning Experimental

Teacher mentoring Internet Randomized

Teacher professional development

Online Treatment group

Teacher training Virtual

Technical education Web-based

Page 192: OLSD Project 2020 Report

A-3

Additional Sources of Articles

Exhibit A-3 lists the sources for the resulting 502 articles that went through full-text screening.

Exhibit A-3. Sources for Articles in the Full-Text Screening

Number of Articles Identified and Passing

Initial Screening

Total retained for full-text screen 502

Source of articles in full-text screen:

Electronic research database searches 316

Additional database searches for teacher professional development and career technical education 6

Recent meta-analyses 171

Manual review of key journals 19

Google Scholar searches 31

Recommendations from experts 3

Overlaps –36

Unretrievable –8

Page 193: OLSD Project 2020 Report

A-4

Effect Size Extraction

Of the 176 studies passing the full-text screening, 99 were identified as having at least one

contrast between online learning and face-to-face or offline learning (Category 1) or between

blended learning and face-to-face/offline learning (Category 2). These studies were transferred to

quantitative analysts for effect size extraction.

Numerical and statistical data contained in the studies were extracted for analysis with

Comprehensive Meta-Analysis software (Biostat Solutions 2006). Data provided in the form of t-

tests, F-tests, correlations, p-levels, and frequencies were used for this purpose.

During the data extraction phase, it became apparent that one set of studies rarely provided

sufficient data for Comprehensive Meta-Analysis calculation of an effect size. Quasi-

experimental studies that used hierarchical linear modeling or analysis of covariance with

adjustment for pretests and other learner characteristics through covariates typically did not

report some of the data elements needed to compute an effect size. For studies using hierarchical

linear modeling to analyze effects, typically the regression coefficient on the treatment status

variable (treatment or control), its standard error, and a p-value and sample sizes for the two

groups were reported. For analyses of covariance, typically the adjusted means and F-statistic

were reported along with group sample sizes. In almost all cases, the unadjusted standard

deviations for the two groups were not reported and could not be computed because the pretest-

posttest correlation was not provided. Following the advice of Robert Bernard, the chief meta-

analysis expert on the project’s Technical Working Group, analysts decided to retain these

studies and to use a conservative estimate of the pretest-posttest correlation (r = .70) in

estimating an effect size for those studies where the pretest was the same measure as the posttest

and using a pretest-posttest correlation of r = .50 when it was not. These effect sizes were

flagged in the coding as “estimated effect sizes,” as were effect sizes computed from t tests, F

tests, and p levels.

In extracting effect size data, the analysts followed a set of rules:

! The unit of analysis was the independent contrast between online condition and face-to-

face condition (Category 1) or between blended condition and face-to-face condition

(Category 2). Some studies reported more than one contrast, either by reporting more

than one experiment or by having multiple treatment conditions (e.g., online vs. blended

vs. face-to-face) in a single experiment.

! When there were multiple treatment groups or multiple control groups and the nature of

the instruction in the groups did not differ considerably (e.g., two treatment groups both

fell into the “blended” instruction category), then the weighted mean of the groups and

pooled standard deviation were used.

! When there were multiple treatment groups or multiple control groups and the nature of

the instruction in the groups did differ considerably (e.g., one treatment was purely online

whereas the other treatment was blended instruction, both compared against the face-to-

face condition), then analysts treated them as independent contrasts.

Page 194: OLSD Project 2020 Report

A-5

! In general, one learning outcome finding was extracted from each study. When multiple

learning outcome data were reported (e.g., assignments, midterm and final examinations,

grade point averages, grade distributions), the outcome that could be expected to be more

stable and more closely aligned to the instruction was extracted (e.g., final examination

scores instead of quizzes). However, in some studies, no learning outcome had obvious

superiority over the others. In such cases, analysts extracted multiple contrasts from the

study and calculated the weighted average of the multiple outcome scores if the outcome

measures were similar (e.g., two final tests, one testing procedural skills and the other

testing declarative knowledge). For example, in one study, analysts retained two outcome

findings because the outcome measures were quite different (Schilling et al. 2006). One

measure was a multiple-choice test, examining basic knowledge, whereas the other was a

performance-based assessment, testing students’ strategic and problem-solving skills in

the context of ill-structured problems.

! Learning outcome findings were extracted at the individual level. Analysts did not extract

group-level learning outcomes (e.g., scores for a group product). Too few group products

were included in the studies to support analyses of this variable.

The review of the 99 studies for effect size calculation produced 50 independent effect sizes (27

for Category 1 and 23 for Category 2) from 45 studies; 54 studies did not report sufficient data to

support effect-size calculation.

Coding of Study Features

All studies that provided enough effect size data were coded for their study features and for study

quality. The top-level coding structure, incorporating refinements made after pilot testing, is

shown in Exhibit A-4. (The full coding structure is available from the authors of this report.)

Twenty percent of the studies with sufficient data to compute effect size were coded by two

researchers. The interrater reliability across these double-coded studies was 86.4 percent. As a

result of analyzing coder disagreements, some definitions and decision rules for some codes were

refined; other codes that required information missing in the vast majority of documents or that

proved difficult to code reliably (e.g., indication of whether the instructor was certified or not)

were eliminated. A single researcher coded the remaining studies.

Page 195: OLSD Project 2020 Report

A-6

Exhibit A-4. Top-level Coding Structure for the Meta-analysis

Study Feature Coding Categories

! Study type

! Type of publication

! Year of publication

! Study author

! Whether the instructor was trained in online training

! Learner type

! Learner age

! Learner incentive for involvement in the study

! Learning setting

! Subject matter

! Treatment duration

! Dominant approach to learner control

! Media features

! Opportunity for face-to-face contact with the instructor

! Opportunity for face-to-face contact with peers

! Opportunity for asynchronous computer-mediated communication with the instructor

! Opportunity for asynchronous computer-mediated communication with peers

! Opportunity for synchronous computer-mediated communication with the instructor

! Opportunity for synchronous computer-mediated communication with peers

! Use of problem-based or project-based learning

! Opportunity for practice

! Opportunity for feedback

! Type of media-supported pedagogy

! Nature of outcome measure

! Nature of knowledge assessed

Study Design Codes

! Unit of assignment to conditions

! Sample size for unit of assignment

! Student equivalence

! Whether equivalence of groups at preintervention was described

! Equivalence of prior knowledge/pretest scores

! Instructor equivalence

! Time-on-task equivalence

! Curriculum material/instruction equivalence

! Attrition equivalence

! Contamination

Page 196: OLSD Project 2020 Report

A-7

Page 197: OLSD Project 2020 Report

MARKET EVALUATION SURVEYING DATA ANALYSIS BENCHMARKING ORGANIZATIONAL STRATEGY

1101 Connecticut Ave. NW, Suite 300, Washington, DC 20036 P 202.756.2971 F 866.808.6585 www.hanoverresearch.com

K-12 Online and Blended Classes

In this report, Hanover Research reviews trends and research in online and blended classrooms at the primary and secondary school levels. Initially, a key findings section highlights selected salient points. Subsequently, after providing an overview of the types of online courses that are available, the report examines the suitability of online classes for different populations of students. It then examines practical considerations for implementing and managing online course programs. Finally, it reviews mid-sized

Page 198: OLSD Project 2020 Report

2 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Key Findings

Some benefits to students from hybrid & online classes include:

Struggling students can take credit recovery courses online, and advanced or gifted students can avail themselves to enriched curriculums that may otherwise be unavailable.

to meet diverse student needs is especially relevant for small, rural, or inner-city schools, as online options can go well beyond what schools with limited resources can offer.

For students that have not had ready access to a range of current computing and communications devices, online courses teach technological skills as they teach academic subject matter.

As a practical, cost-online education may be capable of transforming small, underserved schools into ones with nearly unlimited resources.

Some students particularly well-suited to online learning include: students requiring flexible learning schedules, credit recovery students, independent learners and gifted students, and disabled students.

Other points of interest include:

For supplemental online programs that that will be located on campus, identifying a site coordinator will probably be necessary or recommended from the earliest stages of the project.

Teacher buy-in and training go hand-in-hand as surveys show that teachers are more likely to use teaching technologies when they receive adequate support and training.

While the diversity of online classes makes comparative and aggregate analyses difficult, preliminary research suggests that online teaching methods are effective.

In the best online courses, students have the opportunity to interact with instructors on a personalized level and one that supports them with the specific challenges they face.

Laws regarding state funding of online education vary widely by state, and may undergo frequent changes as the landscape of virtual education is continually transformed. While students enrolled in full-time online programs usually generate per pupil revenue, supplemental or part-time programs generally do not.

For most schools that are new to online education, partnering with a provider is likely to be the most efficient means to achieve their objectives.

In many cases, schools will already have the basic hardware and infrastructure necessary for online or hybrid programs. The main investments will be directly tied to supporting the unique technologies of the program.

Page 199: OLSD Project 2020 Report

3 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

About 50 percent of all public school districts are operating or planning some kind of online and blended learning programs.

For school districts that are new to online education, partnering with a content provider offers one of the simplest ways to launch a program. Regardless of the approach a school district selects, implementation requires that a school district assesses student needs, builds teacher support, and selects an administrator.

Introduction to K-12 Online Education

The marketplace for K-12 online schools, programs, and courses has grown rapidly over the past decade. Online learning takes many different forms, but in its most basic form it consists of students using Internet-based learning tools for a substantial portion of a course. School districts launch online programs for a number of reasons, but overall, they serve as an efficient means to expand district

are similarly diverse, and include, most prominently, the need for flexible schedules, credit recovery, specialized study, or advanced classes. Although a substantial body of research evidence on the subject is still scarce, students in online courses have generally performed as well as, if not better, than their counterparts in traditional classes.

Online education takes many forms, and school districts have a range of options for

building an online program. Schools that elect to work with a provider should look for one that align requirements as much as possible. Aside from the technology used in managing online courses, schools may already have some of the key components necessary for an online program, including computers, internet connections, and classroom space. It is likely that funding will

he competitive marketplace for online education, costs should be reasonable and generally approximate those associated with adding another set of courses or a new teacher.

There is no one-size-fits-all model for K-12 online programs. Models range from full-time online programs where students never set foot on a traditional brick-and-mortar campus to supplemental online programs where students may take just one class or part of class online. The boundaries between different types of online education and traditional models are blurring. Supplemental programs can include blended or hybrid models, in which students alternate between internet learning and traditional classrooms. Because very few reporting requirements exist for single-district online programs, the number of students in these programs is unknown. Research, as well as other published reports, suggests that about 50 percent of all public school districts are operating or planning some kind of online and blended learning programs. 1

1 Watson et al., 2010 K-12 p. 6

Page 200: OLSD Project 2020 Report

4 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

In the best online courses, students have the opportunity to interact with instructors on a personalized level and one that supports them with the specific challenges they face.

Students in online courses are perhaps as diverse as those found in on brick-and-mortar campuses. Online courses cater to students of all ages and backgrounds, including students with different skill levels, learning styles, and levels of academic achievement. The online learning experience may be just as rich as it is in traditional schools.

Online courses are not correspondence courses -

necessarily easy to pass (or cheat in), and students do participate in a substantial amount of interaction with their fellow students and peers.2 In the best online courses, students have the opportunity to interact with instructors on a

personalized level and one that supports them with the specific challenges they face.3 Because teachers and students are in such close communication, the teacher may get to know students and can usually recognize when they are not submitting their own work.4

While there is a broad range of online offerings at the district level, most single-district programs share the following attributes:5

Programs often combine fully online and face-to-face components in blended courses or programs.

While programs are mostly supplemental, some serve full-time students. Programs often focus on credit recovery or at-risk students. Programs are funded primarily by the district from public funds. In most cases, there is no difference in funding between online students and students in the physical setting.

Programs primarily serve high school grade levels, and more occasionally, middle school levels. A smaller number of districts are beginning to create online and blended options for elementary students.

This report focuses on the most common public school online programs: those that generally fit the description above. For ease of reading, this report defines, in general, online programs as those in which significant portions of the course are offered through the internet. The following definitions are helpful for the discussion and analysis in this report:

Online courses: Delivered via a web-based educational system that provides a structured learning environment, they may be synchronous (communication in

http://www.kpk12.com/wp-content/uploads/KeepingPaceK12_2010.pdf 2 Watson, J - p. 4 http://www.inacol.org/research/docs/national_report.pdf 3 Watson et al., 2010. Op. cit, 44. 4 Watson, 2007. Op. cit., 17 5 Watson et al., 2010. Op. cit, 34

Page 201: OLSD Project 2020 Report

5 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

which participants interact in real time) or asynchronous (communication that is separated by time, e.g., email or online discussion forums).6

Full-time online programs: Draw students from across multiple districts, and often an entire state.7

Supplemental online programs: A supplemental online program generally takes place within the structure of a degree-granting high school. Students select online courses for their mainstream, credit recovery, specialized, or AP courses.8

Blended courses combine two delivery modes of instruction, online and face-to-face. Instruction involves increased interaction between student-and-instructor, student-to-content and student-to-student.9

Content Provider: Content providers (or providers) are generally other schools (charter, public, or private), educational services companies, or state agencies. They almost always charge a fee, but there are different models for these as well;; some charge per class enrollment, whereas others charge a subscription fee.

6 Watson et al. 2010. Op. cit., 7 7 Ibid. 8 Watson et al., 2010. Op. cit, 35 9 Ibid., 40

Page 202: OLSD Project 2020 Report

6 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Table 1: K-12 Online Learning at a Glance

During the 2008-2009 school year, state-sponsored virtual K-12 public school programs operating in 27 states provided roughly 320,000 course enrollments (i.e., one student taking one semester-long course) in for-credit courses.

An estimated one million K-12 students took an online course during 2007-2008.

In 2009, more than half of the school districts in the United States offered online courses and services and online learning is growing rapidly, at 30 percent annually.

More than 40 percent of high school and middle school students have expressed interest in taking an online course in 2009.

K-12 Online learning is a marketplace of over 500 million dollars and a 30 percent annual growth rate, and a 20-45 percent growth rate for individual schools.

There are supplemental or fulltime opportunities in 48 of the 50 states as well as Washington, DC, and 27 states have fulltime online schools.

75 percent of school districts have had at least one student enrolled in an online or blended learning course

State virtual schools exist in 39 states. Their size varies from schools with fewer than 2,500 course enrollments (one student taking one semester-long course) to the Florida Virtual School, with more than 220,000 course enrollments.

Research, as well as other published reports, suggests that about 50 percent of all districts are operating or planning online and blended learning programs

Sources: Muller, E. 2010 -12 Public School Programs and Students with Disabilities: Issues and http://www.projectforum.org/docs/VirtualK-12PublicSchoolProgramsandSwD-

IssuesandRecommendations.pdf Inacol. http://www.education.virginia.gov/docs/ NACOL_fastfacts-lr%20Oct%202010.pdf

http://www.kpk12.com/wp-content/uploads/KeepingPaceK12_2009.pdf arning: The Convergence of Online and Face-to-

http://www.inacol.org/research/promisingpractices/NACOL_PP-BlendedLearning-lr.pdf

Page 203: OLSD Project 2020 Report

7 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

For school districts new to online education, providers can hold the key to launching a successful program. Generally, they include the following types of organizations:10

Charter schools within a district Charter schools outside of a district Multi-district programs State-supported virtual schools within a state State-supported virtual schools outside of a state State technology service agencies Colleges and universities Private, for-profit entities that offer selected courses Private, for-profit virtual schools

Besides the difficulty of producing a large variety of materials in-house, many districts have insufficient resources to produce high-quality content. Providers, on the other hand, have dedicated technical staffs, subject-matter experts, and seasoned online teachers. Providers are able to fine-tune their products through user testing and extensive feedback, whereas districts are likely to find the process of continually revising courses and technology difficult prohibitively costly.11 In addition to offering the content of online courses, providers can implement programs and manage them, and many offer 24-hour technical support.12 Providers offer a way for mid-sized or smaller school districts to navigate a crowded marketplace that can have substantial entry costs.

Contracting with a provider, however, is not an all-or-nothing proposition. School districts can be both producers and consumers of content by contracting with

some of their own courses to other schools. Providers can also ease the process of converting courses to online for those districts that choose to eventually create and deliver their own offerings. Converting a single high school or middle school course into a fully online format is a substantial task. Moving portions of a course from face-to-face to online, starting with a small module and then moving on to the next, is a standard course development approach that can ease the development process for teachers and districts.13

A 2007 survey of 60 online schools suggest that the percentage of courses that are licensed or built in-house is highly variable among online programs. According to the survey, 23% of online programs had licensed all of their courses, and 23% of programs had developed all of their courses;; 53% had licensed half or more of their

10 Picciano, A. and J. -12 Online Learning: A Surhttp://sloanconsortium.org/publications/survey/K-12_06 11 Watson, J. and B.

p. 7. http://www.inacol.org/research/promisingpractices/iNACOL_PP_MgmntOp_042309.pdf 12 Ibid., 16. 13 Picciano, Op. Cit.

Page 204: OLSD Project 2020 Report

8 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

courses, while 55% had licensed half or less of their courses.14 Providers likely exist for almost every kind of learning style or need. Table 2 illustrates the main dimensions of online education programs, and provides a glimpse of just how many options and arrangements have come to exist in online learning.

Table 2: Dimensions of Online Education Programs: Dimension

Comprehensiveness Part-time Program (Individual Courses) Full-time School (Full Course Load)

Reach

District Multi-district State Multi-state National Global

Type

District Magnet Contract Charter Private Home

Location School Home Other

Delivery Asynchronous Synchronous

Operational Control

Local Board Consortium Regional Authority University State Independent Vendor

Source: G. Vanourek. 2006. A Primer on Virtual Charter Schools: Issue Brief for the National Association of Charter School Authorizers. http://www.qualitycharters.org/images/stories/publications/Issue_Briefs/IssueBriefNo10_Roles_Virtual_Charters.pdf

14 Ibid., 7.

Page 205: OLSD Project 2020 Report

9 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

In an era in which many school districts face budget constraints and additional mandates, online classes may offer an efficient means of serving very different student populations.

Online or blended learning provides opportunities for students to gain practical technological proficiency, workplace skills, and experience learning independently.

Section One: Teaching and Learning in Online Courses

Potential Benefits of Online Courses

Online classes can offer a range of benefits to school systems and the students they serve. Struggling students can take credit recovery courses online, and advanced or gifted students can avail themselves to enriched curriculums that may otherwise be unavailable. ability to meet diverse student needs is especially relevant for small, rural, or inner-city schools, as

online options can go well beyond what schools with limited resources can offer. They may also increase overall efficiency by optimizing the use of district resources. As a practical, cost- online education may be capable of transforming small, underserved schools into ones with nearly unlimited resources. For example, if only three students at a particular high school want to take an advanced class such as Calculus III, it would probably be too small a number to justify offering the course, and the district would probably not look into hiring or training a teacher if one were not already on staff. Under some dual-enrollment arrangements, students could request to take the course at a local community college or university, and either they or the district would have to pay tuition.15 With online courses as part of the curriculum, however, the students could take the course at home or in the school via the internet. In the end, the school could offer students a valuable opportunity for advanced coursework, and the costs to both the students and school would be lower than any of the alternatives.

All students can benefit from taking some online classes, regardless of whether they want to take advanced classes or need to repeat the ones they failed.16 According to the yearly journal of trends in online education, Keeping up with Online Learning,

build collaborative relationships, to problem solve, and to communicate in a diverse global

17 With many students exposed at an early age to a range of high-tech communication devices, online learning duplicates and builds on the way that students have already begun to learn and receive

15 p. 10 http://www2.ed.gov/admins/lead/academic/advanced/coursesonline.pdf 16 Watson, J - p. iv http://www.inacol.org/research/docs/national_report.pdf 17 Watson, Murin, Vashaw, Gemin, and Rapp, Op. Cit, 44.

Page 206: OLSD Project 2020 Report

10 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

information.18 technological proficiencies and building them into usable workplace skills. For students that have not had ready access to a range of current computing and communications devices, online courses teach technological skills as they teach academic subject matter.

Curriculums, Classes, and Outcomes

Just as there are many approaches to online education, there are numerous options for designing programs, curricula, and classes. Rather than attempting to describe all of these, this report highlights some key characteristics and trends.

Online courses are often divided into lessons and units, with much of the course material offered online, typically via a course management system.

Course content may include texts, video, audio, animations, message boards, and interactive tools. These multimedia materials may be complemented with traditional offline materials, such as textbooks and hands-on materials.

Teacher preferences and the curriculum ultimately define course content tools. For example, an English course might rely heavily on online and offline text;; foreign languages might utilize audio players and recording devices;; a biology course might use interactive models to demonstrate the internal structures of living beings. Class discussions and exchanges often can take place through message boards and short-response assignments.

Assessments include different types of questions such as essays, short answer writing assignments, multiple choice questions, and matching.19 In addition to all of these technological methods, good courses work to engage various learning styles including students with disabilities and provide opportunities to engage in abstract thinking and critical reasoning.20

Online courses can also work to facilitate many of the same opportunities for student-teacher and student-student interaction as traditional classes. There are different means to achieve close levels of contact within online classes. Class sizes should probably be no larger than their traditional equivalents. Some programs utilize regular phone calls between teachers and students to supplement communication via the Internet. Programs may require that students contact their teachers three times a week, or that teachers check email at least once every school day and respond the same day, and blended classes may have regular meetings for class discussions.21

18 Watson, 2007. Op.Cit., iv. 19 Ibid,10. 20 Department of Education, 2007, 34. 21 Ibid.

Page 207: OLSD Project 2020 Report

11 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

According to the U.S. Department of Education (2009), students in online courses have been performing better on average than those in traditional classrooms.

These success stories must not be taken as a comprehensive picture, however. Although there appears to be few research studies on the population of at-risk students enrolled in online programs, there has been speculation that many of the students enrolled in online programs can be classified as such.

For students supplementing traditional courses with online ones, online course grades typically become part of their overall grade point average. For students taking AP courses, the results of their tests may be another method of assessing their performance.22 Students work at their own pace and are required to meet certain goals and hand in assignments by prescribed deadlines.23 In most cases, the academic calendars of online schools closely approximate traditional ones. If schools are working with a provider, they should ensure prior to the beginning of the course (and perhaps before the selection of the provider) that the academic calendars align.24

While the diversity of online classes makes comparative and aggregate analyses difficult, preliminary research suggests that online teaching methods are effective. According to the U.S. Department of Education (2009), students in online courses

have been performing better on average than those in traditional classrooms. The differences in achievement were greatest in conditions that blended online and face-to-face education, which frequently resulted in additional learning time.25 Surveys comparing the results of AP scores from online and traditional classroom students found that students enrolled in online courses

performed better than students in traditional AP programs.26 Even higher achievement rates have come from blended classes as regular contact helps students stay on track and provide time to go over any problems that students may be facing with specific issues.

These success stories must not be taken as a comprehensive picture, however. Although there appear to be few research studies on the population of at-risk students enrolled in online programs, there has been speculation that many of the students enrolled in online programs can be classified as such.27 Despite this perception, there has been little

discussion or research on how at-risk

22 Watson, Murin, Vashaw, Gemin, and Rapp, Op. Cit, 10. 23 Department of Education, Op. Cit., 24. 24 Southern Regional Educational Board, Op.Cit., 3. 25 -12 Public School Programs and Students with Disabilities: Issues and

p. 3. http://www.projectforum.org/docs/VirtualK-12PublicSchoolProgramsandSwD-IssuesandRecommendations.pdf 26 Watson, 2007, Op. Cit., 24. 27 -Risk

p. 19 http://www.inacol.org/research/docs/iNACOL_AtRiskStudentOnlineResearch.pdf

Page 208: OLSD Project 2020 Report

12 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Research has identified characteristics of successful online students, which include intrinsic motivation, independent learning skills, interest in computers, and involvement in outside hobbies, activities and relationships.

students fare in online programs, and research studies have focused more on the performance of high-ability students in online classes.28 Preliminary studies, however, have indicated that programs have had various successes with at-risk students, as measured by an increase in course completion rates;; standardized test scores;; and graduation rates, in addition to a decrease in course drops, student absences, truancy, or other behavioral issues.29 Suitability and Student Characteristics for Success

Because online learning involves independent reading and analysis, students should already have an average or high level of reading comprehension.30 Research has identified characteristics of successful online students, which include intrinsic motivation, independent learning skills, interest in computers, and involvement in outside hobbies, activities and relationships.31 For students spending a substantial amount of time with individual online study, this last set of factors may be more important, as online study, no matter how effective, may not be conducive to the same kinds of relationships as classroom study. The paragraphs below discuss the student populations that might benefit most from online course options.

Students Requiring Flexible Learning Schedules: In terms of flexibility, online programs provide options to students that face time constraints or pressures that make attending regular classes difficult. Often, a student may simply face a scheduling conflict, or perhaps a required or core curriculum course (e.g., math) is offered at the same time as an elective course (e.g., music) that is of particular interest to a student. In other cases, a required class may be offered at the same time that a student athlete regularly needs to travel to competitions. The following groups of students requiring flexible schedules may also benefit from the flexibility of being able to take some or all of their courses online:32

Serious performers and athletes Students that want to enroll in advanced courses Dropouts returning to school Students that become pregnant Migrants

28 Ibid., 18. 29 Ibid., 13. 30 Smith, -12 Online

p. 33 http://www.ncrel.org/tech/synthesis/synthesis.pdf 31 - p. 24 http://www.marylandpublicschools.org/NR/rdonlyres/D895AEF0-476A-46CF-86E5-A77C87A4E129/27450/OnlineLearning_MD_2010_2011.pdf 32 Watson, Op. Cit., 5.

Page 209: OLSD Project 2020 Report

13 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Online classes may help students with some difficulties and who are perhaps off-track from graduation, but they may do little good for high-risk students.

Online classes may be especially appropriate for credit recovery because they do not replicate the circumstances in which a student failed to master a required subject in their first attempt

Online classes may be a good fit for gifted students who may be well-served by programs that allow them to go at their own pace.

Students that are homebound or have serious physical handicaps Students with serious illnesses

Credit Recovery and At-Risk Students: Online courses draw students who, for whatever reason, want to pursue options outside of traditional classrooms. With at-risk students, the reasons may have to do with disciplinary or social factors, but online courses may not be appropriate for students that already face serious or serious difficulties in traditional classrooms.33 In other words, online classes may help students with some difficulties and who are perhaps off-track from graduation, but they may do little good for high-risk students. Because online courses appear to be popular and effective options for students that need to recover credits from failed courses, perhaps it is important to draw finer distinctions between different at-risk populations, particularly between students that

may have failed courses and students with deeper problems. Online classes may be especially appropriate for credit recovery because they do not replicate the circumstances in which a student failed to master a required subject in their first attempt.34 Students may already have a working knowledge of the subject matter, but in a different environment and context, they may

be more successful. In fact, many brick-and-mortar schools with established online programs initially began as a means for students to make up the classes they initially failed.

Independent Learners and Gifted Students: In terms of achievement, this set of students may come from both sides of the spectrum. For example, online classes may be a good fit for students that have proven their ability to excel in a subject and want to progress quickly in it in order to take advanced classes. They may have earned top grades in difficult subjects and are simply ready for the next step. The other sub-set of this group may not have shown such high levels of achievement. Perhaps they show the potential for higher-level work, but fail to do all of the required assignments, oftentimes because claim to fail to see the value in doing them. This second group of potentially gifted students, the ones that achieve little in class but still perform well on tests, may be well-served by 33 Archambault and Diamond, Op. Cit., 18. 34 Watson and Gemin, 2010, Op. Cit., 24.

Page 210: OLSD Project 2020 Report

14 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Online education may be ideal for students with disabilities that would benefit from an individualized program and extensive opportunities for parental involvement.

One of the most common approaches to phasing in online initiatives appears to be utilizing special programs for additional practice and remedial skill-building in subjects like math or reading using online programs.

programs that allow them to go at their own pace. Students that fit this profile may have also had problems in traditional environments and distractions of school life;; others are highly motivated and have a deep interest and ability in one or two subjects.

Disabled Students: Online education may be ideal for students with disabilities that would benefit from an individualized program and extensive opportunities for parental involvement. Some online programs also allow for the easy use of technology as an extension of any assistive technology the student may already be using.35 Furthermore, they provide frequent and immediate feedback and a variety of formats and personalized approaches to instruction. Although disabled students may benefit from online classes, there are still additional legal and administrative frameworks they must pass through which have not caught up with virtual education. These include adherence to the plans for accountability, evaluation, and

an additional challenge in enrolling students in online classes.36 Online Courses and One-to-One Computer Initiatives: When to Start?

There appears to be little theoretical discussion surrounding the issue of age appropriateness for either online courses in K-12 classrooms or for one-to-one computer initiatives. While most discussions of online education for primary and secondary students refer to trends within K-12 as a whole, very few mention how programs have been implemented in at the elementary or even middle

school levels. Among districts with supplementary online high school programs, there seems to be little corresponding mention of equivalent online programs in middle school. Still, given that the technological proficiency level required for online classes is low, it is reasonable to expect that exposure to computers and the ability to take online classes are not mutually exclusive. One of the most common approaches to phasing in online initiatives appears to be programs in which students utilize special programs for additional practice and remedial skill-building in subjects like math or reading using online programs. It would be difficult to characterize the use of programs such as

represent temporary skill-building interventions, but they do allow students to

35 Muller, Op. Cit. 36 Muller, op. cit., 3-4.

Page 211: OLSD Project 2020 Report

15 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

become accustomed to Internet-based learning. Nationally, it appears that the most common grade levels to implement one-to-one initiatives is in middle school, although implementation can occur at any grade level if there is a clear plan for how computers will be used in the classroom.37 One s Freedom to Learn program, which deploys at least some computers to elementary school students.38

37 Education Week. 6 August 2009. Chat: One-to-One Chttp://www.edweek.org/ew/events/chats/2009/08/06/index.html 38 Wilson, L.A. et al. 2006. Measuring the Value of One-to-One Computing: A Case Study Perspective. The Consortium for School Networking. http://www.techlearning.com/techlearning/events/techforum06/LeslieWilson_1-One_to_One_Computing.pdf

Page 212: OLSD Project 2020 Report

16 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

For most schools that are new to online education, partnering with a provider is likely to be the most efficient means to achieve their objectives.

Section Two: Implementing and Managing Online Programs

Districts that decide to provide online classes do so for many reasons, and ultimately, their unique needs should determine the characteristics of the program. For most schools that are new to online education, partnering with a provider is likely to be the most efficient means to achieve their objectives. While some of the academic features of providers were discussed in Section One, schools will want to learn more about prospective operations. If a school elects to partner with an online provider, the providers are responsible for ensuring that instructors are qualified and effective.39

In selecting a provider, school districts should inquire about initial instructor preparation, support available for continued professional development, and methods for evaluating effectiveness.40 State-sponsored providers, for example, generally require that teachers be highly qualified and certified in their area of online instruction. Private providers, however, require that teachers be certified in the area they teach, but not in the state where their students reside.41 Any issues associated with the use of copyrighted materials should be addressed in advance.42 There should be clear procedures and policies for student privacy and academic honesty. If online classes or teachers fail to perform as agreed upon, there should also be clear procedures for appeal and recourse.43 Furthermore, teachers and students should receive tr 44

39 Department of Education, Op. Cit, 29. 40 Ibid., 30. 41 Ibid., 31. 42 Southern Regional Education Board. 2000-2001. les of Quality: Guidelines for Web-based

http://info.sreb.org/programs/EdTech/pubs/EssentialPrincipals/EssentialPrinciples.pdf 43 Ibid. 44 Ibid.

Page 213: OLSD Project 2020 Report

17 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Implementing an online program also involves substantial effort within schools themselves. For supplemental online programs that that will be located on campus, identifying a site coordinator, who serves as a liaison between the school and the provider, will probably be necessary or recommended from the earliest stages of the project. Beyond their liaison role, which includes reporting, payments, and grade transcription, site coordinators handle

45 The site coordinator also generally recruits, enrolls, and counsels students enrolled in online programs and ensures that students should always feel that they have a knowledgeable source available to help answer any questions or concerns. Recruitment is a key component of any online program. Students must be made aware that online options exist and given a sense of how they work. Schools may also want to establish a set of prerequisites and standards before students qualify.46

Teacher Buy-Ins and Training Building support and a pool of qualified teachers is a potentially long-term challenge for any new online program. While there is very little mention of online programs causing controversy among faculty members, one contact that Hanover Research interviewed expressed that their online program faced faculty opposition and skepticism in its early stages. While online programs are gradually becoming an established part of the educational landscape, it may be natural for teachers to express concerns about the quality of online classes and how they could impact faculty positions. Supplementary programs, such as the kinds typically offered in

45 Department of Education, Op. Cit., 17-18. 46 Department of Education, 2007, 39.

Checklist for Schools Implementing Online Programs

Conduct a needs survey for online learning with students and parents Identify online learning provider that can meet identified needs Ensure courses are aligned to local, state, and national standards Establish procedures for course payment Ensure school can meet the technical requirements for students to access online courses

Identify and train site coordinators and site-based mentors Recruit students, draft and publish contract for online course enrollment and participation

Prepare student and parent orientation materials and provide orientation meetings

Source: Accessed March 14, 2011, from http://www2.ed.gov/admins/lead/academic/advanced/coursesonline.pdf

Page 214: OLSD Project 2020 Report

18 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

By allowing initiatives for online learning to be teacher-led, early opposition may be reduced, buy-ins may be easier, and programs may become more responsive in the long term.

collaboration with providers, typically offer advanced or credit recovery classes, appear seldom designed to replace teachers or traditional classes. When and if school districts do decide to move portions of their own curriculums online, teachers are still needed to administer, teach, and grade them. In sum, online courses should never be a risk to job security and serve to expand the range of courses that a school is capable of providing. Because new online programs have to clear higher administrative hurdles than other forms of classroom technology, traditional strategies for buy-ins, such as building a group of

teachers that embrace technology early on and serve as an example may not work.47 Still, schools that have faced challenges in implementing online programs recommend a grassroots approach that gives teachers the initiative in designing and implementing the new program. Another principle that can be adapted from the early-adopter buy-in strategy is for districts to proceed slowly and build on any early successes;; teachers that tend to embrace new technology in their classroom may be more inclined to support online courses. By allowing initiatives for online learning to be teacher-led, early opposition may be reduced, buy-ins may be easier, and programs may become more responsive in the long term.

Teacher buy-ins and training go hand-in-hand as surveys show that teachers are more likely to use teaching technologies when they receive adequate support and training.48 Currently, few teachers in traditional classrooms have had training in online teaching, which makes training and teacher preparation a critical success factor.49 The elements of learning to teach online fall into two categories: the first, learning the technology and tools of the learning management system, is fairly straightforward. The second element of teaching online, which involves harnessing these tools and technologies to promote a stimulating exchange of information, is much more complex. At a basic level, it is the difference between knowing how to post messages on a discussion board, versus understanding how to use technology to promote a lively and educational exchange of ideas.50

47 Teacher Buy in Strategies, http://www2.ed.gov/pubs/EdReformStudies/EdTech/teacherbuyin.html 48 Smith, Clark, and Blowmeyer, Op. Cit., 10. 49 Watson and Gemin, 2009, Management, Op. Cit, 10. 50 Watson, 2007, Op. Cit., 13.

Page 215: OLSD Project 2020 Report

19 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

online teaching is another factor that

makes partnering with a provider an attractive option for schools looking to start their own online programs. Reputable providers generally employ teachers that have considerable training and experience with online teaching methods, but in-house training opportunities should still play an important role. School districts considering a provider should pay special attention to what kinds of opportunities are available to train their teachers in the new technology. Given the right training and experience, teachers can transition to producing online classes for their own schools. Ann Arbor Public Schools, for example, has begun to transition away from offering solely provider-delivered classes to a mix that includes online courses provided by teachers from the district. For Ann Arbor teachers interested in teaching online, the school district usually recommends that they try their first class in the summer, when enrollments and the pressures of working with many students and in an unfamiliar setting is reduced. Teachers who have taught a course online often report that teaching online changes the way they approach their traditional classes in positive ways. Often, these changes have less to do with their use of technology and more to do with the way they plan and deliver their lessons. These include providing more scope for independent work and giving students clearer instructions.51

51 Smith, Clark, and Blowmeyer, Op. Cit., 27.

From Content Consumer to Producer Starting out by working with a content provider and focusing on a small segment of the student population can be a good strategy for long-term

alternative for at-risk students, but grew to service students with a variety of needs, including Advanced Placement students and student athletes that faced scheduling conflicts. The overall online program grew into a program, the Virtual School and Evening High School, generated approximately 400 course enrollments in fall semester 2010. In the evening school, students work online, but come to a physical location to meet. In the virtual school, students meet with their instructor for orientation, but later work on their own. Although Frederick education vendors, all of the courses are now delivered by Frederick County teachers, who have undergone training in effective online teaching methods. Source: Watson, John and Butch Gemin. 2010. - p. 10 http://www.marylandpublicschools.org/NR/rdonlyres/D895AEF0-476A-46CF-86E5-A77C87A4E129/27450/OnlineLearning_MD_2010_2011.pdf

Page 216: OLSD Project 2020 Report

20 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Laws regarding state funding of online education vary widely by state, and may undergo frequent changes as the landscape of virtual education is continually transformed.

In general, the operating costs of an online program are equivalent to the operating costs for similar programs in brick-and-mortar campuses.

Financing Online Programs Funding arrangements for online programs will vary according to the type of program a school selects, state laws, and the availability of special funds such as grants. Laws regarding state funding of online education vary widely by state, and may undergo frequent changes as the landscape of virtual education is continually transformed. It is worth nothing, however, that the funding of online students, particularly full-time students in online charter schools, has been controversial in several states, due in part to the fact that online schools sometimes draw students and funding across district lines.52 While students enrolled in full-time online programs usually generate per pupil revenue, supplemental or part-time programs generally do not.53 There does not appear to be many sources of outside funding for online courses, and for the majority of schools mentioned in this analysis, funding has come from the general budget. In general, the operating costs of an online program are equivalent to the operating costs for similar programs in brick-and-mortar campuses.54 The largest costs are likely to come from technological components, with associated costs for hardware, bandwidth, and the like, which are critical to supporting the teaching and learning process. In addition, other costs, such as teacher travel for face-to-face training, telephone technology, and technical support, must also be taken into consideration. Finally, it is worth noting that if online or blended classes become a

physical infrastructure. Schools facing rising enrollment rates and overcrowding are among the types of schools that could benefit most from only having students in their brick-and mortar campuses for less of the school day.55

Technology Technologies that effectively present online course content as well as facilitate the kinds of exchanges that occur in traditional classrooms are vital elements to any online program. Platforms for presentation and communication must be versatile and integrate a range of functions within one space that is easy to access as course content may include text, graphics, video, audio, animations, and other interactive 52 Watson, 2007 Op. Cit., 8. 53 p. 4. http://www.cde.state.co.us/onlinelearning/download/TrujilloCommissionOnlineEducationFinalReport-2-15-2007.pdf 54 Ibid., 10. 55Watson, Murin, Vashaw, Gemin, and Rapp, Op. Cit., 44

Page 217: OLSD Project 2020 Report

21 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

tools. In many cases, schools will already have the basic hardware and infrastructure, such as computer labs and internet connections, and the main investments that they will have to make have will be directly tied to supporting the unique technologies of the program. The following items represent the major investments that districts will need to make:

Course management systems (CMS) host the course content, communication tools, grade books, and other elements of the course. They are only rarely designed or produced by schools themselves. Although they pose a significant expense for online programs, there are numerous competing products that are keeping costs in check. Open-source resources such as Moodle56 offer platforms that are ideal for schools that have begun to produce their own online courses.

Student information systems (SIS) keep track of student demographic, contact, and assessment information for reporting and decision-making.

Audio and video plug-ins: Teachers and students will usually need a media player for video and audio, and real-time web conferencing.

Basic productivity software: Oftentimes software used for web browsing, word processing, text documents, and presentations (e.g., Microsoft Office, Adobe, Internet Explorer).

Internet Access, Servers, and Bandwidth: An online program needs a server that hosts courses and the bandwidth to deliver them. Most course management systems providers also have an option to provide hosting. Broadband internet access by users requires sufficient bandwidth to host courses and online services, and to be able to sustain peak periods of teacher and student usage

Computers: Providers generally hool to provide access to a computer lab so that the student can access courses from the school. Programs that serve full-time students sometimes loan computers on loan to their students.

Basic work environment: Students need a quiet place for the computer, desk, etc. Computer labs with relatively open access work fine, but separate classrooms dedicated to online learning and where students can interact with their coordinator or take proctored tests are ideal.57

56 www.moodle.org 57 Watson, 2007, Op. Cit., 21-22.

Page 218: OLSD Project 2020 Report

22 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Textbooks: When courses require textbooks, providers often create links to online retailers that can mail the book and materials to the student or school directly.58 In other cases, providers post materials online.59

58 Department of Education, 2007, Op. Cit., 29. 59 Ibid.

Page 219: OLSD Project 2020 Report

23 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Section Three: School District Case Studies

In selecting districts that have successfully implemented online programs, Hanover Research sought to focus on mid-sized public school districts, and looked at districts engaged in online education for varying periods of time and using different models. Hanover then placed calls to school principals or online learning coordinators to

administration.

Ann Arbor Public Schools

Ann Arbor Public Schools (AAPS) in Michigan, a district with approximately 16,000 students, has had an online program for the past 10 years. With approximately 300

gradually diversified to offer a wide variety of courses and online options. The students taking online classes come from a range of backgrounds, including students that may be considered at-risk academically or gifted. High-risk students have not been a good fit for online classes and have generally not been enrolled. Typically, AAPS students taking online courses fall into one of several categories:

Students looking to advance a level in a subject area by taking two courses in the same subject or through AP courses that are not regularly offered.

Students that have displayed a particular aptitude in a subject, but have underperformed in traditional classroom settings.

Students that need credit recovery Transfer students that come from districts or areas where curricula do not align to AAPS requirements

Students that have a specific learning disability, such as problems with memory.

Students with serious health issues.

online program first began, the district state-endorsed provider of online classes, the Michigan Virtual School (MVS). AAPS still utilizes MVS for AP classes, but has begun to offer a mix of provider services for students with different needs. It has begun to use Education2020, a provider for credit recovery classes that take place in on-site labs, and Aleks, a content provider used for mastery learning classes and self-paced math classes. Finally, it has recently begun to produce its own blended learning are hosted on Moodle, an open source content management service. In terms of face-to- courses follow several learning models, ranging from ones that have an orientation and two proctored tests, to weekly face-to-face meetings.

Teachers that display an interest in teaching online courses typically begin teaching them during a special summer program. Low enrollments allow teachers to become accustomed to the differences between online and traditional teaching in a low-pressure environment

Page 220: OLSD Project 2020 Report

24 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

the advantages of an implementation approach that engages stakeholders and vendors.

Fairbanks North Star Borough School District sought to target home-schooled students that would not otherwise attend traditional classes.

and gain a better sense of whether online teaching is something that they want to continue in the future. The summer online program is also what sustains the online program during the school year. Aimed at students that want to free up their schedules for the school year for advanced classes or extracurricular activities, it charges fees that are used to pay for the online classes offered during the school year.60

Sioux Falls School District

Sioux Falls School District (SFSD), with nearly 22,000 students, has had an online program for three years. It has offered courses from Apex since beginning the program, one course from Florida Virtual during the 2010-2011 school year, and recently began to put some of its own classes online using videocasts and Moodle. slots for over 700 students;; approximately 10% are enrolled in AP classes, 10% in remedial or foundational classes, and 80% in mainstream or core classes. Funding comes out of the central budget, with each class enrollment costing about $125.

experience illustrates the advantages of an implementation approach that engages stakeholders and vendors. It thoroughly researched content providers before sending companies a questionnaire to seven finalists. The questionnaire covered questions about whether providers would be able to meet the specific demands of SFSD, which included that prospective providers target all types of learners and be aligned to state requirements. After identifying the top two candidate companies, they were invited to Sioux Falls to present to a group of administrators and teachers. Although it has only offered online classes for three years, it repeated the provider selection method during the second year to see whether there were options that would better suit its needs and be more cost effective.61

Fairbanks North Star Borough School District

Fairbanks North Star Borough School District (FNSBSD), a school district with approximately 14,000 students, has operated an online learning program for grades 6-12 for the past three years. Its Building Educational Success Together (BEST) Program offers more than 90 classes free to district students. Partnering with Advanced Academics, a subsidiary of DeVry Inc. that provides all of the content for the program, FNSBSD sought to target home-schooled students that would not otherwise attend

60 Jaquette, S. Telephone Interview. March 16, 2011. 61 Raeder, L. Telephone Interview. March 14, 2011.

Page 221: OLSD Project 2020 Report

25 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

traditional classes.

Home schooled students, for example, take the bulk of their entire curriculum online, but have the option of participating in extracurricular activities or special courses, such as band. Its approach is not an example of blended learning, however. There is no need to take classes on brick-and-mortar campus, and classes are completely asynchronous, so they can be completed at any time. Students that complete the program will have their diplomas issued by FNSBSD, regardless of whether they set foot in one of the schools or not. The BEST Program, being accredited through a traditional school district like FNSBSD, adds another level of recognition to a curriculum provided by the content provider, Advanced Academics/DeVry.

With students, teachers, and administrators largely supportive, Fairbanks North Star had a relatively easy time of implementing the program. All of the funding is through

students can be put towards online students, there is generally no fee for Fairbanks residents.62

Minot Public Schools

Minot Public Schools (MPS), a district in North Dakota with over 6,500 students, has had a longstanding relationship with two online education providers: The North Dakota Center for Distance Education (ND CDE) and PLATO Learning. Prior to offering online courses, MPS w to offer video classes to smaller, nearby school districts. Because of its size, MPS is already equipped to provide a range of AP and dual credit courses, and most students taking classes online enroll in courses for credit recovery. In some instances, students facing scheduling difficulties also enroll in online courses.

MPS has been largely reliant on independent funding, although the district receives state aid for students that are also enrolled in its Job Corps program. To pay for the classes, Minot buys a number of site licenses/enrollments from both PLATO and ND CDE. Only a limited number of students is permitted to be logged in and taking classes at one time. Most of the infrastructure for online classes was already in place, and classes take place throughout the school, but the district did need to hire a coordinator.

According to an administrator at MPS, the district faced relatively few difficulties in integrating online classes into the curriculum. The faculty was supportive and plenty of training opportunities were provided. Because the courses require students to

the greatest challenge has been finding teachers that are trained and available to administer the tests. Teachers, however, have been willing to go through the training. The school district also did not need to build special facilities or acquire new equipment because classes take place all over the school. MPS does not

62 BEST Program Admissions Representative. Telephone Interview. March 17, 2011.

Page 222: OLSD Project 2020 Report

26 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Instead of using the state-sponsored virtual school, BPS chose to partner with two charter schools that have expanded into providing content for other school

offer online classes to middle school students, but offers skill-building programs such 63

Bismarck Public Schools

Bismarck Public Schools (BPS), a mid-sized school district with nearly 11,000 students, recently started an online program aimed at credit recovery. Instead of using the state-sponsored virtual school, it chose to partner with two charter schools that have expanded into providing content for other school

Payments to the provider schools, Odyssey and Jeffco, come out

screened prior to enrollment, and summer school is generally prioritized over enrollment in the online courses. The students enrolled are typically the ones at risk of not graduating, and taking classes online may be one of the only options for them to recover credits.

BPS worked closely with the content providers to tailor the program to needs. For the

English classes online resembled and aligned with ones at BPS. Students take the

for credit recover comes from the general budget, and students interested in taking AP or general classes that are not offered as part of the general curriculum may still do so through ND CDE. These students still require special approval however, and must provide their own source of tuition funding.64

63 Joyal, S. Telephone Interview. March 15, 2011. 64 Morbin, J. Telephone Interview. March 15, 2011.

Page 223: OLSD Project 2020 Report

27 © 2011 Hanover Research District Administration Practice

HANOVER RESEARCH MARCH 2011

Project Evaluation Form Hanover Research is committed to providing a work product that meets or exceeds member expectations. In keeping with that goal, we would like to hear your opinions regarding our reports. Feedback is critically important and serves as the strongest mechanism by which we tailor our research to your organization. When you have had a chance to evaluate this report, please take a moment to fill out the following questionnaire. http://www.hanoverresearch.com/evaluation/index.php Note This brief was written to fulfill the specific request of an individual member of Hanover Research. As such, it may not satisfy the needs of all members. We encourage any and all members who have additional questions about this topic or any other to contact us. Caveat The publisher and authors have used their best efforts in preparing this brief. The publisher and authors make no representations or warranties with respect to the accuracy or completeness of the contents of this brief and specifically disclaim any implied warranties of fitness for a particular purpose. There are no warranties which extend beyond the descriptions contained in this paragraph. No warranty may be created or extended by representatives of Hanover Research or its marketing materials. The accuracy and completeness of the information provided herein and the opinions stated herein are not guaranteed or warranted to produce any particular results, and the advice and strategies contained herein may not be suitable for every member. Neither the publisher nor the authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages. Moreover, Hanover Research is not engaged in rendering legal, accounting, or other professional services. Members requiring such services are advised to consult an appropriate professional.

Page 224: OLSD Project 2020 Report

!"#$

%&"'()*(+(,-#.)/00(12('

3"2(45&6(#)50)#27'(82#)9:5);"<)=(8(>62)0&5;)2:6#)5??5&27862<

@59);"8<)#27'(82#)157,')7#(A=(8(>62B-&"84(C)D)50)#27'(82#.

35#2

35#2)?(&)?7?6,)-60)"??,61"=,(.

35#2)=(8(>62)"8",<#6#

E;?,61"2658AF7215;(#-?&5#A158#G?5#626+(#A8(4"26+(#.

H"16,62<)8(('#

@59)'5(#)62)1:"84()17&&(82)0"16,626(#

@59)'5(#)62)1:"84()17&&(82)#2"0>684

I7",62<)J"2684

K:"2)1581&(2()(+6'(81()(L6#2#)25)#7??5&2)#711(##)-?&5+6'()#7??5&2684)'517;(82"2658C)&(#("&1:C)(21M.B

H727&()6;?"12)58)#27'(82)1"&((&)?"2:)-15,,(4()158#6'(&"2658#C)15;?(2626+()15,,(4()"??,61"2658A2&"8#1&6?2C81""C)0727&()(;?,5<;(82)5&)+51"2658C)(21M.

N(&#58)J(#?58#6=,(

O% P Q R S T U V W X PY PP PQ

!<?61", N&(#1:55, EZN %602(' Z**

P Q R S T

!"!"#$%&%#'())*+&,(-

Page 225: OLSD Project 2020 Report

Dual Enrollment Olentangy Local Schools February, 2013 Initial Implementation The OLSD/CSCC Dual Enrollment partnership is scheduled to begin offering classes in 2013-­‐14. Three teachers have signed up and committed to teach a total of 7 sections of 5 different classes.

• ENGL 1100 x 2 sections (full year) • PHYS 1200 x 1 section • PHYS 1201 x 1 section • MATH1151 x 2 sections • MATH1152 x 2 sections

The lab capacity at CSCC is 24 students for Physics. The other classroom capacities are 28. If each available section maximizes enrollment, we could serve up to 216 student seats. Because PHYS 1200 and MATH 1151 are prerequisites for subsequent courses, students who take the second would have to been in the first. Therefore, the maximum possible number of students enrolled will be 136. Current Capacity Useable classroom space at Columbus State Delaware is mainly available between 7am and 11am. After 11am the classrooms reach full capacity with traditional college level courses. That means each teacher can only teach up to 4 sections of dual enrollment. In addition to our 3 current dual enrollment teachers, the district has 9 other teachers who meet the qualifications for at least 1 transfer-­‐assured dual enrollment course that could potentially be offered to Olentangy students. If all 12 qualified teachers taught 4 full sections of a dual enrollment course at CSCC-­‐D, we could serve up to 2,112 student seats. Because of prerequisites, that maximum number of students served would be 1,280. If these teachers had a separate facility to offer a full load of courses throughout the day (typically 6), the number of students served would expand to: 3,216 student seats and a maximum of 1,944 students served. It is important to note that “maximum number of students served” is calculated as though each served student takes only one course. If each dual enrollment student were to enroll in a full day’s worth of courses (6) at a that facility, our current staffing capacity would be 324 students who would each potentially earn 22 college credits in one year. Those courses would include:

• Fine Arts/Basic Drawing • Literature 1 and 2 • Composition 1 and 2

Page 226: OLSD Project 2020 Report

• Psychology • Physics 1 and 2 • Spanish 1 and 2 • Calculus 1 and 2

Potential Capacity If the district operated a “full day” dual enrollment program at a separate site and staffed the program accordingly with qualified personnel, and if the program sought to only provide transfer-­‐assured courses, the potential course capacity for the program is as follows:

• Fine Arts/Basic Drawing • Literature 1 and 2 • Composition 1 and 2 • Psychology • Physics 1 and 2 • Spanish 1 and 2 • Calculus 1 and 2 • Accounting 1 and 2 • Biology • Anatomy and Physiology • Chemistry 1 and 2 • Oral communications • Economics 1 and 2 • French 1 and 2 • German 1 and 2 • American History 1 and 2 • World History 1 and 2 • Marketing • American Government • Psychology • Sociology

The full spectrum of these courses offers approximately 106 total college credits and the maximum number of students served would be limited only by the size of the facility.

Page 227: OLSD Project 2020 Report

New program provides netbooks for every Licking Valley High School student 6:34 AM, Aug. 24, 2012 | 4 Comments

Licking Valley seniors Ashton Lovell, left, and Payton Roback learn about their new school laptops on the first day of school. Written by Anna Jeffries Advocate Reporter HANOVER -- Amanda Yates isn't using any textbooks or worksheets in her classroom this year. In fact, the science teacher probably won't be using any paper at all.

All of Yates' assignments and readings will be posted online on the website she created for her physical science classes.

Her students at Licking Valley High School will be able to access the site at home and at school on their new Intel netbooks.

Page 228: OLSD Project 2020 Report

When school started Thursday at LVHS, the district's new one-to-one computing program officially began. Every student was carrying a laptop, which will be theirs -- all day, every day -- for the remainder of their high school careers.

After two years of preparation and planning, teachers and administrators spent the first day of school talking with the students about how the netbooks will be used every day in their classrooms.

The new technology marks a major transition for students and staff, Principal Wes Weaver said.

"It will do nothing less than totally transform the instruction in and out of the building," he said. "I'm just ecstatic."

Out of the dead zone There was a time when cellphones and other devices weren't really welcome at Licking Valley, Superintendent David Hile said.

But limiting technology wasn't helping students prepare for a global world.

"A couple years ago, it dawned on me that our schools were technology dead zones," he said. "When students walked in, they had to turn off their cellphones and put them away. They only used computers in labs or classrooms and only a few students could use them at a time. That's not how the world works today."

Hile decided to start a one-to-one program at Licking Valley after seeing the success of a program in Hicksville.

"The question changed from, 'Should we do this?' to, 'How do we make this happen?'" he said.

A team was put together to figure out how to get laptops into the hands of every high school student. The group decided Intel netbooks were the best fit.

"It's an equalizing factor for our kids. Not all of them come from a home where there is a computer or Internet access," Weaver said. "It's leveling the playing field."

Page 229: OLSD Project 2020 Report

The one-to-one program cost the district about $422,400 for 704 netbooks and software, Hile said.

About $50,000 in upgrades were made to the high school to improve its wireless network.

The district paid for a portion of the project by not filling two teaching positions. Technology funds paid for the rest, Hile said.

With laptops at the high school, the district has to lease fewer desktop computers. Those savings were put toward the one-to-one project.

Families pay a $50 maintenance fee every year for the laptops. A help desk has been set up at the school so students can get technical support in the building, Weaver said.

More than 50 students went through training over the summer so they could work on the help desk.

Students and district employees spent the summer formatting all 704 laptops to get them ready for the first day of school. Although a manufacturing delay caused some of the laptops to arrive late, all of them were ready for students, Weaver said.

"It was one of the true miracle stories of this summer," he said.

Changing education Students and administrators weren't the only ones preparing for the new laptops.

Teachers spent months going through professional development and changing lesson plans to adapt to the new technology.

"You can't teach in the classroom when everyone has a computer the same way you do when everyone had a pencil," Hile said.

Teachers will be using videos, Google Docs and blogs in the classroom on a regular basis, Weaver said.

He recently created a Twitter page for the high school and a blog to keep students informed.

Page 230: OLSD Project 2020 Report

"This will increase the amount of communication between teachers and students," he said. "It will expand the world of the classroom."

Yates said she plans to provide her students with additional online readings and video tutorials with every lesson.

"If the kids aren't understanding what I'm saying, they can watch a video or read something else that will help them," she said.

Giving students access to laptops makes it easy for them to look up facts. The new challenge for teachers is to get students to think critically about what they find, said Brian Ledford, an algebra teacher.

"You can just see in the first couple hours here, they are being forced to use technology for reasons other than social media," he said. "They are going to learn to use technology to educate themselves."

Junior Amber Lacy and senior Samantha Davis said they are looking forward to carrying small laptops instead of heavy bags full of textbooks.

But the girls had other reasons to be excited.

Having computers in class will make it easier to prepare for college, Amber said.

"I think it will be better," Samantha said. "We are so used to it. That's what we do at home and now it's what we do at school. You might as well bring the technology in."

ajeffries@newark advocate.com; 740-328-8544 Twitter: amsjeffries

Page 231: OLSD Project 2020 Report

1

OLENTANGY LOCAL SCHOOL DISTRICT

Delaware County, Ohio

2/12 – 2/15/2012

N=402, +/- 4.88% (n=200 district parents and n=202 non-parents) (percentages may not add up to 100% due to rounding)

Q. 1. Generally speaking, would you say that your community is going in the right

direction, or has it gotten off onto the wrong track?

75.4% Right direction

11.9 Wrong track

5.2 Mixed/both (volunteered)

7.5 Unsure/no answer

Q. 2. Thinking about the Olentangy Local School District that serves your community,

would you say that it is generally going in the right direction, or has it gotten off onto the

wrong track?

70.4% Right direction

16.9 Wrong track

4.5 Mixed/both (volunteered)

8.2 Unsure/no answer

Q. 3. Do you currently have any children enrolled in an Olentangy Local School District

public school?

49.8% Yes

50.2 No

0 Unsure/no answer

Regardless of whether you have any children enrolled in an Olentangy Local School

District …

Q. 4. Overall, how would you rate the quality of education being provided by the

Olentangy Local School District? Would you say it is…(sequentially rotated)…excellent,

good, fair, poor or very poor?

90.3% TOTAL POSITIVE RATING

58 Excellent

32.3 Good

3.5% Fair

.7% TOTAL NEGATIVE RATING

.5 Poor

.2 Very poor

www.FallonResearch.com

Page 232: OLSD Project 2020 Report

2

5.5% Unsure/no answer

Q. 5. In your opinion, how would you rate the job the Olentangy Local School District

has done spending its money in an effective and responsible manner? Would you say it

is…(sequentially rotated)…excellent, good, fair, poor or very poor?

54.8% TOTAL POSITIVE RATING

19.2 Excellent

35.6 Good

26.1% Fair

10.2% TOTAL NEGATIVE RATING

6.2 Poor

4 Very poor

9% Unsure/no answer

Q. 6. Would you say that overcrowding of public schools in the Olentangy Local School

District is…(sequentially rotated)…a very big problem, somewhat of a problem, not

much of a problem or not a problem at all?

5% Very big problem

32.6 Somewhat of a problem

29.1 Not much of a problem

20.1 Not a problem

13.2 Unsure/no answer

Q. 7. Over the last few years, do you think that enrollment of students in schools in the

Olentangy Local School District has...(sequentially rotated)…increased, stayed about the

same, or decreased?

86.6% Increased

6.5 Same

.2 Decreased

6.7 Unsure/no answer

To give you some more information…

Despite the downturn in the housing market, the enrollment of the Olentangy Local

School District continues to increase because of growing families and district

students transferring from private schools. As a result, some of the Olentangy Local

School District school buildings are expected to exceed their capacity within the next

few years.

Page 233: OLSD Project 2020 Report

3

Q. 8. In order to accommodate the additional enrollment of students, do you think that

the School District should…(rotated)…begin making plans to build additional buildings

and classrooms, even though it will require more money and time to complete…or…use

the existing classroom space and technology to adapt to more students, even though it

will require some changes from the traditional ways education is delivered?

26.6% Begin making plans to build additional buildings

54.2 Use the existing classroom space and technology to adapt

10 Both/combination (volunteered)

9.2 Unsure/no answer

Looking more closely at technology…

One idea that is being considered is for the Olentangy Local School District to create

an on-line instructional program, enabling students to use the Internet and

technology for learning in various ways.

Q. 9. Generally speaking, do you think that this is a good idea or bad idea?

64.2% Good idea

24.4 Bad idea

5.5 Mixed opinion (volunteered)

6 Unsure/no answer

(RANDOMLY ROTATED NEXT 6 QUESTIONS)

Thinking more about this…

(PARENTS ONLY) Q. 10. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned that parents and students could choose

how many of their courses are taken on-line within school or from home?

68.5% More favorable

21.5 Less favorable

6 No difference/does not matter (volunteered)

4 Unsure/no answer

(PARENTS ONLY) Q. 11. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned that parents and students could choose

between a mix of on-line classes and traditional ones?

76.5% More favorable

15 Less favorable

5.5 No difference/does not matter (volunteered)

3 Unsure/no answer

Page 234: OLSD Project 2020 Report

4

(PARENTS ONLY) Q. 12. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned that students could choose to take on-

line courses in school, so they would have teacher supervision and could take remote

classes that would not otherwise be available to them?

78.5% More favorable

13 Less favorable

4.5 No difference/does not matter (volunteered)

4 Unsure/no answer

(PARENTS ONLY) Q. 13. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned that it could be offered in a way that

combines traditional teaching methods and individual learning opportunities, at the

student’s own pace?

76.5% More favorable

15 Less favorable

6 No difference/does not matter (volunteered)

2.5 Unsure/no answer

(PARENTS ONLY) Q. 14. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned that students, who take all of their

coursework at home, could still participate in sports and other extracurricular activities?

52.5% More favorable

32.5 Less favorable

12 No difference/does not matter (volunteered)

3 Unsure/no answer

(PARENTS ONLY) Q. 15. Would your opinion of the on-line instructional program be

more favorable or less favorable, if you learned students could receive a “blended-style”

of on-line learning in which they access much of the course content and material at

home, but do assignments in a classroom-setting that is supervised by a teacher?

65.5% More favorable

20.5 Less favorable

9 No difference/does not matter (volunteered)

5 Unsure/no answer

(PARENTS ONLY) Q. 16. Supposing for a moment that you had a child that wanted to

participate in one of the various forms of on-line instruction that might be offered, would

you allow him or her to do it?

69% Yes

15.5 No

12.5 Maybe/possibly/depends age (volunteered)

Page 235: OLSD Project 2020 Report

5

3 Unsure/no answer

(PARENTS ONLY) Q. 17. Supposing for a moment that you had a child that wanted to

take some of his or her classes from home, would you allow him or her to do it?

54% Yes

33.5 No

10 Maybe/possibly/depends age (volunteered)

2.5 Unsure/no answer

Another idea that is being considered is for the Olentangy Local School District to

create a dual-enrollment program, so high school juniors and seniors could take

classes for college credit from Columbus State University that would be guaranteed

to transfer to any college or university in Ohio…

Q. 18. Generally speaking, do you think that this is a good idea or bad idea?

90.5% Good idea

6 Bad idea

.7 Mixed opinion (volunteered)

2.7 Unsure/no answer

(RANDOMLY ROTATED NEXT 4 QUESTIONS)

Thinking more about this…

(PARENTS ONLY) Q. 19. Would your opinion of the dual-enrollment program be more

favorable or less favorable, if you learned that the classes would be taught at an existing

Olentangy Local School District building in another part of the district, so the classes

would only be for high school juniors and seniors?

74% More favorable

15.5 Less favorable

7 No difference/does not matter (volunteered)

3.5 Unsure/no answer

(PARENTS ONLY) Q. 20. Would your opinion of the dual-enrollment program be more

favorable or less favorable, if you learned that the classes would be taught by Olentangy

Local School District teachers who have Master’s Degrees and other credentials

comparable to college instructors?

75% More favorable

12.5 Less favorable

9.5 No difference/does not matter (volunteered)

3 Unsure/no answer

Page 236: OLSD Project 2020 Report

6

(SPLIT SAMPLE – Randomly assigned between versions A and B)

(PARENTS ONLY) Q. 21a. Would your opinion of the dual-enrollment program be more

favorable or less favorable, if you learned that the classes would be offered free of

charge, so students could start accumulating college credits without paying tuition costs?

91.3% More favorable

4.8 Less favorable

3.8 No difference/does not matter (volunteered)

0 Unsure/no answer

(PARENTS ONLY) Q. 21b. Would your opinion of the dual-enrollment program be more

favorable or less favorable, if you learned that the classes would be offered to students at

a tuition rate of approximately $25 per credit hour?

76% More favorable

14.6 Less favorable

5.2 No difference/does not matter (volunteered)

4.2 Unsure/no answer

(PARENTS ONLY) Q. 22. Supposing for a moment that you had a child that wanted to

participate in the dual-enrollment program that might be offered, would you allow him or

her to do it?

89% Yes

4.5 No

6 Maybe/possibly (volunteered)

.5 Unsure/no answer

Finally, I have a few short questions for statistical purposes...

Q. 23. I would like to read you a list of age groups. Please stop me when I get to the one

you are in.

3.7% 18 to 29

32.3 30 to 44

33.3 45 to 59

29.9 60 and older

.7 Refused

0 Unsure

Gender:

48.3% Men

51.7 Women

Page 237: OLSD Project 2020 Report

Olentangy Local Schools: Project 2020 Summit

OLENTANGY | LOCAL SCHOOL DISTRICT Project 2020 Summit: Summary of Findings January 16-17, 2013

Page 238: OLSD Project 2020 Report

2

Olentangy Local Schools: Project 2020 Summit

Executive Summary

In January 2013, DeJONG-RICHTER facilitated a Project 2020 Summit.

A diverse group of parents and OLSD staff were invited to view a se-

ries of informative presentations and participate in group discussions

throughout both days. The intent of this summit was to review the in-

formation developed and gathered by the 2020 committee and De-

JONG-RICHTER, identify potential challenge areas regarding pro-

jected facility utilization, and explore possible solutions to those issues.

Detailed information including results of all of the group discussions

can be found later in this report. These first three pages are intended

only as a summary of findings.

Presentation I: Demographics and Capacity

In November 2012, DeJONG-RICHTER provided

a capacity analysis of all middle and high

schools in the district. To the committees as-

sumption, the historical has been that the ca-

pacities for each middle and high school is 900

and 1,600 students respectively. In this new

analysis, DeJONG-RICHTER assessed each facil-

ity assuming that 25 students would be allo-

cated to each teaching station. A teaching

station is defined as a room that has students

assigned to it in the master schedule throughout

the day. (A resource computer lab that teach-

ers can reserve to take a class to is not counted

as a teaching station in this analysis.) Once all

of the teaching stations are counted, a maxi-

mum capacity is calculated for each facility.

That is the maximum number of students that

can fit into a building assuming that every teaching station is filled to

the maximum of 25. A load factor of 85% is then applied to the maxi-

mum capacity. This gives the facility room for teachers to have plan-

ning periods, and flexibly of course offerings. The table to the right

illustrates the program capacities for each of the middle and the high

schools. These program capacities represent 85% of the maximum

capacity of each facility.

School Capacity

BMS 1,099

HMS 1,071

LMS 1,001

OMS 1,033

SMS 1,237

Total 5,441

LHS 2,061

OHS 1,743

OOHS 1,968

Total 5,772

Middle Schools

High Schools

Program Capacity

The following table shows the projected enrollment vs. the pro-

gram capacity for each of the middle and high school facilities

over the next 10 years. Detailed charts showing the elementary

utilization can be found later in this document.

School 2013-14 2014-15 2015-16 2016-17 2017-18 2018-19 2019-20 2020-21 2021-22 2022-23

BMS 75.6% 80.1% 85.6% 89.5% 95.3% 101.7% 104.2% 100.8% 97.6% 96.4%

HMS 74.6% 75.9% 77.3% 77.3% 75.2% 70.7% 70.0% 69.6% 70.3% 69.7%

LMS 83.0% 80.8% 79.3% 81.6% 81.3% 76.3% 74.2% 74.1% 76.8% 75.9%

OMS 87.0% 93.1% 99.2% 103.6% 110.5% 107.3% 105.7% 101.8% 105.5% 104.4%

SMS 78.5% 83.6% 92.0% 97.5% 96.9% 92.5% 90.8% 88.8% 88.2% 87.2%

Total 79.5% 82.6% 86.8% 90.1% 92.1% 90.1% 89.5% 87.4% 87.9% 87.0%

LHS 86.4% 92.1% 98.2% 100.8% 103.1% 105.0% 102.7% 100.4% 96.6% 94.7%

OHS 85.4% 97.0% 106.7% 113.7% 123.6% 132.0% 141.8% 151.9% 151.8% 149.3%

OOHS 77.6% 85.8% 92.6% 100.3% 107.2% 112.0% 118.5% 122.7% 125.6% 127.9%

Total 83.1% 91.4% 98.9% 104.5% 110.7% 115.6% 119.9% 123.6% 123.1% 122.5%

Utilization @ 85% Load Factor

Page 239: OLSD Project 2020 Report

3

Olentangy Local Schools: Project 2020 Summit

Executive Summary

Questionnaire & Discussion I

At the conclusion of the presentation those attending the summit

we divided into the groups and given a questionnaire to com-

plete. The goal of the questionnaire was to identify specific points

in the future when action will be required, either at an individual

school or districtwide. Below are general observations of those dis-

cussions. Details can be found later in this report.

Elementary

The overall sentiment was that utilizations of 100% were accept-

able for individual elementary schools. Districtwide utilization fell

within similar margins. Many groups commented that student

achievement needed to be considered above utilization.

Middle Schools

The feedback regarding utilization at the middle school level was

not as consistent. Some groups thought that the utilization should

be kept at 95% or lower at the individual school level. Indications

were that discipline problems were more prevalent at higher utili-

zations. Others thought that 100% utilization or greater was ac-

ceptable with additional professional space for staff. They also

commented that high utilizations have already been experienced

at some middle schools.

High Schools

Most of the groups thought that 95% was an acceptable utilization

for the high schools citing logistical and disciplinary concerns.

Other groups skipped ahead and commented on solutions that

would allow higher utilizations to be acceptable.

Benefits and Challenges of Increased Utilization

Next the groups were asked to list some benefits and challenges of

increased utilization in general.

Benefits include:

Financial - No capital investment in new facilities

Social - Redistricting is tough on the students

Challenges include:

Resistance to change

Discipline, safety

Creative scheduling

Page 240: OLSD Project 2020 Report

4

Olentangy Local Schools: Project 2020 Summit

Executive Summary

Presentation II: The Future

The purpose of this presentation and following exercises was to share

insight about what the future of education may look like, in order to

better inform solution development.

Some of the concepts include:

Partnerships with higher education

Online course offerings

Multi-disciplinary teaching (Interdepartmental collaboration)

Multiple Modalities - Students accessing the same curriculum via

different media

Student Centered Learning

Teacher as a facilitator and resource to the students

Envisioning educational tools of the future

Questionnaire & Discussion II

At the conclusion of the presentation all the groups were given a

questionnaire to complete. The goal of the questionnaire was to get

the groups future thinking and foster a high level discussion before

talking about solutions. Below are some general observations of

those discussions. Details can be found later in this report.

In general the groups were very receptive to these ideas. After the

presentation they went into group discussion.

The groups were very accepting of partnerships with outside institu-

tions and professionals to supplement existing curriculum.

When asked about addressing ever-changing technology, the groups

thought that allowing students to bring their own technology was very

cost effective, however, considerations need to be made for families

that may not be able to afford some devices. There would also need

to be some restrictions with some of the devices, for example, no tex-

ting during class.

When the groups were asked about how they felt about the level

of technology integration into the standard curriculum, some of

the groups thought that it was adequate, some inadequate, and

others were not sure.

Comments include:

OLSD is very conservative with its technology expenditures,

therefore the PTOs raise a lot of money to supplement.

There are not enough laptops at the elementary level.

The level of technology integration can vary by teacher.

The network infrastructure struggles to keep up with all of the

personal technology.

When asked about student preparedness in general, comments

include:

Students are prepared to enter college, but we are unsure

about their surviving / excelling in college. Can we track this?

Olentangy seems to be producing students who are highly pre-

pared for college and careers.

When asked about facilities in general, comments include:

We are blessed with quality modern facilities

Setup is not conducive to collaborative environments for cross

curricular collaboration.

We are wired and ready, but curriculum is probably not fully

utilizing it.

Use the facility beyond the school day.

Page 241: OLSD Project 2020 Report

5

Olentangy Local Schools: Project 2020 Summit

Executive Summary

Presentation III: Solutions

The purpose of the solutions presentation was to evaluate possible

solutions to the problem areas that were identified during the first

presentation.

Solutions presented include: Building additions onto existing facilities

Career readiness facility

Dual enrollment

Building a flexible facility - Able to adapt to different grade con-

figurations Grade 10 - 14 facility

Grade configuration change

Independent study

New construction - Status quo Online courses

Open enrollment within district

Redistricting

Split schedule - Two separate shifts

Staggered schedule - Early start, late end, and flexible schedul-

ing throughout. Student overflow

Questionnaire & Discussion III

At the conclusion of the presentation the groups were given a ques-

tionnaire to complete. The goal of this exercise was to evaluate and

apply potential solutions to the problem areas identified in the first ex-

ercise. Below are some general observations of those discussions.

Details can be found later in this report.

There was a lot of support solutions that did not require a major

capital investment.

All of the groups agreed with the following as viable solutions:

Dual Enrollment

Grade configuration change

Independent study - two groups had no opinion

Online courses

Staggered schedule

The following solutions did not receive much support from the

groups:

10 - 14 Facility

New Construction

Open Enrollment with the district

Split Schedule

Overflow

Next, the groups developed strengths and challenges for each of

the solutions listed. Detailed results can be found later in this re-

port.

Conclusion:

The groups participating in this summit are not necessarily repre-

sentative of the community at large, but they have shared insight

into some of the issues at hand. It did not appear that there was

much support to continue operating along the path of continuous

school construction to accommodate projected growth. The

groups were open to change and showed willingness to adapt.

This summit should be considered as the summation of Project 2020

Committee efforts and the first step towards a comprehensive

planning process to be supported by district officials, school ad-

ministrators, faculty, and the community at large.

Page 242: OLSD Project 2020 Report

6

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion I: Detailed Results

1. Which percentage of utilization should an individual elementary

school reach before a change would be necessary?

2. Which percentage of utilization, for elementary schools district-

wide, should facilities reach before a change would be necessary?

Please list any comments you may have regarding questions 1 and

2.

There is a consistent trend over couple years. Don't respond to

a specific spike. Consider student achievement. K-2 student ra-

tio should be lowest. Let's consider OLS mission statement:

maximize learning.

Some schools already at or above capacity and functioning.

There is a comfort level in the community of less utilized build-

ings.

Why don't we have a choice of a lower % utilization?

For question 2 we felt that 100% is doable so this should be con-

sidered the minimum. We might be able to push this a little but

at 105% is definitely necessary. Question 3: Taking this on an av-

erage across the district does not seem practical.

Districtwide consensus concerns us because of the individual

nature/student needs of each school's demographic. Our

group would be more comfortable with 85-90% load capacity.

This needs to take into account pre-school and SLC numbers.

The plan to act at 105% must be developed prior the actual

need, so that the plan is enacted immediately.

Consideration for preschool and specialized learning student

numbers need to be taken into account for districtwide utiliza-

tion. This year there are 24 classrooms and 16 office spaces.

95% Utilization

100%

Utilization

105%

Utilization

Maximum Individual Elementary School Utilization

Theshold

2 Groups2 Groups

4 Groups

95%

Utilization

100% Utilization

105% Utilization

Maximum Districtwide Elementary Utilization Theshold

3 Groups2 Groups

3 Groups

Page 243: OLSD Project 2020 Report

7

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion I: Detailed Results

3. Which percentage of utilization, for elementary schools district-

wide, should facilities reach before a change would be necessary?

(Total utilization for all 15 elementaries)

4. Which percentage of utilization, for middle schools districtwide,

should facilities reach before a change would be necessary? (Total

utilization for all 5 middle schools)

Please list any comments you may have regarding questions 3 and

4.

More flexibility in curriculum style (online courses) and higher

utilization/higher class sizes because of flexibility with regard to

the maturity and responsibility levels

For the middle school series of questions we actually think the

percentage utilization should be lower than the choices of-

fered.

There would need to be additional professional space for staff.

We have already experienced this amount of utilization at the

middle school level.

Lunch, safety, and discipline issues when over 95%. 95% Utilization

100%

Utilization

110%

Utilization

>110% Utilization

Maximum Individual Middle School Utilization Theshold

4 Groups2 Groups

1 Group

1 Group

95%

Utilization

105%

Utilization

110%

Utilization

>110%

Utilization

Maximum Districtwide Middle School Utilization Theshold

4 Groups2 Groups

1 Group

1 Group

Page 244: OLSD Project 2020 Report

8

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion I: Detailed Results

5. Which percentage of utilization should an individual high school

reach before a change would be necessary?

6. At which percentage of utilization, for high schools districtwide,

should facilities reach before a change would be necessary? (Total

utilization for all 3 high schools)

Please list any comments you may have regarding questions 5 and

6.

Balance blended approach for students utilizing technology

and more traditional learning.

Need to creatively challenge existing logistic assumptions. Our

answers are only as good as our assumptions.

For the high school series of questions we think the percentage

utilization should be lower than the choices offered.

This would consider creative scheduling and programming. i.e.

flexible periods (6:30A-6:30P) and courses offered in certain

buildings.

The averages presented seem flawed and skew the data be-

cause 3rd period is the only true reflection of load capacity.

There are many flexible options that have yet to be utilized to

handle these type of percentages.

Lunch, safety, and discipline issues when over 95%

95% Utilization

105% Utilization

>110%

Utilization

Maximum Individual High School Utilization Theshold

4 Groups3 Groups

1 Group

4 Groups3 Groups

1 Group

95%

Utilization

>110%

Utilization

Maximum Districtwide High School Utilization Theshold

4 Groups3 Groups 4 Groups3 Groups

Page 245: OLSD Project 2020 Report

9

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion I: Detailed Results

What are the benefits and challenges of allowing increased utilization

(overall)?

Group Responses

Benefits: financial, social (redistricting is so tough for the students),

and academic. Technology would give options to meet individual

student needs. Challenges: space, ineffective student to teacher

ratio, and traffic flow.

Allows us to come up with creative solutions. Any changes must

not impact quality. Challenges include resistance to change.

Benefits are saving the cost of building to ensure we don't have un-

derutilization in the future. The challenges include figuring out how

to accommodate the needs of the students with the over crowd-

ing. Decreased operational efficiency, risk management, linguis-

tics and operations are all challenges.

Benefit - We don't have to build as many buildings and lower oper-

ating costs (ex. transportation/staffing). Challenges - Fitting stu-

dents into space, keeping everyone happy, transpiration, feeding

students, shorter life spend of furniture and equipment, higher wear

and tear, and extra curricular actives, we felt, provide both in-

creased and decreased opportunity

Benefit = Not building new buildings. Challenges = Creative sched-

uling, blocks, and day schedule.

Challenge is the current paradigm--Do students need come to

school all day? Can they earn credit for internship opportunities?

Must we use the current model for the master schedule building

process? Why can't teachers offer classes at night, on weekends,

and to students at all 3 high schools rather than their home school?

What about the negotiated agreements? Transportation is a huge

challenge as well for underclassmen.

Benefits - Allows us to use our present space in an increasingly effi-

cient and flexible manner. Challenges - Change in current philoso-

phy and ensuring that our students continue to receive an excel-

lent education.

Benefits: Cost savings Challenges: Safety, lunch, and discipline

What are the benefits and challenges of allowing decreased utili-

zation (Overall)? (Decreased = < 75%)

Group Responses

We will all be long gone before that happens.

Benefits would obviously be plenty of space for initiatives, test-

ing rooms, etc. Challenges are operational expenses.

Benefits: Smaller class size and less wear and tear. Challenges:

Operating costs/student go up and community perception.

We do not see this as reality in OLS. Space would be more

available. Discipline would be lower, common areas would be

less crowded.

The benefits are more scheduling flexibility and course offerings

for kids and teachers. Is it fair to assume greater student safety

in terms of class exchanges, fire drills, and other emergency

situations?

Benefits - More opportunities for students and smaller class sizes.

Challenge - Inefficient use of taxpayer dollars and lack of sus-

tainability.

Benefits: Safe environment and less discipline issues. Flexible

spacing can improve student achievement.

7. In your opinion, how important is equal access to the same qual-

ity programs at each facility in the district?

5 1 2

0% 20% 40% 60% 80% 100%

Group Responses

Very Important

Important

No Opinion

Not Important

Page 246: OLSD Project 2020 Report

10

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion II: Detailed Results

1. In your opinion, how important are the following areas when preparing students for the year 2030?

2. Considering that technological advances are continually occurring, what do you believe is the most cost effective way for the District to

handle those advances?

BYOT (Bring your own technology) with the district possibly supplementing as needed.

Subsidized device vouchers from the district in lieu of textbooks for parents to purchase devices.

Individual contributions with grants and donations.

Partner with other agencies. Utilize others for ownership of technology.

Planning for use of personal devices. Leveraging our size for purchasing opportunities for parents.

Allow students to bring their own technology and provide technology for those who do not have it.

Sounds like a question for a technology committee.

Content; Professional Development.

4

5

1

5

7

6

5

2

2

1

3

1

2

3

2

1

3 2 1

0% 20% 40% 60% 80% 100%

A. Increasing academic rigor (e.g. …

B. Integration of career readiness into …

C. Expanded focus on college …

D. Joint ventures with private industry …

E. Collaboration with post-secondary …

F. Creating flexible learning …

G. Innovation and learning using …

Group Responses

Very Important

Important

Somewhat Important

Not Important

No Opinion

Page 247: OLSD Project 2020 Report

11

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion II: Detailed Results

3. What are your thoughts about personal technology (cell/smart

phones, tablets, etc.) used as learning devices in the classroom?

4. Please write any general comments you have regarding Question 3.

Students need to be using technology for appropriate learning.

Students need an acceptable use policy. We have concerns about

privacy, academic honesty, abuse/harassment, and safety.

Connectivity and who pays for it?

The restrictions would be about managing the devices to limit distrac-

tions and interruptions. School policy must identify acceptable use.

We assume restrictions are network restrictions.

Having students bring their personal technology is great but also pre-

sents many other issues (security, students who don't have devices,

etc).

Varies depending on age, digital citizenship, internet responsibility,

and technology etiquette.

5. How would you rate the overall level of technology integra-

tion into the program delivery for each grade level?

1

1

1

1

7

7

7

7

0% 20% 40% 60% 80% 100%

A. Cellular/Smart Phone

B. Tablet

C. Personal Laptop

D. School Issued Laptop

Group Responses

Acceptable

Acceptable w/ Restrictions

Not Acceptable

No Opinion

1 2

2

3

3

3

3

2

3

2

0% 20% 40% 60% 80% 100%

A. Elementary Schools

B. Middle Schools

C. High Schools

Group Responses

Very Adequate

Adequate

No Opinion/Don't Know

Inadequate

Page 248: OLSD Project 2020 Report

12

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion II: Detailed Results

6. Please share any general comments you have regarding technol-

ogy in Olentangy Local Schools.

What is available is a result of teacher-initiated work (grants) and

individual teacher's interest in and comfort level using technology.

Varies by teacher and content. Need time for training and profes-

sional development

We should create ability for students to connect but leave the de-

vices to the students. Adequate to current standards but probably

inadequate to where we are headed.

Olentangy is very conservative with their technology expenditures.

We could invest more in technology. Parent and Teacher Organiza-

tions are spending a significant amount of money on these items.

There are not enough laptops or tablets at the elementary level.

Only 1-2 carts of laptops for 675 students. There are virtually No tab-

lets at all unless they donated.

Our network is slow. Technology is lagging. We need more profes-

sional development offered on our professional development days.

Varies depending on the teacher. Maybe have a tech support per-

son at each school all day.

7. Please share any general comments you have regarding student

preparation in Olentangy Local Schools.

We feel that when stressing increased rigor we mean that it individu-

alized according to what rigor is for each student.

Students are prepared to enter college, but we are unsure about

surviving/excelling in college. Can we track these students?

Olentangy seems to be producing students who are highly pre-

pared for college and career.

Our students are well prepared.

We would like data on how many of our students have to take re-

medial courses at college and how many graduate from college.

We think we are providing a quality education.

We are very happy.

8. Please share any general comments you have regarding fa-

cilities in Olentangy Local Schools.

We are blessed with quality and modern facilities.

Set up is not conducive to collaborative environments, cross

curricular collaboration and charging stations would be

needed.

We are wired and ready but curriculum is probably not fully

utilized.

For the most part, but we have beautiful facilities.

9. Please share any general comments you have regarding facil-

ity usage in Olentangy Local Schools.

We need to shift our thinking about what school looks like and

how to get teachers to be/feel empowered.

Would like to explore that professional work space. We lack in

large group testing rooms.

Use facility beyond the school day.

Page 249: OLSD Project 2020 Report

13

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

1. In order to successfully accommodate future growth and utilization of facilities, please rate your level of agreement with the following po-

tential solutions.

3

4

3

1

1

4

4

1

5

1

3

4

3

4

1

1

3

3

2

1

1

1

3

1

2

1

1

1

1

2

2

3

2

1

1

1

2

1

3

1

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

A. Building Addition(s) to Existing Facilities

B. Career Readiness Tech. (Vocational)

C. Dual Enrollment

D. Flexible Facilities

E. Grade 10-14 Facility

F. Grade Configuration Change

G. Independent Study

H. New Construction

I. Online Courses

J. Open Enrollment Within District

K. Redistricting

L. Split Schedule

M. Staggered Schedule

N. Student Overflow

Group Responses

Agree

Strongly Agree

No Opinion

Disagree

Strongly Disagree

Page 250: OLSD Project 2020 Report

14

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

2. In order to successful accommodate future growth and utilization of facilities, please rate your level of agreement with the following poten-

tial solutions. (Each group was asked to select two of the provided solutions that they believe to be the best fit)

0 1 2 3 4 5 6 7

Additions to Existing Facilities

Career Readiness Facility

Dual Enrollment

Flexible Facility

Grade 10-14 Facility

Grade Configuration Change

Independent Study

New Construction

Redistricting

Online Courses

Open Enrollment within District

Split Schedule

Staggered Schedule

Overflow

# of times selected

Page 251: OLSD Project 2020 Report

15

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

3. Please list any strengths and/or challenges for implementing

building addition(s) to existing facilities.

Notes:

Not agree with adding trailers, only modification to existing.

Include considerations for common areas, travel to and from

classes that's not overly congested

4. Please list any strengths and/or challenges for implementing

career readiness tech (vocational).

Notes:

How many students would one of these facilities accommodate?

5. Please list any strengths and/or challenges for implementing

dual enrollment.

6. Please list any strengths and/or challenges for implementing

flexible facilities.

Strengths

Allows for flexible use

Keeping students in their facility

Challenges

Building as extension to HES

should not look like current build-

ing.

Cost

Problems feeding the kids

Community support/tax dollars,

timing-how long will it take?

Possible future underutilization,

traffic flow.

Strengths

Add career building to current

building, 1.e. HES due to location.

Meeting student needs

Challenges

Parent/student expectations

Would enough students enroll?

Community by-in-perception,

expense

Strengths

Offers college credit, fiscally pru-

dent, team with higher ed in our

district

Great opportunity for some stu-

dents to earn college credit.

Challenges

Teachers who teach are given

time to travel. Suggest to place in

blended building.

Limited number of students who

qualify to take these courses, not

enough qualified teachers per

CSCC standards, not enough

courses offered at this time to

have an impact on moving stu-

dents out of the building.

Strengths

Centrally locate to increase

learning opportunities to house

blended learning opportunities

(i.e. on-line, career programs,

dual enrollment, independent

studies, Mentorships/Internships, P

-K, high risk students)

Can be used for anything, but

there's a capital cost, but could

be used for career readiness, pre-

school, or flexible grade configu-

rations.

Challenges

Transportation, limits if building is

set up for certain grade level in

terms of student's age/size.

Page 252: OLSD Project 2020 Report

16

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

7. Please list any strengths and/or challenges for implementing

grades 10-14 facility.

8. Please list any strengths and/or challenges for implementing

grade configuration change.

9. Please list any strengths and/or challenges for implementing

independent study.

10. Please list any strengths and/or challenges for implementing

new construction.

Strengths

Challenges

20 year olds with 16 year olds

Already exists with Columbus State

Keeps students in our buildings

longer

Concerned about starting to in-

clude collegiate considerations in

addition to our own K-12. Not rec-

ommend.

Terrible idea to keep students be-

yond graduation when we don't

have facility space

Strengths

Grade 6 building, or K-6 building

(depending on building needs),

grade 11-12 programs in flexible

facility

Targeted possibilities to address

growth issue

LOVE this one! Our first choice by

far. Good for kids and good for

facility usage. Elementaries are

easier and less expensive to build.

Challenges

Teachers will be moved

Short term solution

Community support

Strengths

Gives another option for students

Student flexibility and can partner

with another possibility

Can be good if done well. Sup-

ports project based learning.

Gives students more options

Challenges

Difficult for teachers to manage.

Teacher supervision, student ac-

countability, not sure it alleviates

overutilization.

Strengths

It is traditional, what parents know,

less travel for students, safer New

construction for new facility flexible

programs

Challenges

Financial, future vacancy based on

growth

High cost, upset tax payers, poor

economy

Cost, future underutilization

Page 253: OLSD Project 2020 Report

17

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

11. Please list any strengths and/or challenges for implementing

online courses.

13. Please list any strengths and/or challenges for implementing

redistricting.

12. Please list any strengths and/or challenges for implementing

open enrollment within the district.

14. Please list any strengths and/or challenges for implementing

split schedule.

Strengths

Convenient, opens space, dif-

ferent learning styles, 24-7, no

building

Support but only as part of the

solution...not whole solution.

Students aren't taking up seats

Challenges

Must be rigorous, must be OLS cur-

riculum, convenient, need student

teacher support, technology must

be in place

High quality courses have to be

maintained. Has to have teaching

interaction.

Teacher supervision, vetting the

online courses, giving teachers

time to develop the courses with

our curriculum

Strengths

As needed...Make new friends!!

already do it...always an option

If it’s one of the only options, it

would be ok.

Solves a problem temporarily

Challenges

We realize this creates chal-

lenges!

Been there done that; please

don't make us do it again. We'd

rather look at other options first.

Shifting students many times in

their school career is very difficult

for the student/families, bad pub-

lic relations, transportation

Strengths

Centralize programs in flexible

program building, courses

available to all so need to

move schools (must be ath-

letes in home school)

Challenges

No solutions ~ doesn't reduce

number of students

Horrible idea. Could close out

students from closest school.

Could fill up one building and

leave another one half empty.

We have not seen any positive

results from other districts. Trans-

portation costs.

Processes of how students open

enroll...will we stack athletic

teams-OOHSA guidelines, will this

alleviate overutilization over

time?

Strengths

Flexibility

Double capacity

Challenges

Sports, support staff

First shift vs. second shift makes

additional staff needed; restricts

participation in extra curriculars.

Feels as if it would make two

separate schools within one

school building. Extra curriculars

would have to be expanded.

Maybe athletics, possible in-

crease in staffing costs, commu-

nity may not be supportive

Page 254: OLSD Project 2020 Report

18

Olentangy Local Schools: Project 2020 Summit

Questionnaire & Discussion III: Detailed Results

15. Please list any strengths and/or challenges for implementing

staggered schedule.

16. Please list any strengths and/or challenges for implementing

student overflow.

Challenges

Holes during the day for students.

Transportation problems.

Staffing, transportation

Strengths

We do this at elementaries

Done as needed

Challenges

Too difficult for the students; not

student centered. Seems as if it's

a band aid solution.

Similar to redistricting

Strengths

More flexibility

Provides good flexibility. More like

college. Could be done with

upperclassmen. Would require

creative scheduling techniques.

Could be a viable option for

overutilization