Bootstrapping up your Analytics

  • View

  • Download

Embed Size (px)


Bootstrapping up your Analytics. Vernon C. Smith, Ph.D. Vice President, Academic Affairs Rio Salado College October 3, 2011. Objectives. Review of Rio Salado as “ The Outlier ” Examine RioPace as predictive model Identify the steps in bootstrapping predictive modeling. - PowerPoint PPT Presentation

Text of Bootstrapping up your Analytics


    Vernon C. Smith, Ph.D.Vice President, Academic AffairsRio Salado College

    October 3, 2011

    Presentation 1

  • OBJECTIVESReview of Rio Salado as The OutlierExamine RioPace as predictive modelIdentify the steps in bootstrapping predictive modeling

  • THE OUTLIER You really shouldnt exist.

  • Located in Tempe, AZ. Part of the Maricopa County Community College District. Largest Public, Non-Profit Online 2-year college.

    FY10-11 Total Unduplicated Headcount: 69,619*43,093 distance students.**

    Unique attributesOne course, many sections 48 Weekly start dates23 Faculty 1,300 + Adjunct FacultyRioLearn, highly scalable LMS

    * Includes credit, non-credit, & ABE/GED.** Includes students who took online, print-based, or mixed media courses.


  • WHY YOUR INSTITUTION SHOULD BE DEVELOPING PREDICTIVE ANALYTICS?Tremendous growth in online community college enrollment (Allen and Seaman, 2008). Need practical institutional responses to the challenges of online retention and success.

    To identify at-risk online students and drive early alert systems.

    Facilitate and strengthen linkages between instructors and at-risk students within a dynamic online environment.

  • THE MODELS What if you could use this for good?

  • FIVE STEPS OF ANALYTICS (CAMPBELL & OBLINGER, 2007)CaptureReportPredictActRefineCharterCaptureReportPredictActRefine


  • RIO SALADOS JOURNEYWhich factors are effective as early/point-in-time predictors of course outcome* in an online environment?

    Can we predict course outcomes using data retrieved from our SIS and LMS? If so, can we generate early and/or rolling predictions?

    How do we respond to students flagged as at-risk?* Successful = C grade or higher.

  • CAPTUREFive Steps of Analytics (Campbell & Oblinger, 2007)


  • 1010100010110101010100010110101010100010110101010100010110101100110101000101101011001101010001011010110011010100010110101100110101000101101010101000101101011001RioLearnCRMLocalized ADDistrict PeopleSoftPartnershipRDS NightlyCDSHourlyAdd blue and reporting serverSQL Reporting Blue /Explorance

    Presentation 1

  • PREDICTIVE LMS FACTORS = LSPLoginsFrequency of log ins to the course section homepageSite engagementFrequency of LMS activities that suggest at least some level of course engagement (e.g. Opening a lesson, viewing assignment feedback, etc.)PaceMeasured using total points submitted for grading

  • REPORTFive Steps of Analytics (Campbell & Oblinger, 2007)

  • LOGINSChemistry 151 Summer & Fall 2009 (Nunit 3 = 159)

  • SITE ENGAGEMENTChemistry 151 Summer & Fall 2009 (Nunit 3 = 159)

  • PACEChemistry 151 Summer & Fall 2009 (Nunit 3 = 159)

  • ACTIVITY WEIGHTINGIn a practical application, recent behavior is most relevant.

    Log in and site engagement factors weighted based on when the event occurred relative to the course start and end dates.

  • ACTIVITY WEIGHTINGChemistry 151 Summer & Fall 2009 (Nunit 3 = 159) Weighted log ins Weighted site engagement

  • PER-WEEK** CORRELATIONS* Significant at the .05 level.** Scaled weeks (16-unit scale)Significant correlation between log ins and course outcomeSignificance of correlation increases throughout duration of course. Similar findings with other LMS activity measures

    FactorStatisticWeek (Scaled)**3456789Log inPearson r0.162, p=0.041*0.136, p=0.0890.148, p=0.0650.149, p=0.0670.176, p=0.031*0.178, p=0.036*0.212, p=0.016*Spearman 0.146, p=0.0660.082, p=0.3090.087, p=0.2820.098, p=0.2320.124, p=0.1320.153, p=0.074*0.18, p=0.041*Weighted Log inPearson r0.103, p=0.1980.072, p=0.3670.109, p=0.1770.127, p=0.1180.191, p=0.019*0.198, p=0.020*0.258, p=0.003*Spearman 0.086, p=0.278-0.004, p=0.9550.023, p=0.7780.087, p=0.2860.179, p=0.029*0.232, p=0.006*0.272, p=0.002*FactorStatisticWeek (Scaled)**10111213141516Log inPearson r0.218, p=0.014*0.231, p=0.009*0.246, p=0.006*0.247, p=0.006*0.269, p=0.002*0.273, p=0.002*0.288, p=0.001*Spearman 0.218, p=0.015*0.226, p=0.011*0.244, p=0.006*0.258, p=0.004*0.288, p=0.001*0.273, p=0.003*0.324, p=0.000*Weighted Log inPearson r0.218, p=0.016*0.274, p=0.002*0.295, p=0.001*0.285, p=0.001*0.32, p=0.000*0.273, p=0.004*0.336, p=0.000*Spearman 0.218, p=0.017*0.305, p=0.001*0.335, p=0.000*0.354, p=0.000*0.381, p=0.000*0.273, p=0.005*0.415, p=0.000*

  • PREDICTFive Steps of Analytics (Campbell & Oblinger, 2007)

  • PREDICTIVE MODEL #1 (8TH DAY AT-RISK)PurposeRun only on 8th day of class. Derive estimated probability of success and generate warning levels: Low, Moderate, High. Factors30 factors selected covering broad spectrum of LMS behavioral data and enrollment information. Methodology - Nave Bayes classification modelAccurate, robust, fast, easy to implement. (Lewis, 1998); (Domingos & Pazzani, 1997) Accuracy**70% of unsuccessful* students correctly predicted for 6 participating disciplines.Warning levels correlated with course outcome.* Success = C or higher** Tested using random sub-sampling cross-validation (10 repetitions)

  • REFINEFive Steps of Analytics (Campbell & Oblinger, 2007)

  • PREDICTIVE MODEL #2 (RIO PACE)Rio Progress And-Course EngagementInstitutionalization of predictive modeling into LMS at Rio SaladoPiloted April 2010Automatically updates weekly (every Monday)Integrated within RioLearn course rosters

  • PREDICTIVE MODEL #2 (RIO PACE)Warning levelsGenerated using nave Bayes model with 5 input factorsWeighted log-in frequencyWeighted site engagementPoints earnedPoints submittedCurrent credit loadHigh warning = Student has low probability of success if his/her current trajectory does not change.

  • PREDICTIVE MODEL #2 (RIO PACE)Activity metricsLog inExcellent, Good, or Below AverageSite engagementExcellent, Good, or Below AveragePaceWorking ahead, Keeping pace, Falling behind

  • PREDICTIVE MODEL #2 (RIO PACE)Warning level distributionDistribution is approximately uniform at beginning of class. Moderate decreases and Low/High increases over time.

  • PREDICTIVE MODEL #2 (RIO PACE)Accuracy*Other courses*Obtained using random sub-sampling cross-validation (50 repetitions)



  • ACTFive Steps of Analytics (Campbell & Oblinger, 2007)

  • AT-RISK INTERVENTIONSCourse welcome emailsEncourage students to engage early. Gen-ed students who log in on the 1st day of class succeed 21% more often than students who do not.*Small trial in Fall 09 showed 40% decrease in drop rate.Could not duplicate when expanded to large scale more investigation needed.

    8th day at-risk interventionsTrial in Sum & Fall 09 showed no overall increase in success.Low contact rate difficult for faculty to reach students.However, students who did receive direct contact succeeded more often than those who were unreachable.*Obtained from Spring 2009 online general education courses at Rio Salado College.

  • ROLES FOR SUCCESSFUL PREDICTIVE MODELINGProject Champion/Institutional Support Predictive modeling requires resources and someone who can champion the cause.Stakeholders this could include administration, faculty, student services and people from the IT department. The stakeholders need to be willing to review models and provide insight and feedback as the model is developed.IT department Something will be needed from IT whether it be data or implementing the model in a production setting.Predictive Modeler Contrary to some marketing brochures, predictive modeling is not a turnkey solution. Programmer/analyst Having support from a programmer/analyst can help the person doing the modeling to be more efficient. A great deal of the work that goes into predictive modeling can be supported by a programmer/analyst.


    The stakeholders, especially those making use of the outcomes of the project, need to be invested. If they do not buy into the process, they will not use it. If they are not involved in the development (or have representation in the development process), they will not use it. If they do not understand the output they will not use it. Having a good working relationship with the IT department is essential. Generally, they have the data and other resources that may be needed.Time is key for many reasons. Time is needed for model development, testing, training, and development for production.Institutional support includes many things, such as software, hardware, training, conferences, and time for research.

  • CONCLUSIONSLSP Matters! Log ins, site engagement, and pace are correlated with course outcome, even at an early point in the course.

    Colleges can build and bootstrap predictive models using existing LMS data to identify online students who are struggling or are likely to struggle.

    Simple metrics can be used to assess activity performance, which might help instructors launch more customized interventions.

    More research needed on the intervention side, but the best step is to just get started.

  • REFERENCESAllen, I. E., & Seaman, J. (2008). Staying the Course: Online Education in the United States. The Sloan Consortium.Campbell, J., & Oblinger, D. (2007). Academic Analytics. EDUCAUSE White Paper. Green, K. (2009, November 4). LMS 3.0. Inside Higher Ed. Retrieved from, L., Arnold, K., Pistilli, M. (2008, March 4). Mining real-time data to improve student su