Upload
michael-wilder
View
261
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Leveraging data to improve your online course
Citation preview
Empowering the Instructor
with Learning AnalyticsLeveraging data to improve your online
course
Presentation by
Michael Wilder
Agenda
• What is learning analytics?• Research: past and present• What are the challenges and issues?• Case Study: Journalism 444• Conclusions• Questions
Definitions
What is learning analytics?
“Learning analytics refers to the interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues.”
(Johnson, Adams, and Cummins, 2012)
Definitions
What is learning analytics?
“Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning.”
(Siemens, 2010)
Definitions
What is learning analytics?
“Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”
(Society for Learning Analytics Research, 2012)
Definitions
What is learning analytics?
“LA collects and analyzes the ‘digital breadcrumbs’ that students leave as they interact with various computer systems to look for correlations between those activities and learning outcomes.”
(Educause, 2011)
Process
Characterized as “an engine with five stages”
• Capture• Report• Predict• Act• Refine
(Campbell, DeBlois, and Oblinger, 2007)
Adoption
Time-to-adoption horizon
The 2012 higher education NMC Horizon Report places learning analytics in a two- to three-year adoption window.
(Johnson, Adams, and Cummins, 2012)
Opportunities
Application
• Predicting outcome achievement• Course and program dashboards• Curricular evaluation• Prioritizing learning outcomes• Set course and instructional policies• Defining academic quality
(Bach, 2011)
Research
Pre-existing areas of research
• Social network analysis• Latent semantic analysis• Dispositions analysis
(Ferguson, 2012)
Research
Newer areas of interest
• Social learning analytics• Analytics for reflective learning• Visual analytics• Textual analytics• Analytics infrastructure
(Ferguson, 2012)
Research
Major research groups
• Educational data mining“How can we extract value from these big sets of learning-related data?”• Learning analytics“How can we optimize opportunities for online learning?”• Academic analytics“How can we substantially improve learning opportunities and educational results at national or international levels?”
(Ferguson, 2012)
Research
Predictors and indicators
• Activity indicators(Number of logins, time spent in the learning environment)• Performance indicators(Grades, quiz scores)• Dispositional indicators(Age, GPA, prior learning experience, financial status)• Student artifacts(Essays, blog and discussion forum posts, media productions
(Brown, 2012)
Research
Example projects/research tools
• The Signals project• Contextualized Attention Metadata (CAM)• LOCO-Analyst• SMILI Open Learner Modelling Framework• Social Networks Adapting Pedagogical Practice
(SNAPP)• Gephi
(Ferguson, 2012)
Challenges
Learning analytics researchers will need to address:
• Development of new tools, techniques, and people• Data: openness, ethics, and scope• Target of analytics activity• Connections to related fields
(Siemens, 2012)
Challenges
• What is an efficient way to visualize the data? All at once or separate “dashboards?”Wordle or Google Fusion Tables?
• How should interventions and responses be implemented?Fully automated by an LMS? Mediated by educator and advisor?
(Brown, 2012)
Issues
LMS-centric
The initial model of gathering educational data focused on the use of learning managements systems.
“Although the latest LMSs offer an increasing set of features, students are beginning to reach the educational institution with solid experience on how to interact with their peers in ways not covered by the LMS.”
(Pardo and Kloos, 2011)
Issues
Proprietary software
“Early indications from vendors suggest that findings will be treated as proprietary and will not be made available to other researchers.”• How are various factors weighted in an algorithm?• Are the concepts being analyzed the rights ones?• Can researchers adjust the algorithms of vendor tools to
conduct experiments of other factors that might impact learning?
(Siemens, 2012)
Issues
Concerns about ethics and privacy
• Should students be told that their activity is being tracked?• How much information should be provided to students,
faculty, parents, issuers of scholarships and others?• How should faculty members react?• Do students have an obligation to seek assistance?
(Ferguson, 2012)
Issues
Concerns about ethics and privacy
• What data is appropriate to collect about student? What data is inappropriate?
• Who should be able to access the data and view results? Which data should be reported anonymously?
• What is the impact of showing faculty modeling results? Do any of the data bias faculty instruction and evaluation of students?
(Bach, 2010)
Issues
Misuse/misinterpretation of data
• Who gets to interpret the data?• Who owns the data?• Who benefits from the data? The institution, the instructor,
or the student?• Can data be use against instructors/departments/institutions
to determine employment, tenure or funding?• Is time in an LMS representative of engagement? Simply
because educational content pages are open in a browser, does it mean that a student is processing the information?
Case Study
Journalism 444Advanced Interactive Media Design
Case Study
Goals of this study
The primary goal of this study is to explore ways that individual instructors can use contemporary network data as well as traditional feedback methods to iteratively improve the quality of online course curriculum.
This study will use the data generated by the Summer 2012 offering of Journalism 444 as a case study.
No confidential data is shared in this study.
Case Study
Journalism 444
• Introduces digital media strategies to young journalists transitioning from print media to the Internet
• Incorporates a variety of traditional Web development and Web 2.0 technologies into the curriculum including:• HTML• Blogging• Social media• Web-based multimedia
• Provided for senior-level undergraduate journalism students
Case Study
Journalism 444
• Five-week course delivered between July 9, 2012 – August 12, 2012.
• Nine students participated in the course.• Course used both institutional learning management system
(WebCT) as well as external Web server with traditional HTML pages and WordPress blogging system.
• Modules/lessons were delivered through external HTML pages.
• Announcements, assignments, discussions, and grade book were delivered through LMS
Case Study
Assumptions
• Educational data is highly contextual.• Data requires comparison over time in order to be valid.• The best person to interpret educational data is the educator.• Educators can improve the educational experience by:• Improving the quality, scope and sequence of the
curriculum;• Improving the interface of the learning environment;• Understanding the behavior of students.
Case Study
Data-gathering opportunities
• Student tracking reports from the LMS• Additional server log analysis for blogs and HTML pages via
Google Analytics• Qualitative data from:
• End-of-module surveys• End-of-course institutional evaluations• Student reflective journals• Student e-mails• Student discussion board posts• End-of-course surveys
Case Study
LMS Tracking Reports
Case Study
LMS tracking reports
• Summary of ActivityProvides an overview of general Student activity.
• Tool UsageProvides an overview of how often tools, such as Assessments, Assignments, or Discussions, are used.
• Course Item UsageProvides an overview of how often items, such as a quiz, an assignment, or a discussion topic, are used.
Case Study
LMS tracking reports
• Entry Page or ToolProvides an overview of the pages or tools most frequently used as course entry points.
• Exit Page or ToolProvides an overview of the pages or tools most frequently used as course exit points.
• File UsageProvides an overview of the files viewed most frequently.
• Student TrackingProvides a detailed summary of activity information for individual Students.
LMS Reports
Summary of Activity Report: 1001 - 2012 SumrJuly 9, 2012 to August 12, 2012
Total user sessions: 510Average user session length: 00:25:56Average user sessions per day: 15Average user sessions per day on weekdays: 14Average user sessions per day on weekends:16Most active day: July 16, 2012Least active day: July 13, 2012Most active hour of the day: 8:00 PM - 9:00 PMLeast active hour of the day: 5:00 AM - 6:00 AM
LMS Reports
Summary of Activity ReportInterpretation
• Most students log in to the LMS at least once a day with some students logging in several times.
• Students log in about the same number of times during the weekend as they do during the week.
• Students log in to the LMS the least and the most during the first week of class.
• Students prefer to engage with the curriculum generally after dinner, perhaps after work and perhaps after children have been put to bed.
LMS Reports
LMS Reports
Tool usageInterpretation
• The Weblinks tool is by far the most used LMS tool, which makes sense since all curricular modules are linked outside the system.
• The next most popular tools are the discussion, assignment, folder, and mail tools.
• The Who’s Online tool is used the least and should probably be removed from this course.
Case Study
Google Analytics
Google Analytics
Google Analytics provides:
• Activity tracking outside of an LMS;• Traffic statistics over a defined period of time;• Visualization of traffic through a Web site (such as online curriculum)• Browser type• Operating system• Internet service provider• Access location (such as country and city)• Mobile access• More
Google Analytics
Google Analytics
Visit overview interpretation
• Course activity is slow during the first week of class, but picks up speed rapidly during the second week.
• Activity at the end of the third week and during the fourth week is volatile with spikes of heavy activity followed by valleys of low activity, possibly indicating workload juggling between different courses.
• A slowing of activity during the fifth week mirrors a decreased workload during the last week.
Google Analytics
Google Analytics
Visit flow interpretation
The bulk of the traffic flow is primarily within module three (“Introduction to the Dreamweaver Interface”). This module introduces many new ideas to students and has five lessons.
Google Analytics
Browser
Google Analytics
Browser interpretation
• Most of the students are using Safari, an Apple product, followed by Firefox.
• Safari is also the default browser for IOS mobile devices.• Android browsers are also present.• Curriculum development should take into consideration limitations of
certain browser, as well as the potential for mobile learning.
Google Analytics
Browser
Google Analytics
City interpretation
Most of the traffic is coming from Las Vegas and Henderson (local to the University), but a proportion of the traffic is coming from outside the local area. This accurately reflects statements made by students indicating he or she would be out of town during part of the semester.
Google Analytics
Browser
Google Analytics
Service provider interpretation
Only about an eighth of the traffic comes from on campus, indicating that most students access this online course remotely.
Google Analytics
Mobile devices
Google Analytics
Mobile devices
Google Analytics
Mobile access interpretation
• Around 10% of the traffic to the online curriculum is via mobile devices.
• More than 50% of this traffic is via Apple iPhone or iPad.
• These figures justify further mobile access development.
Case Study
End-of-module surveys
End-of-module surveys
In order to gather qualitative student feedback that can inform the improvement of online curriculum, end-of-module surveys were provided.
During the summer 2012 semester, these surveys were not mandatory.
End-of-module surveys
Following are the questions provided in each end-of-module survey:• The content of this unit was useful and relevant to the
objectives of this course.• The content of this unit was clear and organized.• The amount of time provided to accomplish the tasks • What elements of this unit did you find the most enjoyable?• What elements of this unit did you find the least enjoyable? • What suggestions could you give the instructor to improve
this unit?
End-of-module surveys
Example response:
What elements of this unit did you find the least enjoyable? “Tracking down an interview along with doing with the rest of the module was difficult to do in time.”
What suggestions could you give the instructor to improve this unit? “Make an announcement or better reminder for the interview.”
End-of-module surveys
Example response:
What elements of this unit did you find the least enjoyable?“CSS was very difficult.”
What suggestions could you give the instructor to improve this unit? “The directions provided in the CSS didn't always match my Dreamweaver. The directions shown weren't options in my Dreamweaver. However, I don't know how to fix this but it made it difficult and time consuming. “
End-of-module surveys
Example response:
What elements of this unit did you find the least enjoyable?“This stuff is so confusing to me! I'm not the most tech savvy person, and this is an online course so I'm sure that doesn't help. Haha”
What suggestions could you give the instructor to improve this unit? “Nothing, I'm sure I'll get the hang of it.”
End-of-module surveys
End-of-module survey interpretation
• The “Professional Web Developer Interview” assignment may need more scaffolding and time to complete.
• The module on Cascading Style Sheets will need more development, with particular eye on compatibility between versions of Dreamweaver.
• Students feel challenged in some modules, but still feel confident they can continue.
Case Study
Reflective Journals
Reflective Journals
At the end of every semester, students are asked to reflect on the course as a whole.
Following is the assignment prompt:
“Take a moment to think about some of the technologies and strategies you've explored during this class. Write a discussion post reflecting on your work this last semester. Identify any skills as an instructional media designer that you've learned that might be useful to you in your future career endeavors. Also identify any online technologies that would be useful to learn in the future.”
Reflective Journals
Example response:
“Throughout this class we have spent numerous hours on the computer learning about technologies and strategies on how to create websites. I can honestly say that in the beginning I dreaded doing homework for the class. After seeing the final result and knowing that my hard work paid off I am truly grateful that I had the opportunity to partake in this class. There are those classes that you are just happy to get through and there are those classes that you actually learn and take wisdom away from, this was one of those classes.”
Reflective Journals
Example response:
“This class pushed me beyond my comfort zone, made me frustrated and was incredibly time consuming - something my non-health science related classes have not done here at UNLV. I’d say that it’s a good thing because that’s what classes are supposed to do. Maybe it’s weird to say, but I’m used to skating through the majority of my classes and winging about 99% of the work, but that’s not actually possible in this one - and I’m glad because I feel like I finished with a usable skill (even though I’m clearly a novice.)”
Reflective Journals
Example response:
“This class definitely reminded me of when I used to mess around with codes on MySpace. I didn't know what they meant and I would just change the sizes and make titles such as "About Me” on MySpace. Back then, it took a long time to figure out what codes was changing what and what codes made the font different. After taking this class, I understand what they meant. I have learned so much in this class about HTML codes and CSS. I really enjoyed editing my layout to how I wanted it to be. My layout was what I wanted and it wasn’t premade. It was my own idea.”
Reflective Journals
Reflective journal interpretation
Although the course provided a rigorous academic framework, student satisfaction was high. Students felt a sense of accomplishment and that their work was relevant and useful for their future endeavors.
Conclusions
• Learning analytics can be useful to the individual instructor in the pursuit of online curricular improvement.
• Instructors can improve online curriculum by understanding the context in which student engage with the environment (such as time of day, heavy traffic patterns, browser type, mobile devices, etc.).
• Instructors should not discount qualitative methods for learning how to improve the scope, sequence, and efficiency of online curriculum.
• Instructors should be very aware of the ethical and privacy issues of gathering data.
• Teaching and learning is contextual, and the best person to interpret data is the instructor.
Conclusions
A 2011 Educause Learning Initiative brief describes learning analytics as “the coming third wave, a new technology with great potential to increase student academic success.”
(Brown, 2012)
Conclusions
Potential
“Theoretically, LA has the potential to dramatically impact the existing models of education and to generate new insights into what works and what does not work in teaching and learning. The results are potentially transformative to all levels of today’s education system.”
(Siemens, 2012)
References
Bach, C., 2010. Learning Analytics: Targeting Instruction, Curricula and Student Support. Office of the Provost, Drexel University.
Campbell, J., DeBlois, P., and Oblinger, D., 2007. Academic analytics: A new tool for a new era. Educause Review, 42(4): 40-57.
Educause (2011). 7 things you should know about first-generation learning analytics
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges. Technical Report KMI-12-01, Knowledge Media Institute, The Open University, UK.
Pardo, A., and Kloos, C., 2011. Stepping out of the box: Towards analytics outside the learning management system. LAK’11.
Siemens, G., 2012. Learning analytics: Envisioning a research discipline and a domain of practice. LAK’12, 29.
Society for Learning Analytics Research, 2012. About. http://www.solaresearch.org/about/
Credits/Contacts
This presentation made by:
Michael WilderInstructional Design Coordinator
E-mail | Web Site