Sei sulla pagina 1di 10

Predictive Learning Analytics in

Large Online Course Delivery


Michelle Troberg & Eugenia Suh
University of Toronto Mississauga, Dept. of Language Studies

Annual Conference of the Society for Teaching and Learning in Higher Education
University of Sherbrooke, Quebec
June 20, 2018
Research talk
The issue: student engagement  success
• Large body of research that shows that student
engagement is associated with effectiveness of
learning
• (Astin 1993; Braxton et al. 2004; Kuh 2001; Kuh et al., 2007;
Pascarella & Terenzini 2005…)
• We teach a high-enrolment introductory English
Grammar course delivered as an online and hybrid
course (approx. 1000 student/year)
• What does it mean for a student to be engaged in
such a course?
• We take a behavioural perspective, considering student
interaction with the LME in particular.
Research questions
achievement characteristics

activity in the LME

1) What is the relationship between measures of student online behaviour and achievement in our course?
• Previous research: time and frequency spent on task; participation in virtual discussions

2) How does student demographic information correlate with online course activity and achievement?
• Previous research: high school grades, CGPA, financial assistance

See: Morris et al. (2005), Campbell et al. (2007), and Whitmer et al. (2012)
Findings I:
Correlates and Predictors of Success
Performance on the final exam appears to be the most reliable
correlate with effectiveness of learning.
Frequent mid- to low-stakes assessments are not good indicators of how much
knowledge the student is building; students rely heavily on external resources to
complete assessments.
 The participation mark seems to be the single most reliable predictor
in distinguishing students with a grade of A on the final exam. Given the
nature of the participation assessments, it seems to be more of an
indicator of commitment to learning in the course than anything else.
• Therefore, simple commitment to learning appears to result in successful
learning in this course!
 Years 2, 3, 4 perform better on the final exam (grade of B or higher)
Findings II: Students at Risk

- Under 20 hours logged into


the course shell
- Under 50 clicks in the
Course Materials folder
- Under 75% on term course
work
- Under 65% on Participation
- A CGPA of under 2.0

Credit: University of
Toronto Business
Intelligence
data@utoronto.ca
Validation of common sense notions
• If engagement = success, what is success in LIN204?
• here, we define it as achieving a B or higher on the final exam.
• What student characteristics are predictors of success in the
course?
• previous success in other courses (CGPA)
• commitment to learning (motivation as reflected in participation mark)
• What behaviours correlate with success in the course?
• time engaged with course: over 50 hrs logged in and 175 clicks on course
materials
Non-intuitive finding
• Impressionistically: TAs observe insecurity/anxiety in the non-native
speakers that doesn’t seem to be as noticeable in the native speakers.
• Expectation: native speakers will do well on the final exam

• However, on the final exam…


• 35% of native speakers obtain a D or lower
• 29% of non-native speakers obtain a D or lower
Instructor’s role: communication of findings
• frequent and sustained time on task is a behaviour associated with
high-achieving students (but which material, and when, and how?)

• motivation is the strongest predictor of achievement

• performance on low- and mid-stakes assessments may NOT be an


accurate predictor of how they will perform on the final exam
Next steps
• Analyse data from 2016(fall), 2017, 2018(summer)
• Establish statistically reliable correlations (if any)
• Continue to collect risk measures for each iteration
• How much variation is there from iteration to iteration?
• Establish reliable risk measures
• How do our correlations relate to risk measures?
• Conduct a qualitative study by way of surveys
• Feed back our findings to students in a meaninful way
Selected References
• Astin, A. W. 1993. What matters in college? Four critical years revisited. San Francisco: Jossey-Bass.
• Braxton, J. M., Hirschy, A. S., & McClendon, S. A. 2004. Understanding and reducing college student departure. ASHE-ERIC
Higher Education Report, Vol. 30, No. 3. Washington, DC: School of Education and Human Development, The George
Washington University
• Campbell, J., DeBlois, P., & Oblinger, D. 2007. Academic analytics: A new tool for a new era. EDUCAUSE Review 42(2): 40–
57.
• Fritz, J. (2013, April 30). Using analytics at UMBC: Encouraging student responsibility and identifying effective course
designs (Research Bulletin). Retrieved from https://library.educause.edu/resources/2013/4/using-analytics-at-umbc-
encouraging-student-responsibility-and-identifying-effective-course-designs
• Kahu, E. 2013. Framing student engagement in higher education. Studies in Higher Education 38(5): 758-773.
• Kuh, G. D. 2003. What we’re learning about student engagement from NSSE. Change 35(2): 24–32.
• Kuh, G. D., Kinzie, J., Buckley, J., Bridges, B., & Hayek, J. C. 2007. Piecing together the student success puzzle: Research,
propositions, and recommendations. ASHE Higher Education Report 32(5). San Francisco: Jossey-Bass.
• Morris, L. V., Finnegan, C., & Wu, S.-S. 2005. Tracking student behavior, persistence, and achievement in online courses.
The Internet and Higher Education 8(3): 221–231.
• Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research (Vol. 2). San Francisco:
Jossey-Bass.
• Whitmer, J., Fernandes, K., & Allen, W. R. (2012, August 12). Analytics in progress: Technology use, student characteristics,
and student achievement. EDUCAUSE Review Online. Retrieved from https://er.educause.edu/articles/2012/8/analytics-in-
progress-technology-use-student-characteristics-and-student-achievement.

Potrebbero piacerti anche