Using data to transform the learning experience

Coursera Blog

Active Member
by Alexandra Urban, Coursera Teaching and Learning Specialist, & Talia Greenblatt-Kolodny, Coursera Partner Learning and Development Manager

Courses on Coursera are reaching millions worldwide, but not all learners are making it all the way through to completion. How can we help more learners succeed? The Coursera Teaching & Learning team has recently partnered with several university course teams to try to answer this question. Together, we’re tracking metrics, analyzing data, and designing content tests that are improving our understanding of the factors that lead to learner success. Instead of viewing the course launch as the end of development, this joint initiative reinforces the dynamic iteration and feedback cycle we hope to create with our university partners for each and every course.

One of our recent findings shows that expanding practice opportunities, such as formative (ungraded) assessments, can help learners progress further and pass a greater number of graded assessments. We’re excited to share some insight into the process we used to arrive at this conclusion, along with some tips for course teams looking to increase opportunities for practice in their courses.

Our research process

How do we identify opportunities to increase learner success in a course? Would a certain assignment benefit from more scaffolding, or do learners need additional practice to grasp a particular concept? Each course is different, so we start by diving into the specifics—looking at where learners struggle or become inactive, reading course and content item reviews as well as identifying issues with content that’s been flagged by multiple learners. We work closely with the course team to hone in on areas to improve, and then to brainstorm next steps for iteration and testing.

Then, we set up content tests*—A/B tests that split the course into two versions, each with an equal number of learners — to investigate the impact of specific improvements. This setup allows us to measure how changes to the content affect learner behavior and grades throughout the course.

The impact of practice

We frequently recommend adding more practice assessments to courses, especially when we see many learners struggling or dropping out around high-stakes graded assessments. It’s generally accepted that practice questions and activities are beneficial—these resources give learners an opportunity to apply their knowledge and verify that they have the skills to succeed before they attempt a graded assessment.

But does practice really work? In some cases, too many practice assessments could actually overwhelm learners and slow them down, decreasing progression and success. We wanted to be scientific in our approach to this recommendation, so we split several courses into A/B versions with and without extra practice assessments to analyze learner progress and performance in both versions.

Excitingly, we found that when the workload per week matched learner expectations adding practice assessments not only increased participation and success on related graded assessments, but also led to greater overall persistence and progress. This finding was reproduced across several courses in a variety of disciplines, ranging from data science and programming to art, suggesting that how we design the course structure can have significant impact regardless of the topic of the course.

Here is one course example of the increased learner progression we witnessed, with significantly more learners completing not only Module 1 with the addition of practice assessments, but also Modules 2, 3, 4, and 5:

blog-post-graph.png


Figure. In this course, more learners (y-axis) in the course version with added practice opportunities (blue line) passed graded assessments and progressed through each module. We saw significant increases for learner progression across modules 1 through 5 (x-axis).

What does this mean for your courses?

Imagine if you could increase the number of learners completing your first graded assessment by more than 50 percent—without actually changing anything about the assessment itself—and the number of learners completing the first half of your course by almost 75 percent. These are the results we saw when adding more practice with tailored feedback across several courses on Coursera.

If you are an educator, consider adding formative assessments to your course to help learners practice applying concepts before they attempt graded assessments, as well as to keep them engaged with the material. Remember to build practice opportunities into your course structure instead of expecting learners to create that practice for themselves. And be sure to include feedback for specific answers so learners know what they did wrong and where to go to review relevant material.

At Coursera, we’re proud to partner with the world’s best instructors to create an outstanding online learning experience. By researching content design and applying these findings to more courses, we can set learners up for success and amplify your efforts to create content that will transform lives around the world.

If you would like to talk to us about adding meaningful practice opportunities in your course, contact us at pedagogy-research@coursera.org.

* At Coursera, we use Content A/B Tests to investigate the impact of specific changes. We only use this type of A/B test if the impact of the change is uncertain—that is, there are reasons to believe the change could help, hinder, or not affect learning outcomes. In this case, adding additional assessments may have created too much work per week for learners, and actually decreased progression. We used A/B tests to identify the best way to integrate practices assessments, and we’re now working to implement expanded practice in more courses.

The post Using data to transform the learning experience appeared first on Coursera Blog.

Continue reading...
 

Similar threads

Top