Learning Analytics - Beyond Quiz Scores and Completion

Apr 12, 2018 by Tim Sarchet

Get Our Newsletter

Sign up to get the latest expert commentary, analysis, and news on enterprise learning and capability academies delivered straight to your inbox.

The four data points we can use to measure ROI in learning

As new digital learning approaches emerge, so do new ways of measuring learning. Just like in other spheres, a move to digital means data. Lots of data. And that’s part of the challenge we now face in learning. What data is going to be most valuable and move us closer to being able to “prove” the elusive ROI in learning?

One thing we know is that quiz scores and completion, the traditional metrics for e-learning, are no longer enough. They’re still important pieces of the puzzle, but we need more.

Here are four emerging categories of data that Nomadic and other digital learning companies are using to help our clients develop a more complete picture of learning’s impact:

Attention For some of the newer learning experience platforms, this means clicks. Borrowed from the world of online advertising, clicks are a helpful data point to tell us which pieces of content earn the attention of our learners. But that’s not actually telling us if the content is any good. It might help us understand which types of content appeal to learners, but a high click rate might just be because of an interesting headline. Unless we dig deeper into things like scroll depth, time on page, and bounce rate and understand what learners do after opening that particular piece of content, measuring clicks has limited utility.

Interactions This is where we start to get much more interesting data. There are two broad types of interaction we can measure: interactions with the content and interactions with fellow learners.

Interaction with the content can be things like how often a piece of content is shared or liked (which is typically a better metric of the value of a piece of content than clicks) and at which point in a piece of content learners stop paying attention. Knowing how far along people are getting in a video before they stop watching, or at which point in a course the learner drops out can tell us a lot about how well our content is working to earn and sustain the attention of our learners.

Interaction with fellow learners goes beyond measuring how much users like, understand, or complete the content. In collaborative, digital learning experiences we can see how often and for how long users are directly interacting with each other, and we can start to identify data that illustrates how effective a team is at collaborating on a project or specific task. While not directly demonstrating ROI, being able to measure how well a team collaborates is pretty compelling data, unavailable elsewhere.

Influence If we’re able to measure how well a team is collaborating, we should also be able to measure who on that team has the most influence. Peer assessment of comments and submissions, analysis of which user’s comments spark the most conversation, and influence maps that illustrate the users that attract the most attention help us to identify who has the most influence. It also shows us exactly what they are doing to gain that influence. When we get this right, learning can become a powerful tool for talent and performance management and a powerful indicator of leadership effectiveness in the real (digital) world.

Ideas & Opinion Digital learning also has the potential to be the most effective tool for capturing employee ideas and opinions—and allows it to happen honestly and at scale. Carefully designed poll questions embedded in the learning experience elicits more honest responses than a survey where the explicit intention is to gather employee feedback. Building real work challenges into the learning experience can generate 1000s of suggested solutions. By combining peer assessment of those solutions and automated text analysis we can uncover and shortlist ideas that have the biggest potential to change our organizations.

All of these categories of data do not happen on their own. Capturing meaningful data requires that the learning be designed with these data outcomes in mind.

Sign up for a free demo to Nomadic Learning and we’ll show you how we go about designing learning experiences that generate the most meaningful data in our industry.