How to use analytics to improve your learning programs


hen was the last time you checked the performance analytics of your last training program?

Do you even have performance analytics to check?

If your answer is a guilty “Not really”, you’re not alone. Far too often, HR and learning design teams are guilty of putting all their effort and excitement into a program and releasing it to the world, only to never check up on it again.

It’s easy to put the task in the too-hard basket. Sometimes the software used to deliver just doesn’t offer many metrics, or you’re not sure how to interpret them. Qualitative data can be incredibly valuable, but also hard to gather and quantify.

Yet the evaluation process is part of almost every learning development model out there, whether it’s a waterfall process like ADDIE, or an iterative model like SAM, and for a lot of good reasons.

A close up of a man's hands typing on a tablet with abstract renderings of charts around his hand.

The benefits of learning analytics

The ultimate goal of learning analytics is to identify which elements of your program are successful, and which are falling short.

Without this understanding, mistakes can be easily carried forward into future projects, crippling them before you’ve even begun.

You may even discover that an entire course is showing consistently poor performance, indicating that it needs an overhaul, or can be cut from your program entirely.

On the flip side, you may discover a course with consistently outstanding performance, indicating a chance to dive deeper into the reason for its success.

Analytics can also help you to predict which learners are likely to succeed and which aren’t, making it easier for you to identify struggling learners and plan intervention early on.

Overall, analysis and evaluation of your eLearning courses is crucial for informed planning and decision making, lowering costs and increasing learner success.

How to read learning analytics

Sometimes, the issue is not a lack of data, but rather how to understand what it means. Data is only useful if it can be interpreted into a story that answers why the numbers appear as they are. 

Most learning platforms offer the following basic completion analytics at a minimum:

  • Number of attempts at the course
  • If they passed or failed the assessment
  • What percentage of the course they have completed.

Sometimes these numbers can seem self explanatory, but be careful of jumping to conclusions as your assumptions may not be correct. Instead, see if your hypothesis can be corroborated by additional data, either from other numbers that tell the same story, or by gathering qualitative data.

If you’re new to reading learning analytics, here is a quick guide to understanding what these basic metrics may be indicating.

A young African American woman with glasses look intently at a printed spreadsheet while working at her laptop in a modern industrial style office.

Number of attempts

Number of attempts typically refers to how many times a learner has begun the course assessment. Failure to complete the assessment or failure to pass the assessment will require the learner to reattempt the assessment.

Therefore, a high number of attempts at the assessment could indicate that the learning material has not sufficiently prepared the learner for the assessment. Mistakes to look for could include:

  • Do your questions ask for small, irrelevant details from the course?
  • Do your questions ask for answers that were not given in the course at all?
  • Is the language of your questions and/or answers unclear, or do they use different terminology from the learning content?
  • Have you accidentally set up a question incorrectly?
  • Are your learners being taught conflicting information from their managers or colleagues?

Another possibility is that your learners do not have enough time to sit and complete the entire assessment. This can be the case if your course or assessment is very long, and your learners are unable to set aside enough time to finish the course in one attempt. Ensure your course and assessment are only as long as they need to be, and that your course settings allow your learners to save their progress, and if needed, skip straight to the assessment on re-launching the course.

Pass or fail

Pass or fail data is a little more straightforward. Did the learner meet the minimum requirements for passing the assessment on their last attempt, or not?

Matching pass/fail data with number of attempts can help you to determine if learners are making multiple attempts because they are failing the assessment, or because they don’t have enough time to complete the assessment.

Pass/fail rates can also give you insight into how easy or difficult your assessment is. It can be difficult to find the sweet spot here. Too easy, and your assessment will not accurately differentiate between learners with a good grasp on the content and those who do not. Too hard, and your assessment will frustrate your learners, require them to spend additional time on multiple attempts, and foster a negative attitude towards company training.

If you have a high failure rate, try to determine if specific questions are tripping up your learners, or else review if your questions are falling into any of the traps mentioned under ‘Number of attempts’.

If your pass rate is 100% on the first attempt, you may want to consider adjusting your assessment to be a little more difficult in order to challenge learners who skipped through the learning content.

Percentage Completion

If your learners have not finished your course, your learning platform will likely show how far through the course they made it before quitting.

Completion rates typically show how engaged your learners were in the course content (the more engaged, the higher the rate), or else how much time they had to complete the course (not enough time, the lower the rate).

If your completion rate is unsatisfactory, it’s important to check other data for signs of potential issues in the course design. If there is no clear picture to be found in the numbers, then go to the people! Try the following:

  • Chat to the manager in charge of the learners - are they being given enough time and the right devices to complete the course?
  • Ask the manager’s own opinion about the training. You may find that the manager themselves has a low view of training, creating a negative culture that is reflected in the numbers.
  • Talk to the learners, or run a feedback survey. You may discover technical or cultural issues, or frustration with the learning content.
A young man in a blue button up shirt talks to a woman who sits in front of a laptop that is displaying visualised reports and charts.

How to institute learning data analytics

Ultimately, learning data analysis and evaluation requires a team culture committed to continual improvement. But how do you establish a culture like this? Here are 10 learning analytics must-dos to get started.

1. Create a trial analytics project 

Start with a real problem to solve - and just choose one. Where is your department losing the most money, or struggling to prove ROI? Perhaps you can identify an area of learning that is causing you headaches but you have limited visibility of what’s actually causing the problem. Either way, choose an area that you can isolate, and that you can easily access data for. Maximise your chance of success by starting with the lowest hanging fruit.

2. Find your cheer squad

Figure out who needs to be involved in getting this problem solved. Is it a priority for anyone else in your organization? Who needs to be engaged and on-board to get this done? 

3. Scope your information needs

Identify the questions you need answered in order to solve your problem. What information do you need? Does it exist, or does it need to be created? Who owns it? Where does it live, and in what format? 

4. Identify the returns

Chances are, you’ll need to sell your project internally. This will be so much easier if you can prove ROI. Work out how much the problem is costing the business, how much of the problem you think you could solve, and come up with an ROI estimation to take to the board.

5. Identify your budget

Consider how much your project is likely to cost, and how much you can afford. Consider not just the cost of the analytics, but the costs of implementing any change that may come out of it, too. How will you fund it? Is there anyone else who might benefit from it that you can get on board?

6. Check your data

Unless you already have an established Learning Management System in place, you will need to check that your organization holds the right type and amount of clean, quality, standardized data for the job.

7. Build your team

Who do you need in order to get the work done? Depending on whether you use an LMS or not, you might need to build an extensive team to extract and interpret the data. Identify your corporate capabilities to determine the next step. And don’t forget to resource for change planning and execution once you get the results.  

8. Check your tool box

If you’re using an LMS, you’re in luck! Many systems already provide a robust range of analytics tools and reports as standard. If not, it’s worth looking at alternative software such as SNAPP and Netylitic, both of which gauge learner engagement with content across various channels, including social media.

9. Make sure your governance is watertight

Data capture and storage needs to be carefully managed in order to stay within the lines of legality and ethics. Make sure your governance and infrastructure is watertight by having data protection policies and practices securely in place.

10. You’re ready!

Once you have these steps in place, you’re ready to go. Remember to start small and specific, and enjoy watching your learning programs go from strength to strength.

Oct 10, 2022

More from 



View All

Get the latest posts to your inbox every month!

No spam ever. Read our Privacy Policy
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.