Intro to Learning Analytics for Training Programs

Has your corporate learning program been effective?
How can you tell?
Most often, learning program analytics are an afterthought. Due to time constraints, many corporate training leaders forgo a detailed measurement plan.
While setting clear measurement goals and analytics take times, it is an absolute necessity. All learning programs should have clearly defined goals and be collecting data through each step of the learning journey.
If you’re new to learning analytics and not sure where to start, here is what you need to know.
What is learning analytics?
Learning analytics is the measurement, collection, analysis and reporting of data about learning to understand and optimize learning experiences and their organizational impact.
Measurement is the simplest form of analytics. This is where you track learning activities and record values.
These recorded values are collected into a larger pool of data. Collection can occur passively, where information is collected through software, automatically. Or it can occur actively, where a human must intervene to collect information. This could be from personal interviews or surveys.
Why measure learning?
Measuring learning isn’t only about the learning itself. It’s about business outcomes.
The reason we measure is to understand the learning programs effectiveness in solving a specific business problem.
What business problem is your learning program aiming to solve?
First, understand why a learning program is in place at all. Most often, it’s in place to produce specific business outputs.
The business output goals should be identified and codified before a learning program is developed. Before you can collect data on a program, you need to know how the data connects to the business.
This can be achieved by meeting with stakeholders and asking questions, such as:
- What are the current problems?
- What are their top priorities?
- What does success look like to them?
These questions are the basis for developing your learning program. They provide insight into what analytics are important to measure throughout the program.
Example:
If your company is implementing new enterprise-wide ERP software, the business goal is to skill-up every employee to reduce lost productivity time. Stakeholders have a vested interest in skilling up employees in the quickest time, at the lowest cost.
The instructional designer must ask specific questions of the stakeholders upfront, to clearly define the business outcome. Gathering this information on the end-business goal will help the designer work backwards to identify the key points of measurement along the learning journey.
How to Gather Learning Data
Collecting learning data can come in many forms.
It can be simple, such as automatically collecting quiz scores through an LMS. Or it can be complex, such as conducing one-on-one interviews with team members to understand how they solve complicated issues.
Most often, a data gathering strategy combines two factors:
- Looking at data you already have
- Collecting new data based on the learning program goals
It’s typically best to start with the data that has already been collected.
Your current data might be in Excel files, Google sheets, or stored in an LMS database. Take a look at your learning goals and prioritize your available data accordingly.
Gathering new data requires you to plan practically and think long-term.
Remember that learning data changes continuously over time as business needs evolve. It’s important to ensure data is collected in a reliable format and stored in a common location.
The format should be accessible, easy-to-read. Many LMS systems utilize xAPI technology that allows for tracking across a multitude of learning media. xAPI can track if a learner has completed a course, read a book, attended an event, listened to a podcast, and so on. The format is easily customizable to meet your learning program goals.
You’ll also need to consider your long-term data storage options – making sure the data is stored and categorized in a location your organization will use into the future. Does your organization already have a data storage option? Does your LMS have this capability? Explore your options and find the best storage solution for your organization.
And finally, as you craft your new learning data collection strategy, make sure the process is repeatable and scalable.
Does it require a lot of manual management? Or can the data be collected and processed automatically? Consider how quickly your company is growing. The process you establish should be able to adapt to expanding teams and changing organizational landscapes.
Types of Learning Analytics
To fully analyze a learning program, there are three types of analytics to consider.
- Experience Analytics: focuses on the actual learning experiences and activities
- Learner Analytics: focuses on behaviors of the individual or group of learners
- Program Analytics: focuses on the overall program and its effectiveness to achieve the business goals
Let’s take a look at each analytics category in more depth.
Experience Analytics
Focusing on the actual learning experiences, experience analytics looks into the navigation, UX, content, and usage patterns of learners.
For example, analyzing the interactions of an eLearning course. Or understanding hiccups in the UX navigation of a mobile learning program.
The more experiential data points you can measure, the deeper insights you’ll have for your experience analysis.
Examples of experience analytics include:
- Test and quiz scores
- Amount of time spent on specific modules
- Time of day activities are completed
- Course completion rates
- Interaction with materials
- Content shared
- Most popular discussion topics
- Posts with most comments
- Individuals with most engagement / interactions
- Instructional video pause points
Once this data is measured, it then needs to be evaluated. This helps determine which aspects of the learning experience are effective. And which are not.
This is where we look at the test and quiz scores and aim to make sense of the information it provides. Have only 15% of participants passed the final exam? Have we received feedback from the survey stating that a button isn’t working properly?
These insights help determine barriers to learning. They help guide the training analysts attention toward solving specific problems.
Learner Analytics
Learner analytics looks at the individual or group of learners and analyzes their behavior.
You may find that a handful of managers haven’t finished taking an eLearning course. And that most of their peers have fully completed the eLearning, final assessment, and required collaboration activities.
These varying behaviors provide predictive insight into the knowledge and skill acquisition of each population. It allows us to identify top performers and those that need some extra development.
It helps address questions, such as:
- What skills are being attained? And by who?
- Which groups have completed the most training?
- Which team members are high performers? And which members need more development?
- What topics are most interesting/useful/challenging for the learners?
Before understanding learner analytics, you’ll need to define the skills and knowledge that learners should attain. Otherwise, how do you know what to measure? This can be done by creating a competency framework.
What is a competency framework?
A competency framework outlines what individuals need to do to be effective in their jobs, connecting individual performance to business success.
Competencies can be defined as a combination of skills, knowledge, and judgment an individual should have to effectively perform their job.
Typically, a company-wide competency framework would be created for each function within the organization.
For example, if you have a customer service team, each team member should have competencies such as communication, rapport building, empathy, customer focus, and technical knowledge.
A competency framework documents these attributes. It provides a way to measure the journey of skill and knowledge acquisition.
Creating a competency framework
Let’s take a look at a high-level overview of creating a competency framework. Please note: the proper development of a competency framework takes tons of time, dedication, and often external consulting resources. This is a simplified overview to help understand the basics of the process.
1. Define the scope and needs
How will your competency framework be used?
Some organizations create company-wide frameworks, reaching all departments and all people. Others leave it up to the individual departments themselves. Each department will require different competencies. The accounting team requires different skills and knowledge than the IT team.
This step is all about defining the goals of creating a competency framework and the broader – the who, what, why, where, and how.
2. Gather data
Your scope is defined. Before building the framework, you’ll need to gather data about the current state of competencies. This is perhaps the most important step in creating your framework.
What skills are currently needed to perform in certain roles? Where are the gaps? Information can be gathered through on-the-job observation, personal interviews, surveys, and other forms of analysis. Make sure to identify skills, techniques, and behaviors. And survey a diverse sample of team members representative of the greater population.
Aim to capture the actual behaviors that team members are performing. For a customer service team, you might capture behaviors such as, “asks problem-identifying questions, solves technical software issues, understands complex functions of the software.”
3. Create the framework
This step is about organizing the behavior statements into competencies.
You’ll first need to compile behavior statements into similar groups. Keep this as broad as possible to begin – aiming for three buckets. For example, this could be “technical skills, communication skills, and decision-making skills.”
From here, continue breaking up competencies into smaller sub-groups. These can typically be pulled from the data-gathering step. Under the larger behavior statement of “technical skills,” you might have “identifies software bugs, solves software issues, understands complex functions.” These skills are most specific, and all fall into the category of “technical skills.”
At the end of this step, you should have a list of broad categories, and a list of specific skills/knowledge within those categories.
Once all competencies are identified, you now must add measurement. How are you going to measure each competency?
Measurement is where learning analytics comes into full swing. This is about understanding your current data collection tools. Then setting up a system that gathers metrics on both objective and subjective competency mastery.
Look at each behavior statement and brainstorm the most effective way to measure mastery. Some examples might be: viewing training completion rates, analyzing backend ERP system data to track user movements and time spent, or asking for peer-review feedback amongst team members.
4. Launch the framework
As you launch the framework, it’s important to make sure communication is clear to all team members. Everyone needs to understand what the framework is, why it’s there, and how it’s being measured. As with launching any new internal program, you’ll want to consider a robust training and change management plan.
As your problem matures and evolves, you will learn what is working for your organization and what needs to change. The important part is to set your competency framework up correctly from the beginning. Create a robust framework that is flexible to adapt to changing business needs.
Program Analytics
Along with experience and learner analytics, is program analytics. Program analytics focuses on the overall program and its effectiveness to achieve the business goals.
It is not looking at skill acquisition of individuals and groups. That’s learner analytics.
Program analytics focuses on understanding the broader performance of the program, connected to the business. In other words, has the learning program done what it was supposed to do?
Some questions that program analytics aims to answer:
- What was the ROI of the learning program?
- Did the learning program succeed in meeting its goals?
- Has company performance improved?
- How could this learning program be even more effective in the future?

About the Author:
Andrew DeBell is a learning experience strategist and content developer on the customer education team at Atlassian. Connect with him on LinkedIn for more.