Components Of Learning Engagement
This is just how I think of engagement and motivation, but it may also help you. Think about engagement and motivation as a professional journey. You have a vision (you can see the results in front of your mind’s eyes). Motivation is the fuel. Motivation can drive whatever vehicle you choose. You might think of this fuel as intrinsic (the driving force is to feel your accomplishment) and extrinsic (when you get to your destination you might get promoted). You have a full tank at the beginning.
But fuel without a vehicle is just excitement without progress. Engagement is that vehicle that moves you through this journey. Let’s say you pick a car as a vehicle. A car with a full tank. You’re ready to hit the road. Driving around in circles can quickly burn the fuel while you’re getting nowhere. Doing urgent errands instead of keeping on track can also waste your fuel. To drive a car, you’re going to need at least three different types of engagement:
That is, turning the wheel, pressing the gas and brake, turning on the signal, etc.
- Cognitive or mental
That is, making decisions about how fast you go, where you go, when to take a rest, how to avoid accidents, etc.
- Affective or emotional
That is, how you feel about the trip, how excited you are to see new places, and how satisfied you feel about the progress.
All three components together (behavior, cognitive, and affective) contribute to what we often call simply (learner) engagement. The challenge is to design an experience while balancing these components: what people do, what mental decisions and actions they take, and how emotionally involved they get. A journey is a progress through both time and space. The intensity of these three engagement elements may vary throughout the journey.
What Does The Research On Learner Engagement Say?
My journey in space and time analogy is nice, but is there research data backing it up? A recent Learning Guild publication is a good start because it does focus on the relationship between learner engagement and instructional outcomes. I highly recommend reading the whole paper “Learner Engagement and Instructional Outcomes” . Dr. Jane Bozarth, Director of Research at the Learning Guild, does a great job “translating” the research language and its findings by Charles Dye into a practical, more understandable piece for those with eyes untrained in the language of academia: “Among the findings particularly relevant to our readership were isolation of three dimensions of learner engagement: affective, cognitive, and situational. ” Trowler (2010) summarizes the factors of the learning engagement construct as:
Relating to students’ actions. For example, class attendance, submission of work, contribution to class discussion, or participation in school-related activities (e.g., extracurricular sports or school governance).
Relating to students’ affective reactions in relation to their learning. For example, an emotionally engaged student might report that they were interested in their course and that they enjoyed learning.
Relating to students’ psychological investment in their learning. For example, the desire to go beyond the requirements of the class and the adoption of metacognitive learning strategies.
According to Charles Dye, what’s missing from these comprehensive studies is the learning environment as a critical element: “The focus on learning environment as an element of learner engagement is of concern because while adult learners in the workplace are essentially the same as those of 30 years ago, learning environments are not .” The way we live, shop, entertain and get things are dramatically different from the way it was decades ago. Available technology that can be used for delivering and facilitating learning in the workplace has also changed. I deliberately used the word “available,” because many of us are still stuck with the old-fashioned “content delivery and management” approach without the necessary personalization and adaptive technology.
Is There A Correlation Between Learner Engagement And Learning Outcome?
“Learner performance and learner engagement were found to be highly positively correlated, r(326) = .96, p <.001. It must be noted that there is no claim of a causal relationship in the data, although that is the ultimate goal of this line of research .” As for the notion of changing the intensity of these components as you progress through the journey:
Moreover, data consistently supported the idea that learner engagement was not fixed throughout a learning experience, but rather changed—often quite quickly, both as a consequence of the learner and the environment many practitioners and organizations have focused on driving the learner to “engage” in the learning experience to improve outcomes, reduce attrition, and accomplish the organizational goals of the training program .
Caution! When The Journey Is Mandated…
“Participants interviewed in this study noted much higher dissatisfaction with mandatory training not directly useful or applicable to them as learners in their role, with one participant noting ‘most mandatory company-wide training is a waste of time'” . Other studies indicate the same results:
Additionally, when employees feel like they’re being controlled, says Dobbin, organisational studies show they tend to react negatively. So, when diversity training is designated as mandatory—which Dobbin’s research found was the case at 80% of corporations in the US—employees can perceive these sessions as much less palatable than if they were voluntary.
My Main Takeaways From The Paper On Learner Engagement
Personally, I found four key takeaways after reading the publication:
1. Design For All Components
First, no one factor of learner engagement is sufficient in and of itself to result in an engaged learner. This has profound importance for both instructional design and delivery. There are myriad new learning environments; it is critical to engaging the learner that they be placed in an effective and supportive learning environment and receive relevant and authentic instructional content .
I completely agree with the first part of the statement. It is critical to keep in mind that focusing on a single component out of the three is not sufficient. For example, during my interviews with new hires about their onboarding, 95% of them recalled this one course they liked (gamified arcade), yet only 30% remembered what they learned there.
2. Perception Of The Learner And Alignment (Relevance) Count
“The second implication of this model is that the perception of the learner of both the training program and the alignment of intent between the learner and the instructional program being delivered is critical to learner engagement. ”
First, note this implication says “perception of the learner” and not reality. Does this mean L&D needs to be skilled in not only learning design but psychology? Surprise! Second, alignment will not happen until we move from a content delivery service to enablement that is personal and adaptive based on current and desired skills.
3. Engagement Changes
The third implication is that the intensity and frequency of engagement through these components change over time. This brings up the question: when and how to observe and measure engagement? During the experience itself? Right at the end? A day or two later? How does the peak-end rule affect the perception and recall of the experience?
“The peak-end rule is a psychological heuristic that changes the way we recall past events. We remember a memory or judge an experience based on how they felt at the peak moments, as well as how they felt at the end. ”
4. Mandatory Trainings
The fourth implication is mandatory training. Research shows that forcing you to take a training undermines intrinsic motivation and raises negativity. On top of that, mandatory trainings are often generic, which means it is even harder to find relevance and alignment. I had once an ethics training scenario where I had to decide if it’s ethical/legal to accept a yacht trip around the world by a vendor. Hmmm… let me bridge that river when I get there.
5. The Fifth Element
The final critical component does not come from the paper, but rather from my decades of experience in workplace learning. You can easily fall into the analysis-paralysis when it comes to designing the perfect learning experiences. And so, for those of you who are in workplace learning, my guidance is this: “the goal is not to be right, it is to make a difference.”
You don’t need to write a peer-reviewed research paper. The goal is not to be right, but rather to make a difference. That difference that you have to make happens under specific conditions, culture, corporate biases, etc. You are the only one who knows all the details. So, do your research but don’t expect step-by-step instructions. Use evidence-based approaches to minimize trial and error. Start where you are, measure the right things, and iterate from there.
If you tell the business that you’re 80% sure of your solution today but you can probably get up to 99% in 6 months if you do more research, I guarantee they will take the 80%. Why? Priorities you’re aligned with and problems you’re solving for today may change completely in six months, so you might end up with a perfect solution for a problem you don’t have (or don’t care about anymore). A quick win (making a difference) today can build your trust and street credit, to be able to influence stakeholders in the future.
So far, we’ve established the following about learner engagement:
- There are three components of engagement (physical, cognitive, and emotional).
- Focusing on only one of these components is not sufficient.
- The intensity of these components may vary during the experience.
- Mandatory trainings are often doomed. 😊
- About workplace-learning guidance—you don’t need to write a peer-reviewed research paper on this. The goal is not to be right but make a difference. Start where you are, measure the right thing, and iterate from there.
How To Make The Learning Experience Engaging?
Knowing what builds engagement, the fundamental question is how to design it. The rest of the article (along with the third article in the series, which will follow shortly) will go through some of the common approaches. However, before you try any of these solutions, make sure you do need to design a learning experience! If the problem is not solvable by formal learning, all of these approaches are ineffective. Making any course engaging when the course doesn’t make a difference is a waste of time. And time, as you will see, is one of the top learning barriers cited.
Technology Is Never The Answer
Technology vendors often see the problem in… surprise! Technology: “if only there was a feature…”
In fact, 67 percent of organizations say that user engagement is the top barrier to adopting technology-enabled training. One reason for this could be the lack of compelling functionality. Only 39 percent of LMS users in a recent survey gave their system’s feature set a high rating. 
According to Charles Coy, senior director of Analyst and Community Relations at Cornerstone OnDemand:
The idea of rating learning courses in the same way that you would rate a book on Amazon has two benefits. One, the good or popular courses bubble to the top. Two, it encourages people to go back in after they’ve completed something they thought was useful to let their colleagues know about it. They can earn a “top-reviewer” status.
The year was 2014. In the last eight years a lot has changed. Anecdotally, I keep hearing people complaining about their LMS and L&D transitioning from one platform to another. But it’s not about the lack of features that people complain! There are plenty of features, with a sea of content you can drown in. User engagement is not a problem because the LMS is not doing what it was designed for well. It is designed for “managing” learning courses. For admins, there are plenty of features to manage course content. But end users are different. Their problem does not start with the lack of features. Their problem starts with the end: measuring the wrong things.
You must measure the right things. Beyond measuring the number of features offered by a system, take a look at the quality, practicality, and value for end users. Unfortunately, in many organizations, the people who decide on what systems to buy are not the ones who are using them. And as we know, training is often the tool used to compensate for bad UX design, mismatches between the application and the problem, or broken processes. The value of L&D is in not in the visible content of courses. It is in the invisible change that happens in people’s brains, that leads to the application of knowledge on the job and its implication on performance.
“Lack Of Time” To Learn
“Lack of time” for learning emerged as one of the top barriers in multiple surveys across multiple clients I’ve worked with in the last five years. The lack of time (or perceived lack of time) also resonated with other L&D leaders across the industry during our EdTech board meeting recently. (Note that “lack of time”does not mean people don’t learn. It means they now deprioritize the time-consuming effort of trying to find a formal course that may or may not solve their problem.) The reaction by stakeholders is often a demand for shorter courses. I’m not convinced this is the root of the problem. Again, if we measure the value of our work through the duration of courses, we’re focusing on the wrong metrics. A course needs to be as long as it needs to be: sometimes 5 minutes, sometimes 15 minutes, sometimes 0 minutes.
Microlearning Is For The Rescue
Quick googling reveals that one of the top contenders to increase learning engagement is still microlearning. If the lack of time is a challenge and a reason for disengagement, then shorter learning events must be the answer. Microlearning has been the frontrunner for years in the industry, for addressing this problem. A shorter, more digestible, “snackable” learning design sounds exactly what the busy workplace needs. Is microlearning what L&D professionals prefer? According to an engagement-strategy tip blog, yes: “94% of L&D professionals prefer microlearning vs. traditional .”
Be aware! Any time you read “something shiny vs. traditional,” just be skeptical. Like Edward Tufte says, approach the novel findings with an open mind but not with an empty head. So, yes! We are onto something here. Let’s make sure we can also cite the source: the blog refers to www.softwareadvice.com. Unfortunately, this is not an article or publication but a site. I searched high and low but couldn’t locate the source graphics.
Plan B: googling the text itself leads to some articles referencing research by Michael Boyette in 2012. That’s a good sign! We’re back in business again! After more searches and links, I finally land on the actual Boyette research from 2012.
Skepticism Pays Off
Well, this is not exactly research. It is a poll done by Michael Boyette (Rapid Learning Institute) at an ATD (formally known as ASTD) conference. Michael’s exact poll statement was slightly different: “94% said that e-learners prefer short form modules (10 minutes or less) for soft-skills training.”
This statement then changes the original idea: it’s not that 94% of L&D professionals prefer microlearning but rather that 94% of L&D professionals participating in a poll said eLearners prefer it, specifically for soft-skills training. At the end of the article they include the limitation of the study: “With a sample size of 43, RLI considers these early results directional and not definitive, and plans to survey more learning professionals to validate the findings.”
Sample Size Matters!
So, let me recap: 94% of 43 people is 40.42. I guess not everyone responded, maybe? So, let’s assume the best scenario: 40 people. The original statement, out of context, that 94% of L&D professionals prefer microlearning, should be interpreted as: “Among the 43 people at the ASTD conference who responded to the poll, 40 believed that learners prefer microlearning for soft-skills training.”
To Michael Boyette’s credit, the limitation of the survey is mentioned there. Everyone who cited this poll ignored it. Now, imagine if someone just grabs this 94% statistic and builds an engagement strategy around it! Who’s responsible for checking citations and sources? Readers? Editors? Authors?
I understand many of you don’t have the time (and often the skills, or the access to research papers) to track down every single infographic you find online. But be skeptical enough to quickly check the cited resource. You’ll find red flags such as citing their own sources, citing a different article, or citing nothing sometimes. Use critical thinking: if a company sells X widget and the citation is about how the world needs X, double-check for confirmation bias. Follow people who decipher these and translate findings (and limitations) into L&D language. Here’s a list of people you might want to follow: Jane Bozarth, Clark Quinn, Will Thalheimer, Mirjam Neelen, Julie Dirksen, Karl Kapp, Donald Clark, Nick Shackleton-Jones.
And one more tip: don’t just follow people you agree with. Every time someone disagrees with me I learn something. In today’s world, the ability to respectfully disagree has become the exception. Even with the people I mentioned above, I don’t always agree 100%. Ultimately, it is my project, my organization, and my circumstances that determine the best way forward. The decision and the responsibility are mine. However, that decision must be evidence-based and informed.
In the third article of this series, we’re going to look into the rest of the approaches to address learner engagement:
It’s shorter, will they love it?
They can speed it up?
Who doesn’t want a viral microvideo?
The more movement the better?
- Highly interactive course
Clicky-clicky-bang-bang as Cammy Bean (accidental Instructional Designer) keeps them active?
Let’s spice up the script? Hahaha!
- Who wants to be a millionerd? Weakest cyberlink? JeoparDISC?
Game shows are always fun! Are they?
- Games and gamification
You’ve logged in! Congrats! 300 points for being awake?
- Brain-based learning (whatever that is)
With hocus-pocus about how neurons throw a dopamine party every time you hear a Bloom action verb. Got neuro?
- Learning styles
90% of our learners are visual. Let’s show them more pictures?
Let’s dazzle them with AR/VR/XR/metaverse?
- If everything fails, give them cookies!
This article is the second of a three-part series exploring questions related to learning, engagement, and ways to increase learner engagement. You can read the first article of the series here; the third will be published shortly.