How to Measure Attention in Educational Settings
The learning process has had a stage that was considered a blackbox. Teachers and researchers could assess people’s emotions and attention levels before and after a course, but not while it happened.
Beyond mere observation, we didn’t have metrics that could assess how students reacted to the content and how it was being conveyed to them. But that has been changing due to today’s technological advancements in facial recognition and eye tracking. The assessment of the learning process is now complete.
Let’s look at the variety of metrics that allows us to tap into the minds of our students and make significant changes to improve their learning experience.
Measuring your student’s emotions
Our face reveals more about ourselves than we know.
Whenever we have an emotional reaction, we tend to change the distribution of our facial muscles. We yawn when we’re bored, we smile when we find something amusing, or we wrinkle our noses and pull our upper lips up when we feel disgusted. Our face conveys to others how we feel at that moment.
Digital mapping of our facial expressions enables software to accurately assess how likely we are to be showing a certain emotion. It tracks each facial muscle in a way a human can’t. When you combine these features in a certain way (because each emotion has its facial configuration), the software’s output tells us the percentage (from 0 to 100) of each emotion being tracked. So a 90% for happy means that the person is 90% likely to be eliciting that particular emotion.
By analyzing the student’s face, you can create an emotional map for each moment during your course. You can know exactly how they feel when you provide an anecdote to support the historical date you’re teaching them, or their emotion while learning the offside rules for football. So why is that useful?
When learning is linked to strong emotions, it is better stored in our memory and it can be retrieved more easily in the future. Emotions also influence our attention process. They help select what you’ll pay attention to and what you’ll try to avoid. Emotions are part of the learning process from our perception and memory to our reasoning and problem-solving. If you don’t address the emotional component, you’ll be less likely to have an effective course.
We need to capture our student’s attention or else they won’t learn and perform well afterwards. Attention is the first stage of the learning process. If we don’t have their eyeballs, we won’t have their minds working on our content.
Thanks to the video feedback we get from our students while participating in the learning process, we can analyze their level of attention and infer their state of mind. Several metrics allow us to dig deeper into how focused or unfocused they are with what we’re trying to teach them.
The most intuitive one is to track their eye movement with a hardware eye tracker or web-based system. Both of them serve the same purpose but can vary in their precision rate and the depth of information they can offer. Either way, they work on the same principle: track eye movements. These measures allow us to know where they’re looking at all times, how much time they spend looking at one thing, how much time it takes them to look at something, whether they revisit certain elements, and much more. All these patterns help us understand how our content is being received by them and it gives us valuable insights on how to improve what we teach.
We can also track their blink rate. We usually blink between 8 to 21 times a minute and depending on how frequently we blink we can determine people’s level of attention. For instance, when someone is focused they usually blink 4.5 times a minute while low levels of attention make us blink 32.5 blinks per minute. By assessing the student’s blink rate we have an additional measure of how focused they are on our course.
Sound also influences our level of attention. In a study analyzing different noise levels inside a classroom, researchers found that high levels of noise (above 75dB) have a negative influence on the student’s concentration level. They show lower levels of focus and take more time to react to the teacher’s input. For instance, students fixate more times on the same information on the screen when there’s high background noise. This means that they need more effort to process that information compared to students in classrooms with low levels of background noise.
There are other features we can track to determine people’s attention level, like head orientation and body posture, and combining all these factors we can have a more precise assessment of our student’s overall focus on our lecture. The higher the attention the better their learning experience.
Our student’s feedback is vital to understand their learning experience and improve it when something is off. Without their input, we’ll be teaching courses over and over again without really focusing on the most important person in the room and attending to their emotional and cognitive needs. If we keep doing the same and expecting different results, we’re destined to fail.
Unboxing the learning experience is vital to assess our course’s quality and improve when we need to. Here’s where technologies like facial recognition and eye tracking come in handy. By analyzing people’s facial expressions, eye movements, and other body features we can understand more thoroughly what’s happening in the classroom whether it’s a physician or online setting. Once we know what’s wrong (or right) we can implement the changes that’ll make the experience more pleasant and useful for everyone involved (teacher and student).
Are you prepared to use these technologies to enhance your student’s learning experience?