Algorithms that Enhance Empathy? The Potential and Limitations of AI in SEL
Social-emotional learning (SEL) has been gaining momentum in school districts as educators recognize that teaching students how to manage emotions, build healthy relationships, and make responsible decisions is crucial to their overall success.¹ However, despite growing recognition of SEL’s importance, its adoption still remains a particularly tricky challenge.
Recent data shows that while nearly two-thirds of schools have implemented a formal SEL curriculum, a third has not. Among those without such programs, 46% of school leaders cite time as the biggest barrier to adoption. Even for those schools with a formal SEL curriculum, 72% report struggling to implement these lessons effectively due to time constraints.2
Given this overall struggle surrounding its implementation, how can we make SEL more doable amidst most districts’ busy schedules and limited resources? This is where AI can come into the picture. Taking advantage of machine learning may just be the solution we need to make SEL in education not only more feasible but perhaps more scalable and more personal as well.
Why SEL Matters
SEL isn't just an education buzzword—it's an approach for equipping students with the tools needed for them to succeed in life. Think about it: self-regulation, social skills development, and responsible decision-making.3 All these have an impact on every single aspect of a student's life, from their performance in school to their future career and personal relationships.4
The biggest issue, however, is not defining SEL, but fitting such programs into an already jam-packed school day. As a former elementary school teacher, I can clearly remember how hard it already was to juggle learning objectives and try to meet the diverse needs of all of my students. I can only imagine that the added pressure of having to integrate SEL into my curriculum, along with everything else, would have felt like an impossible task.
However, we cannot overlook the immense benefits this type of learning has. Numerous studies indicate that students who participate in an SEL program do better academically, have more positive attitudes, and end up developing fewer symptoms of anxiety and depression.5 So, while it might feel like just another addition to the curriculum, SEL is actually one of the most important ingredients for a well-rounded education.6 The real question is: how can we implement it efficiently and effectively?
How AI Could Transform SEL
Introducing artificial intelligence (AI) technology to classroom settings isn’t a completely new thing. In recent years, school systems have started bringing in online chatbots to instruct students independently or alongside teachers. Such advancements have already helped offer additional support—especially in mixed-ability classrooms—by reducing the pressure on educators and providing a more personalized learning experience for students.
With this in mind, it only seems like a natural (or rather, artificial) progression to adapt AI technology to the context of SEL. Here are just a few ways this can be accomplished.
Real-Time Emotional Feedback & Personalized Support
Back when I had my own classroom, one of the hardest challenges I faced was identifying when a student was struggling emotionally. Whether they were feeling anxious, frustrated, or simply overwhelmed, it wasn’t always clear those emotions existed until they started to interfere with the student’s ability to focus or participate in class. This is where I see the potential for AI to make the biggest difference—by providing real-time emotional feedback that can flag these issues before they grow into bigger concerns.
Take Hume’s Empathic Voice Interface (EVI) as an example, a tool that analyzes vocal patterns to detect emotions like frustration or sadness based on how something was said, not just what was being said. Interfaces like this could be incorporated into classrooms to help identify when students are experiencing negative feelings during discussions or assignments.
In the future, this might look a little something like this: during lessons or group activities, students might use tablets or computers running this kind of software in the background. In theory, Hume's EVI could one day be able to provide teachers with a general overview of the emotional insights regarding their students—such as noticeable increases in stress or disengagement. While AI could recommend follow-up strategies like mindfulness activities, teachers would ultimately be the ones to decide if these interventions are appropriate for each student or not. This approach ensures that insights are channeled through the teacher, preserving both student privacy and teacher agency in decision-making.
But, of course, it’s not that simple. Emotional AI isn’t foolproof.7 Although these systems rely on vast datasets and patterns to make their predictions, any teacher knows that feelings aren’t all that predictable. Humans—and especially children—don’t always express frustration in obvious ways. Sometimes, we hide our emotions behind sarcasm or polite responses, which could easily confuse an AI (just as it can confuse each other!). There’s also the risk of algorithmic and racial bias where technology might misinterpret emotions due to individual or cultural differences. For instance, studies have shown that emotional AI can disproportionately attribute negative emotions to certain racial groups, which could raise a huge concern in diverse classrooms.8
Given these complexities, while tools like Hume’s EVI have the potential to provide valuable insights, they should be used cautiously and always in tandem with human judgment. AI can be an effective aid for flagging emotional signals that teachers might otherwise miss, but it should not replace the teacher's own observations or the need for direct, empathetic human connection.
Scalable Solutions to Different Classrooms
As we previously discussed, one of the biggest barriers to fully integrating SEL into classrooms is time. Teachers are already stretched thin, and finding a spare hour in the day to develop and implement SEL activities can feel overwhelming. This is where AI tools come in, offering scalable solutions that adapt to the diverse needs of students.
For example, Language Learning Models (LLMs) such as ChatGPT can assist teachers by generating SEL activities and lessons through tailored prompts. To illustrate, teachers can input prompts like:
“Create a reflective journaling activity for students to describe a time when they felt brave, such as speaking up in class or making a new friend.”
“Design a mindfulness exercise for a student who seems nervous about an upcoming math test, focusing on helping them relax and feel confident.”
Resources such as the Prompt Library dedicated to SEL, developed by the AI for Education team in collaboration with educator Margot Toppen, allow teachers to instantly access a variety of customized activities, from role-playing scenarios to conflict resolution strategies. These AI-generated exercises can make SEL more accessible, ensuring that even the busiest classrooms can benefit from this kind of support.
However, as with emotional feedback, AI is not without its challenges when it comes to scalability. As helpful as these tools can be, there’s always the risk that one-size-fits-all algorithms might not capture the nuances of individual classroom dynamics. The danger lies in overreliance on automated prompts without considering the emotional context or the cultural sensitivities of students. AI may be able to suggest mindfulness exercises or journaling activities, but these might not always align with the unique emotional landscapes of a specific classroom that only the teacher will truly understand.
That being said, AI still offers an incredible opportunity to fill gaps in SEL implementation. Such tools can help teachers create cooperative learning projects that blend academic content with social-emotional growth, as well as guide students in building positive relationships, fostering a more inclusive and empathetic learning environment. These solutions can save time, freeing up teachers to focus on deeper interpersonal connections with their students.
The Other Side of AI in SEL
It is easy to be excited about what AI can do for SEL, but we must not lose sight of the ethical issues surrounding it.9 As we have begun to discuss, there are several concerns that teachers, schools, and districts should keep an eye out for while integrating such algorithms into the classroom. Here are just three of them.
Human interactions
Human interactions are at the heart of SEL. How else can students learn how to interpret emotions and form appropriate responses than by practicing with their peers and mentors?
At this point in time, technology still cannot feel anything—although it comes pretty close with “computational empathy,” its ability to simulate understanding and respond accordingly. However, we all know that emotions do not work like an algorithm, with specific inputs leading to predictable outputs. With this in mind, educators must strike a balance, ensuring that AI enhances—rather than replaces—the personal connections that are essential for effective SEL.
Data Privacy
To provide adequate recommendations, AI systems must deal with sensitive information about students' emotions, behaviors, and interactions. It is essential that this information remains secure so that it is not used and distributed in any way outside of the students’ and parents’ knowledge. Schools should have proper policy frameworks in place that clearly communicate what data they are collecting, why they are collecting it, and how.
Bias
The last but probably the most critical concern is bias.9 At the end of the day, an AI system will end up reinforcing any biases in the data it is trained on. This matters a lot when it comes to SEL since these predictions could have lasting impacts. An AI tool that misreads the emotions of children belonging to certain backgrounds could perpetuate unfair treatment, or even worse, lost opportunities.11
For instance, research on speech recognition systems found an average word error rate of 0.35 for Black speakers, compared to 0.19 for white speakers.11 In a classroom setting, this disparity means that an AI-driven SEL tool that interprets vocal patterns could consistently fail to accurately interpret the emotions of Black students. Say, for example, the system misreads a Black student's neutral tone as expressing frustration or disengagement. In this case, the teacher might be prompted to intervene unnecessarily, making the student feel singled out or misunderstood. Conversely, if the system fails to recognize genuine signs of distress, the student might not receive needed support, leaving the problem unaddressed.
Such misinterpretations can lead to a lack of trust in the tool, lost opportunities for genuine emotional support, and even perpetuate the alienation or mistreatment of students. This is why it becomes so important to make sure that the AI tools entering schools are trained on unbiased data and tested with fairness in mind.
behavior change 101
Start your behavior change journey at the right place
AI and the Future of SEL
As we look ahead, it’s clear that AI has the potential to be a powerful tool for advancing SEL in schools. However, to truly harness its benefits while avoiding its potential pitfalls, careful consideration must be given to how it can best support teachers and students. Below are several areas where AI could significantly impact SEL, from supporting educators to ensuring a balance between technology and human interaction.
Supporting Teachers in the AI Era
For AI to truly leave a positive emotional impact, teachers must be adequately equipped and trained. At this moment, many educators feel quite uneasy about applying AI to their classrooms. According to an EdWeek Research Center survey of over a thousand district leaders, principals, and teachers, nearly half of educators shared they’re uncomfortable with the AI technology that they’ve encountered or expected to encounter in the next year.12 Without the appropriate training, these tools might very well turn out to be more of a burden than a blessing—and might even backfire in the process.
To ensure the proper use of AI in teaching SEL, schools should provide professional development that boosts teachers' confidence and AI literacy. This training should cover both the basics of machine learning and ethical considerations regarding students' emotional data,12 helping teachers use AI tools without compromising privacy or fairness.13 Teachers must learn how to integrate AI-generated insights with SEL strategies, ensuring emotional data supports rather than dominates their approach. This holistic training will empower teachers to use AI effectively while maintaining essential human interactions. Additionally, students should be introduced to how AI supports their learning, fostering understanding and comfort with the technology.
Balancing AI with Human Interaction
While AI can take over administrative jobs of grading or tracking attendance, we should keep in mind that nothing can replace the very human elements in teaching, like adding a personal touch to instructions or building emotional connections. The true value that automation adds is facilitating daily logistics, thereby freeing up some time that teachers may wish to spend on honing other aspects of their student's lives.
For example, AI can monitor trends in student behavior and provide early alerts regarding signals of emotional crisis. In turn, this would allow the teacher to reach out with support before issues escalate. This makes AI a valuable tool in the practice of SEL: helping teachers identify emotional needs in real-time so that they can foster meaningful connections with more individualized interventions. In this way, AI would ultimately manage the background tasks while allowing teachers to be at the forefront of SEL implementation and provide the empathy and human contact that is actually needed for real emotional learning.
Community Efforts
All stakeholders in SEL—educators, policymakers, tech experts, parents, and students alike—must work together to ensure that AI is used ethically and in the best interest of the students. By involving everyone in the process, we can ensure that AI tools are developed and implemented with transparency and trust. When parents are engaged in how AI is used in their child’s education, it builds confidence in the technology and ensures that the tools align with the values of the community. Additionally, students can gain a better understanding of how AI fits into their learning, what it does, and how it supports them. Collaboration among all parties ensures that diverse perspectives are considered, helping to prevent biases that may arise when technology is developed and used in isolation.
Empathy Beyond Algorithms
As I think back to my time in the classroom, I can’t help but imagine how powerful it would have been to have a tool that could provide insights into my students’ emotional well-being—something to help me catch signs of frustration or anxiety before they become barriers to learning. AI has the potential to be just that: a support system that amplifies our ability to connect, to empathize, and to respond with care. But ultimately, it’s not the technology alone that will transform social-emotional learning; it’s how we choose to integrate it, ensuring it complements and enriches the human relationships at the heart of education. The future of SEL lies not in replacing the teacher's touch but in using every tool available to help both teachers and students thrive, emotionally and academically.
References
- Stanford, L., & Meisner, C. (2023, July 27). Social-emotional learning persists despite political backlash. Education Week. https://www.edweek.org/leadership/social-emotional-learning-persists-despite-political-backlash/2023/07
- Prothero, A. (2024, April 16). What’s really holding schools back from implementing SEL? Education Week. https://www.edweek.org/leadership/whats-really-holding-schools-back-from-implementing-sel/2024/04
- Anonymous. (2023). Social Emotional Learning and AI. AI in Education. https://edtechbooks.org/ai_in_education/social_emotional_learning_and_ai
- Cipriano, C., Strambler, M. J., Naples, L. H., Ha, C., Kirk, M., Wood, M., Sehgal, K., Zieher, A. K., Eveleigh, A., McCarthy, M., Funaro, M., Ponnock, A., Chow, J. C., & Durlak, J. (2023). The state of evidence for social and emotional learning: A contemporary meta-analysis of universal school-based SEL interventions. Child Development, 94, 1181–1204. https://doi.org/10.1111/cdev.13968
- Collaborative for Academic, Social, and Emotional Learning. (n.d.). What does the research say? CASEL. https://casel.org/fundamentals-of-sel/what-does-the-research-say/
- Payton, J., Weissberg, R.P., Durlak, J.A., Dymnicki, A.B., Taylor, R.D., Schellinger, K.B., & Pachan, M. (2008). The positive impact of social and emotional learning for kindergarten to eighth-grade students: Findings from three scientific reviews. Chicago, IL: Collaborative for Academic, Social, and Emotional Learning
- Miles, N. C. (2024, June 23). Are you 80% angry and 2% sad? Why ‘emotional AI’ is fraught with problems. The Guardian. https://www.theguardian.com/technology/article/2024/jun/23/emotional-artificial-intelligence-chatgpt-4o-hume-algorithmic-bias
- Rhue, Lauren, Racial Influence on Automated Perceptions of Emotions (November 9, 2018). Available at SSRN: https://ssrn.com/abstract=3281765 or http://dx.doi.org/10.2139/ssrn.3281765
- Tatineni, S. (2019). Ethical considerations in AI and data science: Bias, fairness, and accountability. International Journal of Information Technology and Management Information Systems, 10, 11-20.
- Brännström, A., Wester, J., & Nieves, J. C. (2024). A formal understanding of computational empathy in interactive agents. Cognitive Systems Research, 85, 101203. https://doi.org/10.1016/j.cogsys.2023.101203
- Koenecke, A., Nam, A., Lake, E., Nudell, J., Quartey, M., Mengesha, Z., Toups, C., Rickford, J. R., Jurafsky, D., & Goel, S. (2020). Racial disparities in automated speech recognition. Proceedings of the National Academy of Sciences, 117(14), 7684-7689. https://doi.org/10.1073/pnas.1915768117
- Langreo, L. (2023, September 20). What AI training do teachers need most? Here’s what they say. Education Week. https://www.edweek.org/leadership/what-ai-training-do-teachers-need-most-heres-what-they-say/2023/09
- Ng, D.T.K., Leung, J.K.L., Su, J. et al. Teachers’ AI digital competencies and twenty-first century skills in the post-pandemic world. Education Tech Research Dev71, 137–161 (2023). https://doi.org/10.1007/s11423-023-10203-6
About the Author
Mariel Guevara
Mariel Guevara is a Junior Research Analyst at The Decision Lab. She is currently pursuing her MA degree in Developmental Psychology at Ateneo de Manila University. She has held several research positions in the past spanning different technology-mediated interventions tackling issues such as substance use prevention, mental health promotion, and civic engagement. She is especially passionate about making mental health services more accessible in the Philippines. In her free time she enjoys playing video games, going on nature walks, and playing sports.
About us
We are the leading applied research & innovation consultancy
Our insights are leveraged by the most ambitious organizations
“
I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.
Heather McKee
BEHAVIORAL SCIENTIST
GLOBAL COFFEEHOUSE CHAIN PROJECT
OUR CLIENT SUCCESS
$0M
Annual Revenue Increase
By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.
0%
Increase in Monthly Users
By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.
0%
Reduction In Design Time
By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.
0%
Reduction in Client Drop-Off
By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%