Produced by:

Sponsored by:

When AI meets emotional intelligence:

reshaping the future of HR

Produced by:

Sponsored by:

When AI meets emotional intelligence:

reshaping the future of HR

Executive summary

Executive summary

As artificial intelligence reshapes workplace dynamics, HR leaders face a defining moment: how to harness AI's capabilities while preserving the essentially human elements that drive organisational success.

This challenge was explored during HubbubHR's July challenge forum, where senior HR leaders gathered to examine the intersection of artificial intelligence and emotional intelligence with HR technology. The roundtable featured Indi Seehra (Director of Human Resources, London School of Economics), David Leech (former HR Director, RWS IP Services), Per Fredriksson (Senior Advisor, Columbus), Ashley Casey (Global Director of Workplace Experience & Operations, Shutterstock), Lisa Zangari (Chief People Officer, Livingston International), and Jacinta Hennessey (Co-founder, HubbubHR).

The central question, posed by Indi Seehra, was: "How do I best use the integration of emotional intelligence in HR systems? We are getting increasingly closer to the idea that a humanoid could be a line manager, mentor, coach, or at least play a role. How does HR make best use of this?"

The conversation highlighted both the tremendous potential and significant challenges of navigating the relationship of emotional intelligence and AI in HR. Success requires starting with what matters to people, designing strategy with human outcomes in mind, and building bridges across the organisation to find internal champions.

The problem: How to balance the relationship of EI and AI

The problem: How to balance the relationship of EI and AI

The intersection of emotional intelligence and AI systems presents unique challenges that extend far beyond traditional HR technology implementations. Unlike process automation, emotional AI touches the core of human workplace experience—how people feel supported, understood, and valued within their organisations.

As Indi Seehra explained, "emotional AI is where you're getting close to who actually has control—machine or humans. Nevertheless, it's one of the areas of interest to HR professionals because we're interested in how to support people in an emotionally intelligent way."

Cultural acceptance varies dramatically. Indi observed: "There are different parts of the world where the idea of having a coach you can share things with, but actually no human being would ever know, [is] very attractive. There are other parts of the world which says this would be ridiculous if I'm going to be helped through an experience. It's got to be a human being doing it."

The authenticity barrier emerged as a critical concern. Ashley Casey raised concerns about implementation: "Have you considered confidentiality, privacy or people not being able to separate real from intelligence? How are you going to put up the guardrails or protect the employee from an emotional intelligence perspective?"

David Leech highlighted risks of over-reliance, citing research showing "students who rely primarily on AI to write their essays end up with limited cognitive benefit compared with those who write essays for themselves." He questioned whether there might be "a de-skilling effect of AI tools" when applied to emotional intelligence.

The regulatory landscape adds complexity. As Indi explained: "In the UK at the moment, anything you do for AI, effectively a human being has to be responsible for what that AI is doing." This accountability requirement creates implementation challenges whilst the absence of ethical frameworks leaves organisations navigating uncharted territory.

Students who rely primarily on AI to write their essays end up with limited cognitive benefit compared with those who write essays for themselves.

David Leech

former HR Director at RWS IP Services

The shift: human-centred AI integration

The shift: human-centred AI integration

Training the AI on the company values and leadership principles, you can ensure that it's actually the company language, and the tonality that we want to have in our conversations with employees.

Per Fredriksson

Senior Advisor at Columbus

Emotional AI represents a shift from technology-led to human-centred implementation. Rather than asking what AI can do, leading organisations are asking what humans need and how AI can enhance those experiences.

This human-first philosophy is reflected in several key principles that emerged from the roundtable discussion. The core principle focuses on augmentation rather than replacement. As Lisa Zangari shared, "The first thought that came to my mind was how we can use AI to enhance emotional intelligence, particularly empathy and self-awareness, ensuring that our leaders are developing and acquiring these essential skills."

A second principle emphasises alignment with organisational values. Per Fredriksson highlighted the importance of "training the AI on the company values and leadership principles, you can ensure that it's actually the company language, and the tonality that we want to have in our conversations with employees." This approach recognises that emotional AI systems must reflect and reinforce human values rather than operating independently of them.

The shift also involves reimagining the role of HR professionals. Rather than simply implementing technology, HR leaders are becoming architects of human-AI collaboration. Jacinta Hennessey noted this evolution: "Emotional intelligence brings another level of context to AI, which I think is beneficial. Perhaps, the inclusion of EI may offer a more nuanced perspective to address some of the fears that are easily associated with AI."

Critically, this new approach requires adaptive technology infrastructure. Organisations implementing emotional AI successfully recognise that rigid, legacy systems cannot support the flexibility required for human-centred AI applications. Modern, API-first platforms enable the kind of iterative development and customisation necessary for emotional AI systems that truly serve human needs.

Indi Seehra described the potential: “If we've got different tools so they can get the same support in different ways, and we've got supports that help the introverts and the extroverts and support that helps the diverse community of people... what it does is it maximises our opportunity for all of them to get the best experience."

This new paradigm requires treating emotional AI as an organisational capability rather than simply a technology deployment, demanding investment in training, change management, and cultural adaptation alongside technical implementation.

Barriers to change

Barriers to change

Implementation of emotional AI systems faces distinct obstacles that go beyond typical technology adoption challenges and are deeply rooted in human psychology, organisational culture, and the complex nature of emotional intelligence itself.

The first major barrier is definitional complexity. David Leech noted the challenge of applying regulation "particularly around emotional intelligence, which I think is an elusive concept." This elusiveness makes it difficult to programme, measure, and validate emotional responses in AI systems, unlike cognitive tasks that can be clearly defined and assessed.

Cultural variation creates complexity. Indi's international experience revealed stark acceptance differences, requiring flexible, locally-sensitive implementations rather than one-size-fits-all solutions.

Regulatory uncertainty creates hesitation among decision-makers. As Indi explained, "In the UK at the moment, anything you do for AI, effectively a human being has to be responsible." This accountability framework, whilst important for ethical implementation, creates additional complexity and potential liability concerns.

Employee resistance often stems from fundamental concerns about authenticity and privacy. Ashley Casey emphasised the need for comprehensive safeguards: "How are you going to put up the guardrails or kind of protect the employee from an emotional intelligence perspective?" These concerns are particularly acute in emotional AI applications where personal and sensitive information is involved.

Technical implementation challenges include the difficulty of training AI systems while maintaining security. As Lisa Zangari pointed out, "We are deliberate in ensuring that we train our own language model internally to prioritize security and privacy."

How are you going to put up the guardrails or kind of protect the employee from an emotional intelligence perspective?

Ashley Casey

Global Director of Workplace Experience & Operations at Shutterstock

Practical solutions

Practical solutions

We are developing our own internal large language model to ensure the highest standards of data privacy and security, and we discourage the use of ChatGPT. Through our partnership with Google, we are leveraging advanced AI tools to integrate artificial intelligence into our workflows in a secure, controlled, and strategically aligned way.

Lisa Zangari

Chief People Officer at Livingston International

Participants shared specific strategies from their experiences:

Focus on specific use cases with clear value. Per Fredriksson highlighted two key ways AI can support emotional intelligence in the workplace: Employee coaching – When trained on company values, AI can help employees give each other constructive feedback, suggesting tone, language, and phrasing. It provides instant responses and a safe space to ask questions—without fear of them sounding “stupid.” Manager support – For difficult conversations, AI can guide managers on the best approach, offering advice on timing, tone, and language in line with company principles. This enables faster preparation and more thoughtful communication.

Implement comprehensive safeguards and transparency. Ashley Casey outlined essential protective measures: "You could put guardrails for employee training, on the company policy for what AI is and best practices as a tool, and reiterating that it's an algorithm, not a real person. With regard to privacy, confidentiality and GDPR, allowing employees to opt in or out of collecting that data and being transparent with what the information is going to be used for."

Maintain human oversight and touchpoints. Ashley also emphasised the importance of human connection: "I think it's important to also follow up with a real person about what had been spoken about. Just because we know that sometimes AI can go off the guardrails. I think it is important to still have that touchpoint of a real person."

Use gradual exposure approaches. David Leech recommended a measured implementation strategy: "I'm thinking about some kind of regulation... drip feeding exposure to AI would be the way I'd be looking at things, whether socially or in terms of new hires and training." He advocated for "enabling people to become adept at using these systems without becoming overly dependent."

Develop internal security and customization capabilities. Lisa Zangari described her organization’s strategy: "We develop our own language model, Scout, internally to maintain security and privacy, while discouraging the use of ChatGPT. Moreover, we are advancing Scout, our internal LLM, in collaboration with our partnership with Google."

Leverage existing HR touchpoints for implementation. Jacinta Hennessey identified practical starting points: "We're seeing a lot of gains and a lot of measurable results in HR service delivery... looking for efficiency gains in HR service delivery and looking for satisfaction gains from an employee experience perspective. Onboarding is obviously a big piece because the employee's initial experience with the company can be greatly altered depending on how well that process goes."

Case study insights

Case study insights

The roundtable participants shared diverse approaches to AI implementation.

London School of Economics focuses on supporting diverse global communities. Indi described their Microsoft copilot partnership: "We're one of their sort of pilot organisations that they're working with to introduce copilot as mainstream into everything we do."

Shutterstock uses behavioural analytics. Ashley shared their innovative approach: "We're looking to invest in AI sensors in our meeting rooms and common areas to assess patterns behavioural patterns use space usage patterns in order to potentially reconfigure spaces, whether that's more collaborative areas, more independent work areas, to really meet the employee where they are."

Columbus with many competing priorities in HR budgets, AI projects that show results early are more likely to gain traction. Per Fredriksson stressed that this isn’t just about budget discipline, but also about building momentum: “When investing in AI, it’s important to show what return we can get—and to show it fast.”

When investing in AI, it’s important to show what return we can get—and to show it fast.

Per Fredriksson

Senior Advisor at Columbus

Practical recommendations

Practical recommendations

The visible pain points, especially those that matter to executives. What are those questions that are difficult or can be difficult for HR to answer to executives, to stakeholders

Jacinta Hennessey

Co-founder of HubbubHR

Based on roundtable insights, HR leaders should:

Start with employee experience pain points. Identify specific situations where current human-only approaches create bottlenecks or inconsistent experiences. As Jacinta Hennessey noted, focus on "the visible pain points, especially those that matter to executives. What are those questions that are difficult or can be difficult for HR to answer to executives, to stakeholders."

Establish ethical frameworks first. Following Indi Seehra's approach of working with international standards bodies, develop clear policies about data usage, employee privacy, and human oversight requirements. Address Ashley Casey's concerns about guardrails and employee protection before technical implementation.

Build gradual exposure. Implement David Leech's "drip feeding" approach by introducing AI capabilities incrementally. Allow employees to develop comfort and competence with AI tools whilst maintaining opportunities to develop human emotional intelligence skills.

Invest in change management. Address Lisa Zangari's insights about employee training and transparency. Ensure employees understand "what AI is and what to be aware of best practices as a tool, and reiterating that it's an algorithm it's not a real person."

Focus on augmentation. Follow Lisa Zangari's recommendation to "leverage AI to augment emotional intelligence" rather than replace human capabilities.

Build internal technical capabilities. Consider creating internal language models to address security and privacy concerns whilst ensuring AI responses align with organisational values.

Expand the narrative. Look for opportunities to continue the discussion and evolve the narrative. As Indi Seehra envisions, we can expand the knowledge for the emotional intelligence AI space and envision something bigger. Jacinta Hennessey agreed, “it's how we come together to develop the language around this, because it could shift how AI is perceived in the workplace, with the inclusion of EI this could evolve into a more human-centred approach.”

Conclusion

Conclusion

The integration of emotional intelligence into AI systems represents both the greatest opportunity and most significant challenge facing HR leaders today. Success depends not on technological sophistication, but on thoughtful implementation that enhances human experience.

As Indi Seehra observed: "There's something in this whole sort of innovation, emotional intelligence, AI space, emotional AI, that we could do something bigger with, for the profession."

Jacinta Hennessey captured the transformative potential: "When you layer emotional intelligence on top of AI, it is just so fascinating to think of how we can use it as people, as a working community, as colleagues, to humanise the experience. If we think of AI, as the predictive, automated, data driven side of the equation. And then you bring in EI, representing the caring, empathetic, human connection side of the equation, and intersecting those two, it almost creates a whole new narrative, a compelling new story for what AI could be."

The future of HR lies not in choosing between human and artificial intelligence, but in thoughtfully integrating both to create more supportive, inclusive, and effective workplace experiences. For HR leaders ready to begin this journey, the key is to start with what matters most to people, design with human outcomes in mind, and build bridges across the organisation to create lasting positive change.

There's something in this whole sort of innovation, emotional intelligence, AI space, emotional AI, that we could do something bigger with, for the profession.

Indi Seehra

Director of Human Resources at London School of Economics

Take action today:

Take action today:

Begin by identifying one specific area where emotional AI could enhance your employee experience. Engage with your team to understand their concerns and aspirations, then design a pilot programme that demonstrates value whilst building trust and capabilities for broader implementation.

About this research

About this research

This article is based on insights from HubbubHR's challenge forum "The Future of HR: Reimagining Value Through Human-AI Integration". The discussion featured senior HR leaders from diverse industries sharing experiences and strategies for implementing emotional AI in human-centred ways.

References and supporting research:

  • Deloitte 2025 Global Human Capital Trends
  • McKinsey State of AI 2025
  • SHRM AI Adoption Research
  • Gartner 2025 HR Priorities Research

HubbubHR is redefining HR technology for the global mid-market. We help organisations move beyond rigid legacy systems with a flexible, API designed, AI-enhanced full HR suite—built for speed, simplicity, and customer autonomy. Backed by CHRO-led research, our global “people hub” evolves with you, supported by a hands-on team committed to your success. In today’s dynamic world of work, HubbubHR’s flexible platform, combined with expert service, delivers a solution that adapts to meet your complex needs, empowering leaders to make insightful, data-driven decisions.

HubbubHR is redefining HR technology for the global mid-market. We help organisations move beyond rigid legacy systems with a flexible, API designed, AI-enhanced full HR suite—built for speed, simplicity, and customer autonomy. Backed by CHRO-led research, our global "people hub" evolves with you, supported by a hands-on team committed to your success. In today's dynamic world of work, HubbubHR's flexible platform, combined with expert service, delivers a solution that adapts to meet your complex needs, empowering leaders to make insightful, data-driven decisions.

Share this page