Skip to main content

Ethics first: building trust in AI with Jo Stansfield

Artificial intelligence (AI) is transforming how we live and work, but who is it really built for? And how can we make sure it benefits everyone equally?

In this episode of the Learn Better Podcast, hosts Caron and Lee are joined by guest host, Chiraag Swaly, Head of Data and Technology at Kaplan. Chiraag, who previously appeared as a guest on the series, brings nearly two decades of experience in designing and delivering IT qualifications across sectors - offering a practical view on how professionals can thrive in an AI-driven world.

Together, they speak with Jo Stansfield, founder and director of Inclusioneering Ltd, to explore how technology and humanity intersect, and why building inclusion, trust, and accountability into AI systems has never been more urgent.

Seeing through new lenses

As guest host, Chiraag brings a technologist’s curiosity to the conversation, reflecting on how AI can support learning and self-awareness. The conversation begins with him reflecting on his own experiments with AI – not just as a tool for answers, but as a way to uncover blind spots. By prompting AI to ask him questions, he’s discovered insights about his own thinking and habits. It’s a reminder that AI isn’t only about automation or productivity; it can also be a catalyst for self-reflection and learning.

Caron and Lee respond with humour and curiosity, sharing light-hearted stories about their own encounters with technology. The discussion quickly moves from amusement to awareness: technology can be powerful, but it must be transparent and honest about what it knows, and what it doesn’t.

From algorithms to inclusion

Jo Stansfield joins the conversation as Lee asks her what led her to shift from engineering to people and data. She explains how her career has evolved from software engineering to people analytics and inclusive innovation. After more than 20 years working in technology, she realised that improving engineering isn’t just about refining code - it’s about understanding people.

When the 2020 exam grading crisis exposed how algorithmic decision-making can unintentionally disadvantage certain groups, Jo saw first-hand how technical assumptions can embed inequality. Her focus now is on using data responsibly: helping organisations understand representation, uncover bias, and make evidence-based decisions that support equity and fairness.

Thinking like an engineer

Chiraag asks Jo whether her engineering mindset helps in solving people-focused challenges. She explains how she applies structured, iterative methods to inclusion, such as the ‘plan, do, check, act’ cycle. However, she adds that even proven frameworks like Agile can fall short when human psychology is overlooked.

Her research uncovered that technical tasks are often seen as more valuable than interpersonal or ‘softer’ work, which is a perception that disproportionately limits progression for underrepresented groups. She also found that some employees from minority backgrounds feel less empowered or trusted to work independently, contradicting Agile’s collaborative principles.

She shares how her advice to firms is to “always be as intentional about building in inclusion as you are about product reflection.” For example, inclusion doesn’t happen automatically, it must be designed, tested, and refined with the same care as any technical process.

Responsible innovation

Chiraag asks what ethical challenges business leaders should be most aware of as AI adoption accelerates. Jo explains that many organisations still don’t know where or how AI is being used within their systems. The first step, she explains, is to map it through identifying what’s being used, and for what purpose. Only then can you begin to understand and manage the ethical risks.

But governance isn’t only about technology. Culture matters too. Jo emphasises the need for psychological safety - workplaces where people feel able to experiment, share ideas, and raise concerns without fear. Building trust within teams is essential for building trustworthy technology.

Auditing AI for humanity

Jo also discusses her work with For Humanity, a global charity developing independent audits for AI systems to ensure they operate responsibly. Inspired by the world of financial auditing, For Humanity aims to create accountability frameworks that assess not just how AI functions, but how it is governed.

As a trustee of the British Computer Society (BCS), Jo continues to champion professional ethics across the technology sector. She reflects on the lessons of the Post Office Horizon scandal, which revealed how lack of oversight and transparency can have devastating real-world consequences. For Jo, these cases reinforce the importance of professional standards, ethical codes, and independent scrutiny.

Learning better, together

As the conversation draws to a close, Lee asks Jo what ‘learn better’ means to her. She and the hosts share a reflection on the importance of curiosity, openness, and continual learning. For Jo, the key to ‘learning better’ lies in combining human insight with technological progress by using tools like AI to enhance understanding, not replace it.

Across every topic, one theme stands out: inclusion, reflection, and ethics aren’t barriers to innovation - they’re the foundations of it.

Subscribe to our podcast

Table of contents

Share article

Related articles

Ethics first: building trust in AI with Jo Stansfield

Ethics first: building trust in AI with Jo Stansfield

Jo Stansfield joins the Learn Better Podcast to unpack how we can build ethical, inclusive, and trustworthy AI systems in a fast-moving digital world.

Kaplan

3 minute read

How to pass the CIMA BA3 exam

How to pass the CIMA BA3 exam

Expert tutor, Iryna McDonald, shares how to pass the CIMA BA3 exam with topic overviews, challenges, and tips to boost your confidence.

Iryna Mcdonald

4 minute read

The price of football: financial sustainability in the game with Kieran Maguire

The price of football: financial sustainability in the game with Kieran Maguire

Kieran Maguire breaks down football’s finances, exploring sustainability, ownership, regulation, and what the game teaches us about money.

Kaplan

3 minute read

View all articles