AI learns from past mistakes, and so must we

by Kay Firth-Butterfield, CEO, Good Tech Advisory
28 Feb 2024
Share this story
AI learns from past mistakes, and so must we

Bett 2024 keynote speaker Kay Firth-Butterfield was recently named by Time Magazine as one of only four people recognised for their impact on AI. Here the CEO of Good Tech Advisory LLC reflects on the bias in data that if not addressed could restrict AI’s potential to do good.

We value education not only for the knowledge it brings but also because it improves financial outcomes for individuals and allows for social mobility. Through personalised education, Artificial Intelligence has the potential to help more people make even more out of their education, resulting in a better future. But in reality, will this actually be true for all?

First we should consider data. In 1817 Jane Austen’s protagonist Anne Elliot, in Persuasion, said that she would never refer to books for explaining women’s emotions because they were all written by men. We had a data problem then and still have it today.

Language Learning Models have the potential to revolutionise our experience of education, but they rely on data collected from mainly white men from the Global North. This is because they have held the pen longest, meaning they have created the bulk of data available to these models.

Take one personal example. Our daughter is a pilot in the US Airforce. About 6.5% of pilots are women and only about 3% fly fighter jets. The data that underpins much of our daughter’s experience as a female military pilot is overwhelmingly male. When it comes to flying equipment or, importantly, healthcare in the event of injury, the male experience is the norm. If we are to succeed in creating better economic prospects for all with AI we have to do better with data.

Currently about 3 billion people cannot access the internet, and billions more have not created a sufficiently large data footprint for their contribution to be elevated by generative AI without very precise prompts. Some would argue that this is just how our world is – and there is some truth in that. But such is the potential and peril of AI, we should not accept the limitations and biases of our current world. Indeed, for AI education to work for everyone we need to do more.

By next year it is estimated that we will produce more data every 15 minutes than we have ever created before. This will be in part human created and a lot of AI cannibalism (data created by large language models (LLMs) when they answer human questions). If we do nothing to correct for bias, this will further marginalise women and minorities.

It is often said that we do not learn from history – a quote attributed to German philosopher Georg Hegel. But if we wish to avoid the tradition of “To the victor the spoils” we must do more, or the communities which have created little or no data will be overlooked, forced to learn from data sets which do not cover their experiences.

In education to truly serve the needs of individuals and communities we must be sure to design, develop and use AI with great care. Using AI responsibly is the only way we to move us all forwards to enjoy the full benefits AI can bring.

 

Take me back to the hub
Loading

Recommended Content

Loading
  • Here’s our expert advice on how to make sure the connections between educators and solution providers made at Bett don’t end after the show.
  • Bett UK 2024 brought 30,000 educational professionals from 129 countries to explore the latest in education together.
  • Building trust in EdTech together

    30 Jan 2024 By Eric Nentrup from Eduevidence.org
    Launching at BETT, the ICEIE is collaboratively developing a reliable and expansive alignment of certification and validation frameworks to help EdTech providers and purchasers navigate sound EdTech e ...
Take me back to the hub

Subscribe to Bett

Sign up to the Bett newsletter to keep up to date with our global series and hear the very latest and most important announcements over the coming months. Simply fill out the form to receive the latest newsletters.

Sign up

Our Partners