The role of design building non-discriminatory AI
Strategy · Research · Writing · Program Management
Â
Â
The UK’s Information Commissioner’s Office (ICO) wanted to better understand the role of service and UX design in creating non-discriminatory AI-driven products and services including:
- What are the challenges product teams face in mitigating algorithmic discrimination resulting from using personal data?
- What are the harms that occur when discrimination manifests within an AI system as a result of using personal data?
- What support might be needed to help design teams implement practical changes their products and services?
We began this research with three hypotheses:
- The use of personal data in AI can result in algorithmic discriminatory harm;
- Design interventions, specifically service design and UX design, can help mitigate discriminatory harms created by AI-driven products and services;
- Data protection guidance can help mitigate discriminatory harms created by AI-driven products and services.
We developed the following mixed methodology approach to explore these hypotheses and identify how the ICO might support AI product teams to identify, assess, and address discriminatory harms.
- desk research,
- expert interviews, and
- collaborative workshops (hosted by our partners AIxDesign)
- stakeholder working sessions.
Â
We created this approach for the following reasons:
- An insights-led approach, informed by desk research and community advocates, offered valuable insights into the current state of non-discriminatory algorithmic design and guidance, while remaining time- and cost-effective.
- These insights helped us to identify areas of strategic interest for focused and collaborative workshops with AI product teams, to better understand and evidence the challenges they face, and the support they need.
- Following the lessons of anti-discriminatory practitioners and organisations like equity X design, Design Justice Network, Hera Hussain – author of trauma-informed design and Founder & CEO of Chayn, and antiracistby.design, we sought to prioritise the people and communities disproportionately impacted by algorithmic systems – namely women and people of colour – because they are the best placed to tell us about the human impact of the algorithmic harm.
Â