logo

Building Inclusive AI for 400M Users at Pinterest

Conference:  Transform X 2022

2022-10-19

Authors:   Nadia Fawaz


Summary

The goal of the inclusive AI team at Pinterest is to make their AI systems perform well across diverse sets of users and reduce bias. They built skin tone ranges and hair pattern search to give control to users on their experience in search and AR try on similar looks recommendation module.
  • Pinterest's AI system consists of several machine learning models trained to optimize objectives such as predicting the likelihood that a pin would be relevant to a Pinner given a variety of inputs.
  • The AI system takes input from queries, user features, content features, and past interactions with pins and boards.
  • Developing inclusive AI requires an end-to-end iterative and collaborative development approach.
  • Reducing bias is important to move away from historical patterns of bias in society, prevent harm, put users first, and improve technical craftsmanship.
  • Pinterest built skin tone ranges and hair pattern search to give control to users on their experience in search and AR try on similar looks recommendation module.
  • The closed box version of the skin tone range system had issues with performance and coverage in darker skin tone ranges.
  • The bias in face detection technology was studied and documented by Joybulam Winnie and Tim need Gabriel in the gender shade study.
  • Pinterest developed an in-house version of the skin tone range system with several computer vision components and a fairness aware tuning phase.
Pinterest's skin tone range system had a coverage issue for images in the darker ranges where the closed box could not output any prediction and it failed to understand the skin tone ranges accurately for many images. The issue was found to be in the face detection module, which was failing to detect faces for the darker range images at much higher rates than for the lighter skin tone images. This type of bias in face detection technology has been studied and documented by Joybulam Winnie and Tim need Gabriel in the gender shade study.

Abstract

Pinterest’s team knows that people want to feel included. When platforms lack representation, it tells people that the way someone may look or where they come from isn't the ‘norm.’ It’s more important than ever before to design inclusive systems that remove historical biases. Nadia Fawaz is the senior staff applied research scientist and technical lead of inclusive AI at Pinterest, which hosts over 400 million users who speak over 35 languages across 8 billion boards. In this keynote, Fawaz explains how ML learns implicit bias and algorithmic fairness, how to design inclusive systems with cross-functional teams, how Pinterest learned from errors and made its models also learn from errors, and how Pinterest has built inclusive features, especially regarding skin tone and hair patterns, to create a more engaging platform. Fawaz discusses how it is possible to change technology with intent. Fawaz’ research and engineering interests include machine learning for personalization, AI fairness and data privacy; her work aims at bridging theory and practice. Before joining Pinterest, she was a Staff Software Engineer in Machine Learning at LinkedIn, a Principal Research Scientist at Technicolor Research lab, and a Postdoctoral Researcher at the Massachusetts Institute of Technology’s Research Laboratory of Electronics.

Materials:

Post a comment

Related work

Conference:  Transform X 2022
Authors: Mostafa Rohaninejad, Ariana Eisenstein, Louis Tremblay, Jack Guo, Russell Kaplan
2022-10-19


Conference:  Transform X 2022
Authors: Susan Zhang, Faisal Siddiqi, Bryan Catanzaro, Erhan Bas, Elliot Branson
2022-10-19