Minimizing and Preventing Bias in AI

Abstract: With the rise of algorithmic products, a new design challenge has been introduced in the creation of software - the danger of unintentional bias. It’s easy to overlook inclusion when building a product. Even as people look to be more inclusive, it’s possible to leave a gap. For example, as cities rolled out wheelchair accessible curb cuts, they inadvertently created a new problem for blind sidewalk users. There was no signal to the blind users that they were at an intersection. These unrecognized biases can show up in tech product development.

As developers increasingly build tools to automate decisions, if we don’t build with an eye towards inclusion, we can end up enshrining bias. Artifical Intelligence is alluring because it offers the promise of ‘objectivity’, but the fairness of our algorithms is only as good as the data we select and use to train our models. One criminal risk assessment tool meant to predict recidivism has already been shown to exacerbate human biases when making risk judgements, and unfairness in loan approval, racist bots, and discriminatory hiring practices have all made the news.

It’s possible to recognize and respond to bias in the development process, and with increased automation it will be evermore critical. Inclusion will be a necessary priority for organizations and critical component of any risk remediation strategy.

Frances can speak from her experience with this, specifically driving Pinterest’s recent change to give users the option to filter searches to specific skin tones. Today at Gigster she is putting the same consideration to her efforts to build AI tools to change the future of work.

In her presentation and Frances will discuss how to design with an eye towards inclusion. They will share strategies to avoid building bias into your algorithms and best practices to limit bias in your machine learning tools.

Bio: Frances Haugen is the Director of Data Product at Gigster, focusing on developer productivity and machine learning. Frances is a passionate advocate for the training of business decision makers in how to use machine learning effectively to transform their businesses and believes that thinking in the patterns of machine learning will be as integral a skill as Excel for 21st century leaders. Frances has taken data-based user experiences from 0->1 in multiple applications across some of the leading tech companies. She co-founded the Computer Vision team at Yelp and built out their data mining team. She was the lead product manager for home feed ranking at Pinterest (a recommender system) and worked on a diverse array of Search experiences at Google including founding the Google+ Search team, launching the first mobile book reader, and co-founding the Boston Search team. Frances loves user-facing big data applications and finding ways to make mountains of information useful and delightful to the user. Frances was a member of the founding class of Olin College and holds an MBA from Harvard.

Open Data Science Conference