Research
How AI Can Foster Inclusion
By: Sarah Hoffman | December 9, 2021
We've spent a lot of time discussing the unintended bias that can easily creep into AI algorithms. But the same technology, properly designed and trained, can also be used to confront biases. A new generation of automated tools seeks to proactively promote inclusion in:
  • Facebook.
  • Twitter.
  • LinkedIn.
  • Print

Writing. Text IQ’s Unconscious Bias Detector identifies partiality in performance appraisals, reporting on issues such as male managers giving higher scores to male workers or certain types of employees getting more personality-focused feedback than work-focused comments.1 Content Moderator and Writers.ai check your writing for language that conflicts with a company’s preferred style or that could come across as biased, flagging terms like “blacklist” and messages like “Is that the best you can do?”2 Grammarly, which offers a service designed to improve written communication, launched a sensitivity feature that detects politically loaded terms, such as someone’s use of “Chinese Virus” instead of coronavirus or COVID-19.3 These checks can be useful for presentations, résumés, and social media posts, helping users avoid inadvertently offending people. San Francisco now uses a bias mitigation tool to remove race from police reports when deciding to charge suspects.4

Speech. Microsoft’s Group Transcribe app not only transcribes meetings in real-time but also translates them, enabling non-native speakers to more fully participate in meetings.5 New versions of PowerPoint catch non-inclusive language, including gender and sexuality bias and ethnic slurs in real time based on spoken words (see Figure 1). Siri no longer defaults to a female voice, which can perpetuate sexist stereotypes, and can speak in a variety of accents and languages.6 And while Big Tech companies like Microsoft, Google, Amazon, and Apple are still lagging behind in reducing racial bias in speech recognition technology (with accuracies of 73%, 69%, 69%, and 55% respectively), British speech recognition startup Speechmatics recently announced an accuracy rate of 83% for Black voices, after training its AI model with unlabeled data from social media and podcasts to help it learn different aspects of speech including accent, language, and intonation.7

Imagery. In May, Microsoft launched people filters to help their advertisers find relevant, inclusive imagery in seconds, enabling filtering by characteristics like gender, ethnicity, and age. 8 That same month, Google announced changes to its camera and imaging products, improving accuracy for dark skin tones based on a broader data set of images featuring black and brown faces.9 Snapchat’s “inclusive camera” effort captures a wide range of skin tones and also aims to remove biased assumptions when automatically adjusting people’s appearances, such as the assumption that smaller, thinner noses are better.10

Promising Advances Extend Efforts

While these tools advance inclusion for both individuals and corporations, we can take things even further with the help of additional, smarter AI:

Creating new tools for underserved communities. In 2019, Snapchat released a gender-changing AI filter. While many were concerned that the tool made light of a serious matter, some in the transgender community were happy to have a safe and easy way to explore themselves, including projecting what they would look like if they decided to pursue hormone therapy.11 Online gamers have been harassed when their voices don’t match their gender identity. Created to make gaming more fun, Modulate’s “voice skins”, which use machine learning to analyze the speech of a player and produce new speech with the exact same emotion, inflection, and cadence of the original player but in the voice of a chosen character, provide a privacy shield for gamers. And more than 100 early testers asked if the technology could be used to ease the dysphoria caused by a mismatch between their voice and gender identities.12 Using the powerful language generator GPT-3, Create Lab Ventures created an Afro-Latina, bilingual AI, which debuted in school systems worldwide in September to inspire and uplift children of color.13

Identifying our own unconscious biases. Identifying AI bias may help us better appreciate our own biases. In 2018, Amazon shut down its AI recruiting tool when it was discovered to be biased against women.14 The model was trained on 10-years of mostly male resumes. The system ended up prioritizing resumes with verbs like “executed” which were more commonly found in male resumes and penalizing resumes that contained the word “women’s”, such as mentioning “women’s chess club”. More recently, a video-analyzing AI recruiting tool was found to favor people with bookshelves behind them.15 Do human recruiters have similar biases? As companies develop and use fairness tools to uncover bias in their AI algorithms, these tools could potentially be a window into our own biases.

Tracking the changing landscape in real time. Acceptable terminology can change quickly. In June 2020, GitHub replaced the term “master” with “main” to avoid a slavery reference.16 That same year, employees from numerous companies, including more than 50 employees from Microsoft, initiated efforts to remove certain terminology from their source code, switching terms like whitelist and blacklist to allowlist and denylist and master-slave to primary-secondary.17 Today, AI is being used to monitor websites for things like product price drops, breaking news stories, and new job postings.18 This technology could be extended to point out wording changes that are happening online in real-time, as suggestions for a company to consider in their own communications.

Self-driving cars might look very different if they were designed with common disabilities in mind. Research has shown that drivers’ reaction times increase as they get older.19 Yet reacting to unexpected situations is precisely when driverless cars expect a human to intervene.20 Making things worse: almost no experiments were done on aging drivers and driverless cars. Snapchat’s gender-changing AI mentioned above might be very different if it were specifically designed as a tool to help transgendered people.

AI is a technology that’s coming of age. Although it gets a lot of bad press for both overlooking and reinforcing prejudices, perhaps when properly designed and trained with inclusion considered from the start, AI could more effectively counter our biases and support underserved communities. How might other tools and industries be reimagined if inclusion were considered upfront?

 
  • Facebook.
  • Twitter.
  • LinkedIn.
  • Print
1 https://www.axios.com/ai-unconscious-gender-racial-bias-87ff0e67-1e05-4dc6-a8c9-ea3a6e6876aa.html
2 https://www.axios.com/writer-automatic-language-checking-tool-eb22ddd3-c75c-4a0f-a48a-76c62f4435f9.html
https://userway.org/moderator
3 https://www.fastcompany.com/90488139/if-you-write-wuhan-virus-grammarly-will-remind-you-not-to-be-racist
4 https://www.theverge.com/2019/6/12/18663093/ai-sf-district-attorney-police-bias-race-charge-crime
5 https://techcrunch.com/2021/03/03/microsoft-launches-group-transcribe-a-transcription-and-translation-app-for-in-person-meetings/
6 https://techcrunch.com/2021/03/31/apple-adds-two-siri-voices/
7 https://www.cnbc.com/2021/10/26/speech-recognition-firm-speechmatics-beat-tech-giants-at-reducing-bias.html
8 https://about.ads.microsoft.com/en-us/blog/post/may-2021/choosing-inclusive-imagery-just-got-easier-try-people-filters
9 “Google is trying to make its image processing more inclusive”
https://www.theverge.com/2021/5/18/22442515/google-camera-app-inclusive-image-equity-skintones
10 https://www.axios.com/snapchat-cameras-overhaul-racism-6a01b1a0-567e-4660-b19a-7f0a00922374.html
11 https://www.axios.com/snapchats-new-gender-changing-filter-provokes-strong-reactions-7493ab81-63f2-4379-8605-215dc6925c34.html
12 https://www.wired.com/story/deepfake-voices-help-trans-gamers/
13 https://createlabs.io/meet-claira
14 https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
15 https://web.br.de/interaktiv/ki-bewerbung/en/
16 https://www.zdnet.com/article/github-to-replace-master-with-alternative-term-to-avoid-slavery-references/
17 https://www.zdnet.com/article/ibm-microsoft-staff-rally-to-remove-racially-insensitive-language-from-products/
18 https://techcrunch.com/2021/07/16/visualping-raises-6m-to-make-its-website-change-monitoring-service-smarter/
19 Salvia, E., Petit, C., Champely, S., Chomette, R., Di Rienzo, F., & Collet, C. (2016). Effects of age and task load on drivers’ response accuracy and reaction time when responding to traffic lights. Frontiers in aging neuroscience, 8, 169. https://www.frontiersin.org/articles/10.3389/fnagi.2016.00169/
20 https://www.wired.com/story/stop-saying-driverless-cars-will-help-old-people/
1007107.1.0
close
Please enter a valid e-mail address
Please enter a valid e-mail address
Important legal information about the e-mail you will be sending. By using this service, you agree to input your real e-mail address and only send it to people you know. It is a violation of law in some jurisdictions to falsely identify yourself in an e-mail. All information you provide will be used by Fidelity solely for the purpose of sending the e-mail on your behalf.The subject line of the e-mail you send will be "Fidelity.com: "

Your e-mail has been sent.
close

Your e-mail has been sent.

This website is operated by Fidelity Center for Applied Technology (FCAT)® which is part of Fidelity Labs, LLC (“Fidelity Labs”), a Fidelity Investments company. FCAT experiments with and provides innovative products, services, content and tools, as a service to its affiliates and as a subsidiary of FMR LLC. Based on user reaction and input, FCAT is better able to engage in technology research and planning for the Fidelity family of companies. FCATalyst.com is independent of fidelity.com. Unless otherwise indicated, the information and items published on this web site are provided by FCAT and are not intended to provide tax, legal, insurance or investment advice and should not be construed as an offer to sell, a solicitation of an offer to buy, or a recommendation for any security by any Fidelity entity or any third-party. In circumstances where FCAT is making available either a product or service of an affiliate through this site, the affiliated company will be identified. Third party trademarks appearing herein are the property of their respective owners. All other trademarks are the property of FMR LLC.


This is for persons in the U.S. only.


245 Summer St, Boston MA

© 2008-2024 FMR LLC All right reserved | FCATalyst.com


Terms of Use | Privacy | Security | DAT Support