Why you should care
Because we all have unconscious biases. Here’s how to get around them.
Never judge a book by its cover — so your mother, priest, teacher or somebody surely told your younger self once upon a time. But seriously, have you ever been browsing in a bookstore and had the time to open every novel that piqued your curiosity? And if you’re relying on recommendations from friends or book clubs, how do you know that they don’t also base their selections exclusively on pretty covers?
Books, humans; tomato, to-mahto. The Book of You is judged by its cover almost constantly, especially when it comes to getting a job. Your résumé is essentially one big dust jacket awaiting judgment, wrapped around the human hardback within. It might be your race or your gender, or even your socioeconomic background, accent, sexual orientation or the ethnic origins of your name. Humans are very good at making judgments and inferences based on limited information — consciously or not — leading to all manner of explicit and implicit biases. A moral offense? Sure, but in business it’s also a financial liability — a discrimination lawsuit waiting to happen.
In recent years, many employers have sought to limit the role of unconscious bias through so-called blind hiring — hiding names from résumés (and therefore implied gender, race and ethnicity); algorithmically assessing experience; and conducting interviews that are literally blind to candidates’ appearances or deaf to the sounds of their voices. All well and good, but as soon as a new employee walks in the office, they are exposed to their bosses and coworkers as themselves, burdened with the cluttered baggage of identity. But increasingly, new solutions — from HR best practices to new pieces of tech — are being adopted that advance the diversity and inclusion battle beyond just hiring.
You’re starting to see a lot of technologies being developed to try to help solve some of these problems.
Ellen Pao, co-founder, Project Include
Beyond hiring, “some companies are starting to think about how they make diversity and inclusion part of their cultures,” says Ellen Pao, co-founder of Project Include, a consulting nonprofit based in the San Francisco Bay Area. That might mean advancing policies such as appointing heads of diversity and inclusion that are not part of HR or promoting the use of data-driven performance reviews in place of subjective evaluations, which, whether conducted by a manager or as a self-review, are prone to bias, according to Project Include.
Replacing humans with algorithms in employee management, including salary and promotion decisions — let’s call it blind management — has been around for several years at multinational enterprises, says Eileen Carey, CEO and founder of Glassbreakers, which works with Fortune 100 companies to create community and mentorship opportunities for underrepresented groups. But it’s also starting to catch on among smaller companies, especially in the tech world, mainly because tech companies are “more open, more forward-thinking and more willing to adopt software platforms” to solve HR problems, says Ilit Raz, co-founder and CEO of Joonko, a tech-driven “diversity and inclusion coach.”
Another way that companies are starting to remove implicit bias from management is by banning salary negotiations, because women are often judged more harshly than men for questioning their paychecks — a step taken by Reddit when it was under the stewardship of Pao as interim CEO. Project Include acknowledges that this proposal may be controversial; Pao asserts that diversity and inclusion policies should not be seen as cookie-cutter approaches that are assumed to work for all companies.
Blindness, though, is not everything. A group of potentially biased human beings decides “which things get weighted” in a supposedly blind algorithm, says Evan Apfelbaum, an MIT professor who researches workplace diversity. And “when the results pop out on the other side, another group of human beings has to subjectively determine what those data mean.” Indeed, objective performance reviews, when the goals implicitly or explicitly favor one group over another, don’t actually help, claims Pao, who is also the chief diversity and inclusion officer at the Kapor Center for Social Impact and a partner at Kapor Capital. So not even our artificially intelligent (and, perhaps, artificially moral) algorithms can check prejudice embedded in company culture or interpersonal interactions. Right?
Not so fast! In fact, “you’re starting to see a lot of technologies being developed to try to help solve some of these problems,” says Pao. Glassbreakers actively promotes diversity by connecting employees with shared identities or personal experiences within multinational companies, which “creates empathy” through face-to-face meetings, says Carey. This helps support the retention and promotion of a diverse workforce, beyond just hiring. In addition, AI startup Joonko can notify managers via email if a certain minority employee seems to be getting overlooked in task assignments, for example, or nudge employees themselves to speak up about their achievements (women are far less likely to do so unprompted than men, according to recent research). Instead of “training people to be blind” to diversity, Raz says, “we need to train people to be more open and inclusive and respectful.”
Of course, “it is misplaced to assume that we’re not going to have to deal with difficult decisions about race and gender [in person] anymore because the computer’s going to take care of it,” Apfelbaum says. “You can’t work in a box all day.” Tech solutions to improve company culture are creative, he says, but that doesn’t mean that “nudges are going to solve all tensions regarding race and gender. I think it can only ever be one tool in the arsenal.”
Silicon Valley futurists often get a bad rap for thinking tech can solve every complex problem facing our messy world, from democracy to disease to poverty. Humans’ tendency for prejudice could perhaps be the messiest, but it seems the technologists might actually have some answers — and some algorithms that can hold our hands on the way to more enlightened and more diverse workplaces.
- True Story
- True Story