Six months ago, tech entrepreneur Rohan Gilkes tried to rent a cabin in Idaho over the July 4 weekend, using the website Airbnb. All seemed well, until the host told him her plans had changed: she needed to use the cabin herself. Then a friend of Rohan’s tried to book the same cabin on the same weekend, and his booking was immediately accepted. Rohan’s friend is white; Rohan is black.
This is not a one-off. Late last year, three researchers from Harvard Business School — Benjamin Edelman, Michael Luca and Dan Svirsky — published a working paper
with experimental evidence of discrimination. Using fake profiles to request accommodation, the researchers found that applicants with distinctively African-American names were 16 per cent less likely to have their bookings accepted. Edelman and Luca have also published evidence that black hosts receive lower incomes than whites while letting out very similar properties on Airbnb. The hashtag #AirbnbWhileBlack has started to circulate.
Can anything be done to prevent such discrimination? It’s not a straightforward problem. Airbnb condemns racial discrimination but, by making names and photographs such a prominent feature of its website, it makes discrimination, conscious or unconscious, very easy.
“It’s a cheap way to build trust,” says researcher Michael Luca. But, he adds, it “invites discrimination”.
Of course there’s plenty of discrimination to be found elsewhere. Other studies have used photographs of goods such as iPods and baseball cards being held in a person’s hand. On Craigslist and eBay, such goods sell for less if held in a black hand than a white one. An unpleasant finding — although in such cases it’s easy to use a photograph with no hand visible at all.
The Harvard Business School team have produced a browser plug-in called “Debias Yourself”. People who install the plug-in and then surf Airbnb will find that names and photographs have been hidden. It’s a nice idea, although one suspects that it will not be used by those who need it most. Airbnb could impose the system anyway but that is unlikely to prove tempting.
However, says Luca, there are more subtle ways in which the platform could discourage discrimination. For example, it could make profile portraits less prominent, delaying the appearance of a portrait until further along in the process of making a booking. And it could nudge hosts into using an “instant book” system that accelerates and depersonalises the booking process. (The company recently released a report describing efforts to deal with the problem.)
But if the Airbnb situation has shone a spotlight on unconscious (and conscious) bias, there are even more important manifestations elsewhere in the economy. A classic study by economists Marianne Bertrand and Sendhil Mullainathan used fake CVs to apply for jobs. Some CVs, which used distinctively African-American names, were significantly less likely to lead to an interview than identical applications with names that could be perceived as white.
Perhaps the grimmest feature of the Bertrand/Mullainathan study was the discovery that well-qualified black applicants were treated no better than poorly qualified ones. As a young black student, then, one might ask: why bother studying when nobody will look past your skin colour? And so racism can create a self-reinforcing loop.
What to do?
One approach, as with “Debias Yourself”, is to remove irrelevant information: if a person’s skin colour or gender is irrelevant, then why reveal it to recruiters? The basic idea behind “Debias Yourself” was proven in a study by economists Cecilia Rouse and Claudia Goldin. Using a careful statistical design, Rouse and Goldin showed that when leading professional orchestras began to audition musicians behind a screen, the recruitment of women surged.
Importantly, blind auditions weren’t introduced to fight discrimination against women — orchestras didn’t think such discrimination was a pressing concern. Instead, they were a way of preventing recruiters from favouring the pupils of influential teachers. Yet a process designed to fight nepotism and favouritism ended up fighting sexism too.
A new start-up, “Applied”, is taking these insights into the broader job market. “Applied” is a spin-off from the UK Cabinet Office, the Behavioural Insights Team and Nesta, a charity that supports innovation; the idea is to use some simple technological fixes to combat a variety of biases.
A straightforward job application form is a breeding ground for discrimination and cognitive error. It starts with a name — giving clues to nationality, ethnicity and gender — and then presents a sequence of answers that are likely to be read as one big stew of facts. A single answer, good or bad, colours our perception of everything else, a tendency called the halo effect.
A recruiter using “Applied” will see “chunked” and “anonymised” details — answers to the application questions from different applicants, presented in a randomised order and without indications of race or gender. Meanwhile, other recruiters will see the same answers, but shuffled differently. As a result, says Kate Glazebrook of “Applied”, various biases simply won’t have a chance to emerge.
When the Behavioural Insights Team ran its last recruitment round, applicants were rated using the new process and a more traditional CV-based approach. The best of the shuffled, anonymised applications were more diverse, and much better predictors of a candidate who impressed on the assessment day. Too early to declare victory — but a promising start.
Written for and first published in the Financial Times.
My new book “Messy” is now out and available online in the US and UK or in good bookshops everywhere.