← Previous · All Episodes · Next →
A Way to Detect Bias - Identifying Bias in Electoral Processes Episode 10

A Way to Detect Bias - Identifying Bias in Electoral Processes

· 05:06

|
"This article written by Paul Graham in 2015 proposes a method to identify bias in selection processes. This method can potentially detect bias even without any information about the pool of applicants. Graham argues that bias means that a certain type of applicant has a harder time being selected, and therefore needs to be better than others to be chosen. As a result, this type of applicant who successfully completes the selection process surpasses other successful applicants. For example, many people think that venture capital firms are biased against female founders. Graham points out that this bias can be easily detected: do startups with female founders in their portfolios perform better than those without? Bias can be detected with such questions.

---

# A Way to Detect Bias (Identifying Bias in Electoral Processes)

October 2015

This will come as a surprise to a lot of people, but in some cases it's possible to detect bias in a selection process without knowing anything about the applicant pool. Which is exciting because among other things it means third parties can use this technique to detect bias whether those doing the selecting want them to or not.

You can use this technique whenever(a) you have at least a random sample of the applicants that were selected,(b) their subsequent performance is measured, and(c) the groups of applicants you're comparing have roughly equal distribution of ability.

How does it work? Think about what it means to be biased. What it means for a selection process to be biased against applicants of type x is that it's harder for them to make it through. Which means applicants of type x have to be better to get selected than applicants not of type x. [1] Which means applicants of type x who do make it through the selection process will outperform other successful applicants. And if the performance of all the successful applicants is measured, you'll know if they do.

Of course, the test you use to measure performance must be a valid one. And in particular it must not be invalidated by the bias you're trying to measure. But there are some domains where performance can be measured, and in those detecting bias is straightforward. Want to know if the selection process was biased against some type of applicant? Check whether they outperform the others. This is not just a heuristic for detecting bias. It's what bias means.

For example, many suspect that venture capital firms are biased against female founders. This would be easy to detect: among their portfolio companies, do startups with female founders outperform those without? A couple months ago, one VC firm(almost certainly unintentionally) published a study showing bias of this type. First Round Capital found that among its portfolio companies, startups with female founders [outperformed](http://10years.firstround.com/) those without by 63%. [2]

The reason I began by saying that this technique would come as a surprise to many people is that we so rarely see analyses of this type. I'm sure it will come as a surprise to First Round that they performed one. I doubt anyone there realized that by limiting their sample to their own portfolio, they were producing a study not of startup trends but of their own biases when selecting companies.

I predict we'll see this technique used more in the future. The information needed to conduct such studies is increasingly available. Data about who applies for things is usually closely guarded by the organizations selecting them, but nowadays data about who gets selected is often publicly available to anyone who takes the trouble to aggregate it. [1] This technique wouldn't work if the selection process looked for different things from different types of applicants—for example, if an employer hired men based on their ability but women based on their appearance. [2] As Paul Buchheit points out, First Round excluded their most successful investment, Uber, from the study. And while it makes sense to exclude outliers from some types of studies, studies of returns from startup investing, which is all about hitting outliers, are not one of them.

**Thanks** to Sam Altman, Jessica Livingston, and Geoff Ralston for reading drafts of this.

---

Relevant Keywords: detecting bias, bias in selection process, bias in venture capital, female founders in startups, measuring performance to detect bias, bias against applicant types, bias in startups, First Round Capital bias study, bias detection technique, gender bias in startups, bias in hiring process."

Subscribe

Listen to Yigit Konur's Curation using one of many popular podcasting apps or directories.

Spotify Pocket Casts Amazon Music YouTube
← Previous · All Episodes · Next →