Diversity quotas suck. Here’s why.

A few days ago, I contributed to a roundtable discussion-style post about diversity quotas (that is, setting specific hiring targets around race and gender) on the Key Values blog. Writing my bit there was a good forcing function for exploring the issue of diversity quotas at a bit more length… and if I’m honest, this is a topic I’ve had really strong opinions about for a while but haven’t had the chance to distill. So, here goes.

I think it’s important to ask ourselves what we want to accomplish with diversity quotas in the first place. Are we trying to level the playing field for marginalized groups? To bring in the requisite diversity of thought that correlates so strongly with a better bottom line? Or to improve our optics so that when the press writes about our company’s diversity numbers, we look good? Unless diversity quotas are truly an exercise in optics, I firmly believe that, in the best case, they’re a band-aid that fails to solve deep, underlying problems with hiring and that, in the worst case, they do more harm than good by keeping us complacent about finding better solutions, and paradoxically, by undermining the very movement they’re meant to help. Instead of trying to manage outcomes by focusing on quotas, we should target root causes and create the kind of hiring process that will, by virtue of being fair and inclusive, bring about the diversity outcomes we want.

Why are quotas bad? If it’s not just about optics, and we are indeed trying to level the playing field for marginalized groups, let’s pretend for a moment that quotas work perfectly and bring us all the desired results. Even in that perfect world, we have to ask ourselves if we did the right thing. Any discussion about leveling the playing field for marginalized groups should not just be about race but should also include socioeconomic status. And age. And a myriad of other marginalized groups in tech.

We often focus on race and gender because those are relatively easy to spot. Socioeconomic status is harder because you can’t tell how someone grew up, and you can’t really ask “Hey were your parents poor?” on an application form. Age is a bit easier to spot (especially if you spent your 20s laying around in the sun like I did), but it’s illegal to ask about age in job interviews… to prevent discrimination! Surely, that’s a contradiction in terms. So, if we’re leaving out socioeconomic status and age and a whole bunch of other traits when we assign quotas, are we really leveling the playing field? Or are we creating more problems?

One of the downsides of diversity quotas is the tokenization of candidates, which often manifests as stereotype threat, one of the very things we’re trying to prevent. I can’t tell you how many times people have asked me if I thought I got into MIT because I’m a girl. That feels like shit… in large part because I DON’T KNOW if I got into MIT because I’m a girl. Stereotype threat is a real thing that very clearly makes people underperform at their jobs… and then creates a vicious cycle where the groups we’re trying to help end up being tokenized and scrutinized for underperformance caused by the very thing that’s supposed to be helping them.

So, what about diversity of thought? If you’re really going after candidates who can bring fresh perspectives to the table, their lived experience should trump their gender and ethnicity (though of course, those can correlate heavily). If you’re really after diversity of thought, then educational background/pedigree and previous work experience should weigh just as heavily. Before I became a software engineer, I spent 3 years cooking professionally. Seeing how hiring happened in a completely different field (spoiler: it’s a lot fairer) shaped my views on how hiring should be done within tech. And look, if you put a gun to my head and asked me, given absolutely identical abilities to do the job, whether I should hire a woman who came from an affluent background, aced her SATs because of access to a stellar prep program and supportive parents, went to a top school and interned at a top tech company over a man who dropped out of high school and worked a bunch of odd-jobs and taught himself to code and had the grit to end up with the requisite skills… I’ll take the man.1

But I’ll also feel shitty about it because I don’t think I should have to make choices like this in the first place. And the fact that I have to is what’s broken. In other words, quotas don’t work from either a moral perspective or from a practical one. At best, they’re a band-aid solution covering up the fact that your hiring process sucks, and the real culprit is the unspoken axiom that the way we’re doing hiring is basically fine. I wrote at length about how engineering hiring and interviewing needs to change to support diversity initiatives already, so I won’t do it here, but the gist is that fixing hiring is way harder than instituting quotas, but low-hanging fruit aren’t going to get us to a place of equal opportunity. Better screening and investments in education will. At interviewing.io, because we rely entirely on performance in anonymous technical interviews rather than resumes to surface top-performing candidates, 40% of the hires we’ve made for our customers are people from non-traditional backgrounds and underrepresented groups (and sometimes these are candidates that the same companies had previously rejected based on their resumes). The companies that we’ve hired for that have benefitted from access to these candidates have been willing to undergo the systemic process change and long-term thinking that effecting this level of change requires. We know our approach works. It’s hard, and it takes time and effort, but it works.


1There was a recent New York Times piece about how “diversity of thought” is an excuse that lets us be lazy about working to hire people from underrepresented groups. I believe that the kind of “root cause” approach we’re advocating where we invest in long-term education and create a fairer hiring process is significantly harder than doing something like quotas.

In defense of Palantir… or why the Department of Labor got the wrong man

On September 26th, the U.S. Department of Labor filed a suit against Palantir Technologies, alleging that Palantir’s engineering hiring practices discriminate against Asian applicants. I don’t have any salacious insider information about this suit, but I do have quite a bit of insight into how technical hiring works. Palantir and the DOL are really arguing over using resumes versus employee referrals to screen job candidates, when smart companies of a certain size should primarily rely on neither. In other words, rather than Palantir, standard hiring practices are really what should be on trial.

The DOL’s suit is based on the unfavorable disparity between the number of “qualified” Asian applicants for 3 engineering roles between 2010 and 2011 and the number of resulting Asian hires. Below (as taken from the full complaint), you can see the roles covered by the suit, the number of applicants and hires, and the odds, according to the DOL’s calculations, that these disparities happened by chance:

palantir-suit

My issue with this setup is simple: what does qualified actually mean? According to the complaint, “Palantir used a four-phase hiring process in which Asian applicants were routinely eliminated during the resume screen and telephone interview phases despite being as qualified as white applicants.” A four-phase hiring process is typical in tech companies and refers to a resume screen followed by a call with a recruiter, followed by a technical phone screen where the applicant writes code while another engineer observes, and concluded by a multi-hour onsite interview.

To determine basic “qualification,” the DOL relied, at least in part (and likely heavily), on the content of applicants’ resumes which, in turn, boils down to a mix of degree and work experience. Resumes are terrible predictors of engineering ability. I’ve looked at tens of thousands of resumes, and in software engineering roles, there is often very little correspondence between how someone looks on paper and whether they can actually do the job.

How did I arrive at this conclusion? I used to run technical hiring at a startup and was having a hell of a time trying to figure out which candidates to let through the resume screen. Over and over, people who looked good on paper (had worked at companies like Google, had gone to schools like MIT, and so on) crashed and burned during technical interviews, whereas candidates without pedigree often knocked it out of the park. So, I decided to examine the resumes of everyone who applied over the course of a year as well as those of past and current employees. After looking at hundreds of resumes and looking at everything from years of experience and highest degree earned to G.P.A and prestige of previous employers, it turned out that the thing that mattered most, by a huge margin, wasn’t any piece of information about the candidate. Rather it was the number of grammatical errors and typos on their resume.

Don’t believe me that screening for education and experience doesn’t work? Then consider the following experiment. A few years ago, I showed a set of anonymized resumes from my collection to 150 engineers, recruiters, and hiring managers and asked them one question: “Would you interview this candidate?” Not only did participants, across the board, fail at predicting who the strong candidates were (the odds of guessing correctly were roughly 50%, i.e. the same as flipping a coin), but, much more importantly, no one could even agree on what a strong candidate looked like in the first place.

Organizations realize that resumes are noisy and are forced to explore other, more reliable channels. In the case of Palantir and many other companies, this boils down to relying on employee referrals, and that may be the DOL’s strongest argument. According to the complaint, “…the majority of Palantir’s hires into [the three positions listed in the suit] came from an employee referral system that disproportionately excluded Asians.” Using referrals as a hiring channel is an extremely common practice, with the rationale being that it’s a reliable source of high-quality candidates. This makes sense. If resumes were reliable, then referrals wouldn’t be such a valued channel.

Despite its ubiquity, is relying on referrals grounds for a discrimination suit? Perhaps. But referrals were Palantir’s perfectly reasonable attempt to find a better screen than resumes. Palantir just didn’t go far enough in looking at other options. What if, instead of being bound to hire from the quasi-incestuous sliver of your employees’ social graph, you could have reliable, high-signal data about your entire candidate pool?

Until recently, when hiring, you had to rely on proxies like resumes to make value judgments because there simply wasn’t a good way to get at more direct, meaningful, and unbiased data about your candidates.

We now have the technology to change that. A slew of products anonymize candidate names, entirely occluding race. A whole other set of tools enables you to send relevant take-home exercises to all your applicants and automatically score their submissions, using those scores as a more indicative resume substitute. And there’s my company, interviewing.io, which I’m totally plugging right now but which also happens to be the culmination of my attempts to fix everything that’s pissed me off about hiring for years. interviewing.io matches companies with candidates based entirely on how those candidates have been doing in technical interviews up until that point. Moreover, every interview on our platform is blind — by the time you unmask with a candidate, you’ve decided whether you’re going to bring them in for an onsite, and you’ve used nothing but their interview performance to make that decision.

Whichever solution ends up being the right one, one thing is clear. It’s time to shut down outdated, proxy-based hiring practices. That doesn’t mean paying lip service to diversity initiatives. It means fundamentally rethinking how we hire, paring away every factor except whether the candidate in question can do the job well.

Any other kind of hiring practice is potentially discriminatory. But even worse, it’s inefficient and wasteful. And it’s ultimately the thing that, unlike Palantir, truly deserves our wrath.

A founder’s guide to making your first recruiting hire

Recently, a number of founder friends have asked me about how to approach their first recruiting hire, and I’ve found myself repeating the same stuff over and over again. Below are some of my most salient thoughts on the subject. Note that I’ll be talking a lot about engineering hiring because that’s what I know, but I expect a lot of this applies to other fields as well, especially ones where the demand for labor outstrips supply.

Don’t get caught up by flashy employment history; hustle trumps brands

At first glance, hiring someone who’s done recruiting for highly successful tech giants seems like a no-brainer. Google and Facebook are good at hiring great engineers, right? So why not bring in someone who’s had a hand in that success?

There are a couple of reasons why hiring straight out of the Googles and Facebooks of the world isn’t necessarily the best idea. First off, if you look at a typical recruiter’s employment history, you’re going to see a lot of short stints. Very likely this means that they were working as a contractor. While there’s nothing wrong with contract recruiting, per se, large companies often hire contract recruiters in batches, convert the best performers to full-time hires, and ditch the rest.1 That said, some of the best recruiters I know started out at Google. But I am inclined to believe they are exceptions.

The second and much more important reason not to blindly hire out of tech giants is the importance of scrappiness and hustle in this hire. If you work as a recruiter at Google, you’re basically plugged into the matrix. You have a readymade suite of tools that make it much easier to be successful. You have a database of candidates who have previously interviewed that spans a huge portion of the engineering population. Email discovery is easier. Reaching out to people is easier because you have templates that have been proven to work to rely on. And you can lean on the Google brand as a crutch. Who hasn’t been, at one point in their career, flattered by an email from a Google recruiter? As a result, if you’re sending these emails, you don’t have to go out of your way to write personal messages or to convince people that your company is cool and interesting and worth their time. You get that trust for free.

Contrast this setup with being the very first person in the recruiting org. You have no tools. You have no templates. You probably have no brand. You probably have, well, jack shit. You need someone who’s going to think critically about tooling and balance the need for tooling with a shoestring budget, especially in a space where most tooling has a price tag of at least $1K per month. You’re going to need someone whose methods are right for your particular situation rather than someone who does things because that’s just how they’ve always been done. You probably want someone who realizes that paying for a LinkedIn Recruiter seat is a huge fucking waste of money and that sourcing on LinkedIn, in general, is a black hole-level time suck. You want someone who is good at engaging with candidates independently of brand sparkle, which likely means someone who understands the value of personalization in their sourcing efforts. You want someone who compensates for your relatively unknown status with great candidate experience during your interview process. You want someone who won’t just blindly pay tens of thousands of dollars for career fair real estate because that’s just what you do, even though the only companies who get ROI on career fair attendance are ones with preexisting brands. And, apropos, you want someone who can start building a sparkly brand for you from day one because building a brand takes time. (More on brand-building in the last two sections on marketing chops and evangelism.)

Sales chops are hugely important, and you can test for those

People often ask me if having an engineering background is important for technical recruiters. My answer to that is always, “Yes, but.” Yes, it’s useful, but the main reason it’s useful is that it helps build credibility and rapport with candidates. A good salesperson can do that without all the trappings of engineering experience. To put it another way, at the end of the day, this is a sales job. Great engineers who are shitty salespeople will not do well at recruiting. Great salespeople with no engineering background will likely do well.

So, how can you test for sales aptitude? If the candidate is currently an in-house recruiter somewhere, I ask them to pitch me on the company’s product. If they’re an agency recruiter, I ask them to pitch me on one of their clients’ products. Most recruiters do a decent job of pitching the company as a good place to work, but unfortunately, many don’t have a very good understanding of what their company actually does. Given that they’re the first point of contact for candidates, it’s really important to be able to answer basic questions about product-market fit, challenges (both product and engineering), how the company makes money, how much traction there is, what the competition looks like, and so on. Moreover, a lack of interest in something this basic points to a lack of intellectual curiosity in general, and in a technical recruiter, this is a very poor sign because such a huge portion of the job is picking up new concepts and being able to talk about them intelligently to very smart people.

You want someone who can write

I was on the fence about whether to include this section because it sounds kind of obvious, but writing well is important in this role for two reasons. First off, your recruiter is likely going to be the first point of contact with candidates. And if you’re an early-ish company without much brand, correspondence with the recruiter will likely be the first time a candidate ever hears of you. So, you probably want that interaction to shine. And the other reason you want someone who cares about narrative, spelling, and grammar is that they will be the arbiter of these abilities in future recruiting hires. Enough said.

One exercise I like to have candidates for this role go through is writing mock sourcing emails to people at your company, as if they were still at their previous position. This portion of the interview process is probably the best lens into what it’s actually like to work with the candidate. In particular, because candidates are not likely to have a clear idea of what they’re pitching yet, I try to make this part of the process iterative and emphasize that I welcome any number of questions about anything, whether it’s the company’s offering, what companies my firm works with, what certain parts of the engineers’ profiles mean, or anything in between. What questions people ask, how they ask them, and how they deal with the ambiguity inherent in this assignment is part of the evaluation, as is the caliber of the research they did on each mock email recipient.

You want someone with marketing chops

I talked a bit earlier about how you probably have no brand to speak of at this point. I can’t stress enough how much easier having a brand makes hiring. Until you have one, especially in this climate, you’re going to be fighting so fucking hard for every one-off hire. If you can, you ought to put effort into branding such that you end up in the enviable position of smart people coming to you.

So why don’t early-ish companies do this across the board? Brand building is a pain in the ass, it takes time, and not all of your outbound efforts are going to be measurable, which can make it harder to get others in your org to buy in. If you can find someone who’s had even a little bit of marketing experience, they’ll be able to identify channels to get the word out, use their preexisting network to help with outsource-able tasks, and get the ball rolling on things like hosting events, which, if you’ve never done before, can be quite intimidating.

And because recruiting doesn’t live in a vacuum and needs help from other teams to send something high-signal and genuine into the world, someone with some marketing experience will likely have an easier time getting other teams to buy in and put time and resources into this endeavor, which brings me to my next point.

You want someone who can fearlessly evangelize the importance of recruiting… and get you to take an active role even when you don’t feel like it

The harsh reality is that the primary reason companies hire their first recruiter is so that hiring can be taken off the plate of the founders. It’s tempting to have the “set it and forget it” mentality in a founder’s shoes — recruiters aren’t cheap, so presumably if you pay them enough, they’ll just deal with this pesky hiring thing, and then you can get back to work. I get it. Hiring isn’t that fun, and as a founder, despite having been a recruiter myself, there are definitely days when I just want to pay someone to, for the love of god, take this bullshit off my hands so I can get back to talking to users and selling and figuring out what to build next and all sorts of other things.

But it doesn’t work that way. If you’re a founder, no one can sell your vision as well as you. And all that experience you’ve built up that makes you a subject matter expert probably also makes you pretty good at filtering candidates. You might take a lot of what’s in your head for granted, but transferring that over into someone else’s brain is going to take time and iteration. And you can never really dissociate from hiring entirely because the moment you do, the culture of “hiring is just the domain of recruiting” is going to trickle down into your culture, and over time, it will cost you the best people.

In my recruiting days, at a high level, I saw two types of hiring cultures. One had the hiring managers and teams taking an active role, participating in sourcing, tweaking interview questions to make them engaging and reflective of the work, and taking time to hang out with candidates, even if they weren’t interviewing yet. The other type had the recruiting org be largely disjoint from the teams it was hiring for. In this type of setup, team members would view recruiting as a hassle/necessary evil that took them away from their regular job, and most of the remaining trappings of the hiring process would be left in the hands of recruiters alone.

You can guess which type of company ends up with an enviable interview process, a sweet blog, cool, hiring-themed easter eggs in their code, and a wistful, pervading, nose-pressed-against-the-glass refrain of “I wish I could work there”. And you can, in turn, guess which company demands a CS degree and 10 years of [insert recent language name here] experience in their job descriptions.

Despite these realities, founders and hiring managers often forget how critical their role in hiring is because they have a ton of everyday tasks on their plates. This is why having your recruiter be a fearless evangelist is so important. This person needs to cheerfully yet insistently remind the team and especially founders (who are, after all, the ones who dictate culture) that time spent on hiring is part of their jobs. This person needs to be able to march into the CEO’s office and demand that they go and give a talk somewhere or consistently block off time on their calendar every week to send some sourcing emails. Or that they need to write some stuff somewhere on the internet such that people start to realize that their company is a thing. Marching into a CEO’s office and making demands is tough. You need a person who will do this without trepidation and who will be able to convince you, even when the sky is falling, that a few hours a week spent on hiring are a good use of your time.

In addition to these points, all the usual thought points about hiring someone who’s going to be growing a team apply here. Is this person already a strong leader? If not, can they grow into one? Are they going to be able to attract other talent to their team? Are they someone you want around, fighting alongside you in the dark, for a long time to come? And, though in an ideal world I’d choose someone with experience who also meets the criteria I’ve outlined in this guide, if ultimately faced with a choice between experience and someone green with hustle, charisma, writing ability, and smarts, I’ll choose the latter every time.


1As an aside, this process is an unfortunate side effect of employment law meant to protect contractors from being exploited. The thinking is that by capping the length of time that someone can work as a contractor, you can exert pressure on the company to turn them into full-time hires who have to be given benefits. But as with many well-intentioned, regulatory pieces of legislation, that’s not really what happens in practice. The practical takeaway, though, is that if someone is great at recruiting, they’re probably not going to have a bunch of short contracting stints.

Engineers can’t gauge their own interview performance. And that makes them harder to hire.

Note: This post is cross-posted from interviewing.io’s blog. interviewing.io is a company I founded that tries to make hiring suck less. I included it here because it seems like there’s a good amount of thematic overlap. And because there are some pretty graphs.

interviewing.io is an anonymous technical interviewing platform. We started it because resumes suck and because we believe that anyone, regardless of how they look on paper, should have the opportunity to prove their mettle. In the past few months, we’ve amassed over 600 technical interviews along with their associated data and metadata. Interview questions tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role at a top company, and interviewers typically come from a mix of larger companies like Google, Facebook, and Twitter, as well as engineering-focused startups like Asana, Mattermark, KeepSafe, and more.

Over the course of the next few posts, we’ll be sharing some { unexpected, horrifying, amusing, ultimately encouraging } things we’ve learned. In this blog’s heroic maiden voyage, we’ll be tackling people’s surprising inability to gauge their own interview performance and the very real implications this finding has for hiring.

First, a bit about setup

When an interviewer and an interviewee match on our platform, they meet in a collaborative coding environment with voice, text chat, and a whiteboard and jump right into a technical question. After each interview, people leave one another feedback, and each party can see what the other person said about them once they both submit their reviews. If both people find each other competent and pleasant, they have the option to unmask. Overall, interviewees tend to do quite well on the platform, with just under half of interviews resulting in a “yes” from the interviewer.

If you’re curious, you can see what the feedback forms look like below. As you can see, in addition to one direct yes/no question, we also ask about a few different aspects of interview performance using a 1-4 scale. We also ask interviewees some extra questions that we don’t share with their interviewers, and one of those questions is about how well they think they did. In this post, we’ll be focusing on the technical score an interviewer gives an interviewee and the interviewee’s self-assessment (both are circled below). For context, a technical score of 3 or above seems to be the rough cut-off for hirability.

Feedback form for interviewers
Feedback form for interviewers

Feedback form for interviewees
Feedback form for interviewees

Perceived versus actual performance

Below, you can see the distribution of people’s actual technical performance (as rated by their interviewers) and the distribution of their perceived performance (how they rated themselves) for the same set of interviews.1

You might notice right away that there is a little bit of disparity, but things get interesting when you plot perceived vs. actual performance for each interview. Below, is a heatmap of the data where the darker areas represent higher interview concentration. For instance, the darkest square represents interviews where both perceived and actual performance was rated as a 3. You can hover over each square to see the exact interview count (denoted by “z”).

If you run a regression on this data2, you get an R-squared of only 0.24, and once you take away the worst interviews, it drops down even further to a 0.16. For context, R-squared is a measurement of how well you can fit empirical data to some mathematical model. It’s on a scale from 0 to 1 with 0 meaning that everything is noise and 1 meaning that everything fits perfectly. In other words, even though some small positive relationship between actual and perceived performance does exist, it is not a strong, predictable correspondence.

You can also see there’s a non-trivial amount of impostor syndrome going on in the graph above, which probably comes as no surprise to anyone who’s been an engineer.

Gayle Laakmann McDowell of Cracking the Coding Interview fame has written quite a bit about how bad people are at gauging their own interview performance, and it’s something that I had noticed anecdotally when I was doing recruiting, so it was nice to see some empirical data on that front. In her writing, Gayle mentions that it’s the job of a good interviewer to make you feel like you did OK even if you bombed. I was curious about whether that’s what was going on here, but when I ran the numbers, there wasn’t any relationship between how highly an interviewer was rated overall and how off their interviewees’ self-assessments were, in one direction or the other.

Ultimately, this isn’t a big data set, and we will continue to monitor the relationship between perceived and actual performance as we host more interviews, but we did find that this relationship emerged very early on and has continued to persist with more and more interviews — R-squared has never exceeded 0.26 to date.

Why this matters for hiring

Now here’s the actionable and kind of messed up part. As you recall, during the feedback step that happens after each interview, we ask interviewees if they’d want to work with their interviewer. As it turns out, there’s a very statistically significant relationship (p < 0.0008) between whether people think they did well and whether they’d want to work with the interviewer. This means that when people think they did poorly, they may be a lot less likely to want to work with you3. And by extension, it means that in every interview cycle, some portion of interviewees are losing interest in joining your company just because they didn’t think they did well, despite the fact that they actually did.

How can one mitigate these losses? Give positive, actionable feedback immediately (or as soon as possible)! This way people don’t have time to go through the self-flagellation gauntlet that happens after a perceived poor performance, followed by the inevitable rationalization that they totally didn’t want to work there anyway.

Lastly, a quick shout-out to Statwing and Plotly for making terrific data analysis and graphing tools respectively.

1There are only 254 interviews represented here because not all interviews in our data set had comprehensive, mutual feedback. Moreover, we realize that raw scores don’t tell the whole story and will be focusing on standardization of these scores and the resulting rat’s nest in our next post. That said, though interviewer strictness does vary, we gate interviewers pretty heavily based on their background and experience, so the overall bar is high and comparable to what you’d find at a good company in the wild.

2Here we are referring to linear regression, and though we tried fitting a number of different curves to the data, they all sucked.

3In our data, people were 3 times less likely to want to work with their interviewers when they thought they did poorly.

What happens when you stop relying on resumes

If you’re a regular reader of this blog, you know that I’ve come to rely quite heavily on data. I’ve counted typos on resumes, I’ve sifted through a corpus of engineering offers, and I’ve skimmed thousands of recruiting messages to tag them by personalization level. This post, however, is going to be a bit of a departure. Rather than making broad, sweeping conclusions based on a lot of data points, I’m going to narrow in on one story that happened, in part, because of some data I gathered during an experiment. I think it’s a really cool story, and I can only hope that there will be more stories like it that, in time, will enable me to write another post with lots of graphs.

The experiment in question was thus. Last fall, I showed a set of anonymized engineering resumes to about 150 engineers, recruiters, and hiring managers and asked one question: Would you interview this candidate? It turned out that not only did both recruiters and engineers largely fail at predicting who the strong candidates were, but, much more importantly, no one could even agree on what a strong candidate looked like in the first place.

These results were quite startling, and they left me scratching my head about the implications. After all, resumes are such a huge part of how hiring is done. Of course, it wasn’t always this way. From Chinese civil servant exams to Thomas Edison’s somewhat infamous general knowledge tests, filtering for positions requiring specialized knowledge was, up until the last century, accomplished largely through a variety of aptitude tests. Even as the resume rose in prominence and became a ubiquitous arbiter of potential fit, its format has gone through a number of evolutions. Things that were commonly included on resumes at various times (and still are in certain cultures) — a photo, marital status, age, religion, height, weight, blood type, and political affiliation — are no longer en vogue, and it’s certainly not inconceivable that, in the future, asking for a resume will seem just as silly as these now-outdated practices.

So, if resumes aren’t a good signal for hiring engineers, what is? In my blog, I wondered if, instead of a resume, it might be interesting to have people write a bit about something they built that they were excited about. I was excited, in turn, when my friends at KeepSafe actually decided to try this out — for a month, candidates who applied through KeepSafe’s No Resume campaign were evaluated purely on what they wrote about their projects, and this writing sample would be the only thing used to decide whether someone would get an interview. This is the story of what happened.

KeepSafe, in some ways, came to this experiment out of necessity. Like many other small startups, they were feeling the acute pain of hiring engineers in this market. They had a product people loved, a ridiculous user to engineer ratio (39M users for 6 engineers), and the requisite hip South Park office. Along with that, though, they had quite a high hiring bar and consistently found themselves competing for talent with the likes of Google. And more often than not, they’d lose.

To stay in the game, KeepSafe needed to change up the rules and tap into a different pool of people. The hope was that out there were plenty of talented engineers who were getting overlooked because of their lack of resume pedigree but who were passionate and skilled and who would be awesome hires, given the chance to show what they could do.

KeepSafe’s experiment struck a chord, and in the first two days, over 400 people submitted descriptions of stuff they’d built. Awesome stuff like this.

Submissions varied in length. Some were just a sentence. Some were multiple pages and supplemented with links to demos. Not all submissions were awesome. Some were generic, copy-pasted cover letters espousing their interest in the software development life cycle, and some were just disconnected links. Those KeepSafe cut immediately. The rest were a lot tougher to cull.

Though I didn’t participate in the judging, one thing that struck me about the submissions I read was that my normal approach was useless here. Normally, when I look at resumes, I can make a yes/no decision based on proxies like past employers within 10 seconds. With KeepSafe’s submissions, I couldn’t rely on proxies at all. With each person, I had to think about what they built, imagine it, understand it. In addition, I found myself starting to getting a real read on people’s interests and trying to imagine what projects they might want to tackle at KeepSafe, something that’s much harder to gauge with the traditional resume format.

All of this was weird and slow, but it felt really good. And then I realized that unless your achievements fit into a convenient mold, you’ll lose the best parts of you when you try to beat your windy path into a nice clean line. Resumes don’t have an explicit section for building rockets or Minecraft servers, and even if you stick it somewhere in “personal projects”, that’s not where the reader’s eye will go. The sad truth is that if the reader doesn’t like or recognize what they see before they get to the rockets, they will likely never get there at all.

***

By traditional Silicon Valley standards, AJ Alt didn’t look especially good on paper. He hadn’t attended a brand-name school, his GPA wasn’t high enough to stand out, and his only professional programming experience was a multiyear cybersecurity stint at a huge defense contractor, the nature of which he was forbidden from discussing, on a resume or otherwise. His GitHub, full of projects spanning everything from a Python SHA-1 implementation to a tongue-in-cheek “What should I call my bro?” bromanteau generator, hinted at a different story, but most people never got there. While AJ’s government work experience gave him a good amount of cred in the public sector, he found that making the move to industry, and startups especially, was near impossible. It wasn’t that he was blowing interviews. He just couldn’t get through the filter in the first place.

Of the 415 people who applied to KeepSafe through the No Resume campaign, 18 ended up interviewing, and 5 came in for an onsite day of coding. One received an offer. It was AJ, a candidate that Zouhair Belkoura, KeepSafe’s cofounder and CEO, readily admits he would have overlooked, had he come in through traditional channels. Since starting a couple of months ago, AJ has built out an open source Android animation library and was one of the co-creators of a brand new security feature within the app.

When I asked Zouhair if KeepSafe would consider the experiment a success, his answer was a resounding yes. According to Zouhair, “The overall quality of applications seemed to be a lot higher than through our normal resume channels. We also met more people who seemed to genuinely like programming and wanted to talk about it [rather than people just applying because they wanted a job].”

The team has decided that they will continue to hire without resumes for the foreseeable future — the funnel numbers for this process weren’t too different than what you’d see in a more traditional setting, and, with time, as the team gets better at making value judgments based on writing and projects, they will only improve. Resumes work well for companies like Google because their strong brand drives a revolving door of inbound applicants. Copying that process, if you’re a smaller brand, however, can be detrimental. Of course, this experiment had a small sample size, and one hire is more anecdote than gospel, but if replacing the resume with a writing sample is a good way to get at a different, yet highly skilled and engaged talent pool, then it’s worth a shot. As Zouhair told me, “Our culture is that code speaks louder than credentials, and now we have a hiring process that reflects that. [When we made the hire], we had never even seen AJ’s resume. And we don’t need to.”


Interested in applying to KeepSafe without a resume? They are still accepting not-resumes at https://www.getkeepsafe.com/noresume.html. Think resumes are generally a dumb way to filter for engineering ability? Check out what I’m working on now, interviewing.io.