I’ve been an engineer and a recruiter. Hiring is broken. Here’s why… and what it should be like instead.

I’ve been in and around eng hiring for the past 13 years, as an engineer, a recruiter, and a founder of a technical recruiting marketplace (interviewing.io). Over the course of those 13 years, I’ve become increasingly disgruntled at the state of hiring, and now I’m mad enough to write this blog post.

If you’ve ever been on either end of the table, you’re probably mad at the state of hiring, too. Whether you have given it a lot of thought or whether you just feel it deep down, something about the whole process feels off.

But we’ve been doing it this way for so long that we probably take much of how hiring works as gospel, and it’s really hard to tease apart all the different components of the process and examine why they are the way they are. In this post, I’d like to challenge many of the things we assume about hiring, and, perhaps most importantly, I’d like to lay out my platonic ideal for how eng hiring should work. It’s a simple axiom, really:

It should be easy for smart people to talk to other smart people.

Or, another way to put it … if I’m a good engineer, it should be easy for me to talk to a hiring manager at a company I might be interested in, at a time of my choosing. But that’s simply not possible today. Despite the refrain that we’re in a candidate’s market and that there’s a shortage of good candidates, which should mean that candidates should have the power to call the shots, today’s hiring process couldn’t be further removed from this ideal. And it’s not just broken for a specific type of candidate. It’s broken for everyone.

If you’re reading this, you might be an engineering manager, a senior engineer with stellar credentials, a recent bootcamp grad, an engineer from a background traditionally underrepresented in tech, or some combination of these. What’s truly messed up about the status quo is that, regardless of which of these groups you fall into, your journey will be unnecessarily unpleasant. Though the degree of unpleasantness will not always be the same, it’s not about race, seniority, pedigree, or gender … or even which side of the table you’re on. Hiring, in its current incarnation, is broken for everybody.

Why? Let us go then, you and I, into the bowels of the status quo.

A candidate and a hiring manager, never the twain shall meet

Let’s say that I’m a competent generalist engineer who looks good on paper, and I’m thinking that it’s time to look for a new job. What happens next? The idea of having to mount a full-on job search is so daunting. 

I could try some job boards to see which companies are out there. But what would I filter on? I know a lot of programming languages but am not set on having to work in a specific one. How can I tell if I’ll hit it off with the team? I’m applying via a job board to a position I know next to nothing about — will anyone even respond?

Suppose I find some companies where I might want to work. If I’m lucky enough to know someone there, I’ll have to get them to refer me, even though that may not actually do much to speed things along. And if I don’t know anyone there, applying will be an exhausting long shot. Odds are no one will look at my application, and having to redo my resume — or worse, write cover letters — seems like the most tedious kind of busywork.

I guess I can always dig through the recruiter spam I’ve gotten. But do those recruiters still work at the company? If they do, how long will it actually take to get into the process?

Breaking character for a moment, a friend of mine recently got this recruiting email from Google, who has elevated gaslighting to an art form: somehow the fact that it takes two months to get through their process has become a selling point.

Once I do get into the process, why do I have to endure the same intro call ten times with different recruiters who can’t tell me anything about what I’d be working on at any level of depth?

Do I join some platform, create a profile that I copy-paste everywhere (with writing that was just as painful as the aforementioned resume/cover letter) and sort of hope that decent companies contact me … only to have to begin the same recruiter calls over and over again, as above?

Will I have to take some quizzes that drill me on obscure syntax or make me solve toy problems that have no bearing on my engineering ability before I even get to have the aforementioned inane conversation with a recruiter?

If I’m actually good at my job, why can’t I just set up some conversations with companies I think are cool and see if it’s a fit? Why do I have to subject myself and others to an endless parade of vapid conversations and the inevitable busywork that precedes them?

Here’s the truth. Even if I look good on paper and am well-connected, hiring still sucks because of all the noise, uncertainty, and time wasted … but at least I have options. They might not be exactly the right options for me, but at least they exist. On the other hand, if I’m an engineer without a pedigree or a network, my choices are extremely limited, no matter how good I am. Recruiters aren’t reaching out to me, I’m not well-networked enough to have friends refer me, and I definitely don’t hear back when I apply.


Let’s take a look at the other side of the table. Let’s say I’m an eng manager who needs to hire more competent generalists for my team. Having worked as both an eng manager and a recruiter, I can tell you that what happens next isn’t particularly inspiring.

As an eng manager, I sit down with a recruiter and try to explain what I’m looking for. Nine times out of ten, I want a smart person who can get shit done. But, after a farcical game of telephone, somehow those criteria get warped into years of experience with a specific technology or requirements about where the candidate went to school. I also end up with an uninspired, sterile job description that fails to capture the imagination of any candidates who might unwittingly stumble upon it.

My recruiter then goes to any number of sourcing tools of which LinkedIn Recruiter is the ubiquitous, lackluster market leader. They type in keywords I didn’t ask for and filter on credentials I don’t care about to come up with the same homogenous list of candidates every other recruiter at every other tech company is chasing.

They then contact these candidates en masse with generic copy about my team and the hard problems we’re solving. They celebrate single-digit response rates and spend the minimal time left over to give a cursory glance at candidates applying directly.

Why is hiring broken?

So therein lies the ineffectual dance. This is the process we’ve come to accept. As far as I can tease out, the axioms that underlie today’s recruiting best practices go something like this (some of these were told to me verbatim when I was starting out as a recruiter, even):

  1. Thou shalt not engage with active candidates. After all, in this market, strong candidates aren’t looking. Good recruiters build relationships so that when a good candidate does decide to enter the market, the recruiter is there, behind the next doorway, ready to spring!
  2. Engineering time is expensive, so it’s critical to do as much top-of-funnel filtering as possible to make sure that it’s spent on the right candidates.

Are these axioms wrong? The sad truth is … not really. I’ve written in a previous post about how market forces rule everything around me, and recruiting best practices are no exception. In an economy with a surplus of jobs and a shortage of talent, it follows that the best talent is going to be harder to find, engineering time will be expensive, and recruiters in their current incarnation are, dare I say it, a necessary evil. 1

The data supports our current world view. According to Lever (one of the two application tracking systems widely used by startups, Greenhouse is the other), here’s a breakdown of how many candidates from each source it takes to make a hire. Note that here, larger numbers are bad — for many companies, internal referrals are the best source and inbound applications are the worst.

Source: https://www.lever.co/recruiting-resources/articles/recruitment-process/

Looking at this data, you can see why recruiters simply ignore online applications. The same dynamics also apply to platforms such as AngelList — like any jobs board, it’s noisy and probably full of candidates who don’t have much leverage (e.g., juniors/bootcamp grads and people requiring visa sponsorship).

As for the value of eng time, guarding it carefully isn’t exactly wrong either. In fact, if you look at what a typical hiring process looks like today, you’ll see that most of the time spent is by engineers conducting interviews.

Hiring process stageWho does it?How long does it take?
Resume reviewRecruiter10-30 seconds
Recruiter screenRecruiter45 min
Technical phone screenEngineer1 hour
Onsite – Eng portionEngineer6 hours
Onsite – Recruiter portionRecruiter1 hour
OfferRecruiter OR Eng mgr1 hour

Engineering salaries are high, so given that most of the time spent on a single candidate is with engineers, it’s rational to put some recruiter gates at the top of the funnel to protect eng time. The idea is that recruiters will effectively screen out most candidates and only pass on the most promising ones to the eng team.

Unfortunately, when you look at an actual typical funnel, you’ll see that despite attempts to gate the top with recruiters filtering resumes and making intro calls, it’s not really working. Below is what a typical funnel looks like.

If you do the math and look at how many hours are spent — not per candidate but per hire (more useful because hires are ultimately what we want) — you’ll see that despite attempts to save eng time, recruiters spend roughly 15 hours a hire 2 and engineers spend about 40. In a process where you don’t make an offer 50% of the time and only convert those offers to hires 50% of the time, these numbers get much worse.

But, hey, recruiters are doing their best, and if we put engineers at the top of the funnel without making any foundational changes, we’d lose an order of magnitude more engineering time (and money)! So today’s approach makes some sense, even if it’s not entirely functional.

Although these approaches are rational under existing constraints, they’re neither particularly efficient nor particularly fair to the individuals subject to them. This doesn’t mean that we can’t improve the system … But before we talk about what we have the power to change, let’s dive a bit deeper into the constraints we face today.

Currently, though the market has softened a bit in the wake of COVID, we’re still in the midst of an engineering shortage. This means that eng time is expensive, which also means that, as you saw above, companies want to save eng time as much as possible. As a result, there is a palpable tension between the pain caused by the talent shortage and the expense of interviewing the wrong people.

Stacked against the backdrop of this tension lives the problem of information asymmetry. Companies don’t actually know which candidates are strong, nor do candidates necessarily know which companies are a good fit for them. 

Market forces tell us that the side with less leverage will have to do the work. In a market where candidates have more leverage (they’re the ones in short supply), companies have to do the work of chasing candidates. And chase they do! In the absence of meaningful performance data, companies to a tee pursue a small subset of fairly homogenous candidates from MIT/Stanford/Google/Facebook on LinkedIn Recruiter.

And so we end up with a paradox: in the midst of a talent shortage, companies ignore the candidates who apply to them and pay recruiters without domain expertise to chase the same ten engineers with the same credentials. This is a textbook example of an inefficient market.

Perhaps surprisingly, given companies’ current constraints — market forces AND information asymmetry — the paradoxical hiring process of the status quo ends up being the logical, inevitable conclusion. This is why we have come to accept the impossibility of getting in front of a company, dealing with bad recruiters, and overlooking good non-traditional candidates.


That is the world today. But let’s imagine for a moment that you, as an engineer, had a credential, based not just on where you went to school or where you worked, both of which have repeatedly been shown to not be predictive of actual ability, but based on actual coding ability, past performance, and so on. 3 And let’s say that this credential was persistent (once you have it, it doesn’t go away). And once you got it, it awarded you the ability to be treated well in your job search.

Traditionally, market forces would dictate that I should be the one who is courted, but if my “pursuit” isn’t a waste of time but actually an efficient, useful signal-gaining interaction, then why doesn’t it make sense for me to initiate contact with companies? After all, companies don’t know when I’m looking. Historically attempts to identify when candidates flip from “passive” to “active” haven’t been effective (the now defunct Entelo Sonar is an example), so if the act of contacting a company is useful to me and puts me in the driver’s seat, why wouldn’t I do it? Remember, when we’ve solved for credentialing and companies know I’m a worthwhile investment, they’re not going to treat me poorly.

In a world with functional credentialing, putting candidates in the driver’s seat makes sense. After all, candidates are the only ones who know when they’re looking. So why shouldn’t they be able to act on that? Instead of them chasing companies, call it doing non-committal recon … and then, once things get more serious, in a market with a shortage of candidates, companies will still be the ones doing the chasing.

Put simply, when we have functional credentialing, when we disintermediate recruiting, and put candidates in the driver’s seat, we suddenly have an efficient, liquid marketplace. So, if it’s that easy, do solutions like this exist? And do they work?

Isn’t this problem being fixed already? A brief history of hiring solutions, and how everyone eventually becomes LinkedIn Recruiter

Sadly, no, though it’s a problem that many smart people have tried to tackle by building products. Historically, these products have been variants of LinkedIn Recruiter, some with more window dressing than others. Several companies tried to be different but eventually succumbed to the inevitability of the LinkedIn Recruiter model. Three examples you might be familiar with are:

  • Hired
  • Triplebyte
  • AngelList

What’s the problem with these solutions? There are two common threads:

  • Lack of credentialing: Most of them don’t have reliable performance data, or if they do, they never got companies to trust it enough (thereby not addressing the problem of information asymmetry), and these platforms typically don’t allow candidates to take charge of their own job search. 
  • Lack of candidate autonomy… and too many middlemen: Today, hiring is owned by recruiters who sit between companies and candidates, and hiring platforms are no exception: most of these platforms don’t allow candidates to contact companies when they’re interested. It’s not bad for companies to do outreach, but candidates know best when they’re looking. As we noted earlier, though market forces tell us that companies have to do the chasing, it is far more efficient for candidates to initiate contact. Unfortunately, most of these platforms actually make the process less efficient by adding an extra hurdle: requiring candidates to interact with a recruiter who works for the platform before speaking with a recruiter at the company in question before finally meeting with an engineer. Hired called recruiters who worked for them “talent advocates,” and Triplebyte named them “talent managers,” but they’re simply other names for recruiters, just like you’d have at an agency.

These problems feed off each other, and failing to address them makes it impossible to build a solution where it’s easy for smart people to talk to other smart people.

Hired

Hired is a technical recruiting solution probably everyone has heard of. Hired has gone through a few evolutions, but when it started, it was called DeveloperAuction and only accepted candidates from MIT/Google/Facebook/Stanford, etc., allowing companies to “bid” on engineers before ever interviewing them.

DeveloperAuction’s goal, as I understand it, was to align hiring with market forces. Candidates have more leverage in the market (as we discussed earlier), so DeveloperAuction decided to call a spade a spade and actually give candidates that power by literally having the “weaker” party place bids.

Hired’s/DeveloperAuction’s approach ran into both of the aforementioned roadblocks: lack of credentialing and lack of candidate autonomy.

Lack of credentialing was problematic because: 1) pedigree isn’t a reliable indicator of performance in the first place, and 2) once Hired exhausted their initial Stanford et al. pool, they didn’t have a reliable means of credentialing to separate the good candidates from the bad. In addition, companies weren’t reliably bidding on all candidates, concentrating instead on a small pool of candidates who were often not interested in the companies that were bidding on them.


Hired tried asking job seekers to take quizzes, but of course only the people who needed to take them actually did so, which meant mostly juniors, folks with visa constraints, or people who didn’t look good on paper (some of whom were of course diamonds, but that’s not enough to build a business around).

The second problem, lack of candidate autonomy, came to light because pretty quickly Hired realized that (due to lack of meaningful credentialing) companies typically bid on the same ten people, and those ten people, because of how many options they had, weren’t interested in most companies. To mitigate these issues, Hired brought on an army of “talent advocates” (read: recruiters) whose job it was to prime the pump and ensure that companies were bidding on the “right” candidates behind the scenes.

Now, as a candidate, not only do you have to talk to an in-house recruiter, but you have to talk to Hired’s recruiter before that!

Of course, employing an army of recruiters makes achieving SaaS margins impossible, and, before you know it, you’re basically a tech-enabled recruiting firm.

Eventually, Hired moved away from the auction marketplace model, fired most of their talent advocates, and fulfilled their destiny, becoming a glorified LinkedIn Recruiter clone. Recruiters could search for candidates, just like on LinkedIn Recruiter, based on their pedigree, languages they knew, etc. Being a LinkedIn Recruiter is a business with better margins, and it’s a model that makes the middlemen who hold the purse strings — the in-house recruiters — feel empowered, which perpetuates the market inefficiency we identified earlier.

Triplebyte

Triplebyte’s story started out completely differently. They wisely rejected pedigree as a viable means of credentialing and adopted the admirable mission of democratizing access to opportunity in software engineering.

As Triplebyte was a YC company, they started out as “the common application for engineers who wanted to work at a YC startup.” To ensure that candidate quality was high, Triplebyte came up with a two-step vetting process: the first was a coding quiz anyone could take. If you did well on the quiz, then you conducted a lengthy technical interview with one of Triplebyte’s contractors. Triplebyte brought on a number of engineers to conduct these interviews, and they also created their own canonical technical interview that every candidate had to complete. If you did well in the interview, regardless of how you looked on paper, you got fast-tracked to onsites at Triplebyte’s customer companies.

Triplebyte, to my mind, did an admirable job of trying to solve the credentialing problem. But their approach was not without shortcomings: 1) not everyone wanted to take their lengthy quiz, even though the quiz was well done, and 2) scaling up an army of interviewers, all of whom had to be trained in exactly the same way, was non-trivial and not cheap. These challenges were surmountable, but the challenge that wasn’t arose from the second issue that all companies had to face: lack of candidate autonomy, which was driven in part by a lack of faith in the credential.

Once you passed Triplebyte’s assessment process, just like at Hired, you had to interact with a talent advocate (Triplebyte labeled them “talent managers,” but again they’re just recruiters). The talent manager would examine your background and short-list you for some companies of their choosing, where you’d then go onsite. You could have some input into which companies you spoke to, but it was limited, and if you didn’t meet the company’s (often somewhat arbitrary) criteria, no matter how well you did on the assessment, its doors were closed to you.

As with Hired, having an army of recruiters AND interviewers working for you makes achieving SaaS margins impossible, and then, you’ve essentially become a tech-enabled recruiting firm (albeit this time one with much better performance data!).

Just like Hired, Triplebyte eventually moved away from the auction marketplace model, fired most of their talent managers and interviewers, and fulfilled their destiny, becoming a glorified LinkedIn Recruiter clone. Recruiters could search for candidates, just like on LinkedIn Recruiter, based on their pedigree, languages known, and so forth. One thing Triplebyte still does differently, however, is to leverage their aforementioned coding quiz to annotate candidate profiles. (Presumably, they are using their historical interview data, from when candidates had to do BOTH, to predict how people will perform in interviews.) The limitation, of course, is that great people will be unlikely to take the quiz in the first place, especially now that it no longer fast-tracks them to an onsite.

AngelList

My last example is AngelList. AngelList is a bit different from Tripleybte and Hired because they have not fully succumbed to becoming a search aggregator, probably because AngelList’s main revenue driver isn’t their recruiter business but their angel investment marketplace.

The big difference between AngelList and the others is that AngelList does give candidates autonomy — you can apply to any company of your choosing — but without the credentialing piece in place, it’s toothless. As with their inbound channels, companies have learned to ignore AngelList referrals because they’re noisy and full of candidates who don’t have much leverage (again, juniors/bootcamp grads and people requiring visa sponsorship, for example).

Currently, AngelList has some early attempts at credentialing in place, mostly quizzes that self-select out people who don’t need to take them. Because their credentialing lacks weight, when a candidate reaches out to a company on AngelList, they aren’t fast-tracked. It’s just like applying via an online job board.

When I was doing research for this piece, one quote about AngelList stuck out (specifically, it was about the now-defunct A-List offering), speaking poignantly to the importance of removing the barriers between candidates and companies during the hiring process. It’s nice when it’s easy for engineers to talk to engineers.

How to fix hiring

As you can see, historically, recruiting solutions have been plagued by two limitations: lack of credentialing and lack of candidate autonomy and resulting middlemen. What happens if you remove them and build something free of these restrictions?

In this beautiful world, once I’ve established that I’m smart and can get shit done, doors of companies are open wide to me. Imagine this. I scroll through a list of employers, pick one that I’m interested in, bypass all the bullshit that typically happens at the top of the funnel — the scheduling, tedious recruiter calls, resume reviews — and I just get to talk to an engineer at that company. Maybe I’m not sure I want to work there yet, and that’s fine. But this way, I get signal about whether I want to, in a way that talking to a recruiter or reading a job description simply can’t replicate.

And I don’t have to wait for recruiters to find me on LinkedIn or in Triplebyte or some other search aggregator, then contact me, all while hoping that the chaotic universe somehow delivers a reachout from a recruiter at a company I actually want to work for at the right time. Or that the recruiter who reached out to me when I wasn’t looking a year ago still works there. But odds are they don’t.

There’s no substitute for chemistry, in dating or in hiring. All the carefully crafted job descriptions in the world pale in comparison to talking shop with someone on the team. That’s how it should be.

In this new world, even though there’s still an engineering shortage, with the advent of persistent, meaningful credentialing and candidate autonomy — the two limitations that have plagued hiring solutions to date — can we now overturn the two “axioms” that have so hampered recruiting? Here they are again:

  1. Thou shalt not engage with active candidates. After all, in this market, strong candidates aren’t looking. Good recruiters build relationships so that when a good candidate does decide to enter the market, the recruiter is there, behind the next doorway, ready to spring!
  2. Engineering time is expensive, so it’s critical to do as much top-of-funnel filtering as possible to make sure that it’s spent on the right candidates.

Let’s look at #1 first. Does this still hold? No! Now, when candidates are active, they can just approach companies they’re interested in. No more skulking and waiting and guessing and writing software that tries to predict when passive candidates are on the move.

How about #2? Nope. Now that candidates come vetted, putting up artificial gates to save eng time doesn’t make sense anymore. If anything, putting up gates would now be an antipattern because it delays the sell — the sooner you can get an engineer talking to an engineer about The Work, the better.

Now, as a candidate I’m actually in the driver’s seat. I can talk to companies I’m interested in learning more about, at any time, without all the hurdles and frustrations that define the current process.

Why do I feel so strongly about all of this? It’s exactly what we’re working on at interviewing.io. At this point, you might roll your eyes and say, what a cheap plug. Or wonder why we think we can succeed where so many others have failed. But, before you do, think on this. I’ve spent the last five years of my life dedicated to bringing this vision of hiring to fruition because I believe, in my heart of hearts, that this is the only way hiring can be both efficient and fair. If I didn’t talk about the thing I made which addresses this very problem, and if I didn’t believe in it to the point of fanaticism, I’d be a raging hypocrite. And if I weren’t proud enough of what my team and I have achieved to promote it, then I’m doing everything wrong.

Or, hell, we could be completely mistaken about our approach… being contrarian doesn’t necessarily make you right in the long run. Hell, maybe devolving to LinkedIn Recruiter is the only way. But, look, I really hope not.

In any event, on interviewing.io, once you’ve built up your reputation by doing mock interviews (persistent, meaningful credentialing), you can look through a list of companies, and regardless of who you are or how you look on paper, you can book an interview with an engineer at that company as early as the next day (candidate autonomy and a direct line to an engineer).

You can pick companies you’d like to talk to…
…and grab a time slot that works for you. There’s an engineer on the other end. No recruiters or resumes.

It feels like magic when it works, and 40% of our hires have been people whose resumes you probably wouldn’t pick out of a lineup (in fact, many had been rejected due to their resume by the very same company that eventually hired them through us). The remaining 60% of our hires are people who look great on paper but were fed up with the Kafkaesque dog and pony show that traditional hiring has become.

Of course, one of the limitations of our approach is that we’re getting performance data about engineers from the practice interviews they do on our platform. Any seasoned interviewer will tell you that the signal one gets from a technical interview isn’t the whole story. A candidate’s performance can oscillate from interview to interview, some candidates are less familiar with the format, the system can be gamed by memorizing Leetcode problems, and so on. But it’s a start, and we’ve found that data in aggregate (performance in at least three interviews) is much more predictive than a single data point. Interview performance aside, we hope to build a corpus of data about people that goes beyond how they do in interviews and also tracks how they perform on the job.

All I want is a world where it’s easy for smart people to talk to each other. That’s the world we’re trying to build at interviewing.io.

  1. From what you’ve read up until this point, you might think that I hate recruiters and find them useless. Not so, dear reader! I hate bad recruiters. And, unfortunately, most of them are bad. What’s sad is that the good ones, instead of spending time on tasks for which they’re uniquely qualified and well-suited, are instead stuck at the top of the funnel sourcing engineers whose qualifications they don’t have the domain expertise to evaluate and selling them on roles they don’t have the domain expertise to describe. The best recruiters I’ve worked with are singularly amazing at shepherding candidates through the process, tirelessly stewarding a company’s employer brand, advising hiring managers on the best ways to close, keeping an analytical eye on the funnel to identify issues before they even arise, and much more. 
  2. If we add in time to review resumes, it’s an extra five hours (at most). 
  3. I’ve done a lot of the work around resumes/pedigree not being a useful predictor. Here are some of my favorite pieces: Lessons from a year’s worth of hiring data, Lessons from 3,000 technical interviews… or how what you do after graduation matters way more than where you went to school, Resumes suck. Here’s the data. But I’m not the only one. Google has found that GPA isn’t predictive of on-the-job success, too. 

The unvarnished, unbundled guide to hiring tools

I get a lot of questions about which hiring tools do what and how they’re different from each other, so I decided to draw an ugly, yet handy, picture (see below).

By the way, the reason this post has “unbundled” in the title is that many hiring tools, in part because we’re all on the VC funding treadmill, aspire to be more than they are and to ultimately be the one ring that rules them all, all the way from sourcing to interviewing to reference checks to onboarding to god knows what. So far, these attempts at grand unification, much like communism, have not panned out in practice. Though most tools claim to do more, most do more badly 1 but will try to upsell you on how they can solve all your hiring needs. Therefore, in the picture above, I’ve chosen the primary use case for each tool, i.e. the use case that each tool has actually gotten traction for (and the use case that I believe they’re actually good at).

And last thing. I run interviewing.io, which means that my take on other tools is indiscriminately biased. But, hell, it’s my blog, and I can say whatever I want… and having been in this industry for almost 10 years as an engineer and later as both an in-house and an agency recruiter and having spent the past 5 years running a successful hiring marketplace, I have acquired my prejudices the honest way, through laboratory experience. 2

All that said, a few of the choices I made in the picture probably require some explanation, so here goes…

I thought HackerRank also had an interview tool. 
Yep, they do, it’s called CodePair. However, last I checked, in order to use it, you also had to buy other things, i.e. you can’t just use CodePair for technical interviews without buying in to the broader HackerRank ecosystem. And though CoderPad isn’t paying me for this, I think it’s a superior tool on many levels (and the one we actively chose to use inside of interviewing.io, forsaking all others). CodeSignal also has their own tool and Codility does as well, but that’s kind of the point of this post: tools unbundled. I’m listing the tools that have the differentiator in question as their core competency, not an add-on on some enterprise checklist.

Why is Triplebyte in the middle of the sourcing section?
Triplebyte recently pivoted to a new model. Instead of interviewing their candidates before sending them to customers, they now rely on a quiz, the results of which they use to annotate candidate profiles that recruiters can source from. Before that, they had their own interviewers conduct technical interviews with candidates and also had their talent managers match candidates to companies.

I thought Hired, AngelList, and LinkedIn had some kind of skills assessment/technical vetting?
They do, but last I checked it was an asynchronous test (rather than a human interviewer). In my opinion, asynchronous skills testing on these platforms has some value, but it’s quite limiting for a few reasons:

  • Skills testing for candidates is optional on these platforms, which means that 1) not all profiles are vetted and 2) you get a good amount of selection bias for candidates without leverage taking them, e.g. juniors and people who need new visa sponsorship
  • Much easier to cheat
  • And of course asynchronous tests are lower fidelity than human interviewers (or at least the ones I’ve seen to date… but I want to be proven wrong)

I thought AngelList had interested candidates come to you?
They do. But like any jobs board, it’ll be noisy and probably full of candidates who don’t have much leverage, e.g. juniors/bootcamp grads and people requiring visa sponsorship.

Should I use take-home tests in my process?
As with many of my answers, it’s a matter of leverage. Candidates who have lots of options probably won’t spend time on take-homes. Candidates who don’t, either because they’re junior or because they don’t get a lot of recruiter outreach for other reasons, will.

Why are you so obsessed with leverage?
Because market forces rule everything around me.

Hey, if you do sourcing, why does your company literally have “interviewing” in the name?
The way we get candidates into our ecosystem is by offering them mock interviews. Then top performers from our practice pool can choose to use us for their job search. The name originally was meant to highlight the practice offering, but yeah, it’s confusing.

Is interviewing.io a good way to source candidates?
Yes. Yes it is.

  1. Not all… some do a decent job of this. 
  2. The bit about acquiring your prejudices the honest way is one of my favorite quotes, and credit goes to James Roberge, electrical engineering professor at MIT. 

Thinking about attending a coding bootcamp? Ask them these questions first.

I get a lot of emails from prospective career changers who’ve read my stuff (especially the one in Forbes where I went off about how MS degrees in computer science are snake oil) asking for advice about breaking into software engineering.

Many of them ask about bootcamps. Almost all are surprised by the harsh reality that, though bootcamps can be a perfectly useful and valid start to your career change journey, they are not the magical panacea that they purport to be… and that a true career change is going to take a lot of blood, sweat, and autodidactic tears after graduation for most people.

Why do I believe this? A few reasons. First, I did a Twitter straw poll about post-bootcamp outcomes a while ago, and it was pretty grim:

Post-bootcamp employment Twitter poll

Of course, Twitter straw polls aren’t science or even really data. What *is* data is how bootcamp grads perform in technical interviews. At interviewing.io, we’ve run pilots with most reputable programs at one point or another, hoping that we’d be able to place their students. The sad truth is that almost every current bootcamp student who participated in interviewing.io’s mock interview pool failed. To be fair, our audience is usually senior engineers, but interviewers see candidate seniority and do adjust question difficulty. Despite that, the outcomes were not encouraging.

It’s not that the students don’t have potential. It’s that every program I’ve seen doesn’t dedicate nearly enough time in the curriculum to interview prep. Technical interviews are hard and scary for everyone, even FAANG engineers with 5+ years of experience. The idea that you can teach someone who’s never coded before big O notation AND get them proficient at writing efficient code and articulating trade-offs in 2 weeks (this is how long I’ve heard the most reputable programs spend on technical interviewing) is laughable.

So, without further ado, for those of you considering attending a bootcamp, these are the questions you should ask.

  • Do you ask people to leave before graduation if they’re struggling?
  • Does your job placement rate include just graduates?
  • Is your job placement rate an average of all cohorts or just a few/the best ones?
  • What is considered “getting a job”? Does it include people who are working on contract or part-time? Does it include people who are now TAs or instructors at the bootcamp? Does it include people who found work doing something other than software engineering?
  • What is the median salary of grads (not just the average)? And do those numbers include part-time work/other job titles or just full-time software engineers?
  • For the students who ended up at FAANG, what were their backgrounds? Were they electrical engineering majors? Physics students? What portion of people who ended up at FAANG didn’t know how to code before doing the bootcamp?
  • What portion of your curriculum is dedicated to technical interview prep?
  • How many mock interviews with industry engineers (not peers or instructors) will I get as part of the course?

Let me know if these questions help you in your adventures, brave heroes. Another resource you can use is the CIRR — they’ve created a standardized request form that bootcamps can use to report outcomes, and you can see outcomes from participating bootcamps for H2 (second half) of 2018. It’s not everything, but it’s a start.

How to write stuff that gets on the front page of Hacker News

EDIT: This piece did indeed make it to the front page of Hacker News in a meta victory. Thank God.

Hi. My name is Aline, leeny on Hacker News. My team at interviewing.io and I have written a lot of stuff, and most of it has been on the Hacker News front page — of the 30 blog posts I (and later we) have submitted, 27 have been on the front page, and over the last few years, our writing has been read by millions of people.

We wrote most of these… though some are just great content I underhandedly beat the author to submitting and feasted on the ill-gotten karma.

Though the first few things I ever wrote were driven by a feckless mix of intuition, terror, and rage (I write a lot about how engineering hiring is unfair and inefficient and broken), over time I began to notice some common threads among my most successful posts, and these realizations have made it easier for me to weep less, write more, and to pass on the learnings to my team and create a somewhat repeatable system of content generation.

I’m not altogether unaware that the title of this post has a whiff of hubris about it and merits some amount of disclaiming. I don’t claim that my way of writing is the only way, nor do I claim that it’s going to work forever. Every time I write something and submit it, I ask myself, “Is this it? Is this the one where I find out the formula no longer works?” It’s terrifying and it’s fickle, and I’m beyond grateful to the HN community for reading interviewing.io’s stuff as long as it has.

What makes content sticky?

This list isn’t exhaustive, and surely there are other strategies to crafting sticky content, but I can only talk about the two strategies that have worked well for us. The most effective strategy, in my experience, is to tap into a controversial point that your audience already holds and then back it up with data that you have to confirm their suspicions.

The second strategy is to share something uniquely helpful with your audience that makes them better at some meaningful aspect of their lives.

I use both of these techniques repeatably, but in my experience, the “controversial with data” technique is way more effective than being “helpful”. More on that later, but first, here’s how to execute on both.

Confirmation bias, cocktail parties, and data

What is confirmation bias? It’s why people enjoy saying “I told you so!” It’s the tendency to interpret new information in a way that reinforces existing beliefs… preferably controversial beliefs that your audience suspects are true (and are probably frustrated about) but can’t definitively back up.

Confirmation bias comic

In our case, it was a bunch of aspects of status-quo hiring, stuff like: resumes suck, LinkedIn endorsements are dumb, technical interviews are being done badly and the results aren’t deterministic, and so on and so forth.

So, you take that kernel of frustration, and then you put some data firepower behind it. Find the data that you have that no one else has, and use it to prove that those controversial beliefs do indeed hold water… lighting up the same parts of our brain that makes us fall prey to confirmation bias, in other words.1

Another way to say it is that the best content marketing, in my mind, is the stuff that makes people smugly want to repeat it at cocktail parties. I don’t say that with judgment or derision. I derive much of the pleasure in my short, brutish life from being smug and right. It’s not something I’m necessarily proud of, but it’s true.

So, if you have something controversial to say, why does having data matter? Because no one cares with Aline “Dipshit” Lerner thinks about hiring. You and your readers might hold all sorts of controversial opinions about the world, but until you’re really famous, your opinion doesn’t matter more than anyone else’s. But data (especially if it’s proprietary) can elevate an anecdote to gospel. Data provides you with the credibility that nothing else can at this stage — no matter who you are, if you have compelling data, engineers will listen.

The one thing you really have in your favor in these situations is that, because no one knows who you are, the more sophisticated your audience, the more likely they are to take your good content seriously. You don’t have a brand, you don’t have a comms team or a brand to protect, all you have is the unvarnished truth from the trenches.

With the attributes above in mind, think about what cool stuff you’ve discovered by virtue of working at your company. Do you have a data set you can mine for unique insights? Does having operated in your space at depth put you in a position where you can confirm or deny controversial assumptions about some aspect of human nature or our daily lives? If you’re a founder, what unique insight do you have that made you start this company in the first place? If you’re an employee, what part of the mission/vision/execution really resonated with you, at the exclusion of other options you had in the same space? Then, once you’ve identified the right sticky tidbit, it’s up to you to distill it into plain English and then back it up with data… which in practice means some very clear (and maybe pretty… though clear trumps pretty) graphs or visualizations.

It’s tempting to fall into the trap of creating content that tells rather than shows, and the myriad blog posts out there to the tune of “here’s how we run meetings” or “here’s our product process” are proof of that. Typically, posts like this don’t do very well because frankly, no one cares about how you run your processes until you’re a household name that others are trying to emulate. One exception to this rule is if you want to highlight something polarizing you’re doing. In that case, feel free to shout that directly to the world so it’s loud and clear and makes its way most directly to the fringe community you’re targeting. In other words, if you’re really gung ho about TDD, you can write a blog post called “Why we ALWAYS use TDD with no exceptions”, and it’ll do great because of confirmation bias among TDD evangelists, probably the very people you want to target.2

Being helpful

Though, in my experience, the controversial cocktail party technique is the most effective, you can’t always bust out controversy at the drop of a hat, and you might have plenty of useful, interesting things to say that don’t tickle our desire to be right. If you can’t be controversial, then be helpful. Note that “helpful” means giving your readers specific, actionable advice about things that have a big impact on their lives (love, work, sex, health) rather than general worldviews on these topics.

Also, note that being helpful is not nearly as effective as being controversial. Woe is us.

Controversy is more effective than being helpful… here’s the data

And, there’s this post, which has had one maybe controversial idea so far (namely that making your readers feel smug is what gets you eyeballs and clicks) but no data to speak of. To right that gross injustice, I looked at all the posts my team and I have contributed to Hacker News over the years and tagged them with 3 attributes: whether they were controversial or helpful and whether they had data.

Below is a graph showing the average number of HN upvotes per post type. I looked at whether a given post was helpful or controversial. And for each type, I broke apart posts into 2 subcategories: whether they had data/graphs or not.

hn blog posts - plot

I refrained from doing any significance testing because teasing apart independence here would have been an unprincipled nightmare. For instance, most of our helpful posts didn’t have data, so the relationship between whether the post was helpful or had data wasn’t entirely independent. That said, there’s probably still something useful to be learned from just looking at the means of upvotes for each category, namely that if you don’t have data, then write helpful stuff. It’ll do OK. If you do have data, controversy reigns supreme.

Examples of good content marketing

It’s easy to wax general. I don’t think this post is going to be helpful without some examples. Here are some examples of stellar writing that fall into the categories above.

Examples of controversial, data-driven content marketing

For me, the canonical, original, great data-driven posts all live in the OKCupid blog served as the lodestar of what good blogging could be. These days, the original posts are buried in a cave where no site nav breadcrumbs will go (they’ve been replaced by a sad facsimile or what they used to be, utterly inoffensive, bland, and humorless), and I had to google to find them. But, you know, gems like this:

  • OKCupid – The lies people tell in online dating where the controversial idea is that people really do lie a lot in online dating (this was controversial in 2010 back when it was socially appropriate to be embarrassed that you were dating online)
  • OKCupid – The case for an older woman where the controversial idea is that women over 30 are viable (very sad for me that this is controversial)
  • Uber early blog – Rides of glory where the controversial idea is that you can guess which of your users are having sex based on their ride usage data (this post was what introduced me to Uber and I expect helped meaningfully build their brand… there’s a reason it’s no longer up and I had to link to the web archive)
  • Priceonomics – The San Francisco drug economy where the controversial idea is that it’s very lucrative (and not very hard) to be a drug dealer in San Francisco, and many users are in tech

But… do posts HAVE to hit a nerve and make some portion of the population uncomfortable? Though those tend to be the most fun, this isn’t necessary to produce great content. Hiring is typically a much more tame subject than sex, but it’s possible to write controversial things about it — I’d be remiss if I didn’t link you to some of the things we’ve written. Here are a few favorites that exemplify our take on controversial, data-driven blogging:

Examples of helpful content marketing

As we discussed earlier, not all good content marketing falls into this controversial-anecdote-backed-up-with-data format. Some successful posts just have really useful content that make you better at some meaningful part of your life.

And of course, a few of ours:

So, we have some theory about content marketing, and we have some practical examples. What now? To wit, here’s one last controversial piece for you: drinking a little might make you a more prolific writer.

How to actually make yourself write

The “cocktail party anecdote backed by data” premise is reliable and repeatable and it works, and I expect as you read this, you probably have some ideas about topics you could write about. Ideas are the easy part, however. How do you actually summon the wherewithal to write?

Before he was a hipster text editor, Ernest Hemingway was a churlish, surly alcoholic writer with an allergy to adverbs who coined the phrase “Write drunk, edit sober” and changed my life and liver forever.

When I was maybe halfway through writing Lessons from a year’s worth of hiring data (the first successful post I ever had), I hit what felt like an insurmountable wall. I had already spent months manually counting typos in resumes, had run a logistic regression and a bunch of statistical tests, and was pretty sure that I was onto something — the number of typos and grammatical errors in one’s resume appeared to matter way more than where someone worked or went to school. And there were other surprising findings, too. But when I tried to get the words out, they wouldn’t come. The typos thing was super cool, right? And surely a better, more competent human would do that finding justice when writing about it. In my hands, the work read like an insipid, stilted school assignment. I drew the blinds and sort of curled up in a ball on the ground for I don’t know how long… eventually my ex-husband and his friend who was visiting came home, a few beers in, and peeled me off the floor.

I don’t remember what the two of them said to me exactly, but my brain put it away in memory as something along the lines of, “Stop being dramatic and get out of your head and drink with us, for life is short and brutish.”

So I drank. And maybe then we had a dance party or something… I don’t know. But at some magical, serendipitous moments, Florence + The Machine came on. And I sat back down at my computer and started working myself into a frenzy to the tune of the music… “Hiring isn’t fair, the world isnt fair…. hiring isn’t fair, and the world isn’t fair, and fuck the fact that everyone uses resumes and rejects all manners of good people even though they’re clearly a crock because typos matter 50 kajillion times more than pedigree.”

And in that slightly drunken, fevered frenzy I wrote the rest of the post. It ended up getting cut in half or more by friends who were kind enough to extract a few cogent bits from whatever it was that I produced. The writing in that post isn’t the best, but it’s ok… and it was good enough to get the payload about typos (and generally about how dumb resumes are) across clearly, which is ultimately what mattered most.

Why does wine help me write (please see the footnote before you unleash your wrath)?3 Because, for a brief hour or so, it stills the inexorable pull of self-editing and silences the voices that tell you you’re a piece of shit who can’t write worth a damn. Now, you might still be a piece of shit who can’t write worth a damn, but you’ll never become a piece of shit who can write unless you actually write.

Once the voices are quiet, you can get out whatever is in your head. It doesn’t have to make sense, it doesn’t have to be ordered or flow, and it doesn’t have to be the most important takeaway you anticipate your post will ultimately have. Whatever it is will be raw and real… and then you (and your friends or coworkers if you’re lucky) will prune the drivel and mold it into something good.

So drink your wine (or don’t drink… just do whatever gets you in a good place) and put on whatever music fills your heart with rage (or inspiration if you’re not a broken human like me), and get to it. And do it again and again, until the ritual itself is what gives you comfort and lets you produce.

But, friends, be warned: please do your data analysis sober.

 

1 The folks at Priceonomics succinctly assigned “confirmation bias” as the right term for this technique. I first heard it at a workshop they ran. I had been doing this for years but hadn’t ever heard the term before. They do great work around content marketing and have made a business out of harnessing confirmation bias and data. They’ve also written a much lengthier guide to what makes good content than this post.

2 I have no idea why I picked TDD for this example. I do not have strong feelings about TDD, and there are probably way more controversial things out there… like using JavaScript in server-side production.

3 I’m probably going to catch a lot of flack and vitriol for encouraging drinking. Look, it works for me. It doesn’t have to work for you, and it might be really bad for you in particular because of some unserendipitous mix of genetics and past decisions. So, instead of drinking, let’s use alcohol as metonymy for any number of activities that quiet the voices in the head and let you focus. I hear that among the well-adjusted, meditation is all the rage, as is physical exercise. For those on the fringe, we drink in the dark.

Diversity quotas suck. Here’s why.

A few days ago, I contributed to a roundtable discussion-style post about diversity quotas (that is, setting specific hiring targets around race and gender) on the Key Values blog. Writing my bit there was a good forcing function for exploring the issue of diversity quotas at a bit more length… and if I’m honest, this is a topic I’ve had really strong opinions about for a while but haven’t had the chance to distill. So, here goes.

I think it’s important to ask ourselves what we want to accomplish with diversity quotas in the first place. Are we trying to level the playing field for marginalized groups? To bring in the requisite diversity of thought that correlates so strongly with a better bottom line? Or to improve our optics so that when the press writes about our company’s diversity numbers, we look good? Unless diversity quotas are truly an exercise in optics, I firmly believe that, in the best case, they’re a band-aid that fails to solve deep, underlying problems with hiring and that, in the worst case, they do more harm than good by keeping us complacent about finding better solutions, and paradoxically, by undermining the very movement they’re meant to help. Instead of trying to manage outcomes by focusing on quotas, we should target root causes and create the kind of hiring process that will, by virtue of being fair and inclusive, bring about the diversity outcomes we want.

Why are quotas bad? If it’s not just about optics, and we are indeed trying to level the playing field for marginalized groups, let’s pretend for a moment that quotas work perfectly and bring us all the desired results. Even in that perfect world, we have to ask ourselves if we did the right thing. Any discussion about leveling the playing field for marginalized groups should not just be about race but should also include socioeconomic status. And age. And a myriad of other marginalized groups in tech.

We often focus on race and gender because those are relatively easy to spot. Socioeconomic status is harder because you can’t tell how someone grew up, and you can’t really ask “Hey were your parents poor?” on an application form. Age is a bit easier to spot (especially if you spent your 20s laying around in the sun like I did), but it’s illegal to ask about age in job interviews… to prevent discrimination! Surely, that’s a contradiction in terms. So, if we’re leaving out socioeconomic status and age and a whole bunch of other traits when we assign quotas, are we really leveling the playing field? Or are we creating more problems?

One of the downsides of diversity quotas is the tokenization of candidates, which often manifests as stereotype threat, one of the very things we’re trying to prevent. I can’t tell you how many times people have asked me if I thought I got into MIT because I’m a girl. That feels like shit… in large part because I DON’T KNOW if I got into MIT because I’m a girl. Stereotype threat is a real thing that very clearly makes people underperform at their jobs… and then creates a vicious cycle where the groups we’re trying to help end up being tokenized and scrutinized for underperformance caused by the very thing that’s supposed to be helping them.

So, what about diversity of thought? If you’re really going after candidates who can bring fresh perspectives to the table, their lived experience should trump their gender and ethnicity (though of course, those can correlate heavily). If you’re really after diversity of thought, then educational background/pedigree and previous work experience should weigh just as heavily. Before I became a software engineer, I spent 3 years cooking professionally. Seeing how hiring happened in a completely different field (spoiler: it’s a lot fairer) shaped my views on how hiring should be done within tech. And look, if you put a gun to my head and asked me, given absolutely identical abilities to do the job, whether I should hire a woman who came from an affluent background, aced her SATs because of access to a stellar prep program and supportive parents, went to a top school and interned at a top tech company over a man who dropped out of high school and worked a bunch of odd-jobs and taught himself to code and had the grit to end up with the requisite skills… I’ll take the man.1

But I’ll also feel shitty about it because I don’t think I should have to make choices like this in the first place. And the fact that I have to is what’s broken. In other words, quotas don’t work from either a moral perspective or from a practical one. At best, they’re a band-aid solution covering up the fact that your hiring process sucks, and the real culprit is the unspoken axiom that the way we’re doing hiring is basically fine. I wrote at length about how engineering hiring and interviewing needs to change to support diversity initiatives already, so I won’t do it here, but the gist is that fixing hiring is way harder than instituting quotas, but low-hanging fruit aren’t going to get us to a place of equal opportunity. Better screening and investments in education will. At interviewing.io, because we rely entirely on performance in anonymous technical interviews rather than resumes to surface top-performing candidates, 40% of the hires we’ve made for our customers are people from non-traditional backgrounds and underrepresented groups (and sometimes these are candidates that the same companies had previously rejected based on their resumes). The companies that we’ve hired for that have benefitted from access to these candidates have been willing to undergo the systemic process change and long-term thinking that effecting this level of change requires. We know our approach works. It’s hard, and it takes time and effort, but it works.


1There was a recent New York Times piece about how “diversity of thought” is an excuse that lets us be lazy about working to hire people from underrepresented groups. I believe that the kind of “root cause” approach we’re advocating where we invest in long-term education and create a fairer hiring process is significantly harder than doing something like quotas.