False Negatives [Steve Yegge]

Download MP3
Yegge talks interviewing
watch Steve Yegge's podcast https://www.youtube.com/watch?v=7GurMGEDHUY


Transcript

[00:00:00] So this week we've been going through Steve yogis podcasts and his greatest hits his updated perspectives on the big clouds and what they're doing right. And what they're doing wrong. But the other thing that Steve is really well known for is his views on tech interviewing. And he's done in big tech interviews and quite a lot of them.  

And we all know they're broken in some way, but it's often in very stark reminder of how broken it is. I think there are two anecdotes here. I want you to look out for, which is the first, the one on Jeff Dean. Just look out for that name. And second, the one on them reviewing their own packets and applying too high of a bar saying too many nos. There's a lot of false negatives in the industry.  

Both false negatives and false positives. R a problem. Of course. And he's just some ways to handle them. But overall, I just think we, we deserve some reminder of how flawed it is when we do our own interviewing. I thought I had a bad run of it doing two interviews a week. And he did multiple a day, sometimes three at once. And i just think this is a fantastic story to go over 

So the thing about interviewing is it's a terrible signal. It's, it's better than a phone screen. And a phone screen is better than a resume screen. If you just look at someone's resume, how sure are you that they're good. I mean, in any, in any discipline, right? You know, you wanna, you wanna, you want an airplane, airline, pilot, you look at the resume. 

Will you just hire them based on the risk? Not usually. So the resume is, is your first filter. It's the first thing where you basically take a stack of resumes and there's an art to reviewing resumes and looking for people that are kind of trying to cover up, uh, things that, that, that, uh, they may not know. 

And they don't want you to know that they don't know. So they try to cover it up in their resume. So you can look for. Weasel words, and it's all kinds of things you need, but basically you're taking the resumes and you're, you're sorting them into two piles, right. That the keeps in the don't keeps and there's of course, the old running joke in the industry about how you want to take some resumes and just throw them in the trash can because you don't want to hire unlucky people. 

And so if you throw in the trashcan, that person was unlucky, but they do sort of the resumes into the I'm gonna follow up. And the ones that you just say pass. So writing a resume is really important. And part of, um, a book. Passing technical interviews would be on how to write a great resume. And this comes up again when you're writing your resume, so-called resume for what you've accomplished your company. 

When at time it's time to get promoted. So the art of resume writing never, never gets old. It never leaves you and is always an important part of your career. Being able to represent yourself. But that's a, that's just step one and it's a bad filter. You don't want to just base your decision on a resume. 

Would you marry somebody based on their race? Maybe, but probably you'd want to meet them first. Right? So the next step is a phone screen and everybody hates doing phone screens. I actually love doing phone screens. I, for some reason have, um, never really had an issue with them unless there's a bad connection or something, but a lot of people just hate talking on the phone and they even more hate having to ask people technical questions on the phone. 

So I often got stuck with phone screen duty at every company that I ever worked. Because you can actually do a pretty good job, not a great job, but a pretty good job of predicting whether they're going to pass their interviews based on my phone screen. Cause my phone screens would go for two hours if necessary to sort of, you know, get a comprehensive look at what this PR this candidate is good at because the general rule is like the longer you spend evaluating somebody than the better. 

Idea. You're going to have of whether they're going to work out. Long-term just like the longer you have a relationship with somebody before you decide whether to marry them or not the better you're going to know how that marriage is going to go. Most likely there is a point of diminishing returns and we'll talk about that. 

But by and large, The amount of vetting that we do in the industry today is nowhere near enough. And I'm going to, I'm going to talk about the consequences of that and how we, how we arrived at that conclusion. And so on in this, in this talk, but at a high level, I don't believe in interviewing anymore. I, I ha I'm a strong skeptic. 

I think that interviewing is so flawed. It's it re any company that really wants to get ahead of their competitors and succeed needs to spend some time re-inventing their interview process. And probably having people spend more time with candidates than they're spending today. It's, it's just not a very good signal. 

And I said that at Google once, uh, Google, I said it in, in an email, uh, replied on some public thread somewhere, um, in the early days, maybe 2008. And. Some director got mad at me and said, oh, we didn't like that. We didn't the [00:05:00] records in life that you said that you had, that you're a skeptic of the interview process. 

We were talking about a company that hires scientists. We're talking about a company that, you know, one of their models is speak truth to authority, and this director was an ass and, uh, he got what was coming to him eventually. At the time, you know, he was just like, well, everybody's upset because you're, you're, you know, you're questioning the sacred interview process. 

You farted in church is what he told me. And so, uh, and so I haven't really been able to tell people this for my entire career because they feel that it's undermining their, um, ability to attract the best. I guess, but the reality is if you marry somebody after dating him for four hours, you're probably going to get a surprise. 

Maybe it's a good surprise. Uh, but most surprises are not so good in that department. And interviewing is the same way. So if you're going to keep your interview. Uh, panels the exact same way that they've been doing it since Silicon valley was invented by the arse hole shot shot key. Uh, then, um, then you're going to need a better process for getting rid of people who are no good. 

You're you're going to need a, you're going to need to double down on your process for managing people out. That's actually how Amazon gets by and gets such great. They aggressively manage out under performance because they know that underperformers are gonna sneak in. And, uh, it's because the interview process is fluid. 

So it's just a best effort. The problem with the interview process is that it takes a lot of time. It's really miserable for engineers to do more than two or three interviews per week. And most companies try to cap it so that you're not talking to more than maybe two people per week. Okay. Or three, if they're really busy, uh, because it takes you. 

Uh, an hour out of your day to, to interview the person. And you may have a interview pre briefs where everybody gets together and maybe divides up what people are going to talk about. It's not recommended at some companies, but some companies do it anyway. And then you may have a post brief where everyone gets together and discusses a candidate afterwards. 

Also not recommend recommended. I could do a whole segment on Google's interview process and how it gets away from a lot of biases and little kids, soccer team situations where one person says, well, I didn't like him. And everyone was like, well, I didn't either because they don't want to rock the boat and all kinds of bad things can happen. 

Uh, so we can talk about that, but that's not what today is about today. It's just about just interviewing in general because what they found, this is a bit surprising, uh, is that, you know, cause Google hires a lot of statisticians. We're talking about people who are. World-class experts in their field. And since Google has this, you know, surfeit of, of statisticians, they can apply statistical methods to a variety of problem spaces. 

Google gets like a million plus resumes per year. And somebody has to go through all those resumes. It took me five applications to Google before they finally noticed my daughter's resume over a course of a year. I sent in one resume and then a couple months later I sent in another resume and a component's me and this went on until my fifth resume came in to Google and somebody looked at it and said, oh, we gotta get you in here for an interview. 

And I was like, great. You know, he could have done that like a year ago before the IEP. But whatever. So, uh, so they get a million resumes a year. That's a lot of data. That's a lot of statistical data. They do a lot of interviews. They do a lot of fun screens and they, and they add a lot of followup data, like performance data on everybody who got accepted. 

And so they can basically run a bunch of statistics and maybe even do experiments. They can set up special interview loops, where they set up conditions and control control groups and kind of try and see what happens when they try this or that to influence the interview process. Okay. Sounds pretty interesting. 

Right. And you probably think that they came up with a really, uh, really useful insights on how to conduct better interviews and the number one insight that they came up with and they shared this broadly with everybody and the, and they were just like, we're sharing this with you. We don't know what to do about it either. 

Was the, the result was they found out that there was zero correlation between your interview performance and your actual perform. Your, your interviews did not predict your performance. You could get a 4 0 4 us from every interviewer, the highest score, uh, at Google and wind up being a very mediocre performer or even a bad performer. 

And you could barely scrape by you by the skinnier teeth on your fifth application. Uh, and because I applied 55 times, but some people interviewed five times. I know one guy who actually got into Google. Fifth interview. He failed the first four and they make you wait six months. They're like, well, six months is long enough for you to go learn some stuff. 

So come back in six months, six more months. And he did that four times on the fifth time. K after, you know, two and a [00:10:00] half years have gone by, he finally got accepted to Google and he was a superstar. So you look at that and you say, well, okay, I've got some good anecdotes, but the statistics actually supported that. 

Basically your interview is now you're, you're either in or you're you're either in or out. And after that, your performance is, is it's not predictable from the interview. Another thing that they found, which was equally damning, I think no individual interviewer is better than any other interviewers when it comes to predicting whether the person will get hired or. 

And they specifically called out Jeff Dean. Uh, who's the number one engineer, not only at Google, but probably arguably in the whole world. And they said even Jeff Dean's interview scores were no better predictor than, than random of whether the person was going to get an offer. Which was kind of weird because everybody thinks they're really good at interviewing and it turns out no nobody's especially good at interviewing the best predictor was if around four people. 

All decided that they want to hire this person. And that's, that's, that's that's that's when you get the best signal. And once you start piling more than four on they, they, they found that the curve tapered off and immediately you hit diminishing returns. So, uh, starting at five interviewers, you're not getting any better information. 

And there are companies that I've been at where that have put people through, you know, 8, 9, 10, 11 interviews, 12 interviews. Trying to figure out is this person good or not? And they're just not getting any better information after the first four. That's the stark reality. And all these statistics basically backed up something that I had been fretting over and worrying about ever since I joined Google. 

Cause when I joined Google, of course, a bunch of people wanted to follow me there because, uh, I had a reputation. I was the Canary. When, when I left people, people were like, well, Either either the ship is seeking and we need to get off the ship or Steve's found something that's so much better that even though our current ship is good, this new one is something we've got to get into. 

And I had a reputation for that amongst my circle of friends of, you know, a hundred, a hundred plus people. And so a bunch of them tried to get in and they were good. Some of them were better than I was. And some of them were very clearly better versed in computer science than I was. They had contributed more. 

They were just better than me and they didn't get into Google and. And I knew right then that there was something seriously, seriously wrong with the process. Now, of course, we run into situations where, you know, I've said it before, why CEOs fail, they failed because they put their faith in the wrong people. 

And I've certainly done that myself in my career in recent times. I mean, it's very easy to do. Sometimes you're putting your faith in a person who might have all the qualifications on paper, but people just don't like. And there's nothing you can do about it, because if everybody hates you at your work, you're not going to be able to get stuff done no matter how good you are. 

So, you know, it's a problem because I I'm biased. And when I say the people who got turned away, uh, and I say they should have gotten jobs at Google and they got turned down by the interview process. You know, I am biased, you know, they're my friends. And in some cases, maybe I was wrong about them, but the ones that I'm thinking of, one on to, in their careers to become senior principal engineers at Amazon of which there are probably only 20 in existence, you know, or 25. 

And, and so these people are not slouches and they, they basically, you know, helped build the Amazon that you use today and they didn't get jobs. So, I guess one takeaway is if you interview at Google and you interview three or four times at Google and you don't get the job that doesn't actually mean that you weren't qualified to work at Google, it means you were what's called potentially. 

It potentially means you were a false negative, a false negative is where you, you do a test and the results of the test is no. And it should have been yet. So, for example, a false negative for, uh, an illness. If you take a test for COVID and they said, no, you don't have it, but you actually do have it. And the test dismissed it. 

That's a false negative, false negatives are, uh, you know, a huge, huge problem in the industry and their. Especially bad problem at Google. Google has probably a higher, false negative rate than maybe anybody else in the industry. Maybe Facebook, uh, since they got a lot of ex Googlers and they, they brought in a lot of people and they kind of copied a lot of Google's interview process. 

And so I'm sure Facebook has false negatives too. Uh, but those two are going to be the top. And false negatives really hurt. They really hurt the company. Like more than companies are willing to acknowledge they hurt because first of all, they've lost out on a great. And they just, the candidate showed up wanting to come and help the company and the candidate was great. 

Uh, but their interview process said, no, not great. So we're not going to hire them.  

So that's that's problem. Number one is you is opportunity cost, but problem, number two is now that person is bitter and resentful because they know they're good enough to work at Google and they got turned down. And so now they start to hate Google and they start to bad mouth Google or Microsoft or whoever. 

[00:15:00] Did this, every company has false negatives and every company goes through this problem where they, they create enemies, Microsoft actually in the nineties. And I interviewed there once for an internship back when I was very early in college, so I didn't get the internship. And, uh, but they were, they were real jerks during the interview. 

At least Google was kind of nice, um, stressful, but nice in Microsoft, they were very arrogant and they would just sit there and make you struggle for 40 minutes without giving you any, any hints or indicators or anything. And it was just the way they ran things. And at Microsoft, they, they basically realized after 10 years of. 

They did some studies and they did some interviews with people out in the industry and they realized they were creating an army I'm fast army of people who hated Microsoft because of their terrible interview process, their experience that they had because people would, would be jerked around at the Microsoft interview and they would leave and they go, well, screw that company. 

And, uh, and, and screw the horse they wrote in on, and they go tell all their friends about it. And now everybody in the industry is like, well, Microsoft, a bunch of jerks, and this is such a big problem that it, it, it escalated all the way up to. You know, the senior executive levels and it was part of their corporate consciousness, you know, where they were like, oh no, we have an existential problem, which is we, we, we we've been screwing people over for a decade and now they're poisoned against us. 

And so they turned it around. They made a huge effort, a huge initiative to be nice to candidates during their interview so that even if they don't get the job, they still feel like they got a fair shot. And that in turn started, started to fix the problem of, of poisoning the well of candidates. So, you know, all, all very interesting. 

Right? What have we learned so far in my, uh, you know, 20 minutes of talking so far for this goes by fast, we've learned that interviewing is not a very good signal. There's a lot of false negatives. There are, um, a lot of interviewers out there who think they're really good at it, but statistics show that they're actually not. 

We've learned that if you do more than four interviews on a candidate, that you're wasting your time, because four is enough to tell you the right, the right answer. Uh, and of course the right answer is if you're not sure than, than Beaumont. Right, right. That's what everyone says. Is it though? I don't know. 

I don't know. Right. Maybe, maybe if in doubt you do hire them and then you fire them three months later, if they're just really, really, uh, out of their league, if they're, if they're underperforming, because the really the way it should work is is if they're close, because interviewing is just so flawed and cause false negatives are such a huge problem. 

Because it's opportunity costs plus it's poisoning the well, you know, not to mention the third problem with false negatives, in addition to in bettering everybody and losing you a great candidate, they go to one of your competitors. So no triple, triple whammy right there, uh, losing, turning away somebody who's really going to be actually do a good job at your company is really, really bad for your company. 

But companies are really scared of false positives. They're really scared that they're going to hire somebody who wasn't qualified. They're terrified of it. They would much rather have false negatives than false positives. And. Most of them can get away with it. Um, because well, they could up until recently because there were plenty of engineers out there and they could just keep, keep sifting through the pile until they get one that's just unambiguously a great candidate. 

And then they, then they're mediocre performers. Right. Because the whole, the whole thing is just, it's a mess. I don't know. I folks, I don't know what to tell you other than you need to do more than just interviews with somebody to really understand what their contribution is at the very least somebody should be tasked with finding a dossier on the person. 

Let me tell you a little bit about one thing that Google does. That's kind of cool that most other companies don't do it. It's called hiring committee and I was on a hiring committee for a long time. Hiring committee is this group of people that reviews the feedback from interviewers. So at Amazon and most other companies, the interviewers, the people who actually interviewed the candidate, they get together afterwards. 

Maybe it's immediately afterwards. Maybe it's later that day. Maybe it's the next day after it started to fade, it can kind of get kind of bad when a week goes by and you're like, oh, what did they do again? So at least take good notes after you interview someone. So they get together and they decide together as a group based on their experience with a candidate, should they hire them or not? 

That's how most companies work and it's actually terribly biased because, uh, these people biases. Like I mentioned a little kid's soccer team earlier, right. Where the ball shoots out and then the whole team runs after the ball. Somebody will say one thing and then everybody will just be like, oh yeah, this candidate sucks. 

Forget it. Forget everything. I, you know, yeah. They'll, they'll say, oh, I had a really good interview of the person, but don't pay too much. Don't read too much into it because of what Joe said. And so they wind up exacerbating the false negative. And in some cases, you know, they can sneak false positives through that way too, if you really [00:20:00] liked the candidate, but usually it's easier to say no. 

And so you get this false negative problem, a Google, what they do is everybody writes up their notes. You write up very detailed notes of what you asked the candidate and what their answers were. And you include if possible, any source code that they wrote as part of their. Or any drawings, they did everything. 

And you present that you, you, you put it into a system. And then later a hiring committee meets like once a week and they go through all of the interview results and they say, okay, let's decide who of these people are. And it's a really cool process because, uh, now you, you have no idea who the interviewer is, maybe, uh, and although you can get statistics on their, their, their past, on their distribution of yeses and nos or their scores. 

Right. And, uh, you know, one thing that they found is, is if people give. 2.0, meaning they're not sure, you know, I'm just going to punt. I didn't get a good enough signal in my interview. That's a useless interview in there a week. Interviewer, you should come out with a yes or no that lets you know, kind of seems obvious. 

And so they can, they can tweak and discount somebody. If somebody always says. They can say, well, this person does always say no. And they said no to this person over here who got hired and they're doing great. So they can discount it that way or they can discount it if the person is too easy and always says yes. 

So they have a little bit of leeway there in, in basically wiggling around. Uh, if one of the, one of the interviewers is really strong. Yes. Are really strong. But basically they don't know what's going on. They're blind. They just, they're just looking at feedback and deciding whether to hire the person or not. 

And it seems like a really, really cool process in principle because it gets rid of that bias of the individual interviewers, biasing, each other. There is a problem though. Okay. The recruiters because you work very, very closely with recruiters during this whole process. Tech recruiters, recruiters are great. 

I always say, be kind to your recruiter every day is be nice to your recruiter day because they are your partner. When it comes to getting great people into this company, the recruiter is the one representing the company. They're going to be the ones sweet talking to the candidate. They're going to be the ones telling the candidate. 

Oh, I'm so sorry that you didn't. You didn't get it, but you know, they thought very highly of you giving the, you know, the, the candidate, good experience recruiters can get you good resumes. They can, they can, they can, um, help sort of try to direct people your direction. If your team is hurting many, many, many reasons to be good to your recruiters. 

And a lot of people just treat them as administrators and they're just like, whatever, get that recruiter over here. And it's, it's, they're dumb. Those people are dumb. And they probably kick puppies. So anyway, uh, you're working with the recruiters and the recruiters came into us one day in hiring committee and they said we today, uh, before we review the resumes, we're going to do a little exercise. 

Okay. What we're going to do is we're going to present, we're presenting you. We're going to give you some packets to. These packets have been carefully selected. And what we're going to do, this is sort of a calibration exercise. We're going to see, we're going to try to see if, if you, uh, you know, if your results match with other results across the company. 

They gave us some story and they gave us a bunch of packets to review. And so we reviewed the packets. What's a packet, a packet is the candidates resume. Uh, and, uh, but they wouldn't show us the resume. They would only show us the interviewer's feedback. So we had to make our decision not based on the resume, but based on only what the interview was. 

Okay, fair enough. I mean, the interviewers give a lot of details. The resume almost doesn't matter anymore at that point, because what really, you know, what really matters is how did they do in the interview? Right? So we went through and we were out allowed to ask questions and they would go in and check the notes and stuff. 

But basically we were, we were doing this totally blind and we had to go in as a group and do our regular thing to decide whether or not to hire each one of these packets. And we ended up projecting 30 or 35% of them. We passed like 60%. If I remember correctly, it was, it was a, it was a pretty hefty number that we turned down and then the recruiters shared the results. 

They shared the secret with us. They, they, they let us in on the surprise, which was at the packets we were reviewing were ours. We were reviewing our own interview feedback and we had decided not to hire one-third of ourselves. And again, it did nothing, but disillusioned me even further, just, just leading new leave that this, this whole process is just it's garbage. 

It's speed dating. It's me making marriage decisions basically based on speed dating. And, uh, and I, and to this day, I think that the, the process is flawed. I understand why they do it. I get it. They do it because. It's a compromise. It's a compromise between how much time and resources are they willing to devote to it versus how good of a signal they're going to get? 

Because [00:25:00] it's really, really hard. Like I said, to, to participate in lots and lots of things. Amazon went through to get big, fast phase back in the early two thousands where they were just like, oh my God, we got a girl that crazy. We've got a bunch of, you know, funding and we've, we've got a bunch of things that we want to get done and we need a bunch of people fast, get big, fast GBF, and we're gonna, um, we're going to interview. 

They're crazy. And so we went through a couple of years at Amazon. I don't know if it's still like this. I hope not where, uh, we had to interview many, many people. Every day, not just, not just like many per week, but like I would get many interviews per day and we had to stay on top. You had to interview the people that would come up with interview questions and you had to write up your feedback and you had to, you know, do that between juggling your regular job, which was already hard. 

And then it got to the point where I was starting to get like double. So they would have me interviewing two different people in different conference rooms at the same time. And I'd be running back and forth, apologizing to the candidate saying, I'm sorry, I've got to run. I'll be back in five minutes, work on this problem. 

And I run to the other candidate. And then one time I actually got triple booked and I was running between three different rooms interviewing three different candidates. And it was just, it was untenable. It was ridiculous. And it was not a great candidate experience. And it certainly wasn't a great experience for us. 

And when I, for a few years, because we were just trying so hard to vet people and that's, that's all we had. It's. So, so you can try to throw the, you can try to scale up, but just making everybody work really, really hard, but that, that can lead to some serious burnout and really bad outcomes. And at that point you might as well just throw darts to figure out who you're going to hire. 

So, uh, so is there a better way? Well, yeah, there, there is actually a better way. 

This is Swyx here. So he goes on to talk about pair interviewing and internships, which I didn't think were very controversial. So I cut it out.  

Don't believe too much in your interviewing process. Don't believe in it so much that you require somebody to go through 10 interviews. Cause it won't help. Don't believe that some of your interviewers are better than others because actually they're not. And, and that's been shown again and again, statistically at Google, they tried year after year. 

Uh, and, and, and don't believe that, um, that the, the, the interview performance, how well somebody does an interview, all that shows is how good they are. That's what we learned. We learned that, that if you do really relevant really well in an interview, then you can conclude beyond a shadow of doubt that this person is really good at interviewing, but they may not be any good at actually working and getting stuff. 

So do whatever you can to try to improve the quality of that signal. If you're on the interviewing side, you know, try to try to get, um, uh, you know, try to get two people into the room and do pair interviews and see how it works for your company and for your candidates. Because you're going to find that it's going to co. 

It's going to fix a lot of problems with bad interviewers, uh, and do your best to try to get the person, to try to understand what their actual work is like. Go look at their GitHub, go look at their, any, any projects that they got, where you can actually see. This is the dossier I started talking about earlier. 

Where you can see what the contributions are that got me, like I said, I got that, got me, my job at Google, my contributions, you know, in the form of, you know, what I did for my computer game was enough to sort of make a tie breaker and get me another round of interviews, you know, do that, do try to get a complete picture of the candidate because you know, you're forced be dates is not going to be enough. 

And then finally, maybe try to turn the. A little bit towards, remember how I said we didn't hire ourselves 30% of us. We decided not to hire, uh, you know, try to turn the novel a little bit, to be a little bit more forgiving and then have a process. By which you tell the candidate, you're in an evaluation period. 

And if after three months or six months, you know, you're not performing up to snuff, we're going to down level you, or we're going to find another role for you. Or unfortunately, we're going to have to, you know, find a way to get you a job at another company. Basically give you an opportunity. However, HR and recruiters say this, they have magic ways of saying it and don't ever let an engineer tell a person that they have to go. 

Cause they'll do a terrible job. Like I just did. But have a process for managing out your false positives. And then, and then you're just going to have, you're going to have better outcomes. You're going to have better people. You're going to have happier. People are going to have more productive teams and you're going to have a better candidate experience.
False Negatives [Steve Yegge]
Broadcast by