May of 2021 is almost over and while the world… well, the US anyway, has stopped jumping at every random tweet from a particular sociopath… But the world continues to spiral into chaos. Between massive COVID infection explosions in various places, open armed conflict, and even state-initiated plane hijackings, things are still dizzying. I don’t even comprehend the majority of it all, but I hope that everyone of you reading out there are safe.
It seemed like brief moments ago we were welcoming in the coming of 2021, and already we’re on the cusp of being done with half of it. And with the imminent coming of summer, interview season has come into full swing.
Side note, the team I’m on in the Cloud is hiring a second Quant UX Researcher to work with me. It leans on the more junior side of the experience scale (roughly a handful of years (3-5ish?) of relevant work experience instead of a more significant years (8+). But due to the complexities and speed of the hiring process, there’s a decent chance any application made to the Quant UXR Posting will wind up being forwarded to a team that’s not mine.
Oh, and before we proceed any further. The opinions about stuff I write about today don’t particularly reflect any specific hiring practices and guidelines set for by my various employers past and present. They all have their specific quirks and I’ve speaking a level up in the general case. I also don’t really discuss the specific questions and things I ask becaus, again, I’m discussing the topic at a more general level.
Also, let’s get it out of the way. I don’t think I’m a particularly good interviewer. I mean this in the very specific sense of, I don’t think I have a mastery over being on the interviewing side of the table. It is still a complex and stressful process to go through, and I’m not even the hiring manager or applying. Despite my misgivings, I’ve had to interview a decent-ish number of people over the years, since 2012 — maybe somewhere between 24 and 50 people?
On top of all the complexity, I also have trouble understanding body language and non-verbals without putting energy into looking for and analyzing things, which is something I don’t really have the mental bandwidth to do when I’m trying to assess a candidate for a host of other stuff. I’m not sure if this relative blindness is a net positive or negative for candidates, but it’s still more stuff on my mind.
But since I’ve had to put a fair amount of energy and thought into it, I have some opinions and observations, probably bad ones, about my experiences. I suspect that some people may find it interesting to see inside my head on the topic, despite my relative doubts about my own abilities.
What am I, as interviewer, looking for?
As an interviewer who is NOT the hiring manager, I have one primary job that’s seemingly obvious — make a assessment of the candidate to advise whether that candidate should be hired or not. The nuances of what that statement means gets a bit complex and is why I’m writing this.
My job is to assess a candidate with an eye towards answering whether they should be hired or not. Very often this is along two dimensions:
Does the candidate meet whatever semi-objective hiring bars that have been set for the position?
Is the candidate someone I would not mind working with in the future?
While the specific details in the execution are important, these are the two broad things I’m trying to answer while going through the process with someone. We’ll get into them.
1. Semi-objective hiring bars
Typically, when I’m asked to interview someone, I’ve been told what the position is for, and might have been given some context as to what the requirements for the position are. Other times, I’ve been asked to interview someone because they’re potentially going to be a future teammate and I’ve already been part of long discussions about the sort of work we’d like the new hire to do at the beginning. Regardless, some rough sense of what skills are necessary is communicated to me beforehand.
That list of knowledge requirements manifests itself as a short list of things semi-objective things a person needs to perform the role. It could be something as simple as “is proficient as SQL” or “has used Tool Y” before. Other times, it’s more vague like '“has some sufficient experience with Methodology M”.
The problem of course is that measuring proficiency in complex skills such as programming and statistics is extremely difficult. Even asking people to prove that they’re skilled at a given piece of software (let’s say, Excel) can be difficult. The stuff one team uses day-to-day might not bear any resemblance to how another team might use the exact same software. This applies to pretty much everything we’d find of interest — programming languages, algorithms, data tools, research methods, statistics…
Despite the fundamental epistemological issues, people keep trying to find SOME signal in the noise. This leads to the infamous and much-maligned “technical interview” for software engineers. Data scientists usually get the dubious honor of getting shorter abbreviated technical interviews… but then get multiple ones that touch upon the different pillars of DS, programming, stats, and business stuff.
So what do I do?
Most interviews, I go in knowing I’m supposed to be testing for a specific aspect, programming, stats, communication, working with other functions, etc. So there’s a theme to the chunk of time I’m given with the candidate.
While I’m sure that other people have their own methodologies, mine is fairly simple. I’m trying to document evidence that the candidate knows enough about the topic in question. This information can then be compared to rubrics developed for the position in question, such as “must have basic proficiency in programming, conditionals, loops, appropriate use of functions and abstraction”.
As a concrete example, if I’m out trying find evidence that someone knows Excel, I’d note down whether they can use the more common analytics functions (Vlookup, sumproduct, countif, etc.), as well as progressively more advanced features like pivot tables, solver, VBA, the mapping feature, etc. Along the way, during the conversation I’d also be noting things like how they put together their sheets and making notes of any knowledge of quirks and stuff.
Individually none of those little bits of knowledge means that the candidate knows Excel. They’re just little bits of evidence in a mental Bayesian model, and we’re trying to update it as quickly as possible.
Similarly, the optimal strategy for the interviewee in such a situation is to firehose me with as much evidence as possible, at any opportunity they can find. This includes comments on the side like “Hey you’re asking me to implement a median function. Normally I’d run stats_pack.median(foo)
but for you I’ll write one from scratch”. Just to demonstrate that, ye,s I do know how to do this normally, but we’ll explore the N number of ways to write a median from scratch.
All of those count as some evidence to having knowledge of things that I as the interviewer am interested in. For one thing, if someone expresses knowledge of stuff that is more advanced (and thus something not every candidate is likely to know, a.k.a, not appropriate to ask of every candidate), and I happen to know about that same topic, it gives us a chance to explore that more advanced topic instead. Such an excursion is useful because it provides evidence for a lot of the prerequisite knowledge to that advanced topic. Having a meaningful conversation about using multivariate calculus means you can assume knowledge of single-variate calculus, etc.
Obviously, as interviewers, we don’t take all that talk at face value. People can and do just study up on terms and methods to regurgitate at the interview without actual practical experience. So there’s follow-up questions involved, as well as looking for less obvious signs that something is not just mere book knowledge. If anything, my job is to make sure I document the evidence fairly, while not being led to thinking someone knows something they don’t.
Since all this stuff about evidence and knowledge is pretty confusing to someone who’s never been to a technical interview before, I do take the somewhat unorthodox step of flat out telling people up front what I’m trying to get out of them — “Hey, today I’m focused on understanding what you know about programming, so as you answer my questions, do our best to show me as much about what you know about programming.” There’s a lot of room for interpretation, but it’s better than nothing.
My reasoning for doing this is because having the candidate know what “the game at hand” is shouldn’t have an affect the outcome if I’m using anything resembling a fair-ish measurement methodology. Interview questions aren’t a secret handshake for people who’ve been trained in it. I actively WANT every candidate to think out loud and work with me instead of falling silent in a panic for 90 seconds at a question. I’d must rather that a candidate be given the best opportunity to fail on the merits of their knowledge instead of pesky interviewing protocol issues.
2. Is this someone I wouldn’t mind working with?
Often this is worded as something like “culture fit” which has a long history of being a questionable device that makes teams look homogenous and primarily white and male. So there’s quite a bit of problematic issues if I just blindly go with the “culture fit” way of thinking about this part of interviewing without a thought.
That said, there’s always some aspect of this line of thinking to interviews that I can’t simply write off. It’s important to weed out skeevy sociopaths if at all possible. Other times there are are just occasions where you get a glimpse of a person’s character that makes them someone you wouldn’t want to work with. Offhand racist or sexist remarks, contempt for their previous coworkers, etc..
Few people let their guard down during an interview, but every so often you’ll find some people who let that stuff slip through and that’s grounds for rejecting a person, because I don’t personally want to work with such people.
The danger is taking it too far and letting all the unconscious biases and preferences that play into how one human likes other humans come into play. For my part, I seem to get along with most people, so I’m primarily just keeping a careful eye for red flags to make note of. Otherwise I try to ignore most of this aspect since it’s more likely to lead to trouble than not.
So, what can go wrong at interviews?
The most common problem seems to be when someone gets very stuck on something and spends a lot of precious time not providing any new information about themselves. Obviously as an interviewee, that’s not a good situation to be in, and can just be the result of nerves.
At the same time, as an interviewer, you don’t want this to be happening either! because it’s a potential sign that we’ve done something wrong ourselves. Did we miscommunicate? Did we calibrate our question to the wrong level? Did we somehow make a mistake and actually we’re the ones who have the wrong answer?
Another thing that could go wrong is that the interview is progressing fine and it’s slowly dawning on you that the person has applied to the incorrect job level. Sometimes they’re just too damn good for the posted position, or they’re more suited to being one level lower. Sometimes it’s not their fault because job postings are very vague and can’t fully express what a given team actually needs from a new hire. Very often, I wind up just reporting back that I think the person is more (or less) senior than we originally thought and leave it up to the hiring manager to figure out what they want to do with the information.
Finally, another common failure mode is that the package of skills and experience a candidate brings just does not fit with the specific needs of the team right now. There are tons of people that I would have been perfectly excited to work with but the needs of the organization dictated that we go hire with another person. While you could argue that all of these issues should have been weeded out long before at the resume and phone screen stages, in practice it’s not possible to catch everything and we needed the full length interview to even realize that there’s a mismatch of skills and needs.
So, as you can see, things can go pretty badly at an interview, even from the interviewer’s perspective. A fair number also aren’t easily within direct control. It’s complicated and tough.
Hopefully, this sheds a bit of light on what one, not particularly good, possibly pretty bad, interviewer’s thoughts on the topic.
About this newsletter
I’m Randy Au, currently a Quantitative UX researcher, former data analyst, and general-purpose data and tech nerd. The Counting Stuff newsletter is a weekly data/tech blog about the less-than-sexy aspects about data science, UX research and tech. With occasional excursions into other fun topics.
All photos/drawings used are taken/created by Randy unless otherwise noted.
Supporting this newsletter:
This newsletter is free, share it with your friends without guilt! But if you like the content and want to send some love, here’s some options:
Tweet me - Comments and questions are always welcome, they often inspire new posts