Welcome to Bytes in Balance, the podcast where we navigate the wild world of software engineering together. I'm Dan and this is Damian. We have been juggling code, teams and sanity for over 35 years combined. From junior devs to principal engineers, we have worn every hat in the industry. In this podcast, we're sharing our journey, lessons learned and mentoring tricks to help you find your own balance.
It's not just about the tech. We dive into people, psychology, communication, and all the messy bits in between. Think of it as group therapy for the digital age. We bend to swap word stories and share what we think is solid advice. Sometimes we even bring guests to shake things up. This podcast is our way of tackling the stress, burnout, and growth pains that come with the job. It's as much a balancing act for us as it is for you. Grab a seat and let's navigate this madness together.
You'll find some interesting links in the episode description if you want to learn more about us or the topics we discuss. All right, let's get started. Hey, Dan. Good to see you again and having this conversation today. Yeah, good to see you again, too. I enjoyed our last conversation.
Yes, I mean, it was a lot of hard work to put together that first episode. Let's see how this goes. You know what? Funny story. My wife almost gave me a heart attack last night. She said, oh, by the way, I scheduled tree trimmers to come and chop all the branches around the trees. It's literally right outside my office.
But they showed up at 7 a.m. They got their job done and they just finished like 20 minutes ago. So I was really worried. I was thinking there's going to be like chainsaws going on while we record. But I think we're good. I mean, it happens. Same here. All right. Last episode, we talked about, in my mind, thinking of what does it mean to be a good software engineer? What are the traits?
and the things that we expect or that we see in good engineers. And we had a long discussion of how to grow those aspects, connections with mentorships and these type of things. Today, we are going to talk about interviewing. and how the industry is interviewing, who the industry has historically interviewed. What do we know about that? We did a lot of interviews on our own as former Amazonians in our tenure in Amazon.
And we saw a lot of things, learned a lot of things on that aspect. And we have interviewed ourselves also to other companies. And there is also a lot of conversations going on about this topic, particularly with the current industry situation, layoffs, etc. A lot of people trying to... Interview and dividing companies seeming going up precisely because there is a lot of offer of engineers and Things like that. So that is the conversation that we're hoping to have today
Yeah, I'm really excited to talk about this. I mean, obviously I have a lot of thoughts on it, but yeah, it's one of those things that obviously it's super important to a lot of people now because they're getting laid off or their friends are getting laid off or they just want to look for a job and they're finding that's way more challenging. But like, it feels like everybody has.
really strong opinions on what is the best way to interview both from a candidate perspective and a company's perspective. So yeah, I think there's a really interesting, there's a lot of topics that we can dive into. How do you think we're gonna start because it's a long conversation, right? I think we could talk for a long time about this
And we have already talked about a potential follow-up episode and things like that. But what do you think should be the starting point? Yeah, well, let's kind of start by segmenting this and breaking down the different types of interviews, because obviously interviewing as a whole is kind this long kind of drawn out process where a company decides if they're going to take the chance to give you a job. But I think there's different parts of it, especially that are either
controversial or that we may have interesting opinions on. So I'm thinking about things like there's coding interviews, there's system design interviews, there's behavioral culture fit leadership, whatever talking interviews as kind of a high level categories. And then
And even within those, there's different subcategories that I think we could dive into. So with coding, you have your more like data structures and algorithms style, lead code kind of problems. And then there's more practical hands-on build this widget using React. or whatever kind of things with system design you can ask a candidate to design a greenfield system not on a whiteboard
Or you can just dive into a historical system that the candidate worked on and have them talk about the pros and cons and how it's architected and things like that. And then there's also different ways of administering interviews that we can talk about. Live interviews, online assessments, take-home projects, things like that. So I don't know.
if any of those kind of jump out at you, it's ones you want to talk about. I think that's what's come to my mind. I'm trying to think if we are missing something. There were also these very famous brain teasers in the 2000s that are not being used today. Or at least I haven't seen them. Yeah, I remember. I've never been in an interview like that, but I remember...
Both hearing about them. Actually, my dad is a software engineer and he worked at HP and things like that. And he told me stories of how they'd interview people asking these dumb questions. Well, in retrospect, dumb questions. How many golf balls would it take to fill up the space needle or something?
Stupid stuff like that. And ostensibly, it was a way to test your creative thinking. But I think they really went out of vogue in like the mid 2000s. And I don't know if algorithmic coding was popular before then. That was before my time. But it does seem like the emphasis on grinding leak code started. around then, right?
Yeah, I've read about them. I also never interviewed with them in either of the two sites as interviewer or interviewee. And you have to admit, there are some of these questions that are really fascinating. How much does Mount Fuji wait? Yeah, it doesn't surprise.
me at all that you could fill up an hour of a conversation i'm sure that's not the only question but i've talked to some really interesting people and sometimes it wouldn't matter what i ask them we could have an interesting discussion about yeah how tall is mount food or how heavy it was whatever you said or golf balls or piano tuners
or whatever, and it would still be an interesting deep dive. How relevant it is and how useful it is as a way to filter out candidates is another question. Yeah. And I think that's one of the things that are controversial is coming back to the regular ways of interviewing today.
effective and how fair or how reasonable are some of these questions to filter out candidates and to actually assess capabilities and skills like for example I've heard from people things like you're going to ask me these very complex algorithms questions that it's it's based on my computer science fundamental course that i maybe took at the university right or today there is also the modality of a lot of people that it's self-taught that maybe they don't come from a computer science
or computer engineer background, and they haven't seen these academically discourses. So it's like, you're asking me about all these things, and then what you're gonna do is put me to build UX, or put me to just write APIs that get an input,
do some transform, write to a database. Exactly, yeah. There used to be this famous joke about Google that they would hire all these super smart computer science PhDs and then they'd put them to work building CRUD apps all day. You see the point to some extent. Yeah, that you're 99% of your day, well, some number, some percentage of your day to day job.
You don't need to know about hash tables and linked lists and whatever. So it is an interesting, I remember reading this famous, let's see, it was a blog post or some, some, maybe it was just a Twitter. post but it's been referenced many times over this the inventor of homebrew i think the package manager on osx he interviewed at google and then apparently was rejected
Don't know why, but he tweeted publicly and said, oh, I couldn't invert a binary tree on a whiteboard. So fuck off or something like that. He said that. And I hear that reference and I see that reference in tons of blog posts about interviewing. And I'm of two minds of it. Like on the one hand.
Yeah, it's people can have off days and it's crazy to try to predict someone's job performance based on these silly algorithmic questions. On that specific example, I always wonder, like, he doesn't really know why he was rejected. It may have been a whole bunch of other. parts of the interview. That's one thing that I hope to explore in this podcast is really how complicated hiring decisions are. The inputs...
tend to be kind of trivial, like these coding exercises. But a lot of people come away from these assumptions thinking, oh, I couldn't reverse the linked list. So that's the reason why they didn't hire me. And a lot of times it's a lot more complicated than that. Yeah, absolutely. I think there are a lot of aspects. sense. The other thing that I have noticed is that people obsess with lead code. For example, I was having a conversation with somebody today.
a couple of days ago, a mentee, and we were talking about lead code and it's like, What do you want to do? Do you want to be good at lead code or do you want to grow as an engineer? And being good at lead code does not necessarily guarantee you that you're going to actually get a job offer. Yeah, but at the same time, they want to get a job. And that is to some extent what this sort of artificial hoop that they have to jump through.
to get a job or, you know, it's not, it's more complex than that, but I think that is the feeling that that's, what's causing people to, I've also, I've had some mentees who have told me the number of leak code problems that they have solved. And I'm always blown away. Like I haven't done that many.
But yeah, definitely people feel the pressure to grind and practice. And yeah, I don't know if all of that is healthy. I agree with the need of practice, etc. But what I see is that people tend to focus just... too much on on lead code and even i don't know to what point it's actually a good way to to grow to learn some of these fundamental concepts and things that you actually need to do it's a good way to grind problems it's right way to to get the actual skills to
to success as an interview? I don't know. The other thing that I found is that people is expecting on these interviews to get this very complex problem that has to be solved with graphs or trees or whatever. And then you just... throw a very simple coding problem, and what I notice is that people tend to over-engineer a solution for such a simple problem, right? And you hear a lot, oh, that's a depth-first search approach.
No, it's not. It's probably a couple of fours and something like that. And I have even candidates that get stuck there and you cannot move them away from the fact that this is a DFS or something like that. Yeah, I see that also on system design. People have this tendency. to want to overcomplicate, make it more difficult and more complex than it sort of needs to be. And usually the right answer is to actually don't do that. Simplify, focus on sort of this core use case and then build out.
you know from there maybe a good interviewer or a good candidate will eventually design a complex system but they don't start out the gate just sort of throwing spaghetti against the wall until the whiteboard looks like you know dinner yeah no totally so Anyhow, it's interesting the things that you step into when you're talking with people and trying to understand how those difference approaches. I have a hard time faulting.
people, candidates who are trying to get a job from really grinding leak code and focusing on that. Cause that, you know, that there is that feeling that they have to do that. What do you think about the other side of it as a company hiring someone? Do you think that's an effective way? How effective do you think the current state of things? is in terms of finding the best candidates out there? I have conflicting feelings about this and probably a few biases because my background is all
computer science to some extent. So some of those are things that I actually value. And I will agree that you don't use in your everyday, like you are not going to write your own data structure. Most likely. And you don't use certain of these things. You don't use a lot of these things in your everyday. But somehow I suspect the thought process and those the knowledge of those fundamentals helps you progress through your career yeah it's like i don't need
I hope I never have to actually go implement my own hash table where it's going to be used in real production code. Hopefully never in my career will I do that. But it's been super useful knowing how they work and knowing, oh, here's a... way that you can access something in memory and O of one time, that concept comes up a lot. And
That's a super important thing that you can come up with lots of other examples from other data structures or algorithms, whatever, where it's important to know graphs and trees. And there've been a few times where I've been working in systems where you're writing some code that explores a file system and you're like, oh yeah, this is a tree and you can use.
graph algorithms on it or maybe there's symlinks you know linking different parts of the tree together and now it's a graph and you need to be careful not to you know visit things that you could have already seen things like that
those times, it has been really useful. And I can imagine, I suppose I've worked with people who lacked those fundamentals and really struggled to get over the hump. They would write code that is lots and lots of for loops and if statements and and would sort of work but like if you don't have the correct high level approach to something like that it's really difficult to write correct code that handles it correctly like
For example, something that for me has been incredibly helpful and I'm grateful that I learned very early is thinking in terms of states and state machines and automata and these type of things I I don't know you I have seen big disasters in some control planes because the person that implemented and designing the thing didn't realize that that that's actually a state
machine, and you can transition from one state to another. Yeah, yeah. Have this explicitly defined. And then, eight years down the line, that control plane has a lot of if-then-else books raised conditions. and stuff like that because nobody thought about this as well so so question for you then sorry to interrupt i'm curious where do you feel like that sunk in for you was it did you have to see those disasters in real production systems or did you feel like you got that from university
Was there like particular professors who drummed that concept in? I got that from the university. maybe i was lucky because the people i was working with was very involved in that area i ended up writing a kind of a workflow system as my undergrad thesis and that was very state machine oriented And then the next years in my career, I saw a couple of interesting disasters other people was doing that could be actually solved with the state machines properly. I got that from that area.
It makes sense if answers the question. Yeah, yeah, yeah. Another interesting thing is that for the argument of I get interviewed, I get asked things about algorithms and data structures, and then you were going to put me to make just APIs. and UX. But then people want to do more. They want to do bigger projects, more interesting, more cool, more of that type of projects. And then you don't realize that if you don't grow that knowledge, you're always going to be stuck.
writing APIs and UX and so on and so forth. So I think there is a bit of a self-fulfilling prophecy there. You also need to grow in these concepts and these aspects if you really want to take on more complex... bigger, more scalable, et cetera, projects. Maybe. At least that's the way I reason through it. Yeah, I tend to have the same view. I mean, I think there are things to improve in the current state of interviewing, especially...
coding interviews and things that don't detract from the quality of the signal you're trying to get from. But I do think there's always going to be some amount of trivial, contrived problem solving. And whether that happens offline in take-home projects, like for instance, that's a good example, maybe. Take-home projects are better, generally better experience if they're designed well, generally better experience than sitting there and...
coding some algorithm while some other guy is staring at you on the video call you're sweating and trying to finish the code that's an uncomfortable experience for everyone involved and it's not really how coding happens on your day job i think take home exam are a better example. But yeah, I do think the problem is it's hard to test data structures and algorithms knowledge in take-home exams for a lot of reasons. So I do think there's some amount of algorithmic
questioning and problem solving that people would be expected to do. And also because you need to time box these kinds of things. Your real day-to-day job takes days or weeks to get set up in your dev environment and understand the domain of where you're working. You just can't have the interview process. take that long um you have to be able to put together these contrived problems and yeah some of them are going to feel like toy problems to some extent
I was preparing for this conversation today, and I was thinking in terms of, okay, when you think in terms of coding, and there are several types of coding interviews you have. Data structures and algorithms. You have code quality. Yeah, your OO design or maintainability kind of stuff.
But like what I'm thinking when I am interviewing somebody in terms of coding is I'm thinking like, can this person problem solve? Which is very interesting. It's a toy problem. And you are seeing all the gears and this person's head working and seeing. how this person approaches to the problem and solves the problem and comes to a solution and proves the solution. Does it make sense or not? And so on and so forth. And then you're thinking,
okay, once this person got the solution on their head, can they actually translate that to code? Can they confidently sit down and code? And it's the, can the person code kind of question. And you start seeing red flags. You see people that doesn't walk you through a solution, does not get to a solution. They just start throwing force and ifs and else is directly into the code. They jump into the code too early before they have that.
before they've solved the problem. Yes, and you see all this brute force approach of adding, removing, adding, removing, adding, removing that obviously doesn't go anywhere. Can this person troubleshoot?
right something went wrong something is not working on their code can they troubleshoot can they fight what I call the technical problem which is like okay I'm fighting with the language now I'm fighting with this compiling error and so on and so forth and once the problem is solved or what even through the process can display doesn't understand the trade-off. Say, hey, I solved this this way. It has disadvantages. It obviously has disadvantages. But if we are optimizing only for this, then
It works. Yeah. I think you identified a couple of really important things here because actually, so what you first talked about separating. Problem solving from coding. I think that's really important for people to understand. And that's how we looked at things from the interviewing perspective. Like when we're looking at candidates and how they did on interviews, we would split those two things up and I'd be the bar raiser on a loop. Right. And I would.
be talking with people about what questions they're going to ask. And I would, I would tell people make sure that they can't solve the problem. Give them enough hints. Eventually give them enough hints to where they understand the algorithm. Even if you have to give them the solution, give them that because then you get data on how they can code. A good interviewer doesn't sit back and give you the problem and just sit and watch you struggle for 30 minutes.
They let you struggle for five minutes and then they step in and say, hey, what about this? Have you thought about this as a graph? Then they let you struggle for a little bit more. And eventually they got to give you the solution so they can see if you can turn it into code. And when we're discussing hiring decision, do we hire this person or not?
we might have two or three different coding data points. And let's say they didn't do so well on one of them. If the other ones, if we find out that like, oh, they did well in problem solving here, but they did... poorly encoding on this one but like overall they did pretty well if we separate out all those data points we can look at candidates more holistically well even a red flag for me is the capability of the candidate to take hints
Yes, absolutely. You give the same hint two, three times, and they keep on the original path of solution, and that's a red flag. Imagine now that translated to a real-life situation where you are trying to coach or provide feedback or giving some hint or another sign review of offering some suggestion or something like that. And candidate cannot incorporate that. Yeah, absolutely. you're trying to test in a system design interview is not whether that person can go off and design a system
by themselves, but rather can they participate in a design discussion and can you give them feedback and you bounce ideas off of them and can they run with it? And yeah, if they're single-mindedly just ignoring you, that is a red flag. which is interesting.
At least for us, that we have been interviewing people for so long. It feels straightforward, but I suspect that there is a lot of people that it's interviewing, that it's applying to jobs in that sense. And don't look at the situation from this perspective.
It's a lead code problem. I need to know trees. I need to know graphs and so on and so forth. I'm not thinking like there is more in the interview than just solving the problem or knowing about XYZ. Yeah. Well, and I do think it is valuable if you are looking to get a job.
to go look up some mock interviews and see how they go and see that hopefully there are bad ones out there, actually, but good ones will show you the whole process of getting the problem, talking through the constraints of the problem, asking about edge cases, asking clarifying questions. start solving the problem, coming up with an approach, verbally describing the approach, starting to turn it into code. That whole process, it's more than just like when you're coding on leak code by yourself.
It's a totally different experience. You have this kind of sort of IDE-ish environment. You can actually hit run and run your code. When you're talking with an interviewer, you're generally... typing this into just some kind of a shared text editor. There might be syntax highlighting, but they're not running your code, generally speaking.
And so your ability to talk through what you're doing, like you mentioned, talk through the trade-offs of, oh, I might do it this way or I might do it this way. Those things become much more important. Yeah, absolutely. I think it's a broad set of aspects, much broader than people actually think on that side.
to some extent there's a big variable here too which is people's like nervousness or randomness and luck in terms of like what questions you are asked how good you do under pressure like for instance people
ignoring you and ignoring feedback. I think there are some people who are just stubborn and don't listen to feedback, but I think other people, they get so nervous that they think they just have their blinders on. And I worry a little bit about that or like, I don't know how to correct for that.
being on this side too is very important to provide the candidate a great experience. And that's... part of introducing yourself helming the candidate being friendly and so on and so forth it's important that's kind of a fun i played in this website that you basically do cross interviews so you interview somebody at the interview just for mock interviews somehow. And the person that was interviewing me was,
Very rough. Very junior person. But here's the problem. And I was like, OK. I worked through the problem and et cetera. And when it was my time of interview, I was like, hey, nice to meet you. I'm Demian. I'm going to be.
talking to you today and I have a nice problem for you I would like to have this conversation and so on and so forth and that was even because at the end you give feedback to each other and that was a piece of feedback to this person it's like hey when you are interviewing somebody you actually have to set the stage and you know yeah you're interviewing manner and it's hard to develop all that i remember my first few interviews man i feel bad for all those first few candidates
I must've been so awkward and just didn't know how to fill time. All of those are skills that you have to develop how to fill time. If a candidate answers your questions really quickly and how do you ask more questions? You have to be prepared. You have to think about how to respond in different. scenarios. You have to be prepared for different questions the candidate might ask. It's actually difficult to develop all that. But coming back to your point is like, yes, randomness, luck.
nervous how do you feel today i think probably happened to you i have had my bad days in those interviews with i performed really poorly and it's like it's a pity but it is what it is yeah i talk about this with with my mentees it's a big thing that i talk about it's just how random it is and even from when i was at amazon the hiring perspective we would understand this too
People would go through the process. We would look at the data that we gathered and, you know, it just wouldn't be there. We wouldn't be able to justify a hire, but we would think this person may have just had a bad day. We put a note in for the recruiter and they would follow up in three months. or six months instead of a year or two. And...
Maybe they would do better next time. There's definitely that aspect of it. And also I, a lot of people go through these and they don't get hired. That feels crushing. It's a really devastating experience, but I definitely try to illuminate that part of it, that it's not some judgment on you as a person. There's a lot of.
again, randomness and luck involved. And it's unfortunate, but what companies are optimizing for is not always the same as what candidates would want. A lot of these, I think a lot of the focus on leak code just comes from big companies needing to scale their hiring and having to do that by basically having their junior engineers turn around and administer leak code problems. Cause that's pretty easy. You can watch one or two of those and you can start doing that. And.
Ultimately, you will find people that are good at leak code problems, which sort of overlaps with good at engineering, not totally. I agree with that. I even have had candidates that... the interview. They did well. They feel they did well. We have the conversation and they did well and they are very frustrated by the rejection and very sad about that.
The answer sometimes is like, hey, particularly in the industry, the way it is right now. Yeah, you did good. You did great. But in the market, how it is right now, there were other... five candidates that also did great and then the companies can just you know pick and maybe there was a better feeling with this other candidate then you could argue that's a bias or whatever but
It is right. The people that is interviewing is trying to figure it out. Do I want to work with this person or not? And if they have five great candidates and one of them get a better feeling, that was the one that's going to get hired. Yeah, for sure. It's been really interesting how the market has changed. in the last couple of years.
I sort of got lucky in my own career. I basically left Amazon sort of before the wave of layoffs came. So I didn't get to experience this from the inside, but all the big tech companies did this and it dramatically changed. I mean, you've probably seen how easy it was. to change jobs back from like, I don't know, 2013 to 20, you know, 22, 21. I look at people's resumes now as part of working with my clients. First thing we do is look at the resume.
And a lot of the feedback that I have for people is this is a perfectly good resume. And two years ago, this would have gotten you a job. Now, I don't know. Now here's a bunch of improvements that I think we can make and we go through those. But it's crazy that a couple of years ago, you wouldn't have needed to do this kind of stuff.
And now you have to work a lot harder to get in front of the hiring manager's eyes and to, like you said, be better than those other five candidates that have more experience than you. It's interesting. And I think it's subject for another podcast for sure. But all the conversation about the resumes and that is, I'm very interested on your point of view about that. It would be an interesting conversation for sure. Yeah. I mean, I don't have a whole lot of depth to it.
But basically I start from the premise that people will spend, hiring managers or recruiters will spend about 10 seconds looking at your resume before they toss it in the bin. So that's about the time that you have to show them. what kind of an engineer you are, a full stack, backend, mobile, whatever, front end, how much experience you have, years of experience, where did you graduate?
And how long ago? Basically, they do that quick filtering of like, is this in the right years of experience that we're looking for? Do they have the right technologies? They look for keyword, you know, Python, Django, Rails. you know, Hadoop, whatever they're looking for. They look for those things. They scan years of experience. They scan blah, blah, blah. And then they move it to a short list to reach out to, or they bin it.
And a lot of people, a lot of times, like I'm not a resume expert, but a lot of times the feedback that I give is just geared towards making it easy for people to do that, making it easy to scan, highlighting. When did you graduate? Highlighting years of experience. If your resume is three pages, getting it down to one. Easy stuff like that.
Yeah, it makes sense. It's definitely an interesting topic. I mean, 10 seconds is nothing. No, but it's true. I used to sit in these sessions at Amazon when we were hiring. We'd go and me and the recruiter, sometimes the hiring manager would sit there and look and the recruiter would pull up resumes and they'd pull up one and then we'd all sit and stare at it and go,
yes or no. And then we'd usually explain why, because we're trying to teach the recruiter, like how to do this essentially. And we go, no, I'm looking at these two things. This is exciting. This is interesting to me. Oh, this is a good school. Oh, this is whatever. We'd talk about those things.
But as we're going through, we'd sit there and we'd process 100 resumes in a short session. And this was years ago. I'm sure it's way worse now with the volumes of people that are applying. What do you think between small companies and big companies? Yeah, that's interesting. I think I mentioned before that a lot of the focus on leak code is this need for big companies to scale their hiring with sort of...
armies of less experienced engineers needing to hire more. I think in bigger companies, they're more sort of recruiting pipelines. They actually have a recruiting pipeline. They think about it in a more automation kind of thing. So sometimes that's good in the sense that if you're interviewing with a big company
the timeline is pretty flexible. Usually if you need more time, there's a say, yeah, sure. We'll wait for you. And you'll go, you'll interview later at a smaller company. If you need more time to prepare, they may just go with someone else. They may only have a couple of spots to fill. So there's that. I also think.
I think that bigger companies are generally less picky about specific technologies and specific languages you'll interview. And I used to say, yeah, use whatever language you're comfortable in. I don't really care. I just want to see general.
computer science skills and problem solving skills. Whereas a small company, they may specifically be looking for, you know, we need to hire a backend Python engineer and they need to know these technologies. They need to be able to code in Python. So they're going to be less flexible. Yeah, I think they have less bandwidth for learning on the job, certain things. Yeah, they want someone that can hit the ground running. And that sort of makes sense. Startups may be this way as well. Not always.
Big companies will take somebody that actually maybe has a Java background coding and they would put in a team that is Python coding with a bunch of folks that code in Python. Yeah. We used to interview lots of people from Microsoft who knew C-sharp and they would code in C-sharp in the interview and
Maybe jumping onto a team that is all 100% Linux and Java and AWS and what have you. And it didn't matter. We assumed they would pick up the skills and it's a long-term investment hiring somebody. And that's how big companies will look at it.
And if you actually find data points on other things that you're looking for, like in attitude, for example, in interviews, you're looking for somebody that has capability to learn, that has capability to face something that is different than their comfort zone and so on and so forth.
So if this person has these right attitudes, they're going to put it in a new environment with new technologies and it's going to be able to learn and probably to thrive into that environment. That's something we haven't really talked about much today is behavioral interviewing. Interviewing people You may be talking about technical things. You may be talking about their projects and the code they wrote and problems they solved. But the interview is...
non-technical. You're talking about what did you do? And at Amazon, we heavily focus on the leadership principles. And I'm actually still a big fan of the leadership principles, except for the stupid ad, the recent ones that were bolted on the earth's greatest employer. whatever yeah there was there was internal controversy over those they're like these sound like they were written by upper management um
Sounds like a piano. Yeah. Yeah. Whatever. I can go to rant rant about that, but the other leadership principles, they're great. They're just like. a really good way to describe having a good attitude and being a good person to work with and do you dive deep into problems and do you
work well with others and you address conflicts in a healthy way and all these things. So I think there's a lot to say around both preparing for behavioral interviews, practicing talking about your work and telling your stories and your anecdotes and whatever but also companies It's important to do this, too. I read about other big companies that had a much less emphasis on behavioral interviewing. Google, for instance. I don't know if this is still true or if this was ever true, honestly.
But I always wondered how that worked out. We got so much signal, I felt like, on how someone would be to work with from the behavioral exercises. Yeah, I don't know. I don't know what your perspective was. Yeah, I agree with you with the leadership principles. I remember when the recruiter, you know, send me the list when I was interviewing for Amazon and
Oh, these are the leadership principles. And I started reading the list in the description. It's like, wait, this is not leadership principle. This is common sense for me. This is the ownership. You learn and be curious, dive deep. It's like, okay. And I still think in those terms.
I remember my first manager told me about the leadership principles, that he would believe in this so much that he would teach them to their kids. I don't use those literal words with my kids, but yeah, it's like a concept. I get the concept. So, and yes, you get a lot from interviewing and understanding how people align on these aspects.
You definitely don't want to work with somebody that if something goes wrong, and I think I talked about this in the previous podcast, if something goes wrong, they are not capable of growth experience or they blame others or they play victim or stuff. like that and those are huge red flags to me working with somebody and in general in life and those are things that that show up in those behavioral interviews
There is a lot of noise in the process because also it is tricky to actually have those conversations for the candidates sometimes. Yeah. They may not be used to telling those stories in that way. A big thing I focus on when I teach people how to do this is the star format, you know, situation, task, action, result. And a lot of people.
approach it as if it's like some specific thing oh amazon likes the star format so practice that and it is true some companies sort of focus more on that but like if you step back it's really just a way to what was the situation and context. What were you asked to do or what did you decide you needed to do if nobody explicitly asked you to do something? What did you do? And then how did it turn out?
And when people tell stories without having practiced this, I mean, sometimes this comes naturally to people because they're natural storytellers. And sometimes this really doesn't. I think it's easily learnable, but sometimes people naturally just jump immediately to the action.
And then the result maybe, but really the interesting part to them was the action. And so they forget all the context and all, what was I asked to do and why, how did we settle on this decision? And how did we, you know, all this kind of stuff that's all sort of in the background for them. And it's assumed, but the interviewer.
They don't know all of that. And so if they only hear the action part, they start to assume that your contribution was really small in scope. They don't understand that you actually conceived of this and you maybe took initiative and they don't see all of this. They just see.
oh i jumped in and wrote a bunch of code even if that to you was the interesting part and it probably was hands on keyboard in a tight loop was it the right code how do you know it was the right code how did it turn out yeah so
And reminding people and teaching people to focus on all those different aspects of the story, it's amazing how it comes across differently to the interviewer. Suddenly you get a sense of the scope that they're working at and that they can understand the trade-offs and why they did it and all of those things. and you are assessing something that
Companies like Amazon never used this word for this, but you're assessing also capabilities of communicating. Storytelling, which we discussed in the past episode, how important that also is. And I agree with you, it's something that you... have to train over time. Even what I found is candidates that go and prepare in the sense that they don't think about their career and systems that have developed and experiences. So when they get asked a random question,
they got caught off guard and they really struggled to get one of those examples. And they pick probably one that is not the best. Yeah, I would definitely feel bad for candidates like that. You ask them a question, tell me about a time that you solved a difficult problem. And then they start telling you the problem.
And then they're like, well, actually this was 11 years ago and I don't remember the details. And maybe this other guy solved that part. And maybe it was totally fine. Maybe they have other strong examples, but then maybe they also wasted 25 minutes during an interview sort of. diving down this rat hole, and now the interviewer doesn't have good data. So it's unfortunate.
I had a friend that was interviewing for a senior engineer role at Amazon, and they asked him a question like this, and he uses, as example, a project that he...
did like 15 years ago when he was a junior. And then I think part of the feedback was, what is this? This is not complex enough for a senior or something like that. And then having a conversation with this person was like, do you realize you have 15 years doing extremely complex things and the example that you pick was example when there was a junior and you were asking that this tiny feature and that's what you talk about well yeah it's funny because
Actually, some of the projects that I did when I was a junior were the most interesting to me. And I still look back at those fondly and it's, I can see how someone would fall into that trap. And also projects when you're a junior are easier to talk about because they're sort of neatly wrapped up the scope.
It's been scoped appropriately. Someone else has maybe taken care of certain parts of it. I don't know. I actually find that one of the challenges that people have in preparing for these interviews is thinking of the right examples.
And they self-select out lots of projects and they're thinking because they're too messy or too complex or they don't know how to frame it in an interview. Usually it's not as simple as like, oh, I was asked to build this thing and then I built it and then it turned out great and then I released it. everybody cheered but usually the real life is a lot more complex they may work
for months on something and then it sort of gets canned and their work goes down the drain and they have to sell that into a success because they take what they built and use it somewhere else. Maybe they don't, maybe they learned something, but a lot of times people have trouble talking about these examples where stuff didn't go on the sort of rosy golden path.
And we that have been there and done that are like, show me the pain. We know it's there. I want to see. Yeah, exactly. Actually, and I was always really happy when people would sort of dive into that complexity because. it showed that they were, it was a real example. They were sort of, I don't know, giving me a sort of their, I don't know. It just felt like a more complex real life situation. All right, then. Taking a step back.
because we have talked a lot and trying to summarize, we probably have mentioned some of these things. What are we trying to test as interviewers? What are the data points that you think are important? Yeah, I think there's...
technical data points around someone's coding and problem-solving abilities, their computer science fundamentals. I think there's some basic stuff there that you're trying to assess. The more senior the candidate... the more important things like their decision-making skills, their ability to analyze different trade-offs, and their ability to basically apply technical solutions to solve business problems.
And you can kind of scale that up depending on how senior of a candidate you're looking for. But that at a high level is what I'm looking for is someone that would be good to work with, someone that I could delegate. tasks and projects at the appropriate scope and they'll jump on those and deliver work and I can work with them and I can address conflict with them in a healthy way and they'll push back on me and I can push back on them and that can be a kind of two-way thing.
at a high level those are the kinds of things that i'm generally looking for what about you yeah i agree with you and when you translate that to system design you mentioned about coding it's more or less the same you're trying to test fundamentals Capability of problem solving, capability of working in behalf of the business, working with the business. I think it translates pretty much there. And when you talk about fundamentals, it's like you're thinking in terms of scalability, reliability.
and so on and so forth. I think behavioral, I'm looking for certain key traits like have you mentioned and i'm looking for the right attitudes to it we mentioned some of some of them capability to learn ownership somebody that can clearly communicate that can discuss
trade off that don't get stuck in one thing. So I think we're in the same page on that sense. How effective do you think it is to process? Yeah, that's the million dollar question. I think it's better than Some people think, again, you go and you read all these blogs and people complaining about the current state of interviewing and it's, there's some truth.
to it that giving people these contrived toy problems is not really accurately predicting how they'll do on the job on the other hand at some point you're going to have to write some code and someone is going to have to look at it and decide do i like it
I think there are some things that have improved and some improvements that I like. For instance, I'm a fan of take-home exams and then having an interview after the fact where you look at how the candidate wrote the code and you ask them questions about why they wrote it that way.
dive into trade-offs and things like that. I'm a fan of that, but that's a complicated interview to design, and it requires a lot of work on the company's perspective. And on the candidate also. Well, on the candidates too, The trick is to design a take-home exam that doesn't require the candidate to spend 20 hours doing it. You ideally want to spend like...
two to three hours working on it. But it's really hard to come up with a test that someone only can spend that amount of time on. I do think also there was a big improvement in the quality of coding interviews. pre-COVID to after COVID. Pre-COVID, we would invite people in and we'd have them write on a whiteboard. And at the time, it seemed normal. And there was all these people defending the practice and things like that.
Looking back on it, no, I think it was dumb. I think people have a much better experience typing on a keyboard. And this sounds stupid in retrospect. But at the time, there were people that were like, no, they need to code on a whiteboard. And every candidate, we need to have the same experience and blah, blah, blah.
Things got way better when we started interviewing virtually. People could sit at their own desk and their own keyword and they could write code. Some things got worse. I like interviewing people in person. I haven't done it in a long time, but I like that experience. But coding interviews in general...
Being able to type on a keyboard is a huge improvement. I did my Amazon one in the whiteboard. I did it on a piece of paper with a pen, and then I'd have to keep turning the paper over so the interviewer on the other side of the desk could read my...
chicken scratch code it was looking back on it it was crazy it was like line paper too it was like whatever i think the process has floss not out And it's easy to take the flag of complaining and criticizing the process, etc. That's the easy part to do. question is if it is not this then what's next and how that other process is really effective to provide the same value that the current process is and that's hard question
Because I can say, okay, you mentioned that a take-home problem should be two, three hours or so on and so forth. I've seen companies that do two, three days and things like that. And it's like... I mean, sure, maybe you have higher chances of success on that, but... You're selecting for unemployed candidates, essentially. People who have the time to go take that exam. From a company's perspective, it's really dumb, but they still do it.
Yeah, it's a problem. And if you have two or three of these, that's like 10 days of work or something like that. So I think I come back to the... Sure, there are things that could be better. I know this generates frustration, but what is the alternative? And without having a solid alternative, it's very hard to transition to something like that.
So I think that's where my opinion stands. Okay, well, it feels like we're getting close to the end of the episode here. What do you think about the topics we've talked about so far? Yeah, I think it has been a... a great conversation. We have briefly talked about the different types of interviews coming from the times when
Companies would use brain teasers to do interviews. And how we have evolved, the whole process had evolved. Even recently, before pandemic, people were doing coding interviews in whiteboards and paper. And now it doesn't happen anymore that way. We use... keyboards and computers to actually do that type of interviews. The reality is that I know that the process is imperfect. I know it's flawed.
to some extent, but I don't have a better alternative at this point. It cannot come with something, you know, that I could argue it's 10 times better. And it's controversial. I know it's hard for some engineers to face this process. I think it ultimately is about learning of it and preparing yourself to actually interview. What do you think?
Yeah, I kind of feel the same way. It's been interesting to see how interviewing techniques have evolved and changed over the years. But really, the goals have always been the same. Companies are trying to find good engineering talent, people with good fundamentals.
good working skills and try to find those people and hire them in a minimum amount of effort. And I agree with you too. I can see the flaws. I do have opinions here and there about certain types of questions are bad or certain kinds of interviewing can be done in a certain way to maximize the day. data you get out of it and the candidate experience.
But overall, yeah, I think it's an imperfect process. There's never going to be a perfect way to see how someone's going to perform on the job because the job can take so long to evaluate your performance. And so, yeah, I do think also candidates just kind of have to accept that it's imperfect. You have to.
study a little bit, study to the test. And those are all things that I think we can dive into in later episodes too. Preparing, diving into the meat of different types of interviews, things like that. What do you think? This opens the door to a good set of interesting conversations.
basically talking about how do you prepare for interviews? How do you prepare for different type of interviews, for the coding interview, for the behavioral interview, for systems design interview, and so on and so forth? What are good strategies? What is the role of mentorship in all? all this too, which is important. I suspect we're going to keep talking in the future about this topic and expanding in some of these things. All right, folks.
Thank you for listening up to this point. I hope you enjoyed the conversation. And we're going to be talking about these topics and other topics of software engineering. And it will be until the next time. Thank you all. And that's it for this episode of bytes in balance. We hope you enjoyed our deep dive into the world of software engineering. Thanks for tuning in.
We would love to hear your thoughts, so don't hesitate to reach out. Connect with us on LinkedIn to continue the conversation or simply follow our updates. You'll find the links in the episode description. We aim to release at least one episode a month, but with our busy lives, it might vary. Subscribe to stay updated and you might catch some surprise episodes when we're feeling extra chatty. If you are enjoying the show, please rate, review, and share it.
with your friends and colleagues. It really helps us reach more people in the community. To learn more about the podcast, check out our website. The link is in the episode description. And if you're looking for more personalized guidance, we're available for mentoring through Mentor Crews. And there's a link for that too.
That's all for now. Until next time, keep coding, stay sane, and remember, even when it feels like a total shit show, you got this. Thanks again for listening, and we'll catch you on the next episode.