How to interview
Reading about hiring processes in different teams - and regularly participating in them ourselves - we never stop being amazed at how fundamentally broken things are. People hide their insecurity and fear of responsibility behind meaningless rituals and redundant filters with negative effectiveness, and refuse to acknowledge a simple fact: the ability to conduct even the most technical interview is a personal soft skill that can and must be developed by anyone who faces this task.
We decided to share our experience. Perhaps someone will find it useful (we understand it won't resonate with everyone).
Collectively, our team has been through it all - individual contributors, tech leads, and founders of startups with significant investment. We've been on both sides of the table many times: as interviewers and as candidates. We've walked the path from complete self-doubt to building a reputation as the people who can be trusted to assemble a strong engineering team from scratch. You're certainly welcome to take our words with a grain of salt, but as it happens, it's our team that most often ends up being responsible for hiring decisions in the projects we participate in - despite being, first and foremost, engineers.
We should note right away that everything we discuss below is only part of the bigger picture. We understand that. Our team specializes in areas of development where vibe-coders and bootcamp graduates have little to offer. It's a specific sector - let's call it "R&D" - where the rules are somewhat different and where independence, confident expertise, and intellectual flexibility are valued far more.
Bulk Purchasing
We believe firmly that the team - its composition, working energy, and internal engineering culture - is the team lead's (or tech lead's) responsibility. A team lead should have the ability to fine-tune. A team lead should understand, figuratively speaking, how many "archers," "swordsmen," and "cavalry" they need to accomplish the mission. A team lead should feel the balance and clearly understand that developers can be very different, yet each can be valuable in their own way. Someone keeps a steady pace and is predictable. Someone is complex and independent, but delivers genuinely great ideas and thinks at a systems level more often than others. And someone feels completely at home in a domain that is complex and critical to the project. Everyone should be in the right place, and each person requires a slightly different approach.
In many large companies, the hiring process is maximally separated from the development process and resembles a conveyor belt, where candidates are filtered out based on maximally formal criteria - identical for everyone.
Here's the thing: the most interesting candidates, in our experience, ALWAYS don't quite fit in some way. Thus, the "bulk purchasing" approach, on one hand, prevents team leads from building teams according to their own criteria shaped by the specifics of the project, and on the other hand, too often leaves truly experienced and most interesting candidates behind - fostering the dominance of mediocrity.
Memory ≠ Knowledge
Good memory is typically valued highly. A walking encyclopedia is often automatically considered an intellectual. But the ability to FORGET rarely-used things is also a very important and underrated feature of our brains. It's largely what defines neuroplasticity - the ability of neurons to retrain, and therefore adapt to new conditions and seek new solutions.
Far too often, interviews try to test how well a candidate REMEMBERS specific trivialities and details of a tech stack. But good, experienced developers have already forgotten more than newcomers ever knew. That's normal, and that's fine. Restoring knowledge and reminding yourself of something is much easier when you've internalized the fundamental principles through experience.
Therefore, testing knowledge of facts is far LESS valuable than testing understanding of principles. Moreover, answers to typical, supposedly tricky questions that are supposedly meant to reveal the depth of a candidate's knowledge most likely only reveal how many typical interviews the candidate has attended before meeting you.
Engineering Culture
But when it comes to testing (and understanding) principles, technical interviewers themselves often struggle. So what do you do in that situation? Our answer: make sure the person has some principles at all, that they've formed their own ideas about engineering culture and their place within it.
For example, if you're a devotee of the functional programming paradigm - explain why. Explain why you're not a devotee of other paradigms. Give a practical example. If a person can maintain a constructive dialogue on such a topic, they're already worth something. Even if you disagree on some points, that's not what matters. You simply can't memorize the "right answers" here, because there are no right or wrong ones. Or rather, the evaluation criterion isn't some factual correctness, but the existence of a formed opinion as such. If a person is just parroting someone else's words - trust us, it shows.
Brick in the Head
There's also a flip side: sometimes we notice experienced developers blindly clinging to certain principles, opinions they're not willing to reconsider even when faced with a stream of counterarguments. We call this "a brick in the head." The mechanism behind such a "brick" is quite understandable: a person feels that spending time discussing something that's been discussed a million times and on which they've reached a comfortable internal consensus is pointless. This can be justified, but in a dynamically changing environment, there's a high probability that some "immutable" things have become outdated or no longer align with the chosen strategy.
In such cases, it's important to verify that a candidate is fundamentally capable of revising their opinion. At minimum - capable of constructive debate on any topic, without exception.
Professional Breadth
This is a crucial point that helps distinguish those who simply "do a job" from those with a calling. The former are often completely indifferent to what happens beyond their immediate responsibilities and chosen stack. Sometimes even the stack itself wasn't chosen deliberately - it just happened to be what the previous project used.
We always try to find out whether a candidate, if they've chosen React for instance, has at least a general understanding of how React conceptually differs from other popular libraries and frameworks serving a similar purpose. Conscious technology selection is one of the main hallmarks of a mature specialist.
Furthermore, we believe that the best engineering solutions most often emerge at the intersection of different competencies. Therefore, diverse experience and a general interest in what's happening in the industry are a clear plus for any candidate.
The AI Impact
Regardless of how you feel about it, it's already hard to imagine effective work in IT without AI assistants. You can mock the vibe-coding devotees, but if you're completely ignoring the practical benefits of AI - something is clearly off. In any case, AI is reshaping patterns of effective work, and this can no longer be ignored.
And AI is very much affecting the hiring process. The most obvious impact is two-fold: on one hand, recruiters have gained new tools for automating routine tasks (resume analysis, initial profiling, etc.). On the other hand, candidates now have tools for automatically tailoring their resumes and other personal brand artifacts (portfolio sites, GitHub profiles, etc.) to match filters. Both of these create a separate meta-game where the candidate's real qualities and the original goals of recruiting fade into the background and gradually stop influencing the process as intended.
Once, we stumbled upon a profile of one of our team members compiled by some trendy service, and were quite amazed at the sheer nonsense it contained.
Beyond that, such an evaluation method as the take-home assignment seems to have completely lost its relevance. Indeed, what's the point of asking candidates to solve problems that AI solves for them?
Our approach: accept this reality. We dedicate a separate section of the interview to discussing AI. We ask candidates about their experience - successes and failures - and what interaction model they've arrived at. We ask about AI tools and their evolution. We ask how they see their professional growth strategy in light of all this. We ask for an example of a task that AI, at its current stage, would handle poorly. We ask them to reason about how the problem could be solved better.
Ultimately, we might agree on a test assignment where the candidate can complete it with unrestricted AI assistance, and then review the result together afterward.
All of this excellently reveals how well a candidate fits into the modern paradigm and their general intellectual level.
Proving Superiority
Sometimes (quite often), an interview devolves into the interviewer's attempts to bolster their ego at the candidate's expense. Technical interviews are often conducted by people pulled away from their "primary" work and, consequently, not particularly prepared for the nuances of the hiring process.
Of course, it's nice to know the answers to your own questions in advance. For some, this instills a sense of superiority. Some start behaving arrogantly: dispensing advice from their elevated position, theatrically expressing surprise that a candidate doesn't know some "basics" by heart. Yes, yes, buddy, we get it - your ideal candidate is yourself. But there are exceptional people out there who understand many things better than you do. Identifying and acknowledging this isn't shameful. On the contrary, it's a sign of your own professionalism.
Our advice to interviewers: hold yourselves back. Try putting yourself in the candidate's shoes. One technique we sometimes use: we ask the candidate to imagine that they're interviewing us. We ask them to pose questions that they personally consider important. And then we try to answer them. This can significantly ease the tension, and simultaneously reveals a great deal about the candidate: their priorities, approaches, and ability to lead a dialogue.
Prepare for the Interview
It's not just the candidate who should prepare for an interview. If you're a technical interviewer - take the time to spend at least half an hour before the meeting. Study the resume. Check their GitHub. Skim the blog, if there is one. Think in advance about which questions would be appropriate for this specific person, rather than for an abstract "frontend developer with three years of experience."
An unprepared interviewer is, first and foremost, disrespectful. The candidate sees that you didn't spend a single minute getting acquainted with their background, and immediately understands: people here are treated as disposable resources. A good specialist will turn around and leave, and you won't even realize who you've lost.
Moreover, preparation saves your own time too. You already understand where the candidate's strengths lie and can quickly move to the topics that truly matter, instead of wasting time on ritualistic checklist drills.
Personal Brand
We certainly don't mean a glossy wrapper when we talk about personal brand. What we primarily care about is observing how a person organizes their public space. GitHub profile, personal site, blog, Stack Overflow answers, articles, talks - all of this, taken together, can tell you far more about a candidate than a formalized resume.
Pay attention to whether the candidate writes thoughtful READMEs for their projects. Whether there are tests. How tidy the code is in pet projects where nobody's watching. This is "engineering culture in its natural habitat" - when a person delivers quality not because the process demands it, but because they simply don't know how to do it any other way.
The absence of public artifacts isn't a deal-breaker, but their presence is a powerful signal. A person who spends their free time sharing knowledge or creating things beyond work tasks almost certainly belongs to those with a calling, not just a job.
Saving Time
Time is the scarcest resource. Both yours and the candidate's. Five-round interview marathons with a crowd of participants, half of whom are there "for quorum" - this is absurd, and unfortunately, many have gotten used to it.
Our approach: one thorough technical interview, about an hour long, in a live dialogue format. If that's not enough for us to reach a decision - then the problem is with us, not with the number of rounds. Adding rounds typically doesn't add new information - it merely multiplies subjective opinions from people who saw the candidate for 15 minutes and tend to judge by first impressions.
Respect other people's time. If you realize within the first 10 minutes that a candidate is categorically not a fit - don't drag things out "out of politeness." Honestly explain your concerns and give the person the opportunity to either change your mind or save the remaining time.
Breaking the Ice
The first few minutes of an interview are critically important. The candidate, in most cases, is nervous. A nervous person thinks worse, answers stiffly, and doesn't show their true level. This means your number one task is to help them relax. Not out of kindness, but for the quality of your own assessment.
We usually start with something informal. Ask about something we noticed during preparation - their pet project, an article, a recent talk. This immediately shows that you spent time preparing (see above), and shifts the conversation into dialogue mode rather than interrogation mode.
Another technique: start with yourself. Briefly describe the team, the project, the current priorities. This levels the playing field: you're sharing information, not just demanding it. The candidate gets the feeling of a mutual process, not an exam.
Dream Stack
One of our favorite questions: "If you were starting a project from a clean slate, with no constraints - what stack would you choose? And why?" A variation: "Describe your ideal toolkit."
This question works on multiple levels. First, it tests breadth - a person who only knows one stack won't be able to articulate why they'd choose it. Second, it reveals priorities: someone will talk about DX and development speed, someone about performance, someone about scalability, and someone about simplicity for the team. There's no right answer, but there's a revealing one.
Third, this question often leads to a very lively discussion where both sides can learn something new. We've personally discovered interesting tools during interviews more than once. And if a candidate can teach you something - that's a very good sign.
Anti-Patterns (The Saboteur)
One of our signature techniques. Sometimes it's much easier to build a conversation not around how things should be done, but around how they definitely shouldn't. Show the candidate a rough code snippet and ask: "What's wrong here? What would you change?" Or even simpler: "Tell us about the most horrible code you've ever seen. What exactly was bad about it?"
This works because criticizing the bad is psychologically easier than describing the ideal. The candidate loosens up, starts speaking more confidently, and you gain access to their real value system. What a person considers an anti-pattern directly reveals which principles matter to them.
Moreover, the depth of critique is very telling. A junior will say "there are no comments." A mid-level developer will notice architectural problems. A senior will point out non-obvious consequences of the decisions made, edge cases, how this code will live six months from now. The very perception of "sabotage" isn't about knowing rules - it's about engineering maturity.
Solving Problems Together
Instead of classic whiteboard exercises, we prefer a different format: let's solve a real problem together. We describe a task close to what the team is working on, and we think through the solution collaboratively. The candidate doesn't write code under our silent observation - we have a dialogue: discussing approaches, debating trade-offs, sketching diagrams.
This format yields incomparably more insight than watching someone sweat over an algorithmic puzzle. You see how the candidate thinks under conditions close to real work. How they ask clarifying questions. How they respond to your suggestions - accepting blindly or arguing with reason. Whether they can decompose a problem. Whether they consider edge cases.
In essence, you're simulating the actual work process. And there's simply no better test for what it will be like to work alongside this person day to day.
Analyzing Results
After the interview, while impressions are fresh, we write brief notes on each candidate. Not five-point scale ratings, but actual notes: what was memorable, what raised concerns, where the strong moments were. Formal scores tend to average out and lose context. But live observations preserve the nuances that later prove decisive when comparing several candidates.
It's important to separate facts from feelings. "The candidate didn't know what the Event Loop is" - that's a fact. "The candidate seemed unsure" - that's a feeling, and it could stem from a dozen reasons unrelated to competence. Consciously separating these two categories is a key skill worth developing.
And one more thing: analyze not just the candidates, but yourselves. If you're systematically rejecting people who then successfully land positions at strong teams - perhaps the problem lies in your criteria, not their qualifications.
Finding Candidates and Initial Screening
The best candidates, as a rule, aren't looking for work. They're already employed, and doing quite well. This means passively waiting for responses to a job posting is a suboptimal strategy by default. Active sourcing, networking, and referrals from the team - all of this works significantly better.
As for initial screening: we've become convinced time and again that resumes are an extremely unreliable source of information. Many strong engineers write terrible resumes, and a polished resume, as we've already discussed, can now be produced by anyone with the help of AI. Therefore, during initial screening, we try to look for more reliable signals: referrals from people we trust, public activity, and code quality in open-source projects.
One of the best sources of candidates is professional communities. Open-source projects, specialized conferences, meetups, professional chat groups - that's where you find people for whom technology is more than just a way to earn a living.
Automation
Automate the routine, but not the decision-making. Automated invitation emails, calendar scheduling, collecting feedback - all of this lends itself perfectly to automation and relieves people of unnecessary overhead.
But automating candidate evaluation is a slippery slope. Automated syntax quizzes, timed algorithmic challenges, resume keyword scoring - all of this creates an illusion of objectivity, but in reality builds filters optimized for the average candidate. And the average candidate is precisely what you don't need in the long run.
Use automation to free up time for genuine human interaction, not to replace it. Machines still don't understand people very well.
In closing, we'd like to say this: the ability to hire well is an engineering skill just like the ability to write code. It requires practice, reflection, and the willingness to acknowledge mistakes. A perfect process doesn't exist, but there is a mindful approach where you genuinely try to see the person behind the pile of formalities. And if you've read this far - it means you care. That's already half the battle.