Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not going to dive into the specifics of my thoughts on this question. I think a lot of comments here address this.

But does anyone else get embarrassed of their career choice when you read things like this?

I've loved software since I was a kid, but as I get older, and my friends' careers develop in private equity, medicine, law, {basically anything else}, I can tell a distinct difference between their field and mine. Like, there's no way a grown adult in another field evaluates another grown adult in the equivalent mechanism of what we see here. I know this as a fact.

I just saw a comment last week of a guy who proudly serves millions of webpages off a CSV-powered database, citing only reasons that were also covered by literally any other database.

It just doesn't feel like this is right.



It's because our profession seems to attract a number of intellectually insecure people who thrive on these little puzzles and the associated minutiae as a means of feeling a sense of superiority. They all think they've figured it out: the best way to tell if someone is a Good Dev or a Shit Dev. Of course, administering the test, there's no way they could possibly be anything but the Good Dev. It is embarrassing. Don't believe me? Why can't they help but blog about it?


We have FAANG to thank for this. They pioneer counterproductive interview questions (remember puzzles?), and the industry just copies these trends, like zombies.

Then what happens is these Leetcode heroes, never having built anything from scratch in their lives, create a hazing ritual for new candidates.

What is the point of, say, a system design interview asking to design a planet-scale system when most people never came close to one or, again, never built one because they are just out of school?

And, yes, I know - "how would you interview a recent student?".

Fine, I was a student in 2004, so why are we having the same goddamned test?


> We have FAANG to thank for this.

Not entirely, though FAANG companies certainly didn't do anything to help make it better.

I'm 51 and have been a professional software developer since the early to mid 1990s and tech interviewing was already a strange hazing ritual disaster before FAANG existed.


I distinctly remember interviewing for a QA position in the 90's where I was asked how I would test a soda vending machine. I was dinged because I didn't mention that I would make sure the sodas were cold.


Actually Microsoft started this in the 1990s, long before FAANG was a thing. They just all adopted it.


“Why are manhole covers round?” “How many dry cleaners are in the city of Seattle?” were both infamous Microsoft interview questions.


Previously known as "Fermi questions"- commonly asked of undergrads and grad students in quantitative fields.


I think quant finance is similarly bad


The real problem: How you accurately evaluate a candidate without these questions? I dislike them as much as the next person, but I don't know a better way. Literally, all the best tech firms and ibanks do it. They pay the most and are most profitable. They must be doing something right. The real trick is finding a "fractically complex" question with many levels. The one in this blog is pretty good! No candidate can solve it perfectly in the allotted time, but you can have a good discussion about the "edges" of the problem -- where good candidates shine.


How do you accurately evaluate candidates with them?

It's an artificial test that doesn't reflect your working environment at all, and so you're not actually getting to see what they'd be capable of doing faced with real world coding work.

It's a discriminatory practice that is proven bad for a lot of neurodivergent candidates, like folks with autism, or ADHD.

You end up eliminating a whole lot of good candidates just by the structure, and then end up picking up a candidate who happens to do well out of that small subset and it still won't stop you from hiring developers that are terrible.

One of the worst developers I've ever worked with will absolutely sail through leetcode exercises. He's brilliant at it. Something about his brain excels in that environment. If only work was like a leetcode interview, filled with leetcode style questions, and someone watching and evaluating his work as he did it, he'd be a fine hire anywhere.

He can't write actual production software to save his arse, needs deadline pressure breathing down his neck, and then what he'll produce at the last minute (always technically makes the deadline), will probably work but is the most unmaintainable nightmare that needs to be rejected and redone by another developer.


I have had the exact same experience with the exact same kind of person. We used to be in the same team for years and after we collaborated once, I absolutely hated any work interaction related to his him or his work. Outside of work we got along well, he was fine, socially stunted, but that's OK.

When the layoff time came, everyone had to scramble to move by themselves to a small selection of teams. The second I heard that guy was interviewing for the same team as me, I had gotten an offer at that point, I told the hiring manager, me or him. That's how bad working with those kinds of people can be. He ended up elsewhere and I'm still in that team. I just could not deal with him anymore. Perfect interviewer, but couldn't write production code to save his life... He's still employed by the way, bumbling around from team to team because the consequences of his incompetence take months or years to feel...


>> They must be doing something right

What tech companies did right is found a ridiculously profitable business model. It is not clear that their success is correlated with hiring practices. Likely other hiring practices would have worked fine as well.

>> Literally, all the best tech firms and ibanks do it. They must be doing something right.

Reasoning by first principles isn't exactly the software industry's strong point.


> What tech companies did right is found a ridiculously profitable business model. It is not clear that their success is correlated with hiring practices.

Agreed though I'm not sure I'd be as generous as you are when it comes to their business models being that great in absolute terms.

Strip away all the confirmation and survivorship bias and IMO it is pretty obvious a lot of the success of tech in general for multiple decades running was almost entirely the result of the free money lottery system funded by zero interest rates.


It's always

- someone else's money (zirp, middle-eastern sugar daddies, juicy government contracts)

- adverts


I really counsel everyone to stop thinking like this. The appealing to authority doesn't work when you are talking about different sized businesses.

All the best banks and tech firms do a lot of things that could be categorized as wasteful, useless, inertia maintaining, etc, adopting their practices without a thorough understanding if it applies to your business is plainly just stupid.

Your business is not structured like those big business, you are not as anemic to risk as they are (otherwise you wouldn't even create your business in the first place), you don't have their cash, you don't have their ability to spend an enormous amount of time hiring every single person because your profits cushion you from all your dumb decisions.


This is my own experience but I find the evaluation on interest and intellectual curiosity to be more validating long term than being able to rote memorize problems.

Edit: To add some color, I want a candidate who is excited to program, I don’t care as much about their ability beyond having the basics which I find pretty easy to figure out in an initial conversation. Candidates who are excited for the opportunity are generally the ones who I find to excel in the long run.


If you're involved in the hiring process at your org at all, and they ask these type of questions, I'd encourage you to try to as-objectively-as-possible evaluate how much of a signal they actually provide.


In my experience the signal is pretty strong.


Are you able to share how you evaluated this? Is this based on gut-feeling or is it data-driven?


Gut feeling based on the generally very high competence of my colleagues when I was at Google.


I trust they were really competent, but it's a bit depressing that these competent people will need to go through the leetcode rituals and 5-10 interviews to get a new job at Meta, Netflix, AWS or adjacent companies. That's actually the point of the original post; you are never judged by your (years of) experience or even your past companies, only by the results of a test from a random person/company.


I disagree; I don't think a full day of interviews is a huge price to pay for a FAANG job. The stakes are high for the employer, the rewards are high for successful applicants, and if you get to that stage, you've already passed a lower-stakes phone screen.

Put yourself in the employer's shoes: You want a high-quality SWE, you're prepared to pay them top dollar, but if they turn out to be not so great, it's expensive to get rid of them. Would you be satisfied by years of experience at other companies by itself, when you know that (like in many job markets) there's a big market for lemons? I wouldn't. I would want to see the candidate demonstrate some specific skills -- ideally the skills they'd be using day-to-day, but if that's not feasible for time reasons (it usually isn't), then adjacent skills that generally imply (though are not necessarily implied by) them, like recognising the shape of a toy problem, and knowing and applying the right algorithm to solve it.


I am not commenting on the interview effort-salary ratio, but the fact that credentials and experience mean nothing to the tech industry, also comparing to the rest of professions. I mean working in Google/Meta/Netflix, is like working on the best hospital if you are a doctor, in the best construction industry if you are an engineer, to best law firm if you are a lawyer, etc. Imagine having to pass a leetcode or iq test everytime you want to move to the next one. I definitely know that my cousin, who is an exceptional doctor in Greece with only 10 years of experience, laughs about it.


I think it should be the other way round: In an ideal world, everyone, including doctors and executives, should expect to have to demonstrate their skills when applying for a new job.

It's just unfortunate that there isn't (TTBOMK) an easy way of measuring, in a few hours, how good an executive is at executive-ing. Which is why the field is crammed with useless pretenders who nevertheless manage to flit from job to job, soaking up fat compensation packages before their incompetence is exposed.

I'm curious why there doesn't seem to be the same market for lemons for doctors. Or is there? How does a person actually know if a doctor is good?


    > executives, should expect to have to demonstrate their skills when applying for a new job
They do. The interview process is a review of your "wins and losses" (public and verifiable is the gold standard), plus you need to complete some case studies.


That's better than nothing, but not much better -- many people have an outsize ability to "dress up" any arbitrary thing as a win, and in that case what the interviewer is really testing is a mixture of (1) that person's ability to produce wins, and (2) that person's ability to dress things up as wins.

(As far as I can see, there's no obviously better way to interview executives, since in general their actual day-to-day work -- building relationships and making strategic decisions -- takes months or years to prove out, and in any case the outcomes are heavily dependent on external factors and their ability to read them.)


    > experience mean nothing to the tech industry
This is not true. It is basically impossible to hit L7 before 30 years old. A lot of it is indirectly related to experience: What have you accomplished for us. MDs on Wall Street trading floors are similar. It is very hard (nearly impossible) to make MD before 30 in this era.


I am glad that you mentioned Google. At this point, their interview process is legendary. It is so good that many other companies have tried to copy it.


For me too, anyone that does horribly, bad signal. Anyone who does perfectly, bad signal.

I've found that looking for mediocre and sub-par results will give you professionals that spend their time getting good at the profession instead of getting good at leetcoding.

I have never and will never hire code monkeys. AI already takes care of that.


ask deep technical questions and evaluate the response.

I've had better success, by a wide margin, doing this, than any code challenges ever gave.

I don't know why the industry is so averse to this, it really does work, and I know others who also swear by it.

You can find the bullshitters pretty quickly even in this ChatGPT driven world


I think the pushback here is from the following:

1. Because it is so dynamic and subjective, it is very hard to systematize this kind of interview, which makes it very hard to work into a repeatable or scalable process. The need to compare incoming candidates is an unfortunate reality of the industry for many companies.

1b. It is basically impossible to control for bias and things like "was the interviewer having a good day".

2. This kind of interview overly rewards charismatic speakers -- this is partially ok, depending on the role, because being able to speak accurately and cogently about the work you're doing and are going to do is part of the job (especially for staff+ engineering). It isn't necessary for all jobs, however.

3. Many people aren't good at driving this kind of conversation for whatever reason. When it goes well it goes well but when this kind of conversation goes poorly it reflects badly on the company.

4. People Ops want more control then this gives them, across many dimensions.


"Write some code to solve this tricky problem" seems like a deep technical question to me. Can you give some examples of what you have in mind?


define "tricky problem". Given the nature of something being tricky, it could well be too heavily reliant on the 'trick'.


I had in mind the problem given in TFA, but I should have been explicit about that.

I think it's probably too hard as an interview question, but also that it actually is a somewhat realistic problem that I'd expect a senior programmer to (eventually) puzzle out correctly. (As opposed to, say, the archetypal bad interview question, "How do you swap two integer values without using a temporary?", where many people happen to already know the trick, and if you don't, you're unlikely to figure it out on your own in the space of an interview.)


    > deep technical questions
Can you provide some examples?


not OP, but i was once asked from the HTTP layer down to PHY layer (a role for embedded system)


> The real problem: How you accurately evaluate a candidate without these questions?

The fact this question needs to be asked really reinforces parent's point.

Perhaps we should examine at how other respectable fields validate their candidates and follow suit?

If we don't have any form standardization for this stuff, I think that speaks more to the lack of maturity of our field than anything else.


Do other fields really have this figured out?

I guess bridges, buildings and houses generally don't fall.

Is that only due to hiring though? It seems more like physics doesn't change. And people who can do audits and inspections are probably pretty good.

I did an audit for my software at work. It's like talking to a child. Talking to a home inspector is way different experience.


> Literally, all the best tech firms and ibanks do it.

So do all the worst - look at the extent to which hiring this Soham Parekh fellow has become a badge of honour instead of abject failure.


Hiring that guy is a badge of honor because admitting otherwise would be to admit that the interview processes have nothing to do with the job, thus the real incompetence here is with the companies and people following that interviewing style.

Instead of correcting themselves, those interviewers chose to dive deeper into delusion.


Accurately evaluating a candidate is impossible, but surely asking a candidate these questions will yoeld a lower quality worker than if you ask them something related to the work they might do.


I disagree. I always go back to the blog posts that Joel Spolsky wrote 20+ years ago about his theory of hiring. The primary goal is to avoid bad hires. Even simple tech problems can easily eliminate (or rank) many software devs. As someone who regularly interviews candidates, I am blown away how bad are many of them.


I think a lot of fields of engineering have analogous questions actually. EEs ask to explain a circuit or draw a new one. Mech Es ask to design some piece of simple hardware with a clever catch. Interviewing is just hard, it’s impossible to cover breadth of knowledge in 45 mins so we end up with little brain teasers.

This particular question is a bit ill formed and confusing I will say. But that might serve as a nice signal to the candidate that they should work elsewhere, so not all is lost.


Lawyers have law school after a degree, a bar exam, legal liability for malpractice, and ongoing licensing requirements.

Medicine has medical school after a degree, a 5+ year residency under close supervision with significant failure rates, legal liability for malpractice, and ongoing licensing requirements.

So explain to us what it is that you "know this for a fact" regarding how they have it easier. Most of the people reading this, myself included, would never have been allowed into this industry, let alone been allowed to stay in it, if the bar were as high as law or medicine.


The difference is that if you fail medicine, it’s ok (it’s hard). But if you fail to get a job because of the stupid “async queue” author’s problem, well, that’s depressing.


I'm not so sure. Failing your residency means your medical career is basically done, and you have to basically start a new career from scratch in your late 20s. Chances are you'll have debt from not just undergrad but also med school.

By comparison, failing a leetcode interview means you've got to find a new company to interview with.


How's failing medicine not depressing!


failing medicine is depressing in the original sense of giving personal depression, failing stupid leetcode and not getting job is depressing in the more modern sense of the world is stupid and not well-organized.


That's like, your opinion. Any argument you can make for leetcode, one can make on the medical certification exam as well.


I know its not a popular opinion here or elsewhere, but since these interviews are so standard, why don't we have a uniform standard body where I can get a licensure, do yearly trainings etc as a software engineer? It would solve the same problem something like the CPA exam does.


For the same reason we have 2137 libraries, frameworks and languages that do the same thing - because everyone thinks they can do it better.


I don't think the reason we have so many libraries / frameworks is the same line of reasoning that hiring is copy pasta throughout the industry


I think we do with OMSCS.


Perhaps the bar should be as high as law and medicine if we want people to take our industry just as seriously.


Nope. In my opinion Wild West in software is much preferred model. If one wants to create software and sell it there should be no barriers. It is one of the the very few fields that give chance to simple people with no money to break out and live decent life.


Tbh I think it depends on the domain you are coding for. The field is so diverse across many different parts of the economy. E-Commerce web app sure go for your life -> software for controlling some life support system... yeah maybe I want someone with qualifications and audited standards thanks.


Life support and controls system should absolutely have a high standard, but even E-Commerce should have a decent bar. If you're handling my money I expect you to be an adult.


I develop software for various areas. The ones that come anywhere close to regulated areas surely gets audited.


>"software for controlling some life support system..."

I believe there are processes to ensure this kind of software is safe (obviously to a degree).


Sure. But audits/processes only catch up to a point. In the end the buck stops with a professional. That's what most "professions" are. They aren't just a service -> they are an accreditation with some recourse which gives them prestige/social status/etc if they have years of experience (i.e. despite the risk imposed on them as a professional they have survived/thrived).


>"In the end the buck stops with a professional."

Where did you get the idea that "professionals" do not fuck up. They do it just as much as mere mortals.


Just make sure to save up before ageism kicks in.


Its not common that people in our industry don't have bachelor degrees anymore. Its also not an industry where I routinely find the majority of people come from lower economic backgrounds etc.

I think a fair compromise would be not to require specific degrees to test, but rather a service fee (which could be sponsored) but I think a similar rigorous standards based exam would do wonders for our industry, even if it trims who can enter it on the margins


>"Its not common that people in our industry don't have bachelor degrees anymore. Its also not an industry where I routinely find the majority of people come from lower economic backgrounds etc."

It does not matter what you "routinely find". Live and let live. Person has an inherent right to make living however they see fit unless it actively harms others.

If you are so concerned about degrees why not to start with the one of a "decent human" and require it from politicians. Those fuckers affect us way more than any software and and mostly walk free no matter haw badly they fuck someone's life


Your attitude is completely off-base. Would you get treated by a doctor who was not recognized by the AMA? Would you hire a lawyer who was not called to the bar, or an accountant who was not chartered or equivalent?

Yet somehow a high school education is sufficient to write software for a 4000 lbs vehicle moving at 60 mph.


>"Yet somehow a high school education is sufficient to write software for a 4000 lbs vehicle moving at 60 mph."

Cut the BS please. Safety critical software gets audited and other measures are taken to insure it stays safe to a degree. However if one wants to write software for let's say music synthesizer the only thing that matter is the person ability. In this case I would look for experience, list of completed projects and other relevant info. I would not give a rat's ass about their diploma. Some of the best / successful software was often created by domain experts who learned how to program.


> Cut the BS please. Safety critical software gets audited and other measures are taken to insure it stays safe to a degree.

Oh? BS is it? Pray tell who's auditing Tesla's software? Or Waymo's for that matter?


I am sure Tesla can show you all the licenses and creds you'll ever need


because our industry would improve massively if we actually removed a barrier to allowing standardized licensure

I also never said it should be held behind a degree, instead I said a fee, which could be sponsored. No degree required, though one certainly would help I imagine.

We live in a society, and we should think beyond the individual in terms of benefits. This would be a big win for society.


>"because our industry would improve massively if we actually removed a barrier to allowing standardized licensure"

I call BS on that but each one of course is entitled to their own opinion. Go get your license if you don't have one already.

>"..instead I said a fee, which could be sponsored. No degree required, though one certainly would help I imagine."

We have enough mafia type bloodsuckers. My take on those money collectors: go fuck yourselves.

>"We live in a society, and we should think beyond the individual in terms of benefits. This would be a big win for society."

And who would be thinking? Our masters looking to squeeze yet more money from people? Enough of that "won't anyone think of children" vomit.


What would be on such an exam? Pseudocode, logic puzzles?

Certainly not specifics on any particular technology, right?


those generic screener questions aren't technology specific. Data structures, algorithms, system design (the top 3 that show up in interviews), none of which are technology specific.

Throw in best practices like TDD, code security, and architectural patterns and I think you could hit all of the most common non technology specific domains that cover it


Many people in software have passed through similarly hard gates in the past. An engineering degree is harder to attain than a law degree for instance. The question isn't about these gates, it is about the interview practice once one is already through. Do law or medical interviews include questions unrelated to the work that they do in a reasonably analogous manner to leetcode? Maybe they do. Perhaps hiring is broken in all fields.


> Many people in software have passed through similarly hard gates in the past.

I didn't. I dropped out of school to work at my first job. That's different from a doctor, nurse, lawyer, CPA or PE who have to meet an industry standard.


Right, certification gate keeping doesn't exist for software. I have an engineering stamp but never got a chance to use it.

The problem is, an engineering stamp or comp sci degree doesn't seem to be particularly predictive of dev capability.


to put it another way there isn’t this much focus on show me you know this weird problem that I’ve been studying for 5 years as well me, your 5 min timer starts now


Oh yeah. Spoke quite a bit to a doctor who wanted to get into a speciality last year. That shits hard. Intersection of mentally hard, hours demanding and high bars to entry.

We have it cushy. Really cushy. Unless you are working for a 2025 AI startup that works and pays you like a mule and uses the word mission unironically.


> does anyone else get embarrassed of their career choice when you read things like this?

On the contrary, it makes me proud. In private equity, medicine, or law, if you have the right accent and went to a good school and have some good names on your resume, you can get a job even if you're thoroughly incompetent - and if you're a genius but don't have the right credentials you'll probably be overlooked. In programming it still mostly comes down to whether you can actually program. Long may it continue.


Leetcode has nothing to do with actually programming a project that lives long enough to deliver value.


I used to give a code review task, of some particularly egregious python code. I'd provide all help with the syntax, and emphasise strongly upfront I don't expect them to know python or its syntax. It has proven to be a low stress task for candidates. They're not trying to solve a brain teaser, they're just doing something that's going to be part of the job. Reading code, understanding what it is doing, and providing feedback to make it more maintainable.

When all around me in this FAANG type role are engineers giving leet code esque questions, I was trying to be a breath of fresh air for them.

Sadly, I need to rethink this now, because you can throw it in an LLM and it'll give you a nearly complete answer. I've already had one candidate clearly do that.


The most "refreshing" interview I had was one where the guy had a list of like 100 questions about C++, and he would just go through and ask "do you know what SIMD is?" or "are you familiar with smart pointers?" or "tell me about templates" (most were less open ended than the template one). If I responded yes, he'd follow up, and it was more of a discussion. If I said no he'd just move on to the next one (sometimes I'd ask what it was and he'd explain).

At one company I lobbied hard against standardizing our interview on a question designed to test the candidate's multi-threaded knowledge. I insisted that if we needed multithreading, we could just ask the candidate, and ask them to elaborate. Fortunately I won that little battle.

Sometimes in interviews you get tunnel vision and you can't see the forest through the trees, and you don't realize that the interviewer is asking you about multithreading because they're being coy. That's the kind of shitty interview we need to avoid, because it leads to the false conclusion that the candidate doesn't know about multithreading when actually you just don't know how to ask.


They build teams in their own broken image of what a good programmer should be, and then get to manager and director and mould entire companies the same way.

They become hotbeds of intellectually rich but functionally and productively inept individuals who value spending resources indulging in esoteric side quests instead of driving businesses forwards in ways that are 'just sustainable enough'.

I've always been on the periphery of FAANG 'level' situations trying to focus on the surface where tech and humans meet and as the 3 decades of my career have gone on, software engineering has become more and more ludicrous and bubble like to the point where it, and the developers working on it, are now just the annoyance between me and my goals.


The higher layer of people in our industry aren’t subjected to those questions. They are evaluated and get jobs more like in law and medicine, ie based on connections and track of record.

Me and you are just not of that high layer. We’re kind of laborers given those simple aptitude tests.

When I was on track to get into the higher layer 15 years ago I got that my last job just by invitation and half an hour talk with VP. Next offer and other invitations came soon the same way, yet I got lazy and stuck at the current job simplemindedly digging the trench deeper and deeper like a laborer.


Of course it's not right. Let's be honest, our profession is in the era where software engineer = factory worker, and the worst part is that we have been playing music chairs right for the last couple years. So yeah all these professions have some steady status/wealth/qol progression and upgrade while people gain years of experience, while in software development it doesn't matter how many years of experience we have, which companies we worked on, their sector, whether your company is using the saas we were working on, etc.; we are going to get judged by trivia questions and leetcode.


I think the "sendOnce" question is fine. Software development is just different than other professions, and you get a lot of candidates who talk a good show but can't actually program at all. For a decent dev, this isn't programming, it's typing.

But all the "ok, now add this feature..." stuff is just a sign that the interviewer is an insecure asshole. And you get insecure asshole interviewers in other professions as well, asking about obscure laws or diagnoses.

Software is still a bit of a craft, and it's perfectly reasonable to ask a job seeker on a construction site to go hammer a nail. But nobody is going to follow that up with a bunch of "OK, now do an L-joint" just to show off their own knowledge.


I had a favourite interview question when I was 3 years in. My boss tempered my enthusiasm by letting me ask it but made it not a requirement to be hired so more of a bonus "extension question". Glad they did that. I was being rediculous. I assumed "anyone who used this framework must have come across this problem" but was just an assumption.


I had lunch with a friend who was trying to get a job at a law firm and they told me that their interview was just vibes and if they asked him actual law questions it would be refreshing. So maybe things aren't necessarily greener on the other side?


I heard that private equity does the equivalent. They show you balance sheet of a potential takeover candidate then ask for feedback. I assume that good candidates will need to do some Excel to answer their questions. Also, interviewing for trader roles at ibanks is similar. Show a trading scenario, ask what they would do and why. I guess law must be similar as well.


The difference to me is this: they're real analyst questions that you will have likely dealt with before in some detail, and not an obscure algorithm I maybe haven't seen since college or a leetcode not particularly real world presented or relevant brain teaser.

They're things you actually do, and I imagine most people applying to these roles have done, either in exercise (say in college) or in previous jobs. Its still a practical assessment.

Where software interviewing is different is the assessments aren't grounded in being all that practical


Other fields definitely involve similar lines of questioning in interviews. Medicine and law are special cases because they have their own set of standards that must be passed before you can even get an interview, but private equity interviews definitely include case studies/technical questions in a similar vein to the one shared in this post.


Which part, the fact that you have to answer such questions to get a job? Those other fields are more established and have formal barriers to entry.


> But does anyone else get embarrassed of their career choice when you read things like this?

What specifically is embarrassing about it? None of these questions seem especially hard, and they're exactly the sort of problem that I face on a daily basis in my work. They're also fairly discussion based rather than having one silly trick answer (like the XOR trick that came up recently here). The whole point of an interview is to check that the candidate can do their job. What would you propose instead? We don't bother to interview and just cross our fingers and blindly hope they'll know what they're doing?

I can only assume that the real reason for your objection is that your job actually doesn't involve solving problems like these ones. Well, that's fair enough, and then I'd expect the interview for your position to look different. But why would you assume that there are no jobs that really need these skills?

Your comment about using a CSV file for a database seems unrelated. Maybe I missed the real point of your comment?


> None of these questions seem especially hard, and they're exactly the sort of problem that I face on a daily basis in my work.

Really? Do you invert linked lists all day? When the last time you had to traverse a binary tree? Genuine questions. I'm sure there has to be a mismatch between what we define as "those questions".

> They're also fairly discussion based

They're also performed wildly differently with no standards at all. I've had good coding interviews with the coding as a starting point for a conversation. But I've also had it super strict on rails, interviewer silent and just expecting you to one-shot the optimal path. The latter is particularly great at hiring professional interviewers rather than actual professionals at the job.


> I'm sure there has to be a mismatch between what we define as "those questions".

When I said "these questions" in my comment, I meant the questions in the article. That's what this discussion is about! Those are not inverting linked lists or traversing binary trees. They're about networking, asynchronous actions and time outs.

And yes, I do deal with those things all them. Maybe not every day, but certainly multiple times in each project. Ever had to deal with a timer where it might still be triggered even after you've cancelled it [1] (because its underlying implementation has already fired but the callback is still waiting in the completion queue)? Or even trigger twice because you then re-set it while the first callback was still in the queue? That's just an example, but that's exactly the sort of fiddly condition that permeates every corner of a heavily async or multithreaded / distributed system. If your work involves that then it's totally fair to ask about them in interviews.

[1] https://think-async.com/Asio/asio-1.10.6/doc/asio/reference/...

> ... I've also had it super strict on rails, interviewer silent and just expecting you to one-shot the optimal path. ...

Well, I agree, that's bad. But, as you say, the same questions can go either way depending on the interviewer. The very reason that I mentioned these being "discussion based" in my comment was because I took it as read that silly tricky questions are bad and to make the point that these questions don't seem to be designed for that.

Are we not allowed to ask technical questions in an interview just because some interviewers are bad? Should we be "embarrassed" about the questions in the article, as was said in the comment that I was replying to? That was what I was objecting to.


Agreed, this is just terrible for the field as a whole it’s like we’re regressing or something


When I started writing code for a living 30 years ago, we were mostly left alone to solve problems.

Now it feels like I'm back in high school, including strict irrelevant rules to be followed, people constantly checking in on you, and especially all of the petty drama and popularity contests.


Don't know how many experience you get but other fields have a lot of these. Accountance have certificates, finance / quant have to solve headmaths type of problems.


> It just doesn't feel like this is right.

I know the feeling.

The author says this is one of their favourite interview questions. I stop to wonder what the others are.

When I'm interviewing a candidate, I'm trying to assess really a few things: 1) the capability of the person I'm interviewing to have a technical conversation with another human being; 2) how this person thinks when they are presented with a problem they have to solve; and 3) can this person be trusted with important work?

For 1) and 2), coding interviews and the types of artificially constructed and unrealistic scenarios really aren't helpful, in my experience. I care a lot less about the person being able to solve one specific problem I hand them and I care a lot more about the person being able to handle a much more realistic scenario of being hand handed an ill-defined thing and picking it apart. Those conversations are typically much more open-ended; the goal is to hear how the person approaches the problem, what assumptions they make about the problem, and what follow-ups are needed once they realise at least one of their assumptions is wrong.

This is a really hard thing to do. For example, I imagine (but do not know) that when a medical practice hires a doctor for a certain role, there is an expectation that they already know how the human body works. For an ER doctor, you might care more about how well that person can prioritise and triage patients based on their initial symptoms. And you might also care about how that person handles an escalation when a patient presents not too awfully but is in fact seriously ill. For a GP, it's probably more important for a practice to care more about healthcare philosophy and patient care approaches rather than the prioritisation thing I mentioned above. I'm spit-balling here, but the point is these two situations are both hiring doctors. You care less about what the person knows because there is a tacit assumption that they know what they need to know; you're not giving the candidate a trial surgery or differential diagnosis (maybe... again I'm not a doctor so I don't actually know what I'm talking about here).

If I'm hiring a software engineer or performance engineer, I am trying to figure out how you approach a software design problem or a performance problem. I am not trying to figure out if you can design an async queue in a single-threaded client. This problem doesn't even generalise well to a real situation. It would be like asking a doctor to assume that a patient has no allergies.

Item number 3) is "Can this person be trusted with important work?" and this is basically impossible to determine from an interview. It's also impossible to determine from a CV. The only way to find out is to hire them and give them important work. CVs will say that a candidate was responsible for X, Y and Z. They never say what their contribution was, or whether or not that contribution was a group effort or solo. The only way to find out, is to hire. And I've hired candidates that I thought could be trusted and I was wrong. It sucks. You course-correct them and figure out how to get them to a place where they can be trusted.

Hiring is hard. It's a massive risk. Interviews only give you a partial picture. When you get a good hire it's such a blessing and reduces my anxiety. When you hire a candidate you thought would be good and turns out to be an absolute pain to manage it causes me many sleepless nights.


My favorite interviews I have taken:

* Give us a presentation about a meaningful amount of work you did at a company (no NDA violations of course) and spend at least an hour talking about what you did, how you did it, what went well and what did not, and then be prepared for questions.

* Actual software development issues that the team sees every day (not hard mode, just normal) submitted as a PR - review it thoroughly.

* Given a running code environment, fix a bug (you have a debugger, full ide, and an hour to look at it) - not so much about the end result as seeing how you work and your process.


Long ago, I worked for a contractor company, so my colleagues and I had a lot of interviews to get sub-hired. As we had many of those interviews, we got pretty good at them, and we shared our experiences, knew popular tricky questions (like "swap using [place here any constraint]") and had names for different types of interviews. "Kick in the balls" was one of them, tricky questions which showed nothing, except that somebody solved that particular narrow problem or was already familiar with that question.

Having that experience, I know that the only reasonable interview is an open conversation. What was your last project? What was the architecture? What was problematic? How did you solve that? How did you come up with the solution? And so on. The candidate is relaxed, drinks coffee, and talks. After 1 hour, you know who you're dealing with.

If there's time, a pair coding session is informative. There are many small hints, like using shortcuts, that give away pros.


I have tried the first two things in that list. I haven't tried the third. I would need to think about how to do that in a way that is general enough that it can be re-used across candidates with different skill sets. I like the idea, though. Thanks for sharing it.


Its very specialized, since I was hiring for data roles I used hex.tech with some examples and it was great.


Well, that's because in law, you can't attend a 4 week bootcamp, call yourself a lawyer and inflate the ranks of desperate looking for a job. Now, also remote.

And as a doctor, you can't attend a 4 week bootcamp, call yourself a doctor and inflate the ranks of desperate looking for a job. And at least you gotta be in person, no remote.

Much of the problems of software engineering stem from the highly inflationary nature of it's people. There's a trillion of us already and more keep coming. That's what you get when there's a dozen for everything: a dime's worth.


Isn't it because there is a difference in your field and other fields?

1) Scope - Other fields like law, medicine atleast are impacting one unit at a time, vs software which is impacting large number of users through your work. I am sure research interviews will go through similar process?

2) Feedback - Just basis past work, you would get a good sense of their aptitude. Very hard to do it in programming without you spending a lot of time going through their work?

3) Subjectivity - Wrt coding, very good way to get objective output in interview by getting other person to write code, can't do that in medicine for example?


SW engineering is rather young, compared to others, like construction, medicine, law, etc. It doesn’t have good established patterns, which is followed by everyone.


there are formal regulatory boards for most of this. Lawyers gotta pass the bar, medical folks have their own board exams, finance has things like CPA or CFP certifications, etc.

Most Mechanical / Electrical / Civil Engineers have formal accreditation processes, too.

IT is something of a clusterfuck but even there we have things like CCIE or RHEL or Windows Certs that can prove a minimum level of competency.

The lack of regulation makes it easy for anyone to be a dev but also means there is no formal minimum standard.


> But does anyone else get embarrassed

No.

> there's no way a grown adult in another field evaluates another grown adult …

There is. Demonstrating competency over a common and interesting problem is the baseline.

Are queues commonly needed? Yes. Is task processing commonly async? Yes.

Fantastic, then what precisely is the problem?

What is most puzzling to me is that people are confounded when even low standards are set for each other. This isn’t a high bar.


Not really. These other industries are evaluating people somehow. Whether it's vibes, connections, resume, or some sort of technical evaluation you'll have a grown adult evaluating you. Hiring will always feel a bit lame and arbitrary, you have limited time and information to pick someone. You're not going to be able to understand candidates fully and you'll probably pick wrong a fair bit. So we come up with criteria that's a bit arbitrary but you need at least some effort and smarts to meet them so it's at least a bit correlated with a good hire. I don't think the non technical methods are any more or less dignified.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: