Many Hard Leetcode Problems Are Easy Constraint Problems
Key topics
The article discusses how many hard LeetCode problems can be easily solved using constraint solvers, sparking a debate on the relevance of such problems in interviews and the value of constraint solvers in real-world applications.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
8m
Peak period
120
0-12h
Avg / period
20
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 12, 2025 at 10:44 AM EDT
4 months ago
Step 01 - 02First comment
Sep 12, 2025 at 10:52 AM EDT
8m after posting
Step 02 - 03Peak activity
120 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 20, 2025 at 8:07 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Now, if they did answer with a constraint solver, I'd probably ask some followup whiteboard questions to make sure they do actually know how to code. But just giving a constraint solver as an answer definitely wouldn't be bad.
A trick if you can't do a custom algorithm and using a library is not allowed during interview could be to be ready to roll your own DPLL-based solver (can be done in 30 LOC).
Less elegant, but it's a one-size-fits-all solution.
Otherwise, customize DPLL for this particular problem.
At this point, job interviews are so far removed from actual relevance. Experience and aptitude still matter a lot, but too much experience at one employer can ground people in rigid and limiting ways of thinking and solving problems.
Otherwise penalizing interviewees for suggesting quick-and-dirty solutions reinforces bad habits. "Premature optimization is the root of all evil," after all.
There is some debate about what premature optimization is, but I consider it about micro optimizations that often are doing things a modern compiler will do for you better than you can. All too often such attempts result in unreadable code that is slower because the optimizer would have done something different but now it cannot. Premature optimization is done without a profiler - if you have a profile of your code and can show a change really makes a difference then it isn't premature.
On the other hand job interviews imply time pressure. If someone isn't 100% sure how to implement the optimization algorithm without looking it up brute force is faster and should be chosen then. In the real world if I'm asked to do something I can spend days researching algorithms at times (though the vast majority of the time what I need is already in my language's standard library)
1. Any optimization in a typical web development file where the process is not expected to be particularly complex. Usually a good developer will not write something very inefficient and usually bottlenecks come from other areas
2. Doing stuff like replacing a forEach with a for loop to be 0.5% faster
Sure, if a good algorithm exists and is simple to implement, then go for it. But if it is non-trivial, then you have to make a judgement call whether it is worth the trouble to solve in a more optimal way. You acknowledge yourself that that this can take days.
Personally I really have to be disciplined about choosing what to optimize vs what to code up quick-and-dirty. There's always a temptation to write clean, custom solutions because that's more interesting, but it's just not a good use of time for non-performance critical code.
They can also be dreadfully slow (and typically are) compared to just a simple dynamic program.
I guess you mean n^2 (or maybe n**2, if you're a fellow Pythonista). Many of these — especially the ones where the intended solution is characterized as using a "dynamic programming" technique — are reducible to n.
Do you know how few people in this world even know what a constraint solver is, let alone how to correctly define the problem into one?
I used a constraint solver to solve a homework problem once in my CS degree 3rd year. My god just writing the damn constraints was a huge cognitive load!
I do hope you're exagerating here, but in case you aren't: this is an extremely simplistic view of what (software) engineers have to do, and thus what hiring managers should optimize for. I'd put "ability to work in a team" above "raw academic/reasoning ability" for the vast majority of engineering roles, any day.
Not that the latter doesn't matter, of course, but it's by no means the one and only measure.
In this hypothetical, why do you do leetcode hard interviews?
I thought I already answered that:
>> Not that the latter doesn't matter, of course, but it's by no means the one and only measure.
I don't. I do easy code interviews because there are people who work great on a team and know enough buzzwords to sound like they know how to write code, but cannot. Something that isn't hard to solve in about 20 minutes (I can solve in 5 - but I've seen a solution several times and so don't have to think about the solution), but is different enough that you haven't memorized the solution. If you can't solve an easy problem then you can't code.
I should have said "if you deemed this a fail on the code interview, you are an idiot".
> If someone solves a leetcode hard with a constraint solver and you don't hire them, you are an idiot
Sometimes you just don't want someone that takes these shortcuts. I think being able to solve the problem without a constraint solver is much more impressive
In any real engineering situation I can solve 100% of these problems. That's because I can get a cup of coffee, read some papers, look in a textbook, go for a walk somewhere green and think hard about it... and yes, use tooling like a constraint solver. Or an LLM, which knows all these algorithms off by heart!
In an interview, I could solve 0% of these problems, because my brain just doesn't work that way. Or at least, that's my expectation: I've never actually considered working somewhere that does leetcode interviews.
That said, I interview in silicon valley and I'm a mixed race American. (extremely rare here) I think a lot of people just don't want me to pass the interview and will put up the highest bar they can. Mind you, I often still give optimal solutions to everything within good time constraints. But I've practiced 1000+ problems and done several hundred interviews.
Source: we am a hiring manager.
Personally, my experience has been that pre-covid, majority of interviewers were assessing your problem solving ability and if you can code the algorithm that you came up with. Getting the most optimal solution and fixing all edge cases for all problems in all interviews was not strictly necessary. But these days, even if you have the best solution coded up for 3 problems and missed one edge case in the 4th problem, you are not “good enough”. At one place, I was dinged for not thinking of the edge case before I wrote the program, even though I caught it while coding it up, in spite of having the write solution for the other 3 problems asked in the 2 coding rounds. It is a tough market, and probably tougher for you. Good luck mate.
You could improve that and still avoid any division or modulus by simply keeping track of when the "next fizz" and "next buzz" should occur. (And output "fizzbuzz" when those numbers are the same and you reach them.)
Weird experience. Didn't get that job (probably for the best tbf).
Hrm. So what you're saying is you've never actually taken or given this style of interview. Nor presumably ever worked at a company that did this interview. So if on the off-chance these interviews actually were a somewhat successful tool for filtering candidates you wouldn't actually know it?
That feels like a miss.
I've also never worked at a place that uses beatings to improve employee morale. So, I can't guarantee that beatings aren't an effective technique for doing so.
My deeply unpopular opinion is that "leet code style" interviews are actually pretty decent at avoiding false positives. Obviously some specific questions are gotcha trivia and many interviewers are bad no matter the question. But they're a reasonably accurate proxy. Their issue is false negatives.
End of the day the ONLY question an interview sets out to answer is "will this candidate be successful in this role". Interviews are strictly a proxy for "the real job". So arguments that "it's not reflective of the real job" are utterly irrelevant. It is not possible for ANY interview to fully reflect the real job. And you can't ask someone to quit a steady job to trial for 3 to 6 months to see if they're a good fit or not. So we're stuck with proxies.
I definitely think it's important for people who are hired to write code to in some form demonstrate that they are capable of writing code. That seems reasonable. But we can't expect candidates to write a big project for every place they apply. That's too much. And almost all candidates can't share code from their prior job. And solo GitHub side project are quite frankly not relevant for 99.99% of candidates. (And maybe more).
The one tried and trued method of hiring is to hire people you've worked with before who were good. This is not scalable.
Hiring is hard. Really really hard. I find that the vast majority of leet code complaints come from people who don't hire. If anyone ever cracks the puzzle of how to hire better they'll have a monumental competitive advantage. Many many have tried. So far none of have succeeded.
It's been a while since I had to hire outside your "people you've worked with before" bucket, but when I did, here are the two questions which worked best:
1. Tell me about a time you analysed a complex problem and came up with a solution.
2. Tell me something creative you did at work. Something you're proud of.
In my experience, a good candidate typically had excellent answers to those questions, because (1) a good candidate has had to do some real engineering at some point and they can tell you about it, and (2) a good candidate has done technical work which they're really proud of just in and of itself, and they want to tell you about it.
A technical question was useful in reducing false positives, but something a couple of steps above FizzBuzz should be fine for that. You just find out anything useful about a candidate by asking them to recreate some esoteric CS algorithm on the spot, unless you really need to select for people who can recreate esoteric CS algorithms on the spot.
I'm generally against using leetcode in interviews, but wherever I've seen it used it's usually for one reason & one reason alone: known dysfunctional hiring processes. These are processes where the participants in the hiring process are aware of the dysfunction in their process but are either powerless or - more often - too disorganised to properly reform the process.
Sometimes this is semi-technical director level staff leveraging HR to "standardise" interview techniques by asking the same questions across a wide range of teams within a large corp. Other times this is a small underresourced team cobbling together interview questions from online resources in a hurry, not having the cycles to write a tailored process for themselves.
In these cases, you're very likely to be dealing with a technical interviewer who is not an advocate of leetcode interviewing & is attempting to "look around" the standardised interview scoring approach to identify innovative stand out candidates. In a lot of cases I'd hazard even displaying an interest in / some knowledge of solvers would count significantly in your favour.
> It's easy to do in O(n^2) time, or if you are clever, you can do it in O(n). Or you could be not clever at all and just write it as a constraint problem
This nails it. The point of these problems is to test your cleverness. That's it. Presenting a not-clever solution of using constraint solvers shows that you have experience and your breadth of knowledge is great. It doesn't show any cleverness.
In my notes I have roughly 30 patterns to leetcode questions btw.
No it's just memorization of 12 or so specific patterns. The stakes are too high that virtually everyone going in will not be staking passing on their own inherent problem solving ability. LeetCode has been so thoroughly gamified that it has lost all utility of differentiability beyond willingness to prepare.
Will you put up with very long hours of insane grindy nonsense in the spirit of being a team player for a team that doesn't really remember what game they're playing?
Are you sufficiently in need of income to be fighting through this interview dance in preference to other things, such that once you join you'll be desperate to stay?
Those are extremely important questions, and a willingness to have spent a thousand hours memorising leetcode correlates strongly with the attributes sought.
In no case is it a useful signal on if I can do my job better than someone else. Some people like this type of problem and are good at it anyway which is a good signal compared to average - but there are also above average people who don't enjoy this type of problem and so don't practice it. Note that both cases the people I'm talking about did not memorize the problem and solution.
Like in race? Like in wealth? Like in defection willingness? Like in corruption?
Asking for a friend who is regularly identified as among the most skilled but feels their career has been significantly derailed by this social phenomenon.
You are right, this definition does come from some person with some set of motivations, but that person is some mid/high-level manager who probably hasn't ever written a line of code in their life.
In this case the group is people good at leetcode - the people I know of in that group are perfectly fine with any race so long as they can solve leetcode. There are people who care about race, but I've never had much to do with them so I can't guess how they think.
If somebody grinds LeetCode while hating it, it signals they are really desperate for a job and willing to jump through hoops for you.
If somebody actually enjoys this kind of stuff, that is probably a signal that they are a rare premium nerd and you should hire them. But the probably play Project Euler as well (is that still up?).
If somebody figures out a one-trick to minmax their LeetCode score… I dunno, I guess it means they are aware of the game and want to solve it efficiently. That seems clever to me…
Interviewers learn nothing from an instant epiphany, and they learn next to nothing from someone being stumped.
Unfortunately, this is why we can't have nice things. Problem solving questions in interviews can be immensely useful tools that, sadly, are rarely usefully used.
100% and it's a shame that over time this has become completely lost knowledge, on both sides of the interview table, and "leetcode" is now seen as an arbitrary rote memorization hurdle/hazing ritual that software engineers have to pass to enter a lucrative FAANG career. Interviewees grind problems until they've memorized every question in the FAANG interview bank, and FAANG interviewers will watch a candidate spit out regurgitated code on a whiteboard in silence, shrug, and say "yep, they used the optimal dynamic programming solution, they pass."
I've probably implemented first-order Markov-chain text generation more than a dozen times in different languages, and earlier this week I implemented Newton–Cotes adaptive quadrature just because it sounded awesome (although I missed a standard trick because I didn't know about Richardson extrapolation). I've also recently implemented the Fast Hadamard Transform, roman numerals, Wellons–NRK hash tries, a few different variants of Quicksort (which I was super excited to get down to 17 ARM instructions for the integer case), an arena allocator with an inlined fast path, etc. Recently I wrote a dumb constrained-search optimizer to see if I could get a simpler expression of a word-wrap problem. I learned about the range-minimum-query algorithm during a job interview many years ago and ad-libbed a logarithmic-time solution, and since then I've found a lot of fascinating variants on the problem.
I've never had a job doing this kind of thing, and I don't expect to get one, just like I don't expect to get a job playing go, rendering fractals, reading science fiction, or playing video games. But I think there's a certain amount of transferable skill there. Even if what I need to do this week is figure out how to configure Apache to reverse proxy to the MediaWiki Docker container.
(I know there are people who have jobs hacking out clever algorithms on different platforms. I even know some of them personally. But there are also people who play video games for a living.)
I guess I'd fail your interview process?
But also, interviews are fuzzy and not at all objective, false negatives happen as well as false positives.
If you want people to know about these things you should put them in your resume though. People can't read your mind.
One way to think about this is:
Is a fresh graduate more likely to provide a solid answer to this than a strategic-thinking seasoned engineer? If so, just be conscious of what your question is actually probing.
And, yes, interview candidates are often shocked when I tell them that I’m fine with them using standard libraries or tools that fit the problem. It’s clear that the valley has turned interviewing into a dominance/superiority model, when it really should be a two-way street.
We have to remember that the candidate is interviewing us, too. I’ve had a couple of interviews as the interviewee where the way the interview was conducted was why I said “no” to an offer (no interest in a counter, just a flat “no longer interested” to the recruiter, and, yes, that surprises recruiters, too).
Absolutely agree. When I interview, I start with a simple problem and add complexity as they go. Can they write X? Can they combine it with Y? Do they understand how Z is related?
Interviewers always say this, but consider: would you endorse a candidate who ultimately is unable to solve the problem you've presented them, even if they think, communicate, and decompose problems well? No interview in this industry prizes those things over getting the answer right.
My first boss (a CTO at a start-up) drilled this into us. What you know is far less valuable than how you learn/think and how you function on a team.
Now I give you problems I expect to take 20 minutes if you have never seen them before so you should at least solve 1. I have more than once realized someone was stuck on the wrong track and redirection efforts were not getting them to a good track so I switched to a different problem which they were then able to solve. I've also stopped people when they have 6 of 10 tests passing because it is clear they could get the rest passing but I wouldn't learn anything more so it wasn't worth wasting their time.
In the real world I'm going to give people complex problems that will take days to solve.
Last round I did at Meta it was clearly to test that you grinded their specific set of problems, over and over again, until you could reproduce them without thinking. It's clear because the interviewers are always a bit surprised when you answer with whatever is not the text-book approach on both leetcode and on the interview guide they studied.
Cleverness is definitely not high on the list of things they're looking for.
In my experience, interviewers love going to the Leetcode "Top Interview 150" list and using problems in the "Array String" category. I'm not a fan of these problems for the kind of jobs I've interviewed for (backend Python mostly), as they are almost always a "give me a O(n) runtime O(1) memory algorithm over this array" type challenge that really doesn't resemble my day to day work at all. I do not regularly do in-place array algorithms in Python because those problems are almost always handled by other languages (C, Rust, etc.) where performance is critical.
I wish interviewers would go to the "Hashmap" section for interviews in Python, JavaScript, etc., type of languages. They are much less about cleverness and more about whether you can demonstrate using the appropriate tools in your language to solve problems that actually do resemble ones I encounter regularly.
There's also the problem of difficulty tuning on some of these. Problem 169 (Majority Element) being rated "Easy" for getting a O(n) runtime O(1) memory solution is hilarious to me. The algorithm first described in 1981 that does it (Boyer–Moore majority vote algorithm) has a Wikipedia page. It's not a difficult to implement or understand algorithm, but its correctness is not obvious until you think about it a bit, at which point you're at sufficient "cleverness" to get a Wikipedia page about an algorithm named after you. Seems excessive for an "Easy" problem.
return Counter(nums).most_common(1)[0][0]
And that's 50th percentile for runtime and memory usage. Doing it with another one liner that's 87% percentile for time because it uses builtin Python sorting but is 20th percentile for memory:
return sorted(nums)[len(nums) // 2]
But the interviewer might be looking for the best approach, which beats "100%" of other solutions in runtime per Leetcode's analysis:
If I were interviewing, I'd be happy with any of these except maybe the sorted() one, as it's only faster because of the native code doing the sort, which doesn't change that it's O(n log n) time and O(n) space. But I've had interviews where I gave answers that were "correct" to the assumptions and constraints I outlined but they didn't like them because they weren't the one from their rubric. I still remember a Google interview, in which we're supposed to "design to scale to big data", in which they wanted some fiddly array manipulation algorithm like this. I gave one that was O(n log n) but could be done in place with O(1) memory, and the interviewer said it was "incorrect" in favor of a much simpler O(n) one using dicts in Python that was O(n) memory. Had the interviewer specified O(n) memory was fine (not great for "big data" but ok) I would have given him the one liner that did it with dicts lolI guess my point is that interviewers should be flexible and view it as a dialogue rather than asking for the "right answer". I much prefer "identify the bug in this self contained code snippet and fix it" type problems that can be completed in <15-30 minutes personally, but Leetcode ones can be fine if you choose the right problems for the job.
I would rather work with a flexible data type with suboptimal performance than a brittle data type that maybe squeezes out some extra performance.
Your example of in-place array mutation feels like a good example of such a thing. I feel like there should be a category of interviewing questions for "code-safety" not just performance.
You need to make sure a candidate can program so asking programing question make sense. However the candidate should not be judged on if they finish or get an optimal or even correct answer. You need to know if they write good code that you can understand, and are on a path that if given a reasonable amount of time on a realistic story would finish it and get it correct. If someone has seen the problem before they may get the correct answer, but if they have not seen it they won't know and shouldn't expected to get the right answer in an hour.
I will say, IME, it's pretty obvious when people have seen a problem before, and unless you work at a big company that has a small question pool, most people are not regurgitating answers to these questions but actually grappling with them in realtime. I say this as someone who has been on both ends of this, these problems are all solvable de novo in an hour by a reasonable set of people.
Leetcode ability isn't everything, but I have generally found a strong correlation between Leetcode and the coding aspects of on the job performance. It doesn't test everything, but nothing in my experience of hiring has led me to wanting to lower the bar here as much as raise the bar on all other factors that influence job performance.
All of the ones listed can be solved with a top down dynamic programing algorithm. Which just means "write recursive solution, add caching to memoize it".
For some of these, you can get cleverer. For example the coin change problem is better solved with an A* search.
Still, very few programmers will actually need these algorithms. The top thing we need is to recognize when we accidentally wrote a quadratic algorithm. A quick scan of https://accidentallyquadratic.tumblr.com/ shows that even good people on prominent projects make that mistake on a constant basis. So apparently being able to produce an algorithm on the test, doesn't translate to catching an algorithmic mistake in the wild.
A top-down solution in this case is in fact strictly worse. Why? Because while both take similar numbers of operations, in the bottom up solution you can throw away a row once you've processed the one above it. But in a top-down solution, you never know when you might call a particular recursive call again.
This is very common. In a bottom up approach, we often know when we're done with data and can throw it away. This memory savings is the reason why people try to learn a bottom up approach. But it does come with the price of trying to figure out various kinds of "tricks". (That in complicated cases, may not be findable.)
Many formulations scale in a way that is completely unusable in practice.
Knowing how to get tools like Z3 or Gurobi to solve your problems is it's own skill and one that some companies will hire for, but it's not a general purpose technology you can throw at everything.
This post is the unironic version of "FizzBuzz in TendorFlow", where just because you have a big hammer doesn't mean everything is a nail. And I say that as an enjoyer of bug hammers including SMT solvers.
If my wife's blood sugar is high, she takes insulin. If you need to solve a constraint problem, use a constraint solver.
If your company doesn't make and sell constraint solving software, why do you need me to presume that software doesn't exist and invent it from scratch?
At least that's been my experience. I'm sure there are exceptions.
> contractor
Do FAANG hire contractor in India?
Really? This kind of interview needs to go away.
However, coding interviews are useful. It's just that "knowing the trick" shouldn't be the point. The point is whether the candidate knows how to code (without AI), can explain themselves and walk through the problem, explain their thought processes, etc. If they do a good enough reasoning job but fail to solve the problem (they run out of time, or they go on an interesting tangent that ultimately proves fruitless) it's still a "passed the test" situation for me.
Failure would mean: "cannot code anything at all, not even a suboptimal solution. Cannot reason about the problem at all. Cannot describe a single pitfall. When told about a pitfall, doesn't understand it nor its implications. Cannot communicate their thoughts."
An interview shouldn't be an university exam.
Even getting an efficient algorithm basically right, is no guarantee.
In some cases there might be alternative solutions which have some tradeoffs, and you might have to come up with those, as well
Miss a counterexample? Even if you get it after a single hint?. Fuck you, you're out. I can find someone who doesn't need the hint
All I can say is that I do conduct interviews, and that I follow the above philosophy (at least for my round).
> We can solve this with a constraint solver
Ok, using your favorite constraint solver, please write a solution for this.
> [half an hour later]
Ok, now how would you solve it if there was more than 100 data points? E.g. 10^12?
Greedy algorithms tell you nearly nothing about the candidate's ability to code. What are you going to see? A single loop, some comparison and an equality. Nearly every single solution that can be solved with a greedy algorithm is largely a math problem disguised as programming. The entire question hinges on the candidate finding the right comparison to conduct.
The author himself finds that these are largely math problems:
> Lots of similar interview questions are this kind of mathematical optimization problem
So we're not optimizing to find good coders, we're optimizing to find mathematicians who have 5 minutes of coding experience.
At the risk of self-promotion, I'm fairly opinionated on this subject. I have a podcast episode where I discuss exactly this problem (including discuss greedy algorithms), and make some suggestions where we could go as an industry to avoid these kind of bad-signal interviews:
https://socialengineering.fm/episodes/the-problem-with-techn...
-what tech you worked with and some questions about decisions
-debugging an issue they encountered before
-talking about interests and cultural fit
Instant green flag for me. Too bad that after receiving my offer covid happened and they had a hiring freeze.
This doesn't mean they can't provide a constraint solver solution, but if they do, they'd better be prepared to address the obvious follow-ups. If they're prepared to give an efficient solution afterward in the time left, then more power to them.
373 more comments available on Hacker News