Python Developers Are Embracing Type Hints
Posted4 months agoActive3 months ago
pyrefly.orgTechstoryHigh profile
controversialmixed
Debate
80/100
PythonType HintsStatic Typing
Key topics
Python
Type Hints
Static Typing
The article discusses the growing adoption of type hints in Python, with commenters sharing their experiences and opinions on the benefits and drawbacks of using type hints in Python development.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
3d
Peak period
97
Days 3-4
Avg / period
40
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 24, 2025 at 7:23 AM EDT
4 months ago
Step 01 - 02First comment
Sep 27, 2025 at 6:32 PM EDT
3d after posting
Step 02 - 03Peak activity
97 comments in Days 3-4
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 11, 2025 at 2:13 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45358841Type: storyLast synced: 11/20/2025, 8:09:59 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Which can be out of date are often missing. Might as well use type-hints that can be statically checked.
In C++, a variable might be defined in a header or in a parent class somewhere else, and there's no indication of where it came from.
Doing otherwise is just asking for prod incidents.
I'd much rather just work in a statically typed language from the start.
To be clear, I'm not opposed to type hints. I use them everywhere, especially in function signatures. But the primary advantage to Python is speed (or at least perceived speed but that's a separate conversation). It is so popular specifically because you don't have to worry about type checking and can just move. Which is one of the many reasons it's great for prototypes and fucking terrible in production. You turn on strict type checking in a linter and all that goes away.
Worse, Python was not built with this workflow in mind. So with strict typing on, when types start to get complicated, you have to jump through all kinds of weird hoops to make the checker happy. When I'm writing code just to make a linter shut up something is seriously wrong.
Trying to ad typing to a dynamic language in my opinion is almost always a bad idea. Either do what Typescript did and write a language that compiles down to the dynamic one, or just leave it dynamic.
And if you want types just use a typed language. In a production setting, working with multiple developers, I would take literally almost any statically typed language over Python.
In my view it's always a mistake to try and tac static typing on top of a dynamic one. I think TS's approach is better than Python's, but still not nearly as good as just using a statically typed language.
The problem we are talking about in both Python and TS comes from the fact that they are (or compile down to) dynamic languages. These aren't issues in statically typed languages... because the code just won't compile it it's wrong and you don't have to worry about getting data from an untyped library.
I don't know a lot about Zod, but I believe the problem you are referring to is more about JavaScript then TS. JavaScript does a LOT of funky stuff at runtime, Python thank God actually enforces some sane type rules at runtime.
My point was not about how these two function at runtime. My point was that if you want to tac static typing onto a dynamic language, Typescripts approach is the better one, but even if can't fix the underlying issues with JS.
You could take a similar approach in Python. We could make a language called Tython, that is statically typed and then compiles down to Python. You eliminate an entire class of bugs at compile time, get a far more reliable experience then the current weirdness with gradual typing and linters, and you still get Pythons runtime type information to deal with things like interopt with existing Python code.
You would never have typing.TYPE_CHECKING to check if type checking is being done in TypeScript, for example, because type hints can't break Javascript code, something that can happen in Python when you have cyclic imports just to add types.
These systems are part of the core banking platform for a bank so I’d rather some initial developer friction over runtime incidents.
And I say initial friction because although developers are sometimes resistant to it initially, I’ve yet to meet one who doesn’t come to appreciate the benefits over the course of working on our system.
Different projects have different requirements, so YMMV but for the ones I’m working on type hints are an essential part of ensuring system reliability.
But it's a fair point. If you truly have no option it's better then absolutely nothing. I really wish people would stop writing mission critical production code in Python.
For example I work on a python codebase shared by 300+ engineers for a popular unicorn. Typing is an extremely important part of enforcing our contracts between teams within the same repository. For better or for worse, python will likely remain the primary language of the company stack.
Should the founder have chosen a better language during their pre-revenue days? Maybe, but at the same time I think the founder chose wisely -- they just needed something that was _quick_ (Django) and capable of slapping features / ecosystem packages on top of to get the job done.
For every successful company built on a shaky dynamic language, there's probably x10 more companies that failed on top of a perfect and scalable stack using static languages.
However for new projects I find that I'd much rather pick technologies that start me off with a sanity floor which is higher than Python's sanity ceiling. At this point I don't want to touch a dynamically typed language ever again.
Whatever the solution is, it doesn’t include giving up on Python typings.
Wouldn't that just be `object` in Python?
There was a proposal[3] for an unknown type in the Python typing repository, but it was rejected on the grounds that `object` is close enough.
[1]: https://mypy.readthedocs.io/en/stable/error_code_list.html#c...
[2]: https://mypy.readthedocs.io/en/stable/error_code_list.html#c...
[3]: https://github.com/python/typing/issues/1835
It's "close enough" to a usable type system that it's worth using, but it's full of so many edge cases and so many situations where they decided that it would be easier if they forced programmers to try and make reality match the type system rather than the type system match reality.
No wonder a lot of people in the comments here say they don't use it...
You can live with the "close enough" if you're writing a brand new greenfield project and you prevent anyone from ever checking in code mypy doesn't like and also don't use any libraries that mypy doesn't like (and also don't make web requests to APIs that return dictionary data that mypy doesn't like)
Retrofitting an existing project however is like eating glass.
I believe Python's own documentation also recommends the shorthand syntax over `Union`. Linters like Pylint and Ruff also warn if you use the imported `Union`/`Optional` types. The latter even auto-fixes it for you by switching to the shorthand syntax.
[^1]: https://en.wikipedia.org/wiki/Option_type
I'm aware this is just a typo but since a lot of the Python I write is in connection with Airflow I'm now in search of a way to embrace duct typing.
If I have a function that takes an int, and I write down the requirement, why should a JIT have to learn independently of what I wrote down that the input is an int?
I get that it's this way because of how these languages evolved, but it doesn't have to stay this way.
The type hints proved to be useful on their own so the project moved past what was useful for that purpose, but a new JIT (such as the one the upcoming CPython 3.14 lays the groundwork for) could certainly use them.
The extra typing clarification in python makes the code harder to read. I liked python because it was easy to do something quickly and without that cognitive overhead. Type hints, and they feel like they're just hints, don't yield enough of a benefit for me to really embrace them yet.
Perhaps that's just because I don't use advanced features of IDEs. But then I am getting old :P
EDIT: also, this massively depends on what you're doing with the language! I don't have huge customer workloads to consider any longer..!
They don't. And cannot, for compatibility reasons. Aside from setting some dunders on certain objects (which are entirely irrelevant unless you're doing some crazy metaprogramming thing), type annotations have no effect on the code at runtime. The Python runtime will happily bytecode-compile and execute code with incorrect type annotations, and a type-checking tool really can't do anything to prevent that.
My understanding is that currently python can collect type data in test runs and use it to inform the jit during following executions
I'd forgotten about that. Now that you mention it, my understanding is that this is actually the plan.
It’s funny, because for me is quite the opposite: I find myself reading Python more easily when there are type annotations.
One caveat might be: for that to happen, I need to know that type checking is also in place, or else my brain dismissed annotations in that they could just be noise.
I guess this is why in Julia or Rust or C you have this stronger feeling that types are looking after you.
It depends what you mean by "read". If you literally mean you're doing a weird Python poetry night then sure they're sort of "extra stuff" that gets in the way of your reading of `fib`.
But most people think of "reading code" and reading and understanding code, and in that case they definitely make it easier.
And Python always was rather strongly typed, so you anyway had to consider the types. Now you get notes. Which often do help.
In my experience I have seen far too much Python code like
`def func(data, args, *kwargs)`
with no documentation and I have no clue wtf it's doing. Now I am basically all in on type hints (except cases where it's impossible like pandas).
I use vanilla vim (no plugins) for my editor, and still consider type hints essential.
Dudes it's literally just worse compilation with extra steps.
There's a world of difference between:
> I've been using a different type checker and I like it, you should try it
And
> I'd like to switch our project to a different compiler
The former makes for more nimble ecosystem.
Turns out they just didn't know any better?
Python has a great experience for a bunch of tasks and with typing you get the developer experience and reliability as well.
Python’s 3 traditional weak spots, which almost all statically-typed languages do better: performance, parallelism and deployment.
It is a great choice though for many problems where performance isn't critical (or you can hand the hard work off to a non-Python library like Numpy or Torch). Typing just makes it even better.
For any even medium sized project or anything where you work with other developers a statically typed language is always going to be better. We slapped a bunch of crap on Python to make it tolerable, but nothing more.
Yeah it's workable, and better than nothing. But it's not better than having an actual static type system.
1. It's optional. Even if you get your team on board you are inevitably going to have to work with libraries that don't use type hints
2. It's inconsistent, which makes sense given that it's tacked onto a language never intended for it.
3. I have seen some truly goofy shit written to make the linter happy in more complex situations.
I honestly think everything that's been done to try to make Python more sane outside outside scripting or small projects (and the same applies to JS and TS) are a net negative. Yes it has made those specific ecosystems better and more useful, but it's removed the incentive to move to better technology/languages actually made to do the job.
The comment about Typescript was really about JavaScript. It's a patch on top of JavaScript, which is a shit show and should have been replaced before it ended up forming the backbone of the internet.
Python, typed or otherwise, isn't good for anything past prototyping, piping a bunch of machine learning libraries together, or maybe a small project. The minute the project gets large or starts to move towards actual production software Python should be dropped.
But it's really valuable documentation! Knowing what types are expected and returned just by looking at a function signature is super useful.
https://old.reddit.com/r/Python/comments/10zdidm/why_type_hi...
Edit: Yes, one can sometimes go with Any, depending on the linter setup, but that's missing the point, isn't it?
That entire Reddit post is a clueless expert beginner rant about something they don't really understand, unfortunate that it's survived as long as it has or that anyone is taking it as any sort of authoritative argument just because it's long.
That's not the issue the reddit post is raising. The reddit post is pointing out that what a "type" is is not as simple as it looks. Particularly in a language like Python where user-defined types proliferate, and can add dunder methods that affect statements that involve built-in operations. "Just use Any" doesn't solve any of those problems.
> just use Any.
All the above said: not putting a type in at all is even easier than using Any, and is semantically equivalent.
But the entire post is built upon the premise that accepting all types is good API design. Which it isn't, at all.
Was Tim Peters also wrong way back in the day when he counseled Guido van Rossum to allow floats to be added to integers without a cast, like other popular languages?
My suggestion -- don't rely on magic methods.
Regardless, none of that bears on the original `slow_add` example from the Reddit page. The entire point is that we have an intuition about what can be "added", but can't express it in the type system in any meaningful way. Because the rule is something like "anything that says it can be added according to the protocol — which in practical terms is probably any two roughly-numeric types except for the exceptions, and also most container types but only with other instances of the same type, and also some third-party things that represent more advanced mathematical constructs where it makes sense".
And saying "don't rely on magic methods" does precisely nothing about the fact that people want the + symbol in their code to work this way. It does suggest that `slow_add` is a bad thing to have in an API (although that was already fairly obvious). But in general you do get these issues cropping up.
Dynamic typing has its place, and many people really like it, myself included. Type inference (as in the Haskell family) solves the noise problem (for those who consider it a problem rather than something useful) and is elegant in itself, but just not the strictly superior thing that its advocates make it out to be. People still use Lisp family languages, and for good reason.
But maybe Steve Yegge would make the point better.
So no e.g. numpy or torch then?
When I've used Python's type checkers, I have more the feeling that the goal is to create a new, typed subset of the language, that is less capable but also easier to apply types to. Then anything that falls outside that subset gets `Any` applied to it and that's good enough. The problem I find with that is that `Any` is incredibly infective - as soon as it shows up somewhere in a program, it's very difficult to prevent it from leaking all over the place, meaning you're often back in the same place you were before you added types, but now with the added nuisance of a bunch of types as documentation that you can't trust.
No, it doesn't. The desired type is known; it's "Addable" (i.e., "doesn't throw an exception when the built-in add operator is used"). The problem is expressing that in Python's type notation in a way that catches all edge cases.
> If you want to allow users to pass in any objects, try to add and fail at runtime
Which is not what the post author wants to do. They want to find a way to use Python's type notation to catch those errors with the type checker, so they don't happen at runtime.
> the entire post is built upon the premise that accepting all types is good API design
It is based on no such thing. I don't know where you're getting that from.
The mistake both you and the reddit posts' author make is treating the `+` operator the same as you would an interface method. Despite Python having __add__/__radd__ methods, this isn't true, nor is it true in many other programming languages. For example, Go doesn't have a way to express "can use the + operator" at all, and "can use comparison operators" is defined as an explicit union between built-in types.[0] In C# you could only do this as of .NET 7, which was released in Nov 2022[1] -- was the C# type system unusable for the 17 years prior, when it didn't support this scenario?
If this were any operation on `a` and `b` other than a built-in operator, such as `a.foo(b)`, it would be trivial to define a Protocol (which the author does in Step 4) and have everything work as expected. It's only because of misunderstanding of basic Python that the author continues to struggle for another 1000 words before concluding that type checking is bad. It's an extremely cherry-picked and unrealistic scenario either from someone who is clueless, or knows what they're doing and is intentionally being malicious in order to engagement bait.[2]
This isn't to say Python (or Go, or C#) has the best type system, and it certainly lacks compared to Rust which is a very valid complaint, but "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case, unsupported in many languages, that it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.
[0] https://pkg.go.dev/cmp#Ordered
[1] https://learn.microsoft.com/en-us/dotnet/standard/generics/m...
[2] actually reading through the reddit comments, the author specifically says they were engagement baiting so... I guess they had enough Python knowledge to trick people into thinking type hinting was bad, fair enough!
In other words, you agree that the Python type hint system does not give you a good, built-in way to express the "Addable" type.
Which means you are contradicting your claims that the type the article wants to express is "unknown" and that the article is advocating using "Any" for this case. The type is not unknown--it's exactly what I said: "doesn't throw an exception when using the + operator". That type is just not expressible in Python's type hint system in the way that would be needed. And "Any" doesn't address this problem, because the article is not saying that every pair of objects should be addable.
> "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case
I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.
> unsupported in many languages
Yes, which means many languages have type systems that claim to deliver things they can't actually deliver. They can mostly deliver them, but "mostly" isn't what advocates of using type systems in all programs claim. So I think the article is making a useful point about the limitations of type systems.
> it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.
The article never says that either. You are attacking straw men.
If your comparison is Rust, sure, but you can't even express this in Java. No, Java's type system is not great, but it's a type system that's been used for approximately 500 trillion lines of production code powering critical systems and nobody has ever said "Java sucks because I can't express 'supports the + operator' as a generic type". (It sucks for many other reasons.)
Again, it is factually and objectively an esoteric and unusual case. Nobody in the real world is writing generics like this, only academics or people writing programming blogs about esoterica.
If your argument is that all type systems are bad or deficient, fine, but calling out Python for this when it has the exact same deficiency as basically every other mainstream language is asinine.
> The article never says that either. You are attacking straw men.
The article says "Turning even the simplest function that relied on Duck Typing into a Type Hinted function that is useful can be painfully difficult." The subterfuge is that this is not even remotely close to a simple function because the type being expressed, "supports the + operator", is not even remotely close to a simple type.
Sorry, but your unsupported opinion is not "factual and objective".
> If your argument is that all type systems are bad or deficient
I said no such thing, any more than the article did. Again you are attacking a straw man. (If you had said "limited in what they can express", I might buy that. But you didn't.)
I think I've said all I have to say in this subthread.
What the reddit post is demonstrating is that the Python type system is still too naive in many respects (and that there are implementation divergences in behavior). In other languages, this is a solved problem - and very ergonomic and safe.
Python’s type hints are in the second category.
But now that I'm coming back to it, I think that this might be a larger category than I first envisioned, including projects whose build/release processes very reliably include the generation+validation+publication of updated docs. That doesn't imply a specific language or release automation, just a strong track record of doc-accuracy linked to releases.
In other words, if a user can validate/regenerate the docs for a project, that gets it 9/10 points. The remaining point is the squishier "the first party docs are always available and well-validated for accuracy" stuff.
Nobody does that, though. Instead they all auto-publish their OpenAPI schemas through rickety-ass, fail-soft build systems to flaky, unmonitored CDNs. Then they get mad at users who tell them when their API docs don't match their running APIs.
Many old school python developers don't realize how important typing actually is. It's not just documentation. It can actually roughly reduce dev time by 50% and increase safety by roughly 2x.
There are developers who design apis by trying to figure out readable invocations. These developers discover, rather than design, type hierarchies and library interfaces.
> Many old school python developers don't realize how important typing actually is.
I don't think this is true. There's simply a communication breakdown where type-first developers don't see the benefits of disabling static checking to design interfaces, and interface-first developers don't see why they should put static checking ahead of interface iteration speed.
No they dont. There is nothing about types that would make incremental develpment harder. They keep having the same benefits when being incremental.
Oh, please, this is either lack of imagination or lack of effort to think. You've never wanted to test a subset of a library halfway through a refactor?
I don't think it's a lack of curiosity from others. But it's more like fundamental lack of knowledge from you. Let's hear it. What is it are you actually talking about? Testing a subset of a library halfway though a refactor? How does a lack of types help with that?
My hunch is that the people who see no downsides whatsoever in static typing are those who mostly just consume APIs.
I'm not a consumer of APIs. I've done game programming, robotics, embedded system development (with C++ and rust), (web development frontend with react/without react, with jquery, with angurar, with typescript, with js, zod) (web development backend with golang, haskell, nodejs typescript, and lots and lots of python with many of the most popular frameworks with flask + sqlalchemy, django, FastApi + pydantic, )
I've done a lot. I can tell you. If you don't see how types outweigh untyped languages, you're a programmer with experience heavily weighed toward untyped programming. You don't have balanced experience to make a good judgement. Usually these people have a "data scientist" background. Data analyst or data scientist or machine learning engineers... etc. These guys start programing heavily in the python world WITHOUT types and they develop unbalanced opinions shaped by their initial styles of programming. If this describes you, then stop and think... I'm probably right.
No, you're one of the old school python developers. Types don't hinder creativity, they augment it. The downside is the slight annoyance of updating a type definition and the run time definition vs. just updating the runtime definition.
Let me give you an example of how it hinders creativity.
Let's say you have a interface that is immensely complex. Many nested structures thousands of keys, and let's say you want to change the design by shifting 3 or 4 things around. Let's also say this interface is utilized by hundreds of other methods and functions.
When you move 3 or 4 things around in a complex interface you're going to break a subset of those hundreds of other methods or functions. You're not going to know where they break if you don't have type checking enabled. You're only going to know if you tediously check every single method/function OR if it crashes during runtime.
With a statically typed definition you can do that change and the type checker will identify EVERY single place where an additional change to the methods that use that type needs to be changed as well. This allows you to be creative and make any willy nilly changes you want because you are confident that ANY change will be caught by the type checker. This Speeds up creativity, while without it, you will be slowed down, and even afraid to make the breaking change.
You are basically the stereotype I described. An old school python developer. Likely one who got used to programming without types and now hasn't utilized types extensively enough to see the benefit.
>I don't think this is true. There's simply a communication breakdown where type-first developers don't see the benefits of disabling static checking to design interfaces, and interface-first developers don't see why they should put static checking ahead of interface iteration speed.
This is true. You're it. You just don't know it. When I say these developers don't know I'm literally saying they think like you and believe the same things you believe BECAUSE they lack knowledge and have bad habits.
The habit thing is what causes the warped knowledge. You're actually slowed down by types because you're not used to it as you spent years coding in python without types so it's ingrained for you to test and think without types. Adding additional types becomes a bit of a initial overhead for these types of people because their programming style is so entrenched.
Once you get used to it and once you see that it's really just a slight additional effort, than you will get it. But it takes a bit of discipline and practice to get there.
I'd been programming for 20+ years and I genuinely couldn't think of any situations where I'd had a non-trivial bug that I could have avoided if I'd had a type checker - claims like "reduce dev time by 50%" didn't feel credible to me, so I stuck with my previous development habits.
Those habits involved a lot of work performed interactively first - using the Python terminal, Jupyter notebooks, the Firefox/Chrome developer tools console. Maybe that's why I never felt like types were saving me any time (and in fact were slowing me down).
Then I had my "they're just interactive documentation" realization and finally they started to click for me.
But if you aren't familiar with a project then dynamic typing makes it an order of magnitude harder to navigate and understand.
I tried to contribute some features to a couple of big projects - VSCode and Gitlab. VSCode, very easy. I could follow the flow trivially, just click stuff to go to it etc. Where abstract interfaces are used it's a little more annoying but overall wasn't hard and I have contributed a few features & fixes.
Gitlab, absolutely no chance. It's full of magically generated identifiers so even grepping doesn't work. If you find a method like `foo_bar` it's literally impossible to find where it is called without being familiar with the entire codebase (or asking someone who is) and therefore knowing that there's a text file somewhere called `foo.csv` that lists `bar` and the method name is generated from that (or whatever).
In VSCode it was literally right-click->find all references.
I have yet to succeed in modifying Gitlab at all.
I did contribute some features to gitlab-runner, but again that is written in Go so it is possible.
So in some cases those claims are not an exaggeration - static types take you from "I give up" to "not too hard".
Flip side of this is that I hate trying to read code written by teams relying heavily on such features, since typically zero time was spent on neatly organizing the code and naming things to make it actually readable (from top to bottom) or grep-able. Things are randomly spread out in tiny files over countless directories and it's a maze you stumble around just clicking identifiers to jump somewhere. Where something is rarely matter as the IDE will find it. I never develop any kind of mental image of that style of code and it completely rules out casually browsing the code using simpler tools.
Kind of like how you don't learn an area when you always use satnav as quickly as you do when you manually navigate with paper maps. But do you want to go back to paper maps? I don't.
Type annotations don’t double productivity. What does “increase safety by 2×” even mean? What metric are you tracking there?
In my experience, the main non-documentation benefit of type annotations is warning where the code is assuming a value where None might be present. Mixing up any other kind of types is an extremely rare scenario, but NoneType gets everywhere if you let it.
My own anecdotal metric. Isn't that obvious? The initial post was an anecdotal opinion as well. I don't see a problem here.
>In my experience, the main non-documentation benefit of type annotations is warning where the code is assuming a value where None might be present. Mixing up any other kind of types is an extremely rare scenario, but NoneType gets everywhere if you let it.
It's not just None. Imagine some highly complex object with nested values and you have some function like this:
wtf is direction object? Is it in Cartesian or is it in polar? Is in 2D or 3D? Most old school python devs literally have to find where modify_direction is called and they find this: Ok then you have to find where modify data is called, and so on and so forth until you get to here: And then boom you figure out what it does by actually reading all the complex quaternion math create_quat does.Absolutely insane. If I have a type, I can just look at the type to figure everything out... you can see how much faster it is.
Oh and get this. Let's say there's someone who feels euler angles are better. So he changes create_quat to create_euler. He modifies all the places create_quat is used (which is about 40 places) and he misses 3 or 4 places where it's called.
He then ships it to production. Boom The extra time debugging production when it crashes, ans also extra time tediously finding where create_quat was used. All of that could have been saved by a type checker.
I'm a big python guy. But I'm also big into haskell. So I know both the typing worlds and the untyped worlds really well. Most people who complain like you literally have mostly come from a python background where typing isn't used much. Maybe you used types occasionally but not in a big way.
If you used both untyped languages and typed languages extensively you will know that types are intrinsically better. It's not even a contest. Anyone who still debates this stuff just lacks experience.
WTF is “an anecdotal metric”‽ That just sounds like an evasive way to say “I want to make up numbers I can’t justify”.
> wtf is direction object? Is it in Cartesian or is it in polar? Is in 2D or 3D?
This seems very domain-specific.
> Most people who complain like you literally have mostly come from a python background where typing isn't used much. Maybe you used types occasionally but not in a big way.
> If you used both untyped languages and typed languages extensively you will know that types are intrinsically better. It's not even a contest. Anyone who still debates this stuff just lacks experience.
I’ve got many years of experience with static typed languages over a 25 year career. Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.
What I have to have scientific papers for every fucking opinion I have? The initial Parent post was an anecdotal opinion. Your post is an opinion. I can't have opinions here without citing a scientific paper that's 20 pages long and no is going to read but just blindly trust because it's "science"? Come on. What I'm saying is self evident to people who know. There are thousands of things like this in the world where people just know even though statistical proof hasn't been measured or established. For example eating horse shit everyday probably isn't healthy even though it there isn't SCIENCE that proves this action as unhealthy directly. Type checking is just one of those things.
OBVIOUSLY I think development is overall much better, much faster and much safer with types. I can't prove it with metrics, but I'm confident my "anecdotal" metrics with I prefaced with "roughly" are "roughly" ballpark trueish.
>This seems very domain-specific.
Domain specific? Basic orientation with quaternions and euler angles is specific to reality. Orientation and rotations exist in reality and there are thousands and thousands of domains that use it.
Also the example itself is generic. Replace euler angles and quats with vectors and polar coordinates. Or cats and dogs. Same shit.
>I’ve got many years of experience with static typed languages over a 25 year career. Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.
The amount of years of experience is irrelevant. I know tons of developers with only 5 years of experience who are better than me and tons of developers with 25+ who are horrible.
I got 25 years as well. If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact. It's not an insult. It just means for a specific thing they don't have experience or knowledge which is typical. I'm sure there's tons of things where you could have more experience. Just not this topic.
If you have experience with static languages it likely isn't that extensive. You're likely more of a old school python guy who spend a ton of time programming without types.
No, but if you’re going to say things like “increase safety by roughly 2x” then if you can’t even identify the unit then you are misleading people.
It’s absolutely fine to have an opinion. It’s not fine to make numbers up.
> I'm confident my "anecdotal" metrics with I prefaced with "roughly" are "roughly" ballpark trueish.
Okay, so if it’s 1.5×, 2.0×, or 2.5×… again, what metric? What unit are we dealing with?
You’re claiming that it’s “in the ballpark”, but what is “in the ballpark”? The problem is not one of accuracy, the problem is that it’s made up.
> If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact.
It’s not a fact, it’s ridiculous. You genuinely believe that if somebody disagrees with you, it’s a fact that they lack knowledge and experience? It’s not even remotely possible for somebody to have an informed difference of opinion with you?
So when I talk about multipliers I have to have a unit? What is the unit of safety? I can't say something like 2x more safe? I just have to say more safe? What if I want to emphasize that it can DOUBLE safety?
Basically with your insane logic people can't talk about productivity or safety or multipliers at the same time because none of these concepts have units.
Look I told YOU it's anecdotal, EVERYONE can read it. You're no longer "deceived" and no one else is.
>Okay, so if it’s 1.5×, 2.0×, or 2.5×… again, what metric? What unit are we dealing with?
If you don't have the capacity to understand what I'm talking about without me specifying a unit than I'll make one up:
I call it safety units. The amount of errors you catch in production. That's my unit: 1 caught error in prod in a year. For Untyped languages let's say you catch about 20 errors a year. With types that goes down to 10.
>It’s not a fact, it’s ridiculous. You genuinely believe that if somebody disagrees with you, it’s a fact that they lack knowledge and experience? It’s not even remotely possible for somebody to have an informed difference of opinion with you?
What? and you think all opinions are equal and everyone has the freedom to have any opinion they want and no one can be right or wrong because everything is just an opinion? Do all opinions need to be fully respected even though it's insane?
Like my example, if you have the opinion that eating horse shit is healthy, I'm going to make a judgement call that your opinion is WRONG. Lack of Typing is one of these "opinions"
> If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact.
You think it’s impossible for anybody to have an informed opinion that disagrees with yours. You literally think yours is the only possible valid opinion. If that doesn’t set off big warning bells in your head, you are in dire need of a change in attitude.
This conversation is not productive, let’s end it.
I mean do you think we should have a fair and balanced discussion about the merits of child molestation and rape? We should respect other people's opinion and not tell them they are wrong if there opinion differs? That's what I think of your opinion. I think your opinion is utterly wrong, and I do think my opinion is the valid opinion.
Now that doesn't mean I disrespect your opinion. That doesn't mean your not allowed to have a different opinion. It just means I tell you straight up, you're wrong and you lack experience. You're free to disagree with that and tell me the exact same thing. I'm just blunt, and I welcome you to be just as blunt to me. Which you have.
The thing I don't like about you is that you turned it into discussion about opinions and the nature of holding opinions. Dude. Just talk about the topic. If you think I'm wrong. Tell me straight up. Talk about why I'm wrong. Don't talk about my character and in what manner I should formulate opinions and what I think are facts.
>This conversation is not productive, let’s end it.
I agree let's end it. But let's be utterly clear. YOU chose to end it with your actions by shifting the conversation into saying stuff like "you literally think yours is the only possible opinion." Bro. All you need to do is state why you think my opinion is garbage and prove it wrong. That's the direction of the conversation, you ended it by shifting it to a debate on my character.
It's a metric (how much more productive he is), and anecdotal (base only on his experience). Pretty obvious I would have thought.
> This seems very domain-specific.
It was an example from one domain but all domains have types of things. Are you really trying to say that only 3D games specifically would benefit from static types?
> Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.
Clueless senior then I guess? Honestly I don't know how you can have this much experience and still not come to the obvious conclusion. Perhaps you only write small scripts or solo projects where it's more feasible to get away without static types?
What would you say to someone who said "I have 25 years of experience reading books with punctuation and I think that punctuation is a waste of time. Just because you disagree with me doesn't mean I'm clueless."?
Or have enough experience to have lived e.g. the J2EE and C++ template hells and see where this is going.
In general types outweigh no types EVEN with the above.
Obviously this post is still firmly in made up statistics land, but i agree with OP, in some cases they absolutely do.
New code written by yourself? No, probably not. But refactoring a hairy old enterprise codebase? Absolutely a 2×, 3× multiplier to productivity / time-to-correctness there.
> But it's really valuable documentation! Knowing what types are expected and returned just by looking at a function signature is super useful.
So ... you didn't have this realisation prior to using Python type hints? Not from any other language you used prior to Python?
Maybe its time you expanded your horizons, then. Try a few statically typed languages.
Even plain C gives you a level of confidence in deployed code that you will not get in Python, PHP or Javascript.
Ironically, the worst production C written in 2025 is almost guaranteed to be better than the average production Python, Javascript, etc.
The only people really choosing C in 2025 are those with a ton of experience under their belt, who are comfortable with the language and its footguns due to decades of experience.
IOW, those people with little experience are not choosing C, and those that do choose it have already, over decades, internalised patterns to mitigate many of the problems.
At the end of the day, in 2025, I'd still rather maintain a system written in a statically typed language than a system written in a dynamically typed language.
Experienced users of C can't be the only people who use it if the language is going to thrive. It's very bad for a language when the only ones who speak it are those who speak it well. The only way you get good C programmers is by cultivating bad C programmers, you can't have one without the other. If you cut off the bad programmers (by shunning or just not appealing to them, or loading your language with too many beginner footguns), there's no pipeline to creating experts, and the language dies when the experts do.
The people who come along to work on their legacy systems are better described as archaeologists than programmers. COBOL of course is the typical example, there's no real COBOL programming community to speak of, just COBOL archeologists who maintain those systems until they too shall die and it becomes someone else's problem, like the old Knight at the end of Indiana Jones.
I don't think it's going to thrive. It's going to die. Slowly, via attrition, but there you go.
I've been dabbling with Go for a few projects and found the type system for that to be pleasant and non-frustrating.
With Python, PHP and Javascript, you only option is "comprehensive tests and no types".
With statically typed languages, you have options other than "types with no tests". For example, static typing with tests.
Don't get me wrong; I like dynamically typed languages. I like Lisp in particular. But, TBH, in statically typed languages I find myself producing tests that test the business logic, while in Python I find myself producing tests that ensure all callers in a runtime call-chain have the correct type.
BTW: You did well to choose Go for dipping your toes into statically typed languages - the testing comes builtin with the tooling.
406 more comments available on Hacker News