I'm Too Dumb for Zig's New Io Interface
Posted4 months agoActive4 months ago
openmymind.netTechstoryHigh profile
heatedmixed
Debate
80/100
Zig Programming LanguageIo InterfaceSoftware Development
Key topics
Zig Programming Language
Io Interface
Software Development
The author struggles to understand Zig's new IO interface, sparking a discussion about the language's design and usability, with some defending the changes and others criticizing the complexity and lack of documentation.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
24m
Peak period
114
0-12h
Avg / period
14.5
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Aug 23, 2025 at 2:39 AM EDT
4 months ago
Step 01 - 02First comment
Aug 23, 2025 at 3:03 AM EDT
24m after posting
Step 02 - 03Peak activity
114 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 28, 2025 at 3:56 AM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 44993797Type: storyLast synced: 11/20/2025, 8:23:06 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
The weird interface of go is probably due the fact that some interfaces can be used to extemd the writer like the hijacker interface (ResponseWriter.(http.Hijacker)) and the request object is used multiple times with different middlewares interacting with it. In short: request does not need to be extended, but the response can be an websocket, an wrapped tcp connection or something else.
You can lift/unlift in or out of arbitrary IO, in some languages one direction is called a mock, in other languages the opposite is called unsafeFoo.
Andrew Kelley independently rediscovered on a live stream 30 years of the best minds in Haskell writing papers.
So the future is Zig. He got there first.
Compute is getting tight, lots of trends, the age of C++ is winding down gracefully. The age of Zig is emerging delibetately, and the stuff in the middle will end up in the same historical trash bin as everything else in the Altman Era: the misfortunes of losing sight of the technology.
The age of C++ is going great, despite all its warts and unsafety, thanks to compiler frameworks like GCC and LLVM, games industry, GPGPU and Khronos APIs.
Even if C++ loses everywhere else, it has enough industry mindshare to keep being relevant.
Same applies to C, in the context of UNIX clones, POSIX, Khronos, embedded.
Being like Modula-2 or Object Pascal in safety, in C like syntax, isn't enough.
Rust makes false promises in practical situations. It invented a notion of safety that is neither well posed, nor particularly useful, nor compatible with ergonomic and efficient computing.
It's speciality is marketing and we already know the bounding box on its impact or relevance. "Vibe coding" will be a more colorful and better remembered mile marker of this lousy decade in computers than Rust, which will be an obscurity in an appendix in 100 years.
"Makes predictions to within a quantifiable epsilon"? What in the world do you mean? The industry experience with C++ is that it is extremely difficult (i.e., expensive) to get right, and C++20 or newer does not change anything about that. Whatever "epsilon" you are talking about here surely has to be very large for a number bearing that sobriquet.
As for the mindless anti-Rust slander... I'm not sure it's worth addressing, because it reflects a complete lack of the faintest idea about what it actually does, or what problem it solves. Let me just say there's a reason the Rust community is rife with highly competent C++ refugees.
I doubt it.
I'm teaching a course on C this fall. As textbook I've chosen "Modern C" by Jens Gustedt (updated for C23).
I'm asked by students "Why don't you choose K&R like everyone else?"
And while the book is from 1978 (ANSI C edition in 1988), and something I've read joyously more than once, I'm reminded of how decades of C programmers have been doing things "the old way" because that's how they're taught. As a result, the world is made of old C programs.
With this momentum of religiously rewriting things in Rust we see in the last few years (how many other languages have rewritten OpenSSL and the GNU coreutils?), the amount of things we depend on that was incidentally rewritten in Rust grows significantly.
Hopefully people won't be writing Rust in 100 years. Since 100 years ago mathematicians were programming mechanical calculators and analog computers, and today kids are making games. But I bet you a whole lot of infrastructure still runs Rust.
In fact, anything that is convenient to Vibe code in the coming years will drown out other languages by volume. Rust ain't so bad for vibe coding.
There is a place to learn about history of computing, and that is where K&R C book belongs to.
Not only is the old way, this is from the age of dumb C compilers, not taking advantage of all stuff recent standards allow compiler writers to take to next level on optimizations, not always with expected results.
Maybe getting students to understand the ISO C draft is also an interesting exercise.
Please stop. Rust's promise is very simple. You get safety without the tracing GC. It also gives you tools to implement your own safe abstraction on top of unsafe, but you are mostly on your own (miri, asan, and ubsan can still be used).
Neither Rust nor Ada nor Lean nor Haskell can guarantee there are no errors in their implementations.
Similarly, none of the listed languages can even try to show that a bad actor can't write bad code or design bad hardware in a way that maintains their promises. If you need that, you need to invent the Omniscient Oracle, not a program.
I hate this oft repeated Nirvana fallacy. Yes, Rust is offering you a car with seatbelts and airbags. It is not offering a car that guarantees immortality in the event of a universe collapse.
Because it's technically true. The best kind of true!
Sorry, I meant to say the opposite of truth. Neither Rust nor Ada.Spark, which use LLVM as a backend, can prove via that they are correct if LLVM has bugs.
In the same way, I can't guarantee tomorrow I won't be killed by a rogue planet hitting Earth at 0.3c. So I should probably start gambling and doing coke, because we might be killed tomorrow.
> Every single project needs to "fix" the same kind of safety issues over and over again
I doubt that's the biggest problem. Each of the unsafe libraries in C/C++/Zig can be perfectly safe given invariants X and Y, respectively. What happens if you have two (or more) libraries with subtly non-compatible invariants? You get non-composable libraries. You end up with the reverse problem of the NPM world.
There are some scary soundness holes in Rust's compiler that will get patched eventually but in principle you could trip them today. They're often "But why would anybody even do that?" problems, but it's technically legal Rust and the compiler doesn't reject your program or even ICE it just miscompiles your input which is not what we want.
By that logic, we definitely have enough safe languages as it is, as there are many more. But this safe/unsafe dichotomy is silly, and is coloured by languages that are unsafe in some particular ways.
1. Memory safety is important because memory-safety violations are a common cause of dangerous security vulnerabilities. But once you remove out-of-bounds access, as Zig does, memory safety doesn't even make it to the top 5: https://cwe.mitre.org/top25/archive/2024/2024_cwe_top25.html I.e. the same logic that says we should focus on safety would lead us to conclude we should focus on something else.
2. Memory safety has a cost. To get it, you have to give up something else (there could even be a cost to correctness). That means that you have to consider what you're getting and what you're losing in the context of the domain you're targeting, which is not the same for all languages. C++, with its "zero-cost abstractions", believed it could be everything for everyone. That turned out not to be the case at all, and Zig is a very different language, with different goals, than C++ originally had.
Given Zig's safety guarantees (which are stronger than C++'s), and given its goals (which are different from C++'s), the question should be what should we be willing to give up to gain safety from use-after-free given the language's goals. Would more safety be better if it cost nothing? Of course, but that's not an option. Even Java and Rust could prevent many more dangerous bugs - including those that are higher risk than use-after-free - if they had more facilities like those of ATS or Idris. But they don't because their designers think that the gains wouldn't be worth the cost.
If you don't say what Zig programmers should give up to gain more safety, saying "all new languages should be memory-safe" is about as meaningful as saying we should write fewer bugs. That's a nice sentiment, but how and at what cost?
I am a firm beliver in the vision of Xerox PARC for computing, and think the only reason we aren't yet there are politics, lack of funding from management for doing the right thing pushing them into the market, always looking to shareholders and the next quarter, and naturally programming language religion.
We were already on the right direction with languages like Modula-3 and Active Oberon, following up on Cedar influences, unfortunately that isn't how the industry goes.
Rational started as a company selling Ada Machines, that didn't had such issues with compilation times, but again goes down to reasons I listed why mainstream keeps ignoring such tools until finally governments are stepping in.
What is that in relation to Zig and memory safety? Am I missing some context?
Instead we got UNIX and C.
Notice that this cost, which proponents of Zig scoff at just like C++ programmers before them, is in fact the price of admission. "OK, we're not correct but..." is actually the end of the conversation. Everybody can already do "Not correct", we had "Not correct" without a program, so all effort expended on a program was wasted unless you're correct. Correctness isn't optional.
A language like Rust exists precisely because correctness isn't the only concern, as most software is already written in languages that make at least as many guarantees. Rust exists because some people decide they don't want to pay the price other languages take in exchange for their guarantees, but they can afford to pay Rust's price. But the very same reasoning applies to Rust itself. If Rust exists because not all tradeoffs are attractive to everyone, then clearly its own tradeoffs are not attractive to everyone.
The goal isn't to write the most correct program; it's to write the most correct program under the project's budget and time constraints. If you can't do it in those constraints, it doesn't matter what guarantees you make, because the program won't exist. If you can meet the constraints, then you need to ask whether the program's correctness, performance, user-friendliness etc. are good enough to serve the software's purpose.
And that's how you learn what software correctness researchers have known for a long time: sometimes increasing correctness guarantees can have unintuitive cost/benefit interactions that they may even end up harming correctness.
There are similar unintuitive results in other disciplines. For example, in software security there's what I call the FP/FN paradox. It's better to have more FN (false negatives, i.e. let some attacks go through) than more FP (false positives, i.e. block interactions that aren't attacks) because FPs are more likely to lead to misconfiguration or even to abandonment of the security mechanism altogether, resulting in weaker security. So, in software security it's a well known thing that to get better security you sometimes need to make fewer guarantees or try less hard to stop all attacks.
In a better, saner world, we'd writing Ada++ not C++. However, we don't live in a perfect world.
> The goal isn't to write the most correct program; it's to write the most correct program under the project's budget and time constraints.
The goal of ANY software engineer worth their salt should be minimizing errors and defects in their end product.
This goal can be reached by learning to write Rust; practice makes perfect.
If GC is acceptable or you need lower compilation times, then yes, go and write your code in C#, Java, or JavaScript.
As someone who worked on safety-critical air-traffic-control software in the nineties, I can tell you that our reasons for shifting to C++ were completely sane. Ada had some correctness advantages compared to C++, but also disadvantages. It had drastically slower build times, which meant we couldn't test the software as frequently, and the language was very complicated that we had to spend more time digging into the minutiae of the language and less time thinking about the algorithm (C++ was simpler back then than it is now). When Java became good enough, we switched to Java.
Build times and language complexity are important for correctness, and because of them, we were able to get better correctness with C++ than with Ada. I'm not saying this is universal and always the case, but the point is that correctness is impacted by many factors, and different projects may find achieving higher correctness in different ways. Trading off fewer use-after-free for longer build times and a more complex language may be a good tradeoff for the correctness of some projects, and a bad tradeoff for others.
> If GC is acceptable or you
BTW, a tracing GC - whose costs are now virtually entirely limited to a higher RAM footprint - is acceptable much more frequently than you may think. Sometimes, without being aware, languages like C, C++, Rust, or Zig may sacrifice CPU to reduce footprint, even when this tradeoff doesn't make sense. I would strongly recommend watching this talk (from the 2025 International Symposium on Memory Management), and the following Q&A about the CPU/footprint tradeoff in memory management: https://www.youtube.com/watch?v=mLNFVNXbw7I
...to the extent possible within their project budget. Otherwise the product would — as GP already pointed out — not exist at all, because the project wouldn't be undertaken in the first place.
> This goal can be reached by learning to write Rust; practice makes perfect.
Pretty sure it could (at least) equally well be reached by learning to write Ada.
This one-note Rust cult is really getting rather tiresome.
The problem with this statement is that without a memory safety invariant your code doesn't compose. Some code might assume no UAF and other parts could and you'd have a mismatch. Just like borrow checker is viral, so is the unsafety.
> If you don't say what Zig programmers should give up to gain more safety, saying "all new languages should be memory-safe" is about as meaningful as saying we should write fewer bugs.
The goal of all engineering disciplines, including software, should be a minimization of errors and defects.
Here is how engineering in any other non-computer science field takes place. You build something. See where it breaks; try to build it again given time and budget constraints. Eventually you discover certain laws and rules. You learn the rules and commit them to a shared repository of knowledge. You work hard to codify those laws and rules into your tools and practice (via actual government laws). Furthermore, you try to build something again, with all the previous rules, tools, and accumulated knowledge.
How it works in tech. You build something. See where it breaks, say that whoever built it was a cream-for-brain moron and you can do it better and cheaper. Completely forget what you learned building the previous iteration. See where it breaks. Blame the tools for failure; remove any forms of safety. Project cancelled due to excessive deaths. Bemoan the lack of mental power in newer hires or lack of mental swiftness in older hires. Go to step 1.
You'll notice a stark contrast between Engineering and Computer Tech. Computer tech is pop culture. It's a place where people wage wars about whether lang X or lang Y is better. How many times did programming trend go from static to dynamically typed? How many times did programming learned a valuable lesson, only for everyone to forget it, until decades later another language resurrected it?
Ideally, each successive language would bring us closer and closer to minimizing defects, with more (types of) safety and better guarantees. Is Rust a huge leap compared to Idris? No, it's better than Ada at memory safety that's for sure.
But it's managed to capture a lot of attention, and it is a much stricter language than many others. It's a step towards ideal. And how do programmers react to it? With disgust and a desire for less safety.
Sigh. I guess we deserve all the ridicule we can get.
Yes, but that holds for any correctness property, not just the 0.0001% of them that memory safe languages guarantee. That's why we have bugs. The reason memory safety is a focus is because out-of-bounds access is the leading cause of dangerous vulnerabilities.
> The goal of all engineering disciplines, including software, should be a minimization of errors and defects.
Yes, but practical minimisation, not hypothetical minimisation, i.e. how can I get the least bugs while keeping all my constraints, including budget. Like I said, a language like Rust exists because minimisation of errors is not the only constraints, because if it were, there are already far more popular languages that do just as much.
> You'll notice a stark contrast between Engineering and Computer Tech.
I'm not sure I buy this, because physical, engineered objects break just as much as software does, certainly when weighted by the impact of the failure. As to learning our lessons, I think we do when they are actually real. Software is a large and competitive economic activity, and where's there's a real secret to more valuable software, it spreads like wildfire. For example, high-level programming languages spread like wildfire; unit tests and code review did, too. And when it comes to static and dynamic typing, the studies on the matter were inconclusive except in certain cases such as JS vs TS; and guess what? TS has spread very quickly.
The selective pressures are high enough, and we see how well they work frequently enough that we can actually say that if some idea doesn't spread quickly, then it's likely that its impact isn't as high as its fans may claim.
> And how do programmers react to it? With disgust and a desire for less safety.
I don't think so. In such a large and competitive economic activity, the assumption that the most likely explanation to something is that the majority of practitioners are irrational seems strange to me. Rust has had some measure of adoption and the likeliest explanation for why it doesn't have more is the usual one for any product: it costs too much and delivers too little.
Let's say that the value, within memory safety, between spatial and temporal safety is split 70-30; you know what? let's say 60-40. If I can get 60% of Rust's value for 10% of Rust's cost, that a very rational thing to do. I may even be able to translate my savings into an investment in correctness that is more valuable than use-after-free.
Rust achieves practical minimization, if not outright eradication, of a set of errors even in practice. And not just memory safety errors.
> Like I said, a language like Rust exists because minimisation of errors is not the only constraints, because if it were, there are already far more popular languages that do just as much.
The reason Rust exists is that the field hasn't matured enough to accept better engineering practices. If everyone could write and think in pre/post/invariant way, we'd see a lot fewer issues.
> I'm not sure I buy this, because physical, engineered objects break just as much as software does, certainly when weighted by the impact of the failure.
Dude, the front page was about how Comet AI browser can be hacked by your page and ordered to empty your bank account. That's like your fork deciding to gut you like a fish.
> the assumption that the most likely explanation to something is that the majority of practitioners are irrational seems strange to me.
Why? Just because you are intelligent doesn't mean you are rational. Plenty of smart people go bonkers. And looking at the state of the field as a whole, I'd have to ask for proof it's rational.
It achieves something in one way, while requiring you to pay some price, while other languages achieve something in a different way, with a different cost. I've been involved with software correctness for many, many years (early in my career I worked on safety-critical, hard realtime software, and then practised and taught formal methods for use in industry), and there is simply no research, none, suggesting that Rust's approach is necessarily the best. Remember that most software these days - not the kind that flies aeroplanes or controls pacemakers, that's written mostly in C, but the kind that runs your bank, telecom supplier, power company, healthcare etc. but also Facebook - is written in languages that offer the same guarantees as Rust, for better or worse.
> If everyone could write and think in pre/post/invariant way, we'd see a lot fewer issues.
Except I've worked with formal methods in a far more rigorous way than Rust offers (well, it offers almost nothing), and in this field there is now the acknowledgement that software correctness can be achieved in many different ways. In the seventies there was a consensus about how to write correct software, and then in the nineties it all got turned around.
> That's like your fork deciding to gut you like a fish.
I don't think so, because everyone uses a fork but this is the first time I hear about Comet AI. The most common way to empty people's bank accounts is still by conning them.
> Why? Just because you are intelligent doesn't mean you are rational.
Intelligence has nothing to do with it. But "rational" here means "in accordance with reality", and since software is a major competitive economic activity, which means that if some organisations act irrationally, there's both a strong motivation and an ability to take their lunch money. If they still have it, they're probably not behaving irrationally.
Pet projects are nice, but slow down with the copium intake.
The future is many things, but a love letter to C is definitely not it.
Zig is cute and a fun hobby project that might see above average success for a hobby project. But that's about it. It doesn't address the problems people actually have with C++, not like Rust or Swift do, and it certainly isn't going to attract any attention from the Java, JavaScript, C#, Python, etc... of the world.
> It doesn't address the problems people actually have with C++, not like Rust or Swift do
Speak for yourself. Zig addresses pretty much all of my problems with C++:
- terrible compile times - overly complex and still underpowered template/meta programming - fragmented/ancient build tools - actually good enums/tagged unions - exceptions - outdated OOP baggage
I don’t actually mind C++ that much, but Zig checks pretty much all of my boxes. Rust/swift check some of these but not all and add a few of their own.
> and it certainly isn't going to attract any attention from the Java, JavaScript, C#, Python, etc... of the world.
Yeah of course. Zig isn’t trying to be a max ergonomics scripting language…
That doesn't seem that odd to me. It's a trade off: more flexibility, but more manual work. Maybe I have a buffer that I've allocated that I'm not using anymore (say I have a buffer pool) and want to use it again. If the type allocates its own behind the scenes, I can't do that. Or maybe I'm working in an environment where I need to statically allocate all of my resources up-front, and can't allocate later.
The big downside is that if 90% of people are just going to allocate a buffer and pass it in, it sucks that 90% of people need to do more work and understand more minutiae when only 10% of the people actually need to. The holy grail is to give lots of flexibility, but make the simple/common case easy.
A simple improvement to this interface might be to allow the caller to pass a zero-length buffer (or Zig's version of null), and then the type will allocate its own buffer. Of course, there's still a documentation burden so people know they can do that. Another option could be to have second constructor function that takes no buffer arguments at all, which allocates the buffers and passes them to the fully-flexible constructor function.
Isn't that the reason why Zig passes around allocators everywhere? If you're using a buffer pool, you should probably be handing out some kind of buffer pool allocator.
Requiring all allocation to have happened before execution is still a good reason to pass buffers around, but I feel like the other situations you describe can be solved by just passing the right allocators.
I would say just stay away from the standard library for now and use your OS API, unless you're willing to be a beta tester.
Personally, I think it is wrong to inflict your experiments on other people and when you pull the rug out from underneath say, well, we told you it was unstable, you should't have depended on us in the first place.
I don't even understand what zig is supposed to be. Matklad seems to think it is a machine level language: https://lobste.rs/s/ntruuu/lobsters_interview_with_matklad. This contrasts with the official language landing page: Zig is a general-purpose programming language and toolchain for maintaining robust, optimal and reusable software. These two definitions are mutually incompatible. Moreover, zig is clearly not a general purpose language because there are plenty of programming problems where manual memory management is neither needed nor desirable.
All of this confusion is manifest in zig's instability and bloated standard library. Indeed a huge standard library is incompatible with the claims of simplicity and generality they frequently make. Async is not a feature that can be implemented universally without adding overhead and indirection because of the fundamental differences in capabilities exposed by the various platforms. Again, they are promising a silver bullet even though their prior attempt, in which they publicly proclaimed function coloring to be solved, has been abandoned. Why would we trust them to get it right a second time?
There are a very small number of assembly primitives that every platform provides that are necessary to implement a compiler. Load/store/mov/inc/jeq/jump and perhaps a few others. Luajit implements its parser in pure assembly and I am not aware of an important platform that luajit runs on that zig goes. I do the vast majority of my programming in lua and _never_ run into bugs in the interpreter. I truly cannot think of a single problem that I think zig would solve better than luajit. Even if that did exist, I could embed the zig code in my lua file and use lua to drive the zig compiler and then call into the specialized code using the lua ffi. But the vast majority of code does not need to be optimized to the level of machine code where it is worth putting up with all of the other headaches that adopting zig will create.
The hype around zig is truly reaching llm levels of disconnection from reality. Again, to believe in zig, one has to believe it will magically develop capacities that it does not presently have and for which there is no plan to actually execute besides vague plans of just wait.
For a lot of simple calls, that works out pretty well, once you know all the tricks to Zig's syntax. A lot of requirements and implications are generally written out in very simple and short functions that are often logically named. Things like Allocators are pretty easy conceptually even if you probably don't want to write one yourself.
It all breaks down when you start dealing with complex concepts. Zig's new I/O system looks a lot like Java's streams and wrappers and readers and writers, all wrapping around each other to make sending encrypted text over a secure channel as simple as output.write("hello").
I think the new I/O system combined with the lack of documentation about how to use it was a mistake. I'm not even sure if expressing a typing system as complicated as this in the Zig standard library is a good idea. The entire language runs on clear, concise, short and readable methods, and the new system doesn't seem idiomatic in that way.
If the tradeoff was absolute performance/avoiding introducing load-bearing performance-lowering abstraction I think that goal was achieved, but DX may have gone out the window.
Not trying to imply that’s an explicit goal (probably instead just a resource problem), but an observation
Generally speaking I think it is the right trade off for now. Purely inferring from Andrew and the Zig's team online character as I don't know them in person, I think they do care a lot of DX, things like compiling speed and tools. So I think once 1.0 come I won't be surprised if it will have extremely good documentation as well.
And I would argue, writing good, simple, clear, detailed documentation is actually harder than writing code itself.
In 2018, seven years ago, Andrew announced he'd go full-time on Zig and quit his paying job to live off donations instead.
In 2020, so five years ago, Zig's 501(c)3 the ZSF was announced, to create a formal structure to hire more people in addition to the few already on Zig.
So, "most of its time" is just not true. For "most of its time" Zig was a small, largely independently funded project for multiple people, for a tiny period it was a part-time project, and for a while after that it was solo, but those weren't the majority of its existence.
I like to compare this to real world cathedral building. There are some cathedrals that are literally taking centuries to build! It's OK if the important, but difficult thing takes a long time to build.
If you keep up the development pace you're going to approach stability. Unless you're in a manic spiral of rewrites.
Making a programming language from scratch is a long endeavor when it's a one man project.
They unarguably cause confusion for everyone as they change.
But it lets you choose the right abstractions that are going to stick for decades.
If you're going to make a python2 -> python3 transition in your language, make sure it's X0 -> X1.
That said, others have pointed out that writing documentation and tests helps improve quality quite a bit, and in this case it would also increase usability. I think I'd agree with this stance, but there is no way I could make the statement that even most of the code I've written for public consumption had excellent documentation or examples. So I've got no leg to stand on there, just the armchair.
> And I would argue, writing good, simple, clear, detailed documentation is actually harder than writing code itself.
All the more reason why it must be done! A little silly but from my armchair maybe it's one of those "start with the interface you want and work backwards", but the problem is that approach can be at odds with mechanical sympathy and we know which side Zig lands on (and arguably should land on based on it's values).
If so, I believe Zig will stay within a niche. Lower entry barriers allow "script kiddies" to easily start withe language, and they eventually will become leading engineers. Only a few people tend to go straight for the highest practice without "playing around". IMHO the reason, why PHP got so popular (it was not good back then, just very very easy to start with).
Yes.
I think a contributor that really wanted to help the ecosystem would start in the stdlib and then start moving outwards. Even if it was LLM-assisted, I think it could be high value.
IIRC Loris already has an engine for building websites with Zig, but making sure that every Zig library has docs (similar to rustdocs) might be a great start. It is incredibly useful to have a resource like rustdocs, both the tooling and the web sites that are easily browsable.
Again, maybe everyone in the Zig ecosystem just has amazing editor setups and massive brains, but I personally really like the ease of browsing rustdoc.
> If so, I believe Zig will stay within a niche. Lower entry barriers allow "script kiddies" to easily start withe language, and they eventually will become leading engineers. Only a few people tend to go straight for the highest practice without "playing around". IMHO the reason, why PHP got so popular (it was not good back then, just very very easy to start with).
I agree, but I'd add that the niche they're aiming for is systems programming, so they're probably fine :). The average hacker there is expecting C/C++ or to be near the metal, and I think Zig is a great fit there. They're likely not going to convince people who write Ruby, but it feels reasonable for C hackers.
Also I want to just be clear that I think Zig has a lot of motivating factors! They're doing amazing things like zig cc, unbelievably easy, "can't believe it's not butter" cross-compilation, their new explicit/managed I/O mechanism, explicit allocators as a default, comptime, better type ergonomics. It's a pretty impressive language.
Tbh, this sort of auto-generated docs from source code is not all that useful, since you get that same information right in the IDE via the language server.
The important documentation part that's currently missing is how everything is supposed to work together in the stdlib, not the 'micro-documentation' of what a single type or function does. And for this sort of information it's currently indeed better to look at example code (e.g. the stdlib's testing code).
IMHO it's way too early for this type of high-level documentation, since things change all the time in the stdlib. Putting much work into documenting concepts that are discarded again anyway doesn't make much sense.
I am just editing docs now that Claude Code writes for me. I am fanatic about developer docs (and I guess an exception as I love writing them) but with a set of concise instructions for CC and some writing style examples I get 90% there, sometimes 99%.
If you believe you don't have time for the last 1--10% you should not be in charge of writing any API used by anyone but yourself. Just my two c.
Lack of docs also cripple AI from understanding, so future adoption becomes even more bleak.
If an api or library developer didnt bother doing even bare minimum docs, my confidence in the library drops aswell.
Did they skip testing aswell? Ran the happy path for a day and called it good?
This post sour my interest in zig. Its now obvious to me now why rust took much of its market.
Zig (programming language) - First appeared 8 February 2016; 9 years ago
Rust (programming language) - First appeared January 19, 2012; 13 years ago
Also, Zig at this point isn't really a brand new language anymore. I have comments on their issues dating back to 2018, so it's been a very active language since at least then.
Just getting started is an even bigger reason to have good docs to clearly communicate how the libraries and APIs work!
I wouldn't even read a push request containing a new function if the creator didn't bother writing a short description and usage clarification.
Getting started is a good excuse for limited libraries or support (same situationwith rust). But lack of even basic docs is not acceptable if you want user adoption.
Of course, this isn't meant to be a defense for the lack of documentation on Zig's side, but in my experience, Zig's code definetly is much easier to read, just because for the fact, that Rust's std code is akin to C++'s stl.
One of the personal grimes i have with Zig is, that `anytype` makes the function contract kind of meaningless, because you can't see what is expected purely on the function definition.
I think what I’ve come to realize is that when I feel a barrier toward doing that work, it’s a sign that I don’t actually like the underlying API. I don’t want to document it or craft examples or tutorials because in my mind the API sucks, it’s transitional, and documenting it in its current state is going to somehow lock in a bad interface and further increase the effort required to change and fix it up later.
I tried Zig a couple of times and I got that feeling: very powerful and clever language but not really for me, I don't have the headspace, sorry. I need something I can debug after an 8 hours dayjob, a commute and having put the kids to bed. It better be inviting & fun! (Hi, C).
(Loris Cro being a key community figure isn't helping in any way, and it's a good remainder that if you don't clear up your community from bullies from the beginning, they will turn your entire community to a miserable place. And that's a shame because from what I've seen, Andrew Kelley seems to be a very cool guy in addition to being very smart).
It is the anti-intelectualism from Go culture, gone wild against C++, Rust, Swift, anything modern, or even tools, using game engines versus doing the whole computer from scratch for a game.
Andrew's talk is here (second event after the two people chatting while sitting on chairs): https://handmadecities.com/media/seattle-2024/hms-day-one/
Here you can see a particularly funny (but also sad) reaction by one of these people https://drive.proton.me/urls/MB1EB4EF34#YZdvmAvBFp1C
> using game engines versus doing the whole computer from scratch for a game
That said you are doing yourself a disservice if you think that not using an engine to make a game is a form of "anti-intellectualism".
https://wiki.xxiivv.com/site/2025.html (the entry under 19b)
It seems this was a right vs. left (or liberal) split.
All this, combined with the fact that Zig, at best is still in beta quality and at worst amounts to a massive waste of everyone’s time, makes it unsurprising that people block you and simply refuse to engage with your loud community efforts, endless churn and crust tied to beta quality compiler.
> Zig is not really a handmade project, case in point both Andrew and I are blocked on social media by the two gods of the handmade movement (casey and john) and, according to their die hard fans, Andrew gave a talk at the last handmade conference that caused the community to split apart (the reality is a bit more complex than this, but Andrew's talk is certainly one that you wouldn't see at their new "better software" conference).
> Andrew's talk is here (second event after the two people chatting while sitting on chairs): https://handmadecities.com/media/seattle-2024/hms-day-one/
> Here you can see a particularly funny (but also sad) reaction by one of these people https://drive.proton.me/urls/MB1EB4EF34#YZdvmAvBFp1C
Regarding the links you posted:
In the first, at 2:30:40, Andrew Kelly publicly calls out a specific author of a competing technology in exaggerated, caricatured, and fabricated context.
In the second video, yet another author of a yet another competing technology directly points out this unapologetic and concerning behavior on Andrew Kelly’s part.
And now you—“VP of Community @ Zig Software Foundation”—assert your “righteous” stance by sharing these videos, while ironically pointing out that some of those same individuals (of competing technologies fame) block you on social media.
Too bad that doing your job probably means being as loud and visible online as possible to spread the molecules of Zig no matter what.
Depends on the attitude, not using an engine, because one wants to learn the whole stack, makes all sense, after all people need to learn how to make game engines, if nothing else for the next generation.
Not using one out of spite, because we do everything handmade over here attitude, is a completely different matter.
Maybe, or maybe the fact that Zig is a small independent project with limited resources has also something to do with it, and this kind of shaming says less about Zig than you'd think.
When I first joined the Zig project, Zig was still using the bootstrap compiler written in C++ that would not free memory (it took more than 4GB to compile the Zig compiler). Some people at the time were asking us to prioritize work on the package manager but Andrew rightfully wanted to prioritize rewriting the compiler instead. In hindsight this was the obviously right decision: a package manager implies that one can very easily add an order of magnitude more code to their project, stressing the performance of the compiler. If we had not prioritized core infrastructure over giving people what they wanted faster, today we would have people complaining that adding a single dependency to their project makes the build impossible to complete.
The Zig project has a huge scope and we are a small independent organization. This makes us extremely nimble and efficient, but it does mean that we need to do things in the order that makes the most sense for the project, not for what the public wants.
The fact that we develop in the open doesn't mean that the language is ready yet.
People that already have the required domain knowledge (and who have a tolerance for breaking changes) will have the opportunity to be early adopters if they wish to do so, others will have to wait for Zig to become more mature. And we do make this clear in releases and all forms of public communication.
We have gone a long way since the bootstrap compiler days, but we are still missing key infrastructure:
- we have a x86_64 custom backend but aarch64 is not complete yet - incremental compilation is showing that we can get instant rebuilds of large projects, but it has missing features and it doesn't work on all platforms yet - we need native fuzzing since AFL keeps regressing everytime a new version of LLVM comes out - for the longest time we haven't had a strong I/O story, now we're finally working on it
The time for paving the road for a new generation of programmers will come (it's in the ZSF mission statement btw), but first we need to finish the plumbing.
That sounds strange. Modern C++ requires very little manual memory management, at least when you're writing something high-level like a compiler. C++11 had been out for years when development on Zig started. Were they writing C++ old-school as C-with-classes and malloc() everywhere? Why not use a more appropriate language for the first prototype of a compiler for a brand new language?
Why would you care about these kinds of micro-optimizations at that stage of development, when you don't even know what exactly you need to build? We're not talking about serious algorithmic improvements like turning O(n²) into O(n) here.
> Freeing memory can actually cost you performance, so why not just let the OS clean up for you at exit(2)?
Because a compiler is not some simple CLI tool with a fixed upper bound on resource consumption.
it turns out that compiler speed is bound by a bunch of things, and it's death by a thousand cuts. If you have a slow compiler, and it takes forever to compile your compiler, your language becomes scelerotic, no one wants to make changes, and your language gets stuck in shitty choices.
> Because a compiler is not some simple CLI tool with a fixed upper bound on resource consumption
yes. thats right. a compiler is complex and should use several different allocation strategies for different parts. if your language steers you towards using malloc for everything then your compiler (assuming it's bootstrapped) will suffer, because sometimes there are better choices than malloc.
> When I first joined the Zig project, Zig was still using the bootstrap compiler written in C++ that would not free memory (it took more than 4GB to compile the Zig compiler). Some people at the time were asking us to prioritize work on the package manager but Andrew rightfully wanted to prioritize rewriting the compiler instead. In hindsight this was the obviously right decision: a package manager implies that one can very easily add an order of magnitude more code to their project, stressing the performance of the compiler. If we had not prioritized core infrastructure over giving people what they wanted faster, today we would have people complaining that adding a single dependency to their project makes the build impossible to complete.
[1] https://news.ycombinator.com/item?id=44994886
I think you may have missed that the intention was always to rewrite the compiler to become self hosted. Improving the C++ implementation any more would've been pointless optimization.
Or, maybe it's this kind of redirection and evidence of a victim complex. Part of the reason that there's a patina of anti-Rust sentiment includes the dismissive attitude and swipes you, a the VP of Community at the Zig Software Foundation, take towards Rust and Rust developers by writing about topics you aren't involved in and don't have a solid grasp of.
https://kristoff.it/blog/raii-rust-linux/ https://lobste.rs/s/hxerht/raii_rust_linux_drama#c_gbn1q8
Or similarly, comments like this one in this thread: https://news.ycombinator.com/context?id=44994749
Loris is a bully, that's the problem, not his opinions.
pcwalton infamously declared zig was "a massive step back for the industry" https://x.com/pcwalton/status/1568306598795431936?s=46&t=OCi.... He and the Rust Core Team had a big reputation for burning bridges. Even to this day, the new Rust leaders are happy to attack other memory safe languages like Go, declaring them "not memory safe" https://news.ycombinator.com/item?id=4467200
I think Kristoff remembers these attacks, and crucially how very few voices within the Rust community push back against Rust supremacism.
> [Golang] "not memory safe"
Both of these are entirely fair assessments, not "attacks". Golang really does have memory safety issues with concurrent code, and a memory-unsafe language like Zig is a step back even compared to Java/C#, let alone Rust.
The end result is that Rust's leaders either avoid interacting with other languages, or engage in flamewars. I think it's a big reason why Java, the most popular and successful memory safe language in the world, has little-to-no formal contacts with the Rust team.
I can't believe you really wrote that.
> Then [Anti Rust person] should have been perma-banned long ago [on an open forum]. Until this is done, we'll have to warn people about engaging with him…
You can hate kristoff, ignore him or attack his arguments. You can also love a piece of software and treat it as sacred. But other people should not be subjected to that love nor should they be canceled on account of it. Flag it or downvote. Beyond that, it's outside of our control.
I do agree though that kristoff should focus on Zig and not indulge in provoking old enemies. His valid points— deferring premature documentation for newbies until concepts are ironed out— are being lost in programming language holy wars.
I'm no language though…
> You can hate kristoff, ignore him or attack his arguments. You can also love a piece of software and treat it as sacred.
It's not about “love” or “being sacred” or even about Rust or Zig, it is about behaving in society. Most successful communities at some point meet toxic people who want to start holly wars and insult people. Successful communities are the one who ban those people, or at least coerce them into behaving through social pressure.
When you don't do that, you end up with bullies like Loris occupying prominent positions, and that's very bad for the community because it attracts people like that.
It has nothing to do with programming languages at all.
And sorry to say that bluntly, your discourse about “supremacism” “love” or “sacred” sounds very immature: programming languages are tools and engineering projects, not icons that should be worshipped or hated, they all have their strong points and warts (and god knows Rust has its share of annoyances…). Don't get dragged in holy wars by cult leaders like Loris.
What might be the better critique about this, is that any programming language's leadership should not be engaging in that kind of bad behavior. And any ill words coming from them about another language, should always be taken with a grain of salt and seen as likely bias.
Maybe you can make a hierarchy between technical criticism like this and the fact that Go isn't technically memory-safe[1], with Loris' abusive behavior of calling Rust maintainers names like “wankers”…
[1]: which is a criticism mostly coming from the Java crowd, by the way, not Rust, like the criticism of the simplistic garbage collection management in Go
It reflects poorly on any leadership engaging in that kind of bad and unprofessional behavior, and it eventually backfires on any project or person. People eventually notice it and figure out the foul things that have been or is being done, then demand accountability or walk away from the toxicity.
I will hopefully wait for comments from Ghostty, Bun or Tigerbeetle Devs.
On another point that is wroth mentioning, I hope Andrew will at least put it out publicly, IMO Zig isn't Anti-Rust. But it did attract the type of people who are not too happy with Rust. I dont remember a single time Zig came out to bash anything about Rust. It isn't Anti anything at all. Its goal is to be a decent C replacement as very high level assembly languages, aiming at Data Oriented Design with some interesting features, and extremely fast compilation speed. ( Which was the reason why I was interested in it in the first place. I miss Turbo Pascal )
Zig reminds me of the old school, traditional projects. It isn't perfect for everything, it never claims to be, if you like it, use it. If not, there are plenty of options out there.
At least Ghostty Dev seems to be enjoying it almost every day.
But as you say, there's no reason why Zig ought to be anti-Rust, both language are are fresh attempts at low level programming, both highly opinionated and with very different philosophies and trade offs and both language can cohabit peacefully (I've heard good things about using the Zig toolchain for cross compilation of C dependencies in rust projects, so the existence of Zig already has had a positive impact on the Rust ecosystem).
While he can get petty when talking about memory safety, he was never obsessed by Zig in any way, his tweet that ended up with Loris calling him “Coomer” among other names was a response to someone mentioning that Zig wasn't memory safe. I've never seen him mentioning Zig first on Twitter.
And on the other hand you have Loris, obsessed so much by Rust he can't help spawning in Rust threads or write dumb anti-rust rants on his blog…
(In fairness, I must say that Loris isn't even the worst offender, that title goes to M. Presler, who keeps spawning every Rust thread telling everyone who want to hear it that Zig is much better, despite having seemingly no stakes in the game (being a Java guy and having no public Zig contribution or even documented use of the langage). Sounds like the “Rust evangelical strike force” emotionally hurt a bunch of people so much they became lifetime haters).
I have not seen much of any anti-Rust sentiment in the community. There's a lot of people in the community who do Rust, like rust, and work on rust projects. If the Zig community has an anti-anything sentiment, it's against C++.
(You won't have to seek long in his HN history to find an instance of such behavior)
What's not very good are the people who don't like Rust, who are uneasy with Rust is eating the system programming world, and are now pushing Zig as the champion of the resistance about Rust.
It happens a lot, unfortunately.
I didn't.
Something - anything. As much as I like Zig, I dread returning to it after a few months of being out of the loop.
Other options include but are not limited to providing minimal, low effort examples, high-level overview, linking to projects using these features, linking to relevant tests, commits, or source code, setting up an official community wiki and encouraging people to contribute.
Where does a beginner go to learn how to use the package manager these days? It looks like they still won't find any clues in the "Learn" section of Zig's website.
There's a promising page titled "Zig Build System" in there which references a "Package Management" section, but when you click on it, it doesn't exist!
But to answer your question, it exists in the comments of the auto-generated build.zig.zon file
The official answer to complaints about missing documentation has always been "ask in Discord". Pretending that this isn't the case is just disingenuous.
> But to answer your question, it exists in the comments of the auto-generated build.zig.zon file
The comments document the file format of build.zig.zon, they don't tell you anything about how to actually use a dependency in your build system.
Pretty funny, coming from someone who went from "false dichotomy" to "the false equivalency".
The zig users on this thread seem to not understand this, and all seem to think documentation is a thing you write later for users when everything settles down. Or is somehow otherwise "in the way" of feature and API development speed.
That is a very strange view.
If writing developer documentation is having serious affect on your language feature velocity, you are doing something very wrong. Instead, writing it should make things move faster, becuase, at a minimum, others understand what you are trying to do, how it is going to work, and can help. Also, it helps you think through it yourself and whether what you are writing makes any sense, etc.
Yes there are people who can do this all without documentation, but there are 100x as many who can't, but will still give high quality contributions that will move you along faster if you enable them to help you. Throwing out the ability to have these folks help you is, at a minimum, self-defeating.
I learned this the hard way, because i'm one of those folks who can just stare at random undocumented messy code and know what it actually does, what the author was probably trying to do, etc, and it took years till i learned most people were not like this.
While tests aren’t quite as good documentation as actual documentation, they are guaranteed to not be out of date.
Of course documentation is good. But if you have to prioritize either a new feature, or a critical bugfix, or documentation, you often can't have it all
I do very much prefer moving fast though, so I get it, docs-later is obviously a very valid way of doing things.
If someone is excited about Zig and wanted to make a difference I guess it’s now obvious where they could have outsized impact!
151 more comments available on Hacker News