Optimising for Maintainability – Gleam in Production at Strand
Key topics
The quest for a seamless distribution process has sparked a lively debate around packaging Gleam/Erlang executables without bundling the Erlang VM. While some commenters pointed out that compiling to JavaScript is a viable alternative, others noted that this approach results in hefty executables - a "hello world" example clocking in at nearly 50MB. Meanwhile, native binary compilation options for the BEAM are still in the works, with some users turning to tools like Burrito to create single executables for Elixir, and potentially other BEAM languages. As the discussion unfolds, it becomes clear that the search for a streamlined distribution process is an ongoing challenge with multiple potential solutions on the horizon.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
54m
Peak period
10
4-6h
Avg / period
2.9
Based on 29 loaded comments
Key moments
- 01Story posted
Aug 28, 2025 at 11:30 AM EDT
4 months ago
Step 01 - 02First comment
Aug 28, 2025 at 12:24 PM EDT
54m after posting
Step 02 - 03Peak activity
10 comments in 4-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 29, 2025 at 3:46 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
There have been some projects for creating self-extracting executable archives for the VM, and some projects for compiling BEAM programs to native code, but nothing has become well established yet.
Personally, I think I'd prefer something that worked without unpacking, but I don't actually need something like this, so my preferences aren't super important :D
You might not get the "handle" you're looking for?
Interesting point and one I haven't seen before. Almost like arguing that AI will work best with things it can learn quickly, rather than things that have lots of examples.
Garbage in, garbage out. If you confuse it with a lot of Junior-level code and have a languages that constantly changes best practices, the output might not be great.
On the other hand, if you have a languages that was carefully designed from the start and avoids making breaking changes, if it has great first party documentation and a unified code style everyone adheres to, the LLM will have an easier time.
The later also happens to be better for humans. Honestly the best bet is to make a good language for humans. Generative AI is still evolving rapidly so no point in designing the lang for current weaknesses.
Like instead of "the entire internet", here's a few hundred best-practice projects, some known up-to-date documentation/tutorials, and a whitelist of 3rd party modules that you're allowed to consider using.
Any language that is difficult for an AI to understand will have to get popular by needing far less boilerplate code for AIs to write in the first place. We may finally start designing better APIs. Or lean into it and make much worse ones that necessitate AI. Look especially to an AI company to create a free razor and sell you the blades.
Well, one glaring issue with the assumption of the quality of LLM output being mostly dependent on a large volume of examples online would be Sturgeon's law.
I think the best programming languages of the future will come with their own LLMs, synthetically trained before release.
I'm curious if I've had the wrong impression of Gleam. My assumption was that it was bringing static types to the BEAM's processes and OTP, but it seems like it's mainly a statically typed language that just happens to be on the BEAM and that it isn't necessarily looking to solve the "static type the messages" in Erlang and Elixir. Is that correct?
I'm not saying either way is bad or good. I'm just trying to get a sense of the language's origins and where it's going compared to Elixir and its gradual typing story. For example, if I know and like F#, Elixir, and Rust, what is the selling point of Gleam?
The language tour covers _the language_ rather than the concurrency framework, so you'd look to the Erlang and Gleam OTP documentation to learn about the framework. Erlang and Elixir have a similar documentation split, for example, the most popular book teaching Erlang, LYSE, has the first half about the language and the second half about the framework.
> it seems like it's mainly a statically typed language that just happens to be on the BEAM and that it isn't necessarily looking to solve the "static type the messages" in Erlang and Elixir. Is that correct?
Not quite! Gleam does have a type-safe derivate of the OTP framework, while maintaining full compatibility with the original untyped OTP. When writing a Gleam application that runs on the Erlang VM you will be using typed actors and messages.
> I'm just trying to get a sense of the language's origins and where it's going compared to Elixir and its gradual typing story.
Elixir and Gleam's respective type systems are about as different as you can get while still being type systems. The difference is so large that they offer completely distinct advantages and development experiences, so people who especially value and enjoy one are unlikely to find the other to their liking. I believe this is a real strength! It means that the BEAM ecosystem offers a wider range of programming styles, and so it can attract and serve a wider range of developers, growing the ecosystem as a whole. BEAM languages work together closely.
> if I know and like F#, Elixir, and Rust, what is the selling point of Gleam?
I could give you a list of technical benefits to picking Gleam, but really I think this sort of technology choice is very personal. You may find Gleam to be a language that you enjoy and are productive with, and if that's the case then it's a great tool for you! Your use of Rust, F# suggests you'll appreciate the programming style, and your use of Elixir suggests you'll appreciate using the same BEAM runtime, so perhaps give it a try.