The Death of Utilitarian Programming
Key topics
A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.
Libraries can be somewhat utilitarian, at least more than frameworks. At least they provide some reusable functionality to the user out of the box like logging, scanning a barcode, fetching data from a URL, etc. But again, a lot of indirection and little lasting time, what did *you* learn about implementation and life in that process my friend?
It's my strong belief that our life's purpose isn't just about learning technology but also other non-technical things in life (such as life itself). By compartmentalizing themselves into libraries, frameworks, specifications, package managers, build and tooling, etc, many coders over the last decade have sort of divorced themselves from the intricacies and interaction with life itself.
A decade ago from now (i.e. circa 2014-15) is where I'd say utilitarian coding came to an end. The kind of programming that prevailed until then (mostly desktop programming) was highly utilitarian in nature. You used to develop a Winforms App for the client, with actual textboxes, dropdowns and buttons, tailored to their specific requirements and domain knowledge, what could be more utilitarian than that! You used to gain domain expertise and not just technology expertise.
As things started moving to the cloud, the interaction between the end-user and programmer became less and less, that's when utilitarian coding started dying too. As a new breed of specialists called "Agile Experts", "Scrum Masters", "Tech Advocates", "Thought Leaders", etc. started inserting themselves between the coder and end user, the former's role started morphing as the ostrich policy of dealing only with technology and nothing else. We started losing touch with domain expertise, and became branded as "python coder", "PHP scripter", "web developer", "AI developer", etc. That's how folks started churning out more frameworks, libraries, packages, stencils, helper scripts, etc. instead of worrying about actual problem solving with the stakeholders.
This is how things stand right now for the most part, desktop development and other forms of utilitarian coding have still maintained their small niche somewhere, but they're just a niche. But it's not a healthy development, nor is it sustainable long term. I strongly feel that this bubble is waiting to burst one day soon, and there will be a reversion towards utilitarian coding again. Even the cloud itself needs to be more utilitarian, a lot of needless clutter out there which can be simplified.
What do you think? Let me know in comments.
The author argues that 'utilitarian programming' is dying, as developers focus on abstract frameworks and tools rather than directly useful code, sparking a discussion on the nature of software development and its relationship to real-world problems.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
27m
Peak period
16
0-3h
Avg / period
6.4
Based on 32 loaded comments
Key moments
- 01Story posted
Sep 28, 2025 at 11:21 AM EDT
3 months ago
Step 01 - 02First comment
Sep 28, 2025 at 11:47 AM EDT
27m after posting
Step 02 - 03Peak activity
16 comments in 0-3h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 30, 2025 at 2:02 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Software has a deeply ingrained craftsman culture, which empathized personal flavor, unique approaches, stylistic debates over engineering. Surprisingly this gets worse in large organizations, where adherence to stylistic choices, specific tools and workflows, supersedes engineering at every point. Software is still full of fads, where every couple of year a new or old flavor is found, which is now the "right" thing to do, which will then be defended and attacked until a new fad is found.
The stark difference between embedded development, especially in aerospace, to "normal" software development is really my point.
Fortunately, AI-assisted coding seems to be wrestling back coding from developers and re-empowering domain experts and systems analysts. A full recession will probably shake out a lot more of the waste in software development.
Of course they were, being an architect astronaut got you hired.
You have to get passed the resume filters and the architecture interview to even have the chance to work on the internal enterprises tool no one uses anyway. People just respond to what will grow their career.
No more easy money = no more engineering for engineering’s sake, and companies are increasingly becoming more privy to the fact that architecture astronauts are a liability, and cloud is a 95% distractions meant to absorb time energy and money from IT orgs within large companies.
I’ve personally switched out of devops and to a domain aligned software team within my company. I am absolutely fed up with how wasteful and distracting the state of devops is to the actual business.
Small, sharp tools, my friend.
I'm not sure many successful engineering orgs did much of that but also the environment our creations live in is much different now.
It was a huge task just to get our programs to run at all not very long ago. When I first started coding I was working on some code for a VLSI problem and I spent like a month just figuring out how to get the problem to fit on disk (not main memory, but on disk!). Now there are similar things that run on a laptop [0]. And it's not like I'm unique in having to come up with crazy solutions to problems tangent to the primary problem in this space. [1]
Now the infrastructure for our code is amazing and the machines that execute them abound in resources, especially in the cloud. Now that the yoke has been cast off it makes sense more time is spent on the solution the actual problem you set out to solve in the first place.
[0] https://github.com/The-OpenROAD-Project/OpenLane [1] How Prince of Persia Defeated Apple II's Memory Limitations https://www.youtube.com/watch?v=sw0VfmXKq54
Somewhere a bash script on a server might be calculating the interest my bank owes me, I'm directly benefiting from that too.
In other cases I have code that fixes error conditions when they arise. No one has to run it manually, but if it didn’t exist, they would end up with the ticket to fix it manually. Even if they forget it’s there, it is given them more time with each run.
> A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.
Back around 2010, my friend Mat was doing cloud consulting. He wrote some code to screen-scrape the AWS billing and usage page for an account to determine how much had been spent day-over-day. This was, of course, all orchestrated via a bash script that iterated through clients and emailed the results to them (triggered by cron, of course).
He realized he had startup on his hands when something broke and clients started emailing him asking where their email was. Cloudability was born out of that.
I'd say that both the Ruby and bash code involved count as pretty utilitarian despite running on a server and not having a direct user interface.
Several years ago, I was the sysadmin/devops of an on-premises lab whose uplink to the rest of the company (and the proxy by extension) was melting under the CICD load.
When that became so unbearable that it escalated all the way to the top priority of my oversaturated backlog, I took thirty minutes from my hectic day to whip up a Git proxy/cache written in an hundred lines of Bash.
That single-handedly brought back the uplink from being pegged at the redline, cut down time spent cloning/pulling repositories in the CICD pipelines by over two-thirds and improved the workday of over 40 software developers.
That hackjob is still in production right now, years after I left that position. They tried to decommission it at some point thinking that the newly installed fiber uplink was up to the task, only to instantly run into GitHub rate limiting.
It's still load-bearing and strangely enough is the most reliable piece of software I've ever written. It's clever and witty, because it's both easy to understand and hard to come up with. The team would strongly disagree with the statement that they didn't directly benefit from it.
Also, I think it misses a bit where programming came from. The idea of general computation was an abstract mathematical plaything long before it had concrete use cases.
If I make something utilitarian in Apple Shortcuts, my “code” is sitting on top of countless layers of abstractions and frameworks which make it all possible, which are also abstracted away behind a drag and drop interface.
In a browser, both the DOM, web APIs, etc are abstractions - so are frameworks like React, ect. However, there is usually a lot less anger and attention directed at the former than the letter.
My theory why this is the case is that the former is a "hard" abstraction boundary while the latter is "soft": For a web site, it's intentionally not possible to peek below the abstraction layer of the web APIs. However, in exchange, the browser also puts in a lot of effort to make sure that web devs don't have to go below the abstraction layer: By providing watertight error handling for everything below the layer and by giving rich debugging tools that work with the concepts given by the abstraction.
In contrast, any frameworks and libraries that build on top of that are "soft", because developers can - and have to - lool into the code of the framework itself if some problem requires it.
A few decade later, all the obvious, real-world, low hanging fruit applications of technology were filled, and "tech" turned into the weird, more self-contained world of "the internet", social media, advertising, the attention economy, bitcoin, high frequency trading, AI, where it's really just your computer/algorithm fighting someone else's computer/algorithm, and rather detached from the offline world.
A lot of "academic" code that is pilloried by "real" software engineers is actually a great example of this. Is it the most performant? No. Could a random person off the street make use of this? No. But for those in the field, they know exactly what this tool is conceptually doing even if they don't know how it is made exactly. It is like a very specialized tool for a very specialized tradesman.
What is interesting about the shift away from utilitarian programming is that these "thought leaders" and other middlemen now find themselves in positions to set the narrative essentially thanks to monopolizing the space, change the very meaning of work to suit the software they happen to peddle rather than the other way around. We saw this with enterprise software and now we see this with AI tooling shoehorned into that enterprise software.
Computer ethics will not improve en-masse in the United States in years to come. We will get more privatization of the public good. We will get more protections for the monopolists. Social media manipulation and mass surveillance are just the beginning.
"How I Learned To Stop Worrying And Love The Palantir."
The world doesn’t need another generic JavaScript framework, but a lot of people have little annoyances everyday that can be made better with code. This is my favorite code to write. Nothing is that impressive, technically speaking, but it changes how people work and makes their job suck a little less. I find this type of work much more fulfilling than working on some silly integration between 2 systems that will be gone in 3 years.
Before agile we had waterfall. Developers didn't interact with users, they got handed requirements by people who didn't know what was even possible.
It's true that software has become more abstract over time, as the need to standardise how things work overrides a bespoke approach most of the time.
What is Winforms itself if non "non-utilitarian"? Most of an OS is non-utilitarian. Compilers, libc, databases, web servers, browser APIs, ffmpeg, OpenGL, Unity, etc., etc., etc...
2014 is a wild year to pin the end of "utilitarian" programming on, since all of the things you appear to complain about already existed by then. If anything the beginning of making programs for other programs and programmers was 1951/52 with the invention of the compiler. It's been downhill from there.
This take just pushes the notion of utilitarian to an extreme. In this framing, primary industry isn't utilitarian and only some subset of secondary industry is (chiefly B2C industry of products that aren't used as tools or parts of a final product by the consumer: a framemaker isn't utilitarian). The only utilitarian software in this definition is games or other entertainment software since otherwise the "utility" is an means to an end as a tool but that is exactly what a library is.
Making transistors is utilitarian. Making a library is utilitarian.
I am otherwise sympathetic to the pain points but I don't think there is an easy label for the layers of non-essential complexity that have built up and cannot easily be stripped away.
"Let me know in comments"
I seem to be seeing a lot more submissions to Ask HN that are basically blog posts. I'm not trying to police anyone, and if the mods are happy with this then ok. But I'm curious about whether there's actually a trend here, and whether hn users lack alternative places to post their thoughts. Something to so with twitter going down the drain?