A Shift in Developer Culture Is Impacting Innovation and Creativity
Posted4 months agoActive3 months ago
dayvster.comOtherstoryHigh profile
calmmixed
Debate
70/100
Developer CultureCreativityInnovation
Key topics
Developer Culture
Creativity
Innovation
The article discusses how the developer culture is shifting away from curiosity and creativity, with comments debating whether this is a real phenomenon and what its causes and implications are.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
15m
Peak period
105
0-2h
Avg / period
20
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 19, 2025 at 12:02 PM EDT
4 months ago
Step 01 - 02First comment
Sep 19, 2025 at 12:16 PM EDT
15m after posting
Step 02 - 03Peak activity
105 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 20, 2025 at 5:37 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45303199Type: storyLast synced: 11/20/2025, 8:23:06 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
E.g. there would be enormous difficulty in replacing the Dewey Decimal System with something else, if only due to its physical inertia, but with a computer system a curious clerk can invent an alternative categorization and retrieval system which inevitably touches on mathematical topics.
As a teacher or many other professions? Forget about it. You need to either marry someone with a more lucrative career, or move somewhere more affordable.
Working on fixing our housing shortage has felt extremely meaningful to me.
I'd like to find some of that idealism in software again.
Disclosure: I work at govstream.ai - we work in this space [we're hiring!]
They have a business tax rate above 106% of profit[]. That is it is illegal for a business to make a profit.
Yet there is apparently a video out there of a black market cart seller selling wares right in front of the Argentine tax office, totally unbothered.
It made me wonder if this was just an allegory of what's in store for us.
[] https://archive.doingbusiness.org/content/dam/doingBusiness/...
Under that model, 10% of your time is completely up to you (within reason) to work on things that aren't your main project or scope.
Works out well for R&D or more open ended positions, since you can have the flexibility to explore hunches without having to justify a whole project around it.
Unfortunately this is the consequence.
Sure I was a hobbyist in the 80s programming in assembly language on four separate architectures by the time I graduated college in 1996 in CS. But absolutely no one in my graduating class did it for the “passion” we all did it because we needed to exchange money for food and shelter.
The people jumping into tech during the first tech boom were definitely doing it for the money and the old heads I met back then programming on Dec Vax and Stratus VOS mainframes, clocked in, did their job and clocked out.
When I went into comp-sci, it wasn't the cool path to riches it's been marketed as for the past 15 years.
EDIT: you graduated around the same time as me. Sure everyone wanted jobs. But there were easier paths to getting a good job in '95/96 that cramming late night comp-sci work. Almost everyone in my class had grown up in the age of hacking around on Apple IIs, their first PCs, etc. No one just randomly ended up in the Comp Sci part of the university because they just wanted a job to make money.
And most of the people in my class had their first exposure to computer programming at my state school in south GA was in college.
I had a focus on programming languages in my undergrad studies and went into an academic R&D programming job, basically making tools for computational sciences. I've basically spent my whole career using and writing open source software.
I've definitely seen a kind of culture shift where the frontier nature of our old R&D culture is getting drowned out by boring process. To me, it is largely the influx of cybersecurity compliance that is killing the old culture. The bureaucratic compliance overhead is antithetical to the small team prototyping approach that drove progress during most of my career. It inspires a sort of cargo cult risk-management process that seems more about appearances and plausible deniability than actual secure systems.
Most developers from the beginning worked at banks, government, defense etc doing boring enterprise work. That has always been the case.
They weren’t doing “research” writing COBOL for banks and the government.
In 30 years, I’ve worked at 10 jobs for startups, boring big enterprise companies, BigTech and I’ve been working in consulting (3 of those were working full time in AWS’s consulting department) for five years working with developers from startups, enterprises and government.
I think my exposure to a wide swath of the industry is a little bit more than working in California for 30 years…
Even when I was younger and single, all of us would hang out after work - males and females and go to the bar, the club, the strip club (yes the women too - it was their idea they were mostly BAs and one programmer), and just really enjoy the money we were making. We were all making $50K-$80K back when you could easily get a house built in the burbs for $150k-$170K.
Even as I got older, and change jobs in my mid 30s, by then my coworkers were mostly involved with other hobbies and our families. We weren’t even thinking about computers after work.
But, I do think HN has a cultural fixation on the fabled Silicon Valley experience. This includes an attachment to (nostalia for?) the old university/startup axis. This used to be a more fluid exchange, rather than just the regular enterprise hiring pipeline.
The tech startup scene was a thing between the mid 90s and 2000 with the first dot com boom and even most of them outside of the hardware companies like Sun, Cisco, Intel and software at Netscape were not doing cutting edge for the time development. The startups were throwing dumb (or some premature) things at the wall with no business plan backed by VC money. The people at startups were definitely there for the money. No one had a “passion” to deliver pet food or groceries (Webvan) or early web advertising.
Tech before the mid 90s were programmers building software on mainframes mostly doing boring things.
You had the dark days between 2001-2008 before mobile, web apps, SaaS and high speed internet took off where all most people really could find were boring enterprise jobs. Back then, I was living in Atlanta and the dot com bust didn’t affect the local market at all. The banks, the airlines, the credit card processing companies were hiring boring Microsoft devs and Java devs like crazy.
I’ve just seen too often where 90% of the devs who spend their career at boring old enterprises don’t have any idea what it’s like for those who are in the top 10% working at BigTech and adjacent while at the same time those 10% can’t fathom the fact that there are “Dark Matter developers” living their lives in tier 2 cities with their big houses in the burbs treating a job as just a job and most people always have.
https://www.hanselman.com/blog/dark-matter-developers-the-un...
I’ve been on both sides (and now in the middle doing strategy cloud consulting). I had my first house built in 2003 for $170K and even my second house built in the “good school system” in the burbs of Atlanta in 2016 for $335K. We moved and downsized three years ago.
There was also a Small World effect where we kept in touch across these various R&D spaces. I don't mean to sound grandiose, but I think my cohort built up a lot of the open source that props up the current web world. I don't quite agree with the article this HN post links to, as I know a lot of that open source was written on salary. It wasn't all hobbyists in moms' basements. Whether we worked in government, university, or corporation, we had figured out ways to work on these things we wanted to work on and release to the world and be paid a wage to do it.
I do feel like our microcosm is dying out. I'm not sure if it is a net change where the tech world is reverting to just the dominant corporate tech you describe, or if there is some replacement microcosm bubbling into existence and I'm just out of the loop now.
And whether startup founders had passion or not, once they took outside funding, money was all that mattered.
Took me half a year to get them to value Sentry, lol.
I’ll just collect my check and go do something else.
Fighting the Product Manager, fighting the Designer, sometimes even fighting some micromanaging stakeholder that won't leave you alone.
It's definitely fights I can win, but do I even have the energy anymore? Development work involves more than meets the eye. While there are some folks who understand the technical intricacies, it's tiring having to join discussions you know won't go anywhere.
I wish it was 2000-2010 again, when my biggest problems were Sales promising features we don't have and then having fun with the other devs coding it.
I used to have a lot more mental bandwidth and energy to be "curious" and to tinker once upon a time. But now the world is so literally and figuratively on fire and every executive is so rabidly frothing at the mouth over AI that now I just want things to "just work" with the least amount of bullshit so I can just go home on time and forget about work until the next business day.
I just want this fucked decade to be over already.
World will not end if project is delayed by few weeks. You get time for your own tinkering (never tinker on company stuff, even if that would improve things, unless you are shareholder).
Apart from may be few core infrastructure primitives at public Cloud providers most of IT stuff today is API calling API calling API and so on.
It will be the case until Human is Out Of Loop from most of the IT work.
Final straw for me was RTO. Silently quitting and getting my ticket punched (laid off) was the best thing for me.
Honestly some of my best jobs were at places that had a nicely balanced practice in place and the backbone to remind execs that if they interrupt makers daily with new shiny asks they will in effect get nothing (because nothing would ever be completed)...
But obviously we can both have worked at places with those labels with vastly different implementations and thus experiences :)
for me it feels like I have to spend all day fighting with folks who are constantly “holding it wrong” and using libraries and frameworks in really weird/buggy ways who then seem completely uninterested in learning.
In my free time I love working on my own projects because I don’t have to deal with this bullshit and I can just craft things finely without any external pressure.
Any non-small company has plenty of people that need to justify their salaries.
Meetings is one of the most effective ways to actually pretend to be working.
Subsequent governments turned the profession into the captive market, where you can only realistically work for corporations who fix the wages by following so called "market rates" and you cannot create your own job if you disagree with the rates.
The average university grad would be better off in law/finance/medicine by income in London. This isn't to stay the top software Devs don't get paid a lot, but it's a minority compared to the legions of high paid people in finance in London and the surrounding industries.
Jobs in London pay less than peanuts and if you earn six figures in the UK, income tax takes half of it anyway even if you go to FAANG.
Many consultant friends I know and business owners have moved away from the UK to low tax areas in Portugal or UAE.
That's not entirely true. We (society, definitely US) pushed going to college HARD for the last 3-4 decades and glamorizing how much money you'll make. Now, we have an overabundance of people with college degrees and thousands of dollars in debt to those degrees.
There's plenty of career paths where you could make decent money that don't require a college degree.
We should have been pushing people to figure out what they wanted to do, not "Make lots of money", and figure out the path that gets them there.
It's wild how this site has turned into reddit over the last couple years.
The sad reality is that "everyone learn to code" was by and large a marketing distraction from the severe structural unemployment the fast and loose economy is in. No a coal miner can't just learn to code and get a job in WV, certainly not 1,000's of other miner sin the same position, not can the millions of people that corporate laid off over those same decades.
Coding was a way out of poverty, but for most people it was just a distraction to keep them from seeing how bad the economy is.
Americans are poor: PNC Bank's annual Financial Wellness in the Workplace Report shows that 67 percent of workers now say they are living paycheck to paycheck, up from 63 percent in 2024. https://www.newsweek.com/2025-rise-americans-living-paycheck...
This has been happening since the 2008 financial crash when a lot of people would have normally gone into careers on Wall Street, but the shrinking Wall Street job market led people into tech as a high-performant, decent paying career..... (U.S. biased opinion, of course)
My first trip through college I studied business and then the economy collapsed. Most people my age eeked their way through menial jobs (like me) and survived, found a way to break through, or, (like me) went back to school years later when the economy improved to try to find another opportunity. For me the choices were CS or nursing at that time, and I have always been good at math and with computers, so I chose that.
I wouldn't say I ever "loved" development, especially not the current corporate flavor of it. I've had some side projects when I get time and energy. But there's never really been a point in my life where I could ever have afforded getting the level of expertise I possess now just for the "curiosity" of it. Not everyone has a trust fund or safety nets.
I interviewed many people from top universities and they absolutely scream "I couldn't care less about the field, I'm just here to maximize the compensation".
At the same time I get 19 year old self taught kids who are miles better at programming, learning and are genuinely passionate.
Just the bar is so high now, so much competition, so many cargo culting startups that only do bad leetcode interviewing.
It's very hard to both find and get hired at places that want more than a coding monkey to just blindly move Jira tickets.
I got started in the 1980s, and super-curious and technical people were the norm. We were incredibly strongly attracted to computers.
The first real growth in computers in that kind of era was Wall Street and banks. Wall Street in particular started paying huge bonuses to developers because it was clear that software could make huge piles of money. Then we started seeing more people joining in for the money who were not necessarily passionate about technology.
Then came the dot com era and bust, and then the rise of social media, FAANG, and absurd corporate valuations allowing ridiculous total comp to developers, and the needle moved even more towards money.
The net result is the curious and the passionate are still here, but our numbers are massively diluted.
I come places like here to find that passionate niche.
Here's an example from my perspective.
Recently while developing a way to output a PAL video signal from two digital lines(an endeavour obviously driven by curoiosity more than utility). I learned a great deal about far more than I would have if I had not have used AI. I wasn't blind to what the AI was emitting, It helped me decide upon shifting one output to 0.285v and the other to .715v. Write a program to use pytorch to learn a a few resistors/capacitors to smooth out the signal and a sample of 2-bit data that when emitted though the filters produced a close sine wave for the color burst. AI enabled me to automatically emit Spice code to test the waveform. I had never used spice before, now I know the pains of creating a piecewise linear voltage source in it.
Yesterday I used an AI to make a JavaScript function that takes a Float32Array of voltage level samples and emits a list of timings for all parts of a scanline along with the min and max voltage levels, calculates the position and frequency of the color burst and uses the measured color burst to perform the quadrature decoding to produce a list of YUV values at a lower sample rate.
This should let me verify the simulated waveform so that I can compare the difference between what I intend to emit, and what should be a correct signal. If there turns out to be a discrepancy between what I am emitting and what I think I am emitting, this will come in quite handy.
Perhaps I might learn more if I did all of this myself unaided by an AI, but it would also take much longer, and likely not get done at all. The time I save writing tests, wrangling libraries and software is not stored in the bank. I used that time too, doing other things, learning about those as I did so.
They want to produce something without having the skills to produce it. Which, you know, probably isn't uncommon. I'd love to be able to rock out the guitar solo in Avenged Sevenfold's "Bat Country" [0] or "Afterlife" [1] or the first solo in Judas Priest's "Painkiller" [2], but to get to that skill level takes years of practice, which I'm quite frankly not willing to put in.
The difference is the honesty. A vibe coder produces something barely more than "Hello world" and brags about being able to produce software without learning to code. Nobody grabs a guitar, learns two chords, then claims to be a guitarist.
[0] (mildly nsfw) https://youtu.be/IHS3qJdxefY?t=137
[1] https://youtu.be/HIRNdveLnJI?t=168
[2] https://youtu.be/nM__lPTWThU?t=129
That's off by large factor.
What I could find quickly was an estimate that the top 400 own a little over 4%, not 50%.
What dampens the Spirit is same as everyone - a treadmill you cannot get off, punishment for independnat thinking.
Dev culture is not one thing that is found in dozens of companies - dozens of companies have their own culture - and if that is a curious and empowering culture you have curious and empowered devs, and salespeople and operations and chemists and …
Culture is what we make it
You won't tour for long as a one hit wonder and I think what's being said by the OP is quite similar
There's still people taking on new frontiers... even if you don't love crypto (and I don't!), a lot of very curious developers found a home there. AI is tougher (due to the upstart costs of building a model), but still discovery is happening there.
I don't think curious developers are gone... there's just an increase of un-curious developers looking for a paycheck. You just have to look harder now (although I think it only seems like we had a cohort of curious devs because we're looking at it in hindsight, where the outcomes are obvious).
TFS was introduced in 2005 for Microsoft shops for instance.
I'm rather envious of kids today who have access to Google, Wikipedia, YouTube, and (with caveats) ChatGPT when they're truly interested in a topic. They can dive a lot deeper than I had the opportunity to without bringing in adult assistance.
"Can" is a critical word in your comment.
But after acknowledging this you just said that most people want to be lazy. Which is something I was actually agreeing with. Though, I guess I should add that I don't think people are doing that so much because they are lazy by nature but rather that they are overwhelmed. There's definitely a faster pace these days and less time given to have fun and be creative. One might say that it's work and work isn't meant to be fun, but this is mental work and critically, it is creative work. In those domains, "fun" is crucial to progress. It is more a description of exploration and problem solving. If we're all just "wanting to go to the pub and watch TV" (nothing wrong with that) then we'll just implement the quickest dirtiest solution to get things done. I think this can work in the short run but is ineffective in the long run. A little shortcut here and there isn't a big deal, but if you do that every day, every week, every year, those things add up. They are no longer little shortcuts, but a labyrinth. Instead of creating shortcuts, people are actually just navigating this labyrinth they made. It's the difference between being lazy by sitting on the couch all day and being lazy by "work smarter, not harder."
My main concern is with the environment we've created these days. It's the "free time" that davidw mentions. As several people have mentioned, things shifted towards money. My honest belief is that by focusing too much on money we've actually given up on a lot of potential profits. Take Apple. Instead of thinking different and trying new things, they've really mostly concentrated on making things smaller and thinner. That's good and all, but honestly, I'd more than happily have a thicker laptop to give my screen and keyboard more clearance. It's just so fucking difficult to avoid those smudge marks on my screen. We take less risks because we're profit focused. The risks just seem far riskier than they are. We've created walled gardens, which undermine what made the computer and smartphone so successful in the first place! (The ability to program them!) We hyper fixate on spreadsheets. We dismiss our powerusers because they are small in quantity, ignoring the role that they play in the ecosystem. In every ecosystem it is a small portion that does the most. Everything is just so myopic.
I'd agree that all these problems existed to some extent. But what the difference now is scale. I think what changed is the population of the developers. In "the old days" there was a good balance between the coding monkeys and business monkeys, where they pushed back against one another. The business people couldn't do it without the coders and the coders could do it without the business people, but were more effective with them. But I think these days the business monkeys just took over and dominated. The paradigm shifted from wanting to build good products and being happy to get rich while doing so to focusing on the getting rich part. We lost the balance. I think people are still creative, but I think we do not give them room to breathe, I think we do not give them enough room to take their chances. In some ways things are far easier than they've ever been, but in many ways they're also far harder. So are we measuring success by the fact that we have multiple trillion dollar companies, something never seen in 2010, or are we measuring success by the number of products and technologies that have changed peoples' lives. We made Android, iPhone, Maps, YouTube, Twitch, WiFi, Bluetooth, and so much more in just such a short time. But in (more than) that same timeframe, what innovations have we made? There's been some good leaps, don't get me wrong, but even AI is only a small portion of that. During most of that time we saw more vaporware than actual products. For the love of god, there's bitcoin billionaires. Love or hate crypto, it hasn't changed the world in a huge way.
/rant
But since then the Apple Watch is an innovation both technologically and from a business standpoint, In 2010, I wouldn’t have imagined you could have a processor faster than the original iPhone, with Wifi, Bluetooth, GPS, cellular, satellite communication, 32 GB of storage in something that size with that battery life.
While I think the newly announced Meta glasses are ugly and don’t provide enough value for the money, it was risky and not just another social media platform to provide ads.
Gen AI is some real sci fi shit that I wouldn’t have thoight it would be as far as it is in 2020.
Self driving cars are a real thing on the road right now. Even Uber itself is innovative and has made travel to different cities much better than dealing with taxi services. As much as I dislike Musk as a person, you can’t deny SpaceX and Starlink are game changers. All of the major tech companies are spending a lot of money on better custom processors and TSMC is doing some wild stuff on the manufacturing side.
The medical industry is also doing some life changing things.
Probably it would have been possible to get something via inter-library loan, but I would have been 9 or 10, didn't know this was possible and didn't think to ask. The handful of topical books I obtained from parents and schoolfriends was a far cry from just scrolling on your phone to the information you want.
It's not better in every way. But it is better in some ways.
MySQL was available for free in 2000 and anyone could download any number of language runtimes for free like Perl and Java. If your corporate overlords weren’t cheap (or you were in college) an MSDN subscription was amazing.
JavaScript has been around for decades. But jQuery made it so much easier, and then React built on top of that even more. And jQuery wasn't the first DOM library, nor was React the first framework – but both were where it seemingly clicked between ideas, usability and whatever else made them successful.
(I will agree that Microsoft had a run of things where anyone who bought in to their ecosystem had a lot of things that worked well together.)
Especially today while the IDEs are free, people are paying for LLM coding assistants.
Not to mention the dark days of Window GUI development. How exactly is Vim better than a modern IDE?
[1] Yes I know all bets are off when you are using reflection.
We (ie, people that do not have a safety net) do not have this luxury you people did in the 1990s of experimentation and curiosity. Boomers and leaders using shitty Reaganomic economic policies have decimated our safety nets by so much that it makes experimentation a luxury for the rich and powerful.
Cost of living is higher than ever. Inflation is higher than ever. We are handcuffed to this shitty system in America called “private health insurance.” Get sick? No job? You are fucked m8.
The risks of "curiousity" are much much higher than it was during your time buddy
I was young and didn't have many responsibilities then, and lots of free time. Now I'm a dad with a mortgage and an interest in local politics because I want to 'leave it better than I found it'.
All that said... I do think there have been some shifts over time. I grew up in the era of open source taking off, and it was pretty great in a lot of ways. We changed the world! It felt like over time, software became mainstream, and well-intentioned ideas like PG's writing about startups also signaled a shift towards money. In theory, having F U money is great for a hacker in that they don't have to worry about doing corporate work, but can really dig into satisfying their curiosity. But the reality is that most of us never achieve that kind of wealth.
Now we find ourselves in a time with too much concentrated corporate power, and the possibility that that gets even worse if LLM's become an integral part of developer productivity, as there are only a handful of big ones.
Perhaps it's time for a new direction. At my age I'm not sure I'll be leading that charge, but I'll be cheering on those who are.
It's certainly true that IT has grown vastly since those good old days, but there has always been a proportion of people who're just... not that interested in what they're doing. For example I remember being mildly horrified in around 1998 that a colleague didn't know how to run his compiler from the command line; without an IDE he was lost - but I doubt he was the only one.
Meanwhile the idea that there's a dearth of cool new stuff seems quite quaint to me. There's a whole bunch of cool things that pop up almost daily right here on Hacker News². Just because they haven't spread to ubiquity doesn't mean they're not going to. Linux was not mainstream right out of Linus's Usenet announcement - that took time.
As to corporate power? They ebb and flow and eat each other (Data General, Compaq, DEC ... remember them? Remember when Microsoft was the major enemy? Or IBM?)
¹ https://en.wikipedia.org/wiki/Good_old_days
² Edit: Not to mention, there's also a whole bunch of crap that's not very interesting. But survivor bias means we'll have forgotten those in 20 years time when we're surveying this time period; as Sturgeon's law reminds us, "90 percent of everything is crap."
It just feels like "it's a job" is more of the zeitgeist these days.
And yes, I'm also well aware of what came before 'my time' - mainframes and such were definitely an era where the power was more with the large companies. One of the reasons Linux (and *BSD) was so cool is that finally regular people could get their hands on this powerful OS that previously was the exclusive purview of corporations or, at best, universities.
As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But if you're looking for that spark and excitement again, you need to get back out to the frontier. One frontier that is particularly exciting to me is using AI to speed up the tedious parts of the development process, and to tackle areas where I don't have specialist knowledge. Similarly to how Linux opened up a powerful OS to individuals, AI is enabling individuals to create things that would have previously required large teams.
Perhaps over time it'll get efficient enough to run outside of huge companies; that could be an interesting aspect to keep an eye on.
Though certain novel uses could lead to new individuals or entities gaining power.
I'd like to be hopeful and would like to hear good arguments for how this could happen - but it seems to me improved technology on the whole leads to increased concentration of power - with exceptions and anomalies but that being the dominant trend.
It was about how only big companies have the resources to make big computers that take up a whole room that are powerful enough to run smart AI models.
But if tech progress is any indication, in say 50 year or probably less, we will have the power of a modern day datacenter in our pockets and be able to run smart AI models locally without it being a large corp monopoly.
> it seems to me improved technology on the whole leads to increased concentration of power
Which is why we are dominated by IBM, AT&T, Kodak and Xerox.
In all seriousness though there’s plenty of room for improvement both in current models and hardware.
Or, you know, if AI is the mainstream hotness or just doesn't float your boat, look for what the iconoclasts are up to and go dive into that, not whatever the VCs are flinging their gold at today.
But... they're still there. They're a little diluted, but I've not yet worked somewhere where I had no like-minded tinkerers amongst my colleagues. I don't think I'd want to, but it just hasn't come up.
> As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But the free software movement dates back to the early 80s, not the 2000s that we're talking about. Open source itself was being seen as a dilution of the principles of free software in the late 90s/early 2000s. More to the point, free and open source software is still very much here - we're absolutely surrounded by it.
> mainframes and such were definitely an era where the power was more with the large companies
It's oscillated. DEC used to be the zippy young upstart snapping at IBM's heels you know. Microsoft didn't start out big and evil; nor did Google if it comes to that. Put not thy faith in shiny new companies for they shall surely betray thee once they devour the opposition... :D
edit: I hadn't scrolled down to https://news.ycombinator.com/item?id=45303388 when I wrote this
That's a cheap dismisal. There's nothing wrong with "good old days" thinking if old days were actually better.
>Meanwhile the idea that there's a dearth of cool new stuff seems quite quaint to me. There's a whole bunch of cool things that pop up almost daily right here on Hacker News²
Hardly of the breadth and ambition of the 1998-2012 or so period.
>As to corporate power? They ebb and flow and eat each other (Data General, Compaq, DEC ... remember them? Remember when Microsoft was the major enemy? Or IBM?)
Yes, and also remember then players like Sun did cool stuff in the UNIX space. Or when FOSS wasn't basically billion dollar corporate owned wholesale, with mere corporate employees buying the majority of contributors and IBM, Oracle, Google and co running the show. Even RedHat was considered too corporate and now it's IBM...
Hits so much harder as a middle aged adult than when I saw it on tv ~2 decades ago.
That isn't the case anymore. That sort of monoculture where everyone is reading the same stories, discussing the same topics, and reading about shared values and principles, is long gone.
Old man yells at cloud services
But most of the people I went to uni to study computer science with at the end of the nineties were there for the money. Even back then it was all about money for most programmers.
And then there is a generation that grew up knowing that there was money in computers, so many of them learned to use them even if they didn't care about them per se. This generation also contains many hackers, but they are surrounded by at least 10x more people who only do it for money.
Twenty years ago, most programmers were nerds. These days, nerds are a minority among the programmers. Talking about programming during an IT department teambuilding event is now a serious faux pas.
Then again, I did spend some time in e.g. lisp and Haskell just for the heck of it. And there ate still plenty more unsolved problems outside of the mainstream today.
You can't keep that curiosity and at the same time see one of the most wonderful and awe-inspiring technologies of the last decades as something threatening.
I lamented when my career first started (2000 or so) that there were devs I worked with who didn't even own computers at home. While my bookshelves were full of books I aspired to learn and my hard drive was full of half-baked projects, they clocked out and their thinking was done.
I still know a few of those now 25 years after the fact. Some of them have made a career out of software. But they never got curious. It was a means to an end. I don't begrudge them that. But as someone who is internally driven to learn and improve and produce, I can't relate.
My primary fustration today is how many of my software peers are satisfied with updating a Jira status and not seeking to build excellent software. I've seen it at all levels - engineers, managers, and executives. I'm actualized by shipping good, useful software. They seem to be actualized by appearing busy. They don't appear to deliver much value, but their calendars are full. It has me at my professional wits end.
Truth be told, the phenomenon of appearing productive without being productive is an epidemic across multiple industries. I've had conversations with people in manufacturing and agriculture and academia and they all echo something similar. Eventually, Stein's law indicates that the productivity charade will end. And I fear it will be ugly.
SWE culture was very different in a low interest rate environment. Teams were over staffed. No new tech came around for a long time so everyone was focused inward on how to improve the quality of the craft. At my big tech company some teams had 2-3 people dedicated to purely writing tests, maintainability, documentation, and this was for a <1m MAU product.
Then boom free money gone. Endless layoffs year over year. Companies pushing “AI” to try and get SWEs to deprecate themselves. It’s basically just trying to survive now.
That wizard that used to nag everyone about obscure C++ semantics or extremely rare race conditions at distributed scale has disappeared. There’s no time for any of that stuff.
Like all cultures, this was all performative. People astutely observed how to say and care about the things that they saw, the people above them, saying and caring about, and mimicked them to promotions. That doesn’t work anymore, so that wizard culture is long gone.
I think it depends on the circles you're in. For example, I see a lot of interest in the "Handmade" way of doing things, largely inspired by Handmade Hero. Almost feels like a comeback of what you consider to be dying. There are people who are interested, but one needs to look for them. I recommend it.
That wave feels definitively over now, making mobile apps in 2025 is much like doing WinForms in 2003. Hopefully something new will come along that shakes things up. In theory that's AI but as a developer I find AI tremendously unsatisfying. It can do interesting things but it's a black box.
For me personally... I'm older and married with kids. My free time is so much more valuable than it was back in the day. I still try to be a curious developer but at the end of the day I need to get my work done and spend time with my family. There's enough of a financial squeeze that if I did find myself with an excess of free time I'd probably try to spend it doing freelance work. So whenever this next wave does arrive I might not be catching it.
127 more comments available on Hacker News