Package Managers Keep Using Git as a Database, It Never Works Out
Key topics
The debate rages on about using Git as a database for package managers, with commenters weighing in on the seductive simplicity of Git versus its scalability and performance limitations. While some argue that Git's data model is powerful, others point out that the real issues lie with Git's protocol and GitHub's implementation, not the content-addressed tree itself. Self-hosting solutions like Forgejo are gaining traction as a way to avoid security risks and bot traffic, with some sharing clever workarounds for accessing private repositories. As the discussion unfolds, a consensus emerges that the problems with using Git for package management are often more related to the hosting platform than Git itself.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
53m
Peak period
66
0-3h
Avg / period
14.5
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 26, 2025 at 7:46 AM EST
15 days ago
Step 01 - 02First comment
Dec 26, 2025 at 8:39 AM EST
53m after posting
Step 02 - 03Peak activity
66 comments in 0-3h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 27, 2025 at 10:27 PM EST
14 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Scaling that data model beyond projects the size of the Linux kernel was not critical for the original implementation. I do wonder if there are fundamental limits to scaling the model for use cases beyond project-based source code management.
Consider vcpkg. It’s entirely reasonable to download a tree named by its hash to represent a locked package. Git knows how to store exactly this, but git does not know how to transfer it efficiently.
Naïvely, I’d expect shallow clones to be this, so I was quite surprised by a mention of GitHub asking people not to use them. Perhaps Git tries too hard to make a good packfile?..
Incidentally, what Nixpkgs does (and why “release tarballs” were mentioned as a potential culprit in the discussion linked from TFA) is request a gzipped tarball of a particular commit’s files from a GitHub-specific endpoint over HTTP rather than use the Git protocol. So that’s already more or less what you want, except even the tarball is 46 MB at this point :( Either way, I don’t think the current Nixpkgs problems actually support TFA’s thesis.
Turns out Go module will not accept package hosted on my Forgejo instance because it asks for certificate. There are ways to make go get use ssh but even with that approach the repository needs to be accessible over https. In the end, I cloned the repository and used it in my project using replace directive. It's really annoying.
No, that's false. You don't need anything to be accessible over HTTP.
But even if it did, and you had to use mTLS, there's a whole bunch of ways to solve this. How do you solve this for any other software that doesn't present client certs? You use a local proxy.
So the phrase the article says "Package managers keep falling for this. And it keeps not working out" I feel that's untrue.
The most issue I have with this really is "flakes" integration where the whole recipe folder is copied into the store (which doesn't happen with non-flakes commands), but that's a tooling problem not an intrinsic problem of using git
Julia does the same thing, and from the Rust numbers on the article, Julia has about 1/7th the number of packages that Rust does[1] (95k/13k = 7.3).
It works fine, Julia has some heuristics to not re-download it too often.
But more importantly, there's a simple path to improve. The top Registry.toml [1] has a path to each package, and once donwloading everything proves unsustainable you can just download that one file and use it to download the rest as needed.
[1] https://github.com/JuliaRegistries/General/blob/master/Regis...
... Should it be concerning that someone was apparently able to engineer an ID like that?
https://en.wikipedia.org/wiki/Universally_unique_identifier
> 00000000-1111-2222-3333-444444444444
This would technically be version 2, which would be built from the date-time and MAC address, and DCE security version.
But overall, if you allow any yahoo to pick a UUID, its not really a UUID, its just some random string that looks like one.
universally unique identifier (UUID)
> 00000000-1111-2222-3333-444444444444
It's unique.
Anyway we're talking about a package that doesn't matter. It's abandoned. Furthermore it's also broken, because it uses REPL without importing it. You can't even precompile it.
https://github.com/pfitzseb/REPLTreeViews.jl/blob/969f04ce64...
https://devblogs.microsoft.com/oldnewthing/20120523-00/?p=75...
Right now I don't see the problem because the only criterion for IDs is that they are unique.
Apparently it is the former, and most developers independently generate random IDs because it's easy and is extremely unlikely to result in collisions. But it seems the dev at the top of the list had a sense of vanity instead.
It cannot be the case that engineers are labelled lazy for not building the at-scale solution to start with, but at the same time there are next to no resources for said engineer to actually build the at scale solution.
> the path of least resistance for themselves.
Yeah because they're investing their own personal time and money, so of course they're going to take the path that is of least resistance for them. If society feels that's "unethical", maybe pony up the cash because you all still want to rely on their work product they are giving out for free.
I like OSS and everything.
Having said that, ethically, should society be paying for these? Maybe that is what should happen. In some places, we have programs to help artists. Should we have the same for software?
You realize, there are people who think differently? Some people would argue that if you keep working on problems you don't have but might have, you end up never finishing anything.
It's a matter of striking a balance, and I think you're way on one end of the spectrum. The vast majority of people using Julia aren't building nuclear plants.
Refusing to fix a problem that hasn't appeared yet, but has been/can be foreseen - that's different. I personally wouldn't call it unethical, but I'd consider it a negative.
Literally anybody could forsee that, _if_ something scales to millions of users, there will be issues. Some of the people who forsee that could even fix it. But they might spend their time optimizing for something that will never hit 1000 users.
Also, the problems discussed here are not that things don't work, it's that they get slow and consume too many resources.
So there is certainly an optimal time to fix such problems, which is, yes, OK, _before_ things get _too_ slow and consume _too_ many resources, but is most assuredly _after_ you have a couple of thousand users.
Most software gets to take it to more of an extreme then many engineering fields since there isn't physical danger. Its telling that the counter examples always use the potentially dangerous (sounding) problems like medicine or nuclear engineering. The software in those fields are more stringent.
Contrary to the snap conclusion you drew from the article, there are design trade-offs involved when it comes to package managers using Git. The article's favored solution advocates for databases, which in practice, makes the package repository a centralized black box that compromises package reproducibility. It may solve some problems, but still sucks harder in some ways.
The article is also flat-out wrong regarding Nixpkgs. The primary distribution method for Nixpkgs has always been tarballs, not Git. Although the article has attempted to backpedal [1], it hasn't entirely done so. It's now criticizing collaboration over Git while vaguely suggesting that maybe it’s a GitHub problem. And you think what, that collaboration over Git is "unethical"???
On one side, there are open-source maintainers contributing their time and effort as volunteers. On the other, there are people like you attacking them, labeling them "lazy" and bemoaning that you're "forced" to rely on the results of their free labor, which you deride as "slow, bug-riddled garbage" without any real understanding. I know whose side I'm on.
[1]: https://github.com/andrew/nesbitt.io/commit/8e1c21d96f4e7b3c...
Another way to phrase this mindset is "fuck out and find out" in gen-Z speak. It's usually practical to an extent but I'm personally not a fan
When you fuck around optimizing prematurely, you find out that you're too late and nobody cares.
Oh, well, optimization is always fun, so there's that.
Building on the same thing people use for code doesn't seem stupid to me, at least initially. You might have to migrate later if you're successful enough, but that's not a sign of bad engineering. It's just building for where you are, not where you expect to be in some distant future
This is too naive. Fixing the problem costs a different amount depending on when you do it. The later you leave it the more expensive it becomes. Very often to the point where it is prohibitively expensive and you just put up with it being a bit broken.
This article even has an example of that - see the vcpkg entry.
[1] https://github.com/JuliaRegistries/General
[2] https://pkgdocs.julialang.org/dev/protocol/
> The problem was that go get needed to fetch each dependency’s source code just to read its go.mod file and resolve transitive dependencies. Cloning entire repositories to get a single file.
I have also had inconsistent performance with go get. Never enough to look closely at it. I wonder if I was running into the same issue?
Python used to have this problem as well (technically still does, but a large majority of things are available as a wheel and PyPI generally publishes a separate .metadata file for those wheels), but at least it was only a question of downloading and unpacking an archive file, not cloning an entire repo. Sheesh.
Why would Go need to do that, though? Isn't the go.mod file in a specific place relative to the package root in the repo?
My favorite hill to die on (externality) is user time. Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
Externalities lead to users downloading extra gigabytes of data (wasted time) and waiting for software, all of which is waste that the developer isn't responsible for and doesn't care about.
Commons would be if it's owned by nobody and everyone benefits from its existence.
The "tragedy", if you absolutely need to find one, is only for unrestricted, free-for-all commons, which is obviously a bad idea.
This is of course a false dichotomy because governance can be done at any level.
Let's Encrypt is a solid example of something you could reasonably model as "tragedy of the commons" (who is going to maintain all this certificate verification and issuance infrastructure?) but then it turns out the value of having it is a million times more than the cost of operating it, so it's quite sustainable given a modicum of donations.
Free software licenses are another example in this category. Software frequently has a much higher value than development cost and incremental improvements decentralize well, so a license that lets you use it for free but requires you to contribute back improvements tends to work well because then people see something that would work for them except for this one thing, and it's cheaper to add that themselves or pay someone to than to pay someone who has to develop the whole thing from scratch.
But that doesn't mean the tragedy of the commons can't happen in other scenarios. If we define commons a bit more generously it does happen very frequently on the internet. It's also not difficult to find cases of it happening in larger cities, or in environments where cutthroat behavior has been normalized
That works while the size of the community is ~100-200 people, when everyone knows everyone else personally. It breaks down rapidly after that. We compensate for that with hierarchies of governance, which give rise to written laws and bureaucracy.
New tribes break off old tribes, form alliances, which form larger alliances, and eventually you end up with countries and counties and vovoidships and cities and districts and villages, in hierarchies that gain a level per ~100x population increase.
This is sociopolitical history of the world in a nutshell.
You say it like this is a law set in stone, because this is what happened im history, but I would argue it happened under different conditions.
Mainly, the main advantage of an empire over small villages/tribes is not at all that they have more power than the villages combined, but that they can concentrate their power where it is needed. One village did not stand a chance against the empire - and the villages were not coordinated enough.
But today we would have the internet for better communication and coordination, enabling the small entieties to coordinate a defense.
Well, in theory of course. Because we do not really have autonomous small states, but are dominated by the big players. And the small states have mowtly the choice which block to align with, or get crushed. But the trend might go towards small again.
(See also cheap drones destroying expensive tanks, battleships etc.)
canadians need an anti-imperial radio-canada run alternative. we arent gonna be able to coordinate against the empire when the empire has the main control over the internet.
when the americans come a knocking, we're gonna wish we had chinese radios
Yet we regularly observe that working with millions of people; we take care of our young, we organize, when we see that some action hurt our environment we tend to limit its use.
It's not obvious why some societies break down early and some go on working.
That's more like human universals. These behaviors generally manifest to smaller or larger degree, depending on how secure people feel. But those are extremely local behaviors. And in fact, one of them is exactly the thing I'm talking about:
> we organize
We organize. We organize for many reasons, "general living" is the main one but we're mostly born into it today (few got the chance to be among the founding people of a new village, city or country). But the same patterns show up in every other organizations people create, from companies to charities, from political interests groups to rural housewives' circles -- groups that grow past ~100 people split up. Sometimes into independent groups, sometimes into levels of hierarchies. Observe how companies have regional HQs and departments and areas and teams; religious groups have circuits and congregations, etc. Independent organizations end up creating joint ventures and partnerships, or merge together (and immediately split into a more complex internal structure).
The key factor here is, IMO, for everyone in a given group to be in regular contact with everyone else. Humans are well evolved for living in such small groups - we come with built-in hardware and software to navigate complex interpersonal situations. Alignment around shared goals and implicit rules is natural at this scale. There's no space for cheaters and free-loaders to thrive, because everyone knows everyone else - including the cheater and their victims. However, once the group crosses this "we're all a big family, in it together" size, coordinating everyone becomes hard, and free-loaders proliferate. That's where explicit laws come into play.
This pattern repeats daily, in organizations people create even today.
But if a significant fraction of the population is barely scraping by then they're not willing to be "good" if it means not making ends meet, and when other people see widespread defection, they start to feel like they're the only one holding up their end of the deal and then the whole thing collapses.
This is why the tendency for people to propose rent-seeking middlemen as a "solution" to the tragedy of the commons is such a diabolical scourge. It extracts the surplus that would allow things to work more efficiently in their absence.
It’s easier to explain in those terms than assumptions about how things work in a tribe.
No it does not. This sentiment, which many people have, is based on a fictional and idealistic notion of what small communities are like having never lived in such communities.
Empirically, even in high-trust small villages and hamlets where everyone knows everyone, the same incentives exist and the same outcomes happen. Every single time. I lived in several and I can't think of a counter-example. People are highly adaptive to these situations and their basic nature doesn't change because of them.
Humans are humans everywhere and at every scale.
Commons can fail, but the whole point of Hardin calling commons a "tragedy" is to suggest it necessarily fails.
Compare it to, say, driving. It can fail too, but you wouldn't call it "the tragedy of driving".
We'd be much better off if people didn't throw around this zombie term decades after it's been shown to be unfounded.
that seems like an unreasonable bar, and less useful than "does this system make ToC less frequent than that system"
Communal management of a resource is still government, though. It just isn’t central government.
The thesis of the tragedy of the commons is that an uncontrolled resource will be abused. The answer is governance at some level, whether individual, collective, or government ownership.
> The "tragedy", if you absolutely need to find one, is only for unrestricted, free-for-all commons, which is obviously a bad idea.
Right. And that’s what people are usually talking about when they say “tragedy of the commons”.
Nonetheless, the concept is still alive, and anthropic global warming is here to remind you about this.
But I would make the following clarifications:
1. A private entity is still the steward of the resource and therefore the resource figures into the aims, goals, and constraints of the private entity.
2. The common good is itself under the stewardship of the state, as its function is guardian of the common good.
3. The common good is the default (by natural law) and prior to the private good. The latter is instituted in positive law for the sake of the former by, e.g., reducing conflict over goods.
I think it's both simpler and deeper than that.
Governments and corporations don't exist in nature. Those are just human constructs, mutually-recursive shared beliefs that emulate agents following some rules, as long as you don't think too hard about this.
"Tragedy of the commons" is a general coordination problem. The name itself might've been coined with some specific scenarios in mind, but for the phenomenon itself, it doesn't matter what kind of entities exploit the "commons"; the "private" vs. "public" distinction itself is neither a sharp divide, nor does it exist in nature. All that matters is that there's some resource used by several independent parties, and each of them finds it more beneficial to defect than to cooperate.
But it appears we cannot avoid getting into the weeds a bit…
> Governments and corporations don't exist in nature.
This is not as simple as you seem to think.
The claim “don’t exist in nature” is vague, because the word “nature” in common speech is vague. What is “natural”? Is a beehive “natural” Is a house “natural”? Is synthetic water “natural”? (I claim that the concept of “nature” concerns what it means to be some kind of thing. Perhaps polystyrene has never existed before human beings synthesized it, but it has a nature, that is, it means something to be polystyrene. And it is in the nature of human beings to make materials and artifacts, i.e., to produce technology ordered toward the human good.)
So, what is government? Well, it is an authority whose central purpose is to function as the guardian and steward of the common good. I claim that parenthood is the primordial form of human government and the family as the primordial form of the state. We are intrinsically social and political animals; legitimate societies exist only when joined by a common good. This is real and part of human nature. The capacity to deviate from human nature does not disprove the norm inherent to it.
Now, procedurally we could institute various particular and concrete arrangements through which government is actualized. We could institute a republican form of government or a monarchy, for example. These are historically conditioned. But in all cases, there is a government. Government qua government is not some arbitrary “construct”, but something proper to all forms and levels of human society.
> "Tragedy of the commons" is a general coordination problem.
We can talk about coordination once we establish the ends for which such coordination is needed, but there is something more fundamental that must be said about the framing of the problem of the “tragedy”. The framing does not presume a notion of human beings as moral agents and political and social creatures. In other words, it begins with a highly individualist, homo economicus view of human nature as rationally egoist and oriented toward maximizing utility, full stop. But I claim that is not in accord with human nature and thus the human good, even if people can fall into such pathological patterns of behavior (especially in a culture that routinely reinforces that norm).
As I wrote, human beings are inherently social animals. We cannot flourish outside of societies. A commons that suffers this sort of unhinged extraction is an example of a moral and a political failure. Why? Because it is unjust, intemperate, and a lack of solidarity to maximize resource extraction in that manner. So the tragedy is a matter of a) the moral failure of the users of that resource, and b) the failure of an authority to regulate its use. The typical solution that’s proposed is either privatization or centralization, but both solutions presuppose the false anthropology of homo economicus. (I am not claiming that privatization does not have a place, only that the dichotomy is false.)
Now, I did say that the case with something like github is analogical, because functionally, it is like a common resource, just like how social media functions like a public square in some respects. But analogy is not univocity. Github is not strictly speaking a common good, nor is social media strictly a public square, because in both cases, a private company manages them. And typically, private goods are managed for private benefit, even if they are morally bound not to harm the common good.
That intent, that purpose, is central to determining whether something is public or private, because something public has the common benefit as its aim, while something private has private benefit as its aim.
The idea of the tragedy of the commons relies on this feedback loop of having these unsustainably growing herds (growing because they can exploit the zero-cost-to-them resources of the commons). Feedback loops are notoriously sensitive to small parameter changes. MS could presumably impose some damping if they wanted.
Not linearity but continuity, which I think is a well-founded assumption, given that it's our categorization that simplifies the world by drawing sharp boundaries where no such bounds exist in nature.
> The idea of the tragedy of the commons relies on this feedback loop of having these unsustainably growing herds (growing because they can exploit the zero-cost-to-them resources of the commons)
AIUI, zero-cost is not a necessary condition, a positive return is enough. Fishermen still need to buy fuel and nets and pay off loans for the boats, but as long as their expected profit is greater than that, they'll still overfish and deplete the pond, unless stronger external feedback is introduced.
Given that the solution to tragedy of the commons is having the commons owned by someone who can boss the users around, GitHub being owned by MS makes it more of a commons in practice, not less.
You’re fundamentally misunderstanding what tragedy of the commons is. It’s not that it’s “zero-cost” for the participants. All it requires a positive return that has a negative externality that eventually leads to the collapse of the system.
Overfishing and CO2 emissions are very clearly a tragedy of the commons.
GitHub right now is not. People putting all sorts of crap on there is not hurting github. GitHub is not going to collapse if people keep using it unbounded.
Not surprisingly, this is because it’s not a commons and Microsoft oversees it, placing appropriate rate limits and whatnot to make sure it keeps making sense as a business.
The jerks get their free things for a while, then it goes away for everyone.
And out of curiosity, aside from costing more for some people, what’s worse exactly? I’m not a heavy GitHub user, but I haven’t really noticed anything in the core functionality that would justify calling it enshittified.
Probably the worst thing MS did was kill GitHub’s nascent CI project and replace it with Azure DevOps. Though to be fair the fundamental flaws with that approach didn’t really become apparent for a few years. And GitHub’s feature development pace was far too slow compared to its competitors at the time. Of course GitHub used to be a lot more reliable…
Now they’re cramming in half baked AI stuff everywhere but that’s hardly a MS specific sin.
MS GitHub has been worse about DMCA and sanctioned country related takedowns than I remember pre acquisition GitHub being.
Did I miss anything?
As for how the site has become worse, plenty of others have already done a better job than I could there. Other people haven't noticed or don't care and that's ok too I guess.
This isn’t what “commons” means in the term ‘tragedy of the commons’, and the obvious end result of your suggestion to take as much as you can is to cause the loss of access.
Anything that is free to use is a commons, regardless of ownership, and when some people use too much, everyone loses access.
Finite digital resources like bandwidth and database sizes within companies are even listed as examples in the Wikipedia article on Tragedy of the Commons. https://en.wikipedia.org/wiki/Tragedy_of_the_commons
The behavior that you warn against is that of a free rider that make use of a positive externality of GitHub’s offering.
“Commons can also be defined as a social practice of governing a resource not by state or market but by a community of users that self-governs the resource through institutions that it creates.”
https://en.wikipedia.org/wiki/Commons
The actual mechanism by which ownership resolves tragedy of the commons scenarios is by making the resource non-free, by either charging, regulating, or limiting access. The effect still occurs when something is owned but free, and its name is still ‘tragedy of the commons’, even when the resource in question is owned by private interests.
Certainly private property is involved in tragedy of the commons. In the shared cattle ranching example, the individual cattle are private property, only the field is held in common.
I generally think that tragedy of the commons requires the commons, to, well, be held in common. If someone owns the thing that is the commons, its not a commons but just a bad product. (With of course some nit picking about how things can be de jure private property while being defacto common property)
If Github realizes that the free tier is too generous, they can cut it anytime without it being in any way a "tragedy" for anybody involved - having to pay for stuff or service you want to consume is not the "T" in ToC! The T is that there are no incentives to pay (or use less) without increasing the incentives for everyone else to just increase their relative use! You not using the github free tier doesn't increase the usage of Github for anybody else - if it has any effect at all, it might actually decrease the usage of Github because you might not publish something that might in turn attract other users to interact.
Remember how GTA5 took 10 minutes to start and nobody cared? Lots of software is like this.
Some Blizzard games download 137 MB file every time you run them and take few minutes to start (and no, this is not due to my computer).
The article mentions that most of these projects did use GitHub as a central repo out of convenience so there’s that but they could also have used self-hosted repos.
https://clickpy.clickhouse.com/dashboard/numpy
It's a very hacky feeling addon that RKE2 has a distributed internal registry if you enable it and use it in a very specific way.
For the rate at which people love just shipping a Helm chart, it's actually absurdly hard to ship a self contained installation without just trying to hit internet resources.
From compute POV you can serve that with one server, with headroom left.
Bandwidth-wise, given a 100 MB repo size, that would make 3.4 GB/s which you can also serve with a single server and have headroom left.
The git transport protocol is "smart" in a way that is, in some ways, arguably rather dumb. It's certainly expensive on the server side. All of the smartness of it is aimed at reducing the amount of transfer and number of connections. But to do that, it shifts a considerable amount of work onto the server in choosing which objects to provide you.
If you benchmark the resource loads of this, you probably won't be saying a single server is such an easy win :)
Using the slowest clone method they measured 8s for a 750 MB repo, 0.45s for a 40MB repo. appears to be linear so 1.1s for 100MB should be a valid interpolation.
So doing 30 of those per second only takes 33 cores. EPYC servers have 384 cores.
And remember we're using worst case assumptions in a 3 places (full clone each time, using the slowest clone method, and numbers from old hardware). In practice I'd bet a fastish laptop would suffice.
I've looked into self hosting and git repo that has horizontal scalability, and it is indeed very difficult. I don't have the time to detail it in a comment here, but for anyone who is curious it's very informative to look at how GitLab handled this with gitaly. I've also seen some clever attempts to use object storage, though I haven't seen any of those solutions put heavily to the test.
I'd love to hear from others about ideas and approaches they've heard about or tried
Explain to me how you self-host a git repo without spending any money which is accessed millions of time a day from CI jobs pulling packages.
You can implement entire features with 10 cents of tokens.
Companies which dont adapt will be left behind this year.
Also in case you're not aware, accusing people of shilling or astroTurfing is against the hacker news guidelines
Anyone working in government, banking, or healthcare is still out of luck since the likes of Claude and GPT are (should be) off limits.
The number of companies that have this much respect for the user is vanishingly small.
Native software being an optimum is mostly an engineer fantasy that comes from imagining what you can build.
In reality that means having to install software like Meta’s WhatsApp, Zoom, and other crap I’d rather run in a browser tab.
I want very little software running natively on my machine.
Yes, there are many cases when condoms are indicative of respect between parties. But a great many people would disagree that the best, most respectful relationships involve condoms.
> Meta
Does not sell or operate respectful software. I will agree with you that it's best to run it in a browser (or similar sandbox).
I think this is sad.
I know the browser is convenient, but frankly, its been a horror show of resource usage and vulnerabilities and pathetic performance
The idea that somehow those companies would respect your privacy were they running a native app is extremely naive.
We can already see this problem on video games, where copy protection became resource-heavy enough to cause performance issues.
By contrast as long as you have a native binary, one way or another you can make the thing run and nobody can stop you.
I think companies shifted to online apps because #1 it solved the copy protection problem. FOSS apps are not in any hurry to become centralized because they dont care about that issue.
Local apps and data are a huge benefit of FOSS and I think every app website should at least mention that.
"Local app. No ads. You own your data."
I have never been convinced by this argument. The aggregate number sounds fantastic but I don't believe that any meaningful work can be done by each user saving 1 second. That 1 second (and more) can simply be taken by me trying to stretch my body out.
OTOH, if the argument is to make software smaller, I can get behind that since it will simply lead to more efficient usage of existing resources and thus reduce the environmental impact.
But we live in a capitalist world and there needs to be external pressure for change to occur. The current RAM shortage, if it lasts, might be one of them. Otherwise, we're only day dreaming for a utopia.
A high usage one, absolutely improve the time of it.
Loading the profile page? Isn't done often so not really worth it unless it's a known and vocal issue.
https://xkcd.com/1205/ gives a good estimate.
Even if all you do with it is just stretching, there's a chance it will prevent you pulling a muscle. Or lower your stress and prevent a stroke. Or any number of other beneficial outcomes.
Not all of those externalizing companies abuse your time but whatever they abuse can be expressed in a $ amount and $ can be converted to a median's person time via median wage. Hell, free time is more valuable than whatever you produce during work.
Say all that boils down to companies collectively stealing 20 minutes of your time each day. 140 minutes each week. 7280 (!) minutes each year, which is 5.05 days, which makes it almost a year over the course of 70 years.
So yeah, don't do what you do and sweettalk the fact that companies externalize costs (private the profits, socialize the losses). They're sucking your blood.
I’d see this differently from a user perspective. If the average operations takes one second less, I’d spend a lot of time less waiting for my computer. I’d also have less idle moments where my mind wanders while waiting for some operation to complete too.
First argument would be - take at least two 0's from your estimation, most of applications will have maybe thousands of users, successful ones will maybe run with 10's of thousands. You might get lucky to work on application that has 100's of thousands, millions of users and you work in FAANG not a typical "software house".
Second argument is - most users use 10-20 apps in typical workday, your application is most likely irrelevant.
Third argument is - most users would save much more time learning how to use applications (or to use computer) properly they use on daily basis, than someone optimizing some function from 2s to 1s. But of course that's hard because they have 10-20 apps daily plus god know how many other not on daily basis. Though still I see people doing super silly stuff in tools like Excel or even not knowing copy paste - so not even like any command line magic.
1. Calling OP a liar when they mention their user count.
2. Calling OP a liar and their app irrelevant.
3. Blaming the user for not being smart enough and saying that it's their problem that app developers don't focus on saving user time.
Are any of those arguments actually arguments regarding the scenario OP described? It feels like they were meant for a totally different scenario, perhaps one involving fewer users.
> "The Macintosh boots too slowly. You've got to make it faster!"
https://www.folklore.org/Saving_Lives.html
For 24 years of career I've met the grand total of _two_ such. Both got fired not even 6 months after I got in the company, too.
Who's naive here?
So now you and I both have come across such a manager. Why would you make the claim most engineer’s don’t come across such people?
because it's bad at this job, and sqlite is also free
this isn't about "externalities"
This is what people mean about speed being a feature. But "user time" depends on more than the program's performance. UI design is also very important.
289 more comments available on Hacker News