Helldivers 2 Devs Slash Install Size From 154gb to 23gb
Postedabout 1 month agoActiveabout 1 month ago
tomshardware.comTech Discussionstory
informativepositive
Debate
20/100
Game OptimizationGamingStorage
Key topics
Game Optimization
Gaming
Storage
Discussion Activity
Very active discussionFirst comment
4m
Peak period
147
0-12h
Avg / period
40
Key moments
- 01Story posted
Dec 3, 2025 at 8:20 AM EST
about 1 month ago
Step 01 - 02First comment
Dec 3, 2025 at 8:25 AM EST
4m after posting
Step 02 - 03Peak activity
147 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 9, 2025 at 7:56 AM EST
about 1 month ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 46134178Type: storyLast synced: 12/3/2025, 1:32:10 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Think of it as I have two packs for levels.
Creek.level and roboplanet.level
Both use the cyborg enemies, by duplicating the cyborg enemy model and texture data across both files, Only the level file needs to be opened to get all nessecary data for a match.
Because modern OS will allow you to preallocate contiguous segments and have auto defrag, you can have it read this level file at max speed, rather than having to stop and seek to go find cyborg.model file because it was referenced by the spawn pool. Engine limitations may prevent other optimisations you think up as a thought exercise after reading this.
It's similar to how crash bandicoot packed their level data to handle the slow speed of the ps1 disc drive.
As to why they had a HDD optimisation in 2024... Shrugs
Sadly, Valve doesn't include/publish HDD vs SSD in/on their surveys (https://store.steampowered.com/hwsurvey/?platform=combined) but considering the most popular combo seems to be 16GB RAM, 8GB VRAM, 2.3 Ghz to 2.69 Ghz CPU frequency, I'm getting the impression that the average gaming PC machine isn't actually that beefy. If someone told me the most common setup paired with the previous specs was a small SSD drive for the OS and a medium/large-sized HDD for everything else and I would have believed you.
I think us as (software/developer/technology) professionals with disposable income to spend on our hobbies forget how things are for the average person out there in the world.
More interesting would be to see the specs for users who bought COD (add other popular franchises as you wish) in the last 2 years. That would at least trim the sample set to those who expect to play recent graphics heavy titles.
A filesystem is by itself just one big "file" acting like a file archive.
> 3-D Hardware Accelerator (with 16MB VRAM with full OpenGL® support; Pentium® II 400 Mhz processor or Athlon® processor; English version of Windows® 2000/XP Operating System; 128 MB RAM; 16-bit high color video mode; 800 MB of uncompressed hard disk space for game files (Minimum Install), plus 300 MB for the Windows swap file […]
* https://store.steampowered.com/app/9010/Return_to_Castle_Wol...
* https://en.wikipedia.org/wiki/Return_to_Castle_Wolfenstein
Even older games would be even smaller:
* https://www.oldgames.sk/en/game/ultima-vi-the-false-prophet/...
* https://en.wikipedia.org/wiki/Ultima_VI:_The_False_Prophet
The ever-increasing system requirements of productivity software, however, never ceases to amaze me:
Acrobat Exchange 1.0 for Windows (1993) required 4 MB RAM and 6 MB free disk space.
Rough feature parity with the most-used features of modern Acrobat also required Acrobat Distiller, which required 8 MB RAM and another 10 MB or so of disk space.
Acrobat for Windows (2025) requires 2,000 MB RAM and 4,500 MB free disk space.
It's currently over 100GB because of duplicated assets, so this is a game-changer (pun intended).
EDIT: Just checked; 157GB on my SSD.
Those high resolution textures will just take space. You could obviously decrease the graphical fidelity but I'd guess that most players (me very much included) would rather play a very pretty 23GB Helldivers II than a 5GB ugly Helldivers II.
150GB was very annoying, ironically forcing me to install it to a HDD. 23GB isn't even worth thinking about for me.
Because offhand, I know you could do things like cute optimizations of redundant data to minimize seek time on optical media, but with HDDs, you get no promises about layout to optimize around...
The only thing I can think of is if it was literally something as inane as checking the "store deduplicated by hash" option in the build, on a tree with copies of assets scattered everywhere, and it was just nobody had ever checked if the fear around the option was based on outcomes.
(I know they said in the original blog post that it was based around fears of client performance impact, but the whole reason I'm staring at that is that if it's just a deduplication table at storage time, the client shouldn't...care? It's not writing to the game data archives, it's just looking stuff up either way...)
Let’s say you have UI textures that you always need, common player models and textures, the battle music, but world geometry and monsters change per stage. Create an archive file (pak, wad, …) for each stage, duplicating UI, player and battle music assets into each archive. This makes it so that you fully utilize HDD pages (some small config file won’t fill 4kb filesystem pages or even the smaller disk sectors). All the data of one stage will be read into disk cache in one fell swoop as well.
On optical media like CDs one would even put some data closer to the middle or on the outer edge of the disc because the reading speed is different due to the linear velocity.
This is an optimization for bandwidth at the cost of size (which often wasn’t a problem because the medium wasn’t filled anyway)
HDDs also have to real with fragmentation, I wonder what the odds that you get to write 150 GBs (and then regular updates in the 30GB range) without breaking it into fragments...
Microsoft has a paper somewhere that shows IO speed starts to drop when fragments of files get below 64MB. So you can split that file up into a few thousand pieces without much performance loss at all.
Even if you pack those, there's no guarantee they don't get fragmented by the filesystem.
CDs are different not because of media, but because of who owns the storage media layout.
Single large file is still more likely to be mostly sequential compared to 10000 tiny files. With large amount of individual files the file system is more likely to opportunistically use the small files for filling previously left holes. Individual files more or less guarantee that you will have to do multiple syscalls per each file and to open and read it, also potentially more amount of indirection and jumping around on the OS side to read the metadata of each individual file. Individual files also increases chance of accidentally introducing random seeks due to mismatch between the order updater writes files, the way file system orders things and the order in which level description files list and reads files.
I am a little curious about the performance of reading several small files concurrently versus reading a large file linearly. I could see small files performing better with concurrent reads if they can be spread around the disk and the IO scheduler is clever enough that the disk is reading through nearly the whole rotation. If the disk is fragmented, the small files should theoretically be striped over basically the entire disk.
They realised, after a lot of players asking, that it wasn't necessary, and probably had less of an impact than they thought.
They removed the duplicates, and drastically cut the install size. I updated last night, and the update alone was larger than the entire game after this deduplication run, so I'll be opting in to the Beta ASAP.
It's been almost a decade since I ran spinning rust in a desktop, and while I admire their efforts to support shitty hardware, who's playing this on a machine good enough to play but can't afford £60 for a basic SSD for their game storage?
It follows in the footsteps of trading in storage for less compute and/or better performance.
An opposite approach in the form of a mod for Monster Hunter: Wilds recently made it possible [0] for end-users to decompress all the game textures ahead of time. This was beneficial there, because GPU decompression was causing stalls, and the trading in of compute for less storage resulted in significantly worse performance.
[0] https://youtu.be/AOxLV2US4Ac
In this case, I don't think it was forgetfulness; unlike us, they have an excuse and they were trying to optimise for disk seek times.
Anyway, I've got a half-dozen cloud accounts I need to go check for unused resources waves.
It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!
I expect it's a story that'll never get told in enough detail to satisfy curiosity, but it certainly seems strange from the outside for this optimisation to be both possible and acceptable.
Users might be more hesitant to switch to another game if it means uninstalling yours and reinstalling is a big pain in the backside due to long download times.
I looked up the size of the latest one, and Sony's investment in RAD Kraken seems to be paying dividends:
Xbox: 214 GB
PC: 162 GB
PS5: 96 GB
On phone, I bet you see some more effort.
On both phones and PCs storage has just grown so its less of an issue. The one thing I have noticed is that Apple does its price windowing around memory so you pay an absurd amount for an extra 128 gb. The ultra competitive Chinese phone market crams high end phones with a ton of memory and battery. Si some popular Chinese phone games are huge compared to ones made for the iPhone.
Would have saved us from all the people who reject any sort of optimization work because for them it is always "too early" since some product team wanted their new feature in production yesterday, and users waiting 5 seconds for a page load isn't considered bad enough just yet.
It means "We think we have something that could help performance based on a dubiously applicable idea, but we have no real workload to measure it on. But we're going to do it anyway."
So it doesn't save us from anything, it potentially delays launching and gives us the same result that product team would have given us, but more expensive.
the problem is that it doesn't say that directly so people without experience take it at face value.
There's only so much you can do with people who will not even take the complete original sentence, let alone the context. (That said, "premature optimisation is the root of all evil" is much snappier so I do see why it's ended up being quoted in isolation)
Yes, of course you shouldn't optimize before you get your critical path stable and benchmark which parts take too much.
But many, many times it is used as excuse to delay optimisation so far that it is now hard to do because it would require to rewriting parts that "work just fine", or it is skipped because the slowness is just at tolerable level.
I have a feeling just spending 10-20% more time on a piece of code to give it a glance whether it couldn't be more optimal would pay for itself very quickly compared to bigger rewrite months after code was written.
Sure they may loose some sales but I have never seen many numbers on how much it really impacted sales.
Also on the disk side, I can't say I have ever looked at how much space is required for a game before buying it. If I need to clear out some stuff I will. Especially with it not being uncommon for a game to be in the 100gb realm already.
That all being said, I am actually surprised by the 11% using mechanical hard drives. I figured that NVME would be a lower percentage and many are using SSD's... but I figured the percent with machines capable of running modern games in the first place with mechanical drives would be far lower.
I do wonder how long it will be until we see games just saying they are not compatible with mechanical drives.
To be fair, at launch Starfield had pretty shit loading times even with blazing fast SSDs, and the game has a lot of loading screens, so makes sense they'll nip that one in the bud and just say it's unsupported with the slower type of disks.
Because it's a recent 20TB HDD the read speeds approach 250MB/s and I've also specifically partitioned it at the beginning of the disk just for games so that it can sustain full transfer speeds without files falling into the slower tracks, the rest of the disk is then partitioned for media files that won't care much for the speed loss. It's honestly fine for the vast majority of games.
https://www.romexsoftware.com/en-us/primo-cache/
Yes, because they apparently still duplicate data so that the terrible IOPS of spinning disks does not factor as much. You people need to stop with this so that we can all have smaller games again! ;-) <--- (IT'S A JOKE)
Edit: Forgot it was released recently on Xbox Series consoles but those also have SSDs.
SSD sizes are still only equal to the HDD sizes available and common in 2010 (a couple TB~). SSD size increases (availability+price decreases) for consumers form factors have entirely stopped. There is no more progress for SSD because quad level cells are as far as the charge trap tech can be pushed and most people no longer own computers. They have tablets or phones or if they have a laptop it has 256GB of storage and everything is done in the cloud or with an octopus of (small) externals.
Sure, there is some limitation in format, can only shove so many chips on M.2, but you can get U.2 ones that are bigger than biggest HDD (tho price is pretty eye-watering)
I think this is more a symptom of data bloat decelerating than anything else. Consumers just don't have TBs of data. The biggest files most consumers have will be photos and videos that largely live on their phones anyway. Gaming is relatively niche and there just isn't that much demand for huge capacity there, either -- it's relatively easy to live with only ~8 100GB games installed at the same time. Local storage is just acting as a cache in front of Steam, and modern internet connections are fast enough that downloading 100GB isn't that slow (~14 minutes at gigabit speeds).
So when consumers don't have (much) more data on their PCs than they had in 2015, why would they buy any bigger devices than 2015? Instead, as sibling commenter has pointed out, prices have improved dramatically, and device performance has also improved quite a bit.
(But it's also true that the absolute maximum sized devices available are significantly larger than 2015, contradicting your initial claim.)
You can buy 16-32TB consumer SSDs on NewEgg today. Or 8TB in M.2 form factor. In 2015, the largest M.2 SSDs were like 1TB. That's merely a decade. At a decade "plus," SSDs were tiny as recently as 15 years ago.
[1] https://www.microcenter.com/product/659879/inland-platinum-1...
[2] https://www.microcenter.com/product/700777/inland-platinum-8...
(More relevant might be that backups are a largely sequential workload and HDDs are still marginally cheaper per TB than QLC flash.)
Tape, microfilm, and M-DISC are the only modern mediums that is meant to be left in storage without needing to be babysit, in climate controlled warehousing at least.
They’re not the ones bearing the cost. Customers are. And I’d wager very few check the hard disk requirements for a game before buying it. So the effect on their bottom line is negligible while the dev effort to fix it has a cost… so it remains unfixed until someone with pride in their work finally carves out the time to do it.
If they were on the hook for 150GB of cloud storage per player this would have been solved immediately.
That's why they did the performance analysis and referred to their telemetry before pushing the fix. The impact is minimal because their game is already spending an equivalent time doing other loading work, and the 5x I/O slowdown only affects 11% of players (perhaps less now that the game fits on a cheap consumer SSD).
If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.
It didn't help their game load noticeably faster. They just hadn't checked if the optimization actually helped.
> Only a few seconds difference?
> Further good news: the change in the file size will result in minimal changes to load times - seconds at most. “Wait a minute,” I hear you ask - “didn’t you just tell us all that you duplicate data because the loading times on HDDs could be 10 times worse?”. I am pleased to say that our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.
> Now things are different. We have real measurements specific to our game instead of industry data. We now know that the true number of players actively playing HD2 on a mechanical HDD was around 11% during the last week (seems our estimates were not so bad after all). We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.
They measured first, accepted the minimal impact, and then changed their game.
I'm just a little baffled by people harping on this decision and deciding that the developers must be stupid or lazy.
I mean, seriously, I do not understand. Like what do you get out of that? That would make you happy or satisfied somehow?
I never called anyone lazy or stupid, I just wondered whether they blindly trusted some stats without actually testing them.
> FWIW, the PC install size was reasonable at launch. It just crept up slowly over time
Wouldn't this mean their optimization mattered even less back then?
It's certainly true that a lot of optimization can and should be done after a software project is largely complete. You can see where the hotspots are, optimize the most common SQL queries, whatever. This is especially true for CRUD apps where you're not even really making fundamental architecture decisions at all, because those have already been made by your framework of choice.
Other sorts of projects (like games or "big data" processing) can be a different beast. You do have to make some of those big, architecture-level performance decisions up front.
Remember, for a game... you are trying to process player inputs, do physics, and render a complex graphical scene in 16.7 milliseconds or less. You need to make some big decisions early on; performance can't entirely just be sprinkled on at the end. Some of those decisions don't pan out.
I don't see a reason to think this. What are you thinking?To be clear, I'm not misquoting Knuth if that's what you mean. I'm arguing that in this case, specifically, this optimization was premature, as evidenced by the fact it didn't really have an impact (they explain other processes that run in parallel dominated the load times) and it caused trouble down the line.
> Some of those decisions don't pan out.
Indeed, some premature optimizations will and some won't. I'm not arguing otherwise! In this case, it was a bad call. It happens to all of us.
> I don't see a reason to think this. What are you thinking?
You're right, I got this backwards. While the time savings would have been minimal, the data duplication wasn't that big so the cost (for something that didn't pan out) wasn't that bad either.
No, they measured it now, not first. The very text you pasted is very clear about that, so I'm not sure why you're contradicting it.
If they had measured it first, this post would not exist.
And others who wish one single game didn't waste 130GB of their disk space, it's fine to ignore their opinions?
They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect. I don't really understand why you'd consider that a positive thing.
The "problem" is a feature. The "so it remains unfixed until someone with pride in their work finally carves out the time to do it" mindset suggests that they were simply too lazy to ever run fdupes over their install directory, which is simply not the case. The duplication was intentional, and is still intentional in many other games that could but likely won't apply the same data minimization.
I'll gladly take this update because considerable effort was spent on measuring the impact, but not one of those "everyone around me is so lazy, I'll just be the noble hero to sacrifice my time to deduplicate the game files" updates.
Arrowhead is a whole company full of "lazy" developers who just don't like to work very hard?
Or do you think they had their hands full with other optimizations, bug fixes, and a large amount of new content while running a complex multiplatform live service game for millions of players? (Also consider that management was probably deciding priorities there and not the developers)
I put hundreds of hours into HD2 and had a tremendous amount of fun. It's not the product of "lazy" people...
But that's also par for the course with AA+ games these days, where shoving content into an engine is paramount and everything else is 'as long as it works.' Thanks, Bethesda.
Evidenced by the litany of quality of life bug fixes and performance improvements modders hack into EOL games.
In the case of HD2 I'd say the team has done well enough. The game has maintained player base after nearly two years, including on PC. This is rare in the world of live service games, and we should ask ourselves what this tells us about the overall technical quality of the game - is the game so amazing that people keep playing despite abysmal technical quality?
The technical quality of the game itself has been somewhat inconsistent, but I put hundreds of hours into it (over 1K, I think) and most of the time it was trouble-free (and fun).
I would also note that the PC install size issue has only become egregious somewhat recently. The issue was always there, but initially the PC install size was small enough that it wasn't a major issue for most players. I actually never noticed the install size bug because I have a $75 1TB drive for games and even at its worst, HD2 consumed only a bit over 10% of that.
It certainly must have been challenging for the developers. There has been a constant stream of new content, and an entirely new platform (Xbox) added since release. Perhaps more frustratingly for the development team, there has also been a neverending parade of rebalancing work which has consumed a lot of cycles. Some of this rebalancing work was unavoidable (in a complex game, millions of players will find bugs and meta strategies that could never be uncovered by testing alone) and some was the result of perhaps-avoidable internal discord regarding the game's creative direction.
The game is also somewhat difficult to balance and test by design. There are 10 difficulty levels and 3 enemy factions. It's almost like 30 separate games. This is an excellent feature of the game, but it would be fair to Arrowhead for perhaps biting off more than any team can chew.
That makes no goddamn sense. I’ve read it three times and to paraphrase Babbage, I cannot apprehend the confusion of thought that would lead to such a conclusion.
5x gets resources to investigate, not assumed to be correct and then doubled. Orders of magnitude change implementations, as we see here. And it sounds like they just manufactured one out of thin air.
HDD and SSD, where SSD is deduplicated.
Im.sure some gamers will develop funny opinions, but for the last 8 years I have not had a HDD in sight inside my gaming or work machines. I'd very much rather save space if the load time is about the same.on an SSD. A 150gb install profile is absolute insanity.
Get rid of 80% of that duplication for a 2x instead of a 5x slowdown would be something.
The optimization was not ill-advised. It is in fact, an industry standard and is strongly advised. Their own internal testing revealed that they are one of the supposedly rare cases where this optimization did not have a noticeably positive effect worth the costs.
> These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.
It's literally "instead of profiling our own app we profiled competition's app and made decisions based on that".
They made an effort to improve the product, but because everything in tech comes with side effects it turned out to be a bad decision which they rolled back. Sounds like highly professional behavior to me by people doing their best. Not everything will always work out, 100% of the time.
And this might finally reverse the terms of games being >100gb as other teams will be able to point to this decision why they shouldn't implement this particular optimization prematurely
Never trust a report that highlights the outliers before even discussing the mean. Never trust someone who thinks that is a sane way to use of statistics. At best they are not very sharp, and at worst they are manipulating you.
> We were being very conservative and doubled that projection again to account for unknown unknowns.
Ok, now that's absolutely ridiculous and treating the reader like a complete idiot. "We took the absolute best case scenario reported by something we read somewhere, and doubled it without giving a second thought, because WTF not?. Since this took us 5 seconds to do, we went with that until you started complaining".
Making up completely random numbers on the fly would have made exactly the same amount of sense.
Trying to spin this whole thing into "look at how smart we are that we reverted our own completely brain-dead decision" is the cherry on top.
I'm sure that whatever project you're assigned to has a lot of optimization stuff in the backlog that you'd love to work on but haven't had a chance to visit because bugfixes, new features, etc. I'm sure the process at Arrowhead is not much different.
For sure, duplicating those assets on PC installs turned out to be the wrong call.
But install sizes were still pretty reasonable for the first 12+ months or so. I think it was ~40-60GB at launch. Not great but not a huge deal and they had mountains of other stuff to focus on.
When the documented worst case is 5x you prepare for the potential bad news that you will hit 2.5x to 5x in your own code. Not assume it will be 10x and preemptively act, keeping your users from installing three other games.
In my experience it's always been quite a battle to spend time on perf.
I'll happily take a demotion if I make a 10x performance goof like that. As long as I can get promoted eventually if I make enough 10x wins. :D
People are more likely to thank me after the fact than cheer me on. My point, if I have one, is that gaming has generally been better about this but I don’t really want to work on games. Not the way the industry is. But since we are in fact discussing a game, I’m doing a lot of head scratching on this one.
That’s the sort of mistake that leads to announcing a 4x reduction in install size.
Or, you know, they just didn't really understand industry recommendations or what they were doing.
"Turns out our game actually spends most of its loading time generating terrain on the CPU" is not something you accidentally discover, and should have been known before they even thought about optimizing the game's loading time, since optimizing without knowing your own stats is not optimizing, and they wrote the code that loads the game!
Keep in mind this is the same team that accidentally caused instantly respawning patrols in an update about "Balancing how often enemy patrols spawn", the same group that couldn't make a rocket launcher lock on for months while blaming "Raycasts are hard", and released a mech that would shoot itself if you turned wrong, and spent the early days insisting that "The game is supposed to be hard" as players struggled with enemy armor calculations that would punish you for not shooting around enemy armor because it was calculating the position of that armor incorrectly, and tons of other outright broken functionality that have made it hard to play the game at times.
Not only do Arrowhead have kind of a long history of technical mediocrity (Magicka was pretty crashy on release, and has weird code even after all the bugfixes), but they also demonstrably do not test their stuff very well, and regularly release patches that have obvious broken things that you run into seconds into starting play, or even have outright regressions suggesting an inability to do version control.
"We didn't test whether our game was even slow to load on HDD in the first place before forcing the entire world to download and store 5x the data" is incompetence.
None of this gets into the utterly absurd gameplay decisions they have made, or the time they spent insulting players for wanting a game they spent $60 on to be fun and working.
Some many game design crimes have a storage limitation at their core e.g. levels that are just a few rooms connected by tunnels or elevators.
I have friends who play one or two games and want them to load fast. Others have dozens and want storage space.
Game size is a problem in every new triple A release.
Not what happened. They removed an optimization that in *some other games* ,that are not their game, gave 5x speed boost.
And they are changing it now coz it turned out all of that was bogus, the speed boost wasn't as high for loading of data itself, and good part of the loading of the level wasn't even waiting for disk, but terrain generation.
I’ve worked with far too many people who have done the equivalent in non game software and it leads to unhappy customers and salespeople. I’ve come to think of it as a kind of learned helplessness.
Maybe, kinda, sorta, on some games, on some spinning rust hard disks, if you held your hand just right and the Moon was real close to the cusp.
If you're still using spinning rust in a PC that you attempt to run modern software on, please drop me a message. I'll send you a tenner so you can buy yourself an SSD.
However a lot of people have TINY SSDs. Think 500 gigabyte.
I am not the one who loads last in the multiplayer lobbies.
The entire current insistence about "HDD is slow to load" is just cargo cult bullshit.
The Mass Effect remastered collection loads off of a microSD card faster than the animation takes to get into the elevator.
Loading is slow because games have to take all that data streaming in off the disk and do things with it. They have to parse data structures, build up objects in memory, make decisions, pass data off to the GPU etc etc. A staggering amount of games load no faster off a RAM disk.
For instance, Fallout 4 loading is hard locked to the frame rate. The only way to load faster is to turn off the frame limiter, but that breaks physics, so someone made a mod to turn it off only while loading. SSD vs HDD makes zero difference otherwise.
We live in a world where even shaders take a second worth of processing before you can use them, and they are like hundreds of bytes. Disk performance is not the bottleneck.
Some games will demonstrate some small amount of speedup if you move them to SSD. Plenty wont. More people should really experiment with this, it's a couple clicks in steam to move a game.
If bundling together assets to reduce how much file system and drive seek work you have to do multiplies your install size by 5x, your asset management is terrible. Even the original playstation, with a seek time of 300ish ms and a slow as hell drive and more CD space than anyone really wanted didn't duplicate data that much, and you could rarely afford any in game loading.
I wish they gave any details. How the hell are you bundling things to get that level of data duplication? Were they bundling literally everything else into single bundles for every map? Did every single map file also include all the assets for all weapons and skins and all animations of characters and all enemy types? That would explain how it grew so much over time, as each weapon you added would actually take sizeOfWeaponNumMaps space, but that's stupid as fuck. Seeking an extra file takes a max of one frame* longer than just loading the same amount of data as one file.
Every now and then Arrowhead says something that implies they are just utterly clueless. They have such a good handle of how games can be fun though. At least when they aren't maliciously bullying their players.
They're using the outdated stingray engine and this engine is designed for the days of single or dual core computers with spinning disks. They developed their game with this target in mind.
Mind you, spinning disks are not only a lot more rare today but also much faster than when Stingray 1.0 was released. Something like 3-4x faster.
The game was never a loading hog and I imagine by the time they launched and realized how big this install would be, the technical debt was too much. The monetary cost of labor hours to undo this must have been significant, so they took the financial decision of "We'll keep getting away with it until we can't."
The community finally got fed up. The steamdb chart keeps inching lower and lower and I think they finally got worried about permanently losing players that they conceded and did this hoping to get those players back and to avoid a further exodus.
And lets say this game is now much worse on spinning disk. At the end of the day AH will choose profit. If they lose that 10% spinning disk people who wont tolerate the few seconds change, the game will please the other players, thus making sure its lives on.
Lastly, this is how its delivered on console, many of which use spinning media. So its hard to see this as problematic. I'm guessing for console MS and Sony said no to a 150gb install so AG was invested in keeping it small. They were forced to redo the game for console without this extra data. The time and money there was worth it for them. For PC, there's no one to say no, so they did the cheapest thing they could until they no longer could.
This is one of the downsides of open platforms. There's no 'parent' to yell at you, so you do what you want. Its the classic walled garden vs open bazaar type thing.
I have to check. You're assumption is correct. I am one of very few.
I don't know the numbers and I'm gonna check in a sec but I'm wondering whether the suppliers (publishers or whoever is pinning the price) haven't screwed up big time by driving prices and requirements without thinking about the potential customers that they are going to scare away terminally. Theoretically, I have to assume that their sales teams account for these potentials but I've seen so much dumb shit in practice over the past 10 years that I have serious doubts that most of these suits are worth anything at all, given that grown up working class kids--with up to 400+ hours overtime per year, 1.3 kids on average and approx. -0.5 books and news read per any unit of time--can come up with the same big tech, big media, economic and political agendas as have been in practice in both parts of the world for the better part of our lives--if you play "game master" for half a weekend where you become best friends with all the kiosks in your proximity.
> the effect on their bottom line is negligible
Is it, though? My bold, exaggerated assumption is that they would have had 10% more sales AND players.
And the thing is, that at any point in time when I, and a few I know, had time and desire to play, we would have had to either clean up our drives or invest game price + sdd price for about 100 hours of fun over the course of months. We would have gladly licked blood but no industry promises can compensate for even more of our efforts than enough of us see and come up with at work. As a result, at least 5 buyers and players lost, and at work and elsewhere you hear, "yeah, I would, if I had some guys to play with" ...
My best recollection is that the PC install size was a lot more reasonable at launch. It just crept up over time as they added more content over the last ~2 years.
Should they have addressed this problem sooner? Yes.
I've racked up 700 hours in the game and the storage requirements I didn't care about.
https://www.autodesk.com/products/stingray/overview
I'm not sure that's necessarily true... Customers have limited space for games; it's a lot easier to justify keeping a 23GB game around for occasional play than it is for a 154GB game, so they likely lost some small fraction of their playerbase they could have retained.
Both entrants in the market are telling you that "install size isn't that important".
If you asked the player base of this game whether they'd prefer a smaller size, or more content - the vast majority would vote content.
If anything, I'd wager this decision was still driven by internal goals for the company, because producing a 154gb artifact and storing it for things like CI/CD are still quite expensive if you have a decent number of builds/engineers. Both in time and money.
You are saying, that most users don't check install size of their games. Which I am not convinced of, but might even be true. Lets assume this to be true for the moment. How does this contradict, what I stated? How does users being uninformed or unaware of technical details make it so that suddenly cramming the user's disk is "caring" instead of "not caring"? To me this does not compute. Users will simply have a problem later, when their TBs of disk space have been filled with multiple such disk space wasters. Wasting this much space is user-hostile.
Next you are talking about _content_, which most likely doesn't factor in that much at all. Most of that stuff is high resolution textures, not content. It's not like people are getting significantly more content for bigger games. It is graphics craze, that many people don't even need. I am still running around with 2 full-HD screens, and I don't give a damn about 4k resolution textures. I suspect that a big number of users doesn't have the hardware to run modern games fluently at 4k.
"There is a limited amount of time, money, and effort that will be spent on any project. Successful enterprises focus those limited resources on the things that matter most to their customers. In this case, disk usage in the ~150gb range did not matter much in comparison to the other parts of the game, such as in-game content, missions, gameplay, etc."
We know this, because the game had a very successful release, despite taking 150gb to install.
I'm not saying they should have filled that 100 extra gb with mission content - I'm implying they made the right call in focusing their engineering manpower on creating content for the game (the ACTUAL gameplay) and not on optimizing storage usage for assets. That decision gave them a popular game which eventually had the resources to go optimize storage use.
I mean.. A few years ago, 1TB SSDs were still the best buy and many people haven't ugpraded still, and wasthing 15% of your total storage on just one game is still a pain for many.
I'm already disillusioned and basically done with these devs anyways. They've consistently gone the wrong direction over time. The game's golden age is far behind us, as far as I'm concerned.
And this being primarily a live-service game drawing revenues from micro-transactions, especially a while after launch, and the fact that base console drives are still quite small to encourage an upgrade (does this change apply to consoles too?), there’s probably quite an incentive to make it easy for users to keep the game installed.
You should look at COD install sizes and almost weekly ridiculously huge "updates". 150gb for a first install is almost generous considering most AAA games.