Ask HN: What is the current state of the art in BIG (>5TB) cloud backups?
Mood
thoughtful
Sentiment
neutral
Category
tech
Key topics
cloud backup
data storage
large data
The post inquires about the current state of the art in backing up large (>5TB) datasets in the cloud, with no discussion or answers provided.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
1h
Peak period
4
Hour 2
Avg / period
2.3
Based on 7 loaded comments
Key moments
- 01Story posted
11/19/2025, 4:12:30 PM
3h ago
Step 01 - 02First comment
11/19/2025, 5:27:54 PM
1h after posting
Step 02 - 03Peak activity
4 comments in Hour 2
Hottest window of the conversation
Step 03 - 04Latest activity
11/19/2025, 7:18:12 PM
11m ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
How quickly do you need to be able to restore? Is it commercial or homelab?
The most cost-effective option by far would be to put a NAS device someplace offsite. You could use tailscale to connect to it remotely.
After that, depending on your access patterns, either a glacier-style s3 service (aws or backblaze/etc), or a rented bare-metal server with big disks some place inexpensive.
Or to put it another way, why is state of the art important?
You can probably get away with google drive+rclone+borg/restic/whatever, but it will be rather clunky. Backblaze might be a nicer backend to use.
I use rsync.net with borg, but not sure about your budget. Their 1TB lifetime plan is very competitive though.
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.