A Global Mining Dataset
Posted3 months agoActive3 months ago
tech.marksblogg.comResearchstory
calmpositive
Debate
20/100
MiningDataSustainability
Key topics
Mining
Data
Sustainability
The release of a comprehensive global mining dataset has sparked interest in the HN community, with discussions around its potential applications and limitations.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
3d
Peak period
5
66-72h
Avg / period
3.5
Key moments
- 01Story posted
Oct 6, 2025 at 6:48 AM EDT
3 months ago
Step 01 - 02First comment
Oct 8, 2025 at 11:17 PM EDT
3d after posting
Step 02 - 03Peak activity
5 comments in 66-72h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 9, 2025 at 4:36 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45489925Type: storyLast synced: 11/20/2025, 3:10:53 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
The dataset is a 8,000 row spreadsheet.
My advice when working with such a small dataset is not to overthink it.
8,000 rows is small but the typical processing isn’t fast. Optimizing it has limited ROI. I use a custom Python library I wrote for this kind of work, which makes it a bit slow, but you constantly run across new types of inexplicable geometry issues so the ability to rapidly write custom routines is paramount, which Python excels at.
GIS data is computationally expensive to process even beyond its obvious properties.
The Parquet pattern I'm promoting makes working across a wide variety of datasets much easier. Not every dataset is huge but being in Parquet makes it much easier to analyse across a wide variety of tooling.
In the web world, you might only have a handful of datasets that your systems produce so you can pick the format and schemes ahead of time. In the GIS world, you are forever sourcing new datasets from strangers. There are 80+ vector GIS formats supported in GDAL. Getting more people to publish to Parquet first removes a lot of ETL tasks for everyone else down the line.
It was nice seeing how these stats can be calculated in sql, but this analysis would be beat by a few pivot tables in excel.
Excel can even draw a map to go along (although not as pretty)