Back to Home11/19/2025, 1:43:35 PM

Show HN: Fixing a single pointer bug unlocked 1M+ row JSON parsing on Windows

4 points
0 comments

Mood

excited

Sentiment

positive

Category

tech

Key topics

performance optimization

JSON parsing

Windows development

I've been building a cross-platform JSONL viewer app that handles multi-GB files. It worked perfectly on macOS (my development machine), but consistently crashed on Windows at exactly 2,650 KB. Here's the debugging journey and the tiny fix that made all the difference.

The Problem

- macOS: Handles 5GB+ files effortlessly - Windows: Crashes at 2,650 KB every time - Same codebase, cross-compiled from Mac Silicon to Windows using MinGW

The Investigation

Added detailed logging to track execution. The crash happened during string interning after successfully parsing ~6,000 rows. Not during parsing, not during file I/O, but during the merge phase.

The Root Cause

My StringPool class used std::unordered_map<std::string_view, uint32_t> to deduplicate strings. The string_views pointed into a std::vector<std::string>.

When the vector grew and reallocated, all the string_view keys became dangling pointers. The hash map was full of invalid references.

Why did it work on macOS? Different memory allocator behavior, different default stack sizes (8MB vs 1MB), different reallocation patterns.

The Fix

Before (broken):

    uint32_t intern(std::string_view str) {
        auto it = indices_.find(str);
        if (it != indices_.end()) return it->second;
        
        uint32_t idx = strings_.size();
        strings_.push_back(std::string(str));
        indices_[std::string_view(strings_.back())] = idx;  // DANGER!
        return idx;
    }
After (fixed):

    uint32_t intern(const std::string& str) {
        auto it = indices_.find(std::string_view(str));
        if (it != indices_.end()) return it->second;
        
        // Preemptively rebuild if we're about to reallocate
        if (strings_.size() >= strings_.capacity()) {
            strings_.reserve(strings_.capacity() * 2);
            rebuildIndices();  // Fix all string_views!
        }
        
        uint32_t idx = strings_.size();
        strings_.push_back(str);
        indices_[std::string_view(strings_.back())] = idx;
        return idx;
    }
    
    void rebuildIndices() {
        indices_.clear();
        for (size_t i = 0; i < strings_.size(); i++) {
            indices_[std::string_view(strings_[i])] = i;
        }
    }
The Result

- 1 million rows: 6 seconds on Windows - Multi-GB files: No crashes - ~166,000 rows/second throughput - Cross-platform stability

Lessons Learned

1. std::string_view is powerful but dangerous - It's a non-owning reference. When the underlying storage moves, you're holding garbage.

2. Cross-platform testing is essential - The bug was invisible on macOS due to different allocator behavior and larger default stack sizes.

3. Structured logging beats debuggers for cross-compilation - I was cross-compiling from Mac to Windows. Adding timestamped logging to a file made the crash point obvious immediately.

4. Small changes, huge impact - One function, ~15 lines of code, turned "crashes at 2MB" into "handles 5GB+ files"

5. Performance stayed excellent - The rebuild only happens during vector reallocation (exponential growth), so amortized cost is negligible.

The Tech Stack

- simdjson (v4.2.2) for parsing - Multi-threaded parsing (20 threads on my test machine) - Columnar storage for memory efficiency - C++17, cross-compiled with MinGW-w64

This was a humbling reminder that the most critical bugs are often the simplest ones, hiding in plain sight behind platform differences.

Happy to discuss the implementation details, simdjson usage, or cross-platform C++ debugging techniques!

The author shares their experience of fixing a single pointer bug that significantly improved JSON parsing performance on Windows, unlocking the ability to parse over 1 million rows.

Snapshot generated from the HN discussion

Discussion Activity

No activity data yet

We're still syncing comments from Hacker News.

Generating AI Summary...

Analyzing up to 500 comments to identify key contributors and discussion patterns

Discussion (0 comments)

Discussion hasn't started yet.

ID: 45979409Type: storyLast synced: 11/19/2025, 7:23:53 PM

Want the full context?

Jump to the original sources

Read the primary article or dive into the live Hacker News thread when you're ready.