HN: AI File Sorter 1.3 – Add your own Local LLM for offline file organization
Mood
informative
Sentiment
positive
Category
startup_launch
Key topics
Ai
File Organization
Local Llm
Offline Tools
Discussion Activity
Light discussionFirst comment
N/A
Peak period
1
Hour 1
Avg / period
1
Based on 1 loaded comments
Key moments
- 01Story posted
Nov 23, 2025 at 7:44 AM EST
19h ago
Step 01 - 02First comment
Nov 23, 2025 at 7:44 AM EST
0s after posting
Step 02 - 03Peak activity
1 comments in Hour 1
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 23, 2025 at 7:44 AM EST
19h ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
What’s new in v1.3:
- Add your own Local LLM (custom .gguf models in the Select LLM dialog).
- Two categorization modes - More Refined vs More Consistent.
- Optional Whitelists to limit allowed category names.
- Multilingual categorization & UI languages (Dutch, French, German, Italian, Polish, Portuguese, Spanish, Turkish).
- Works with LLaMa-based models (llama.cpp / ggml / gguf) and can fall back to a remote model only if you explicitly enable it.
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.