Transformation Learning-50+continual Learning Experiments (98.3% Mnist N=5
Posted2 months ago
github.comResearchstory
calmpositive
Debate
0/100
Continual LearningMachine LearningNeural Networks
Key topics
Continual Learning
Machine Learning
Neural Networks
The author shares their research on transformation learning, achieving 98.3% accuracy on MNIST with 5 tasks, and documents their experiments and insights on continual learning.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
N/A
Peak period
1
Start
Avg / period
1
Key moments
- 01Story posted
Nov 6, 2025 at 7:16 PM EST
2 months ago
Step 01 - 02First comment
Nov 6, 2025 at 7:16 PM EST
0s after posting
Step 02 - 03Peak activity
1 comments in Start
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 6, 2025 at 7:16 PM EST
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45842214Type: storyLast synced: 11/17/2025, 7:56:02 AM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Found transformation learning scales: 100% XOR/XNOR → 98.3% MNIST (5 tasks). Key insight: transform features (128D), not logits (5D). Feature-level gets +16% accuracy.
Documented everything - successes and failures. All experiments verified and reproducible.
Curious about feedback, especially on: 1. Scaling to CIFAR-100 or beyond MNIST 2. More sophisticated routing without task labels 3. Theoretical connections to meta-learning
Happy to answer questions!