Xortran - a Pdp-11 Neural Network with Backpropagation in Fortran Iv
Postedabout 2 months agoActiveabout 2 months ago
github.comTechstory
calmpositive
Debate
20/100
Neural NetworksRetrocomputingFortran
Key topics
Neural Networks
Retrocomputing
Fortran
The post showcases a PDP-11 neural network implementation in Fortran IV, sparking nostalgia and appreciation for the simplicity of old tech among commenters.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
38m
Peak period
3
0-2h
Avg / period
1.8
Comment distribution11 data points
Loading chart...
Based on 11 loaded comments
Key moments
- 01Story posted
Nov 11, 2025 at 3:10 PM EST
about 2 months ago
Step 01 - 02First comment
Nov 11, 2025 at 3:48 PM EST
38m after posting
Step 02 - 03Peak activity
3 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 12, 2025 at 10:11 PM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45892174Type: storyLast synced: 11/20/2025, 12:41:39 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
No backpropagation back then, this only appeared around 1986 with Rumelhart, probably on VAX machines by that time.
The 11/34 was hardly a powerhouse (roughly a turbo XT) but it was sturdy, could handle sustained load and its FPU made the whole difference.
Regarding floating point, I realized the code actually works fine without an FPU, so I assume it uses soft-float. There's no switch to enable the FP11 opcodes, maybe that was in their F77 compiler.
It's indeed rough and spartan, but using a 64KB optimizing compiler requiring just 8KB of memory was a refreshing change for me.
Do you have some reading for this? I've used that compiler but I never read the resulting assembly language.
Nice to see this. It’s a good way to learn the basics without getting bogged down with modern complexities.