Analog Optical Computer for AI Inference and Combinatorial Optimization
Posted4 months agoActive4 months ago
nature.comTechstory
skepticalmixed
Debate
80/100
Optical ComputingAI InferenceAnalog Computing
Key topics
Optical Computing
AI Inference
Analog Computing
A new analog optical computer design for AI inference and combinatorial optimization is presented, sparking debate about its potential to revolutionize computing and its comparison to traditional digital architectures.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
3d
Peak period
9
90-96h
Avg / period
5
Comment distribution20 data points
Loading chart...
Based on 20 loaded comments
Key moments
- 01Story posted
Sep 4, 2025 at 1:06 PM EDT
4 months ago
Step 01 - 02First comment
Sep 7, 2025 at 8:04 PM EDT
3d after posting
Step 02 - 03Peak activity
9 comments in 90-96h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 8, 2025 at 1:54 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45129507Type: storyLast synced: 11/20/2025, 2:27:16 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Electronic signaling is just so marvelously easy to scale that the right path was clear pretty much from day one. We don't have that path for other operating principles right now. As for synchronous operation, binary signaling, and so forth, they're once again just scaling tools that let us crank out designs with billions of transistors without hand-crafting every piece or making the abstractions more leaky than they already are.
The other difference is that computers are now powerful enough to do the 10^20 calculations required to design efficient optical metamaterials for optical inference.
What part of physics do you have in mind?
https://en.m.wikipedia.org/wiki/Quantum_tunnelling
> The current hardware includes 16 microLEDs and 16 photodetectors, supporting a 16-variable state vector…
I'm not really seeing any ASICs that are built for running optimizers. Meanwhile the "AOC" is really good at solving unconstrained/equality-constrained quadratic programs.
Show me a way to keep the network all in optics, all the way through the network - including native positive and negative weights, weights that can actually be larger than 1, and activation functions - without any digital conversion and re-emission.