Asynchronous LLM Computations Specifications with Llm:graph
Posted3 months agoActive3 months ago
rakuforprediction.wordpress.comTechstory
calmpositive
Debate
20/100
LLMGraph TheoryAsynchronous Computations
Key topics
LLM
Graph Theory
Asynchronous Computations
The post introduces LLM:Graph, a specification for asynchronous LLM computations, and discusses its potential applications and benefits, with commenters exploring its implications and potential use cases.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
N/A
Peak period
5
0-2h
Avg / period
2.7
Key moments
- 01Story posted
Sep 28, 2025 at 11:29 AM EDT
3 months ago
Step 01 - 02First comment
Sep 28, 2025 at 11:29 AM EDT
0s after posting
Step 02 - 03Peak activity
5 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 29, 2025 at 6:08 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45405068Type: storyLast synced: 11/17/2025, 12:04:30 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
I personally don’t see what advantages Python as a language (not an ecosystem) would have here.
But antononcube has thicker skin than me so..
For me, it is a mystery how programmers "decide" as a group that a new tool or language is better than the established & familiar ones. But I would love to see more folks open to try new tools like Raku that could make their lives easier and more fun.
Another point, which could have mentioned in my previous response -- Raku has more elegant and easy to use asynchronous computations framework.
IMO, Python's introspection matches that Raku's introspection.
Some argue that Python's LLM packages are more and better than Raku's. I agree on the "more" part. I am not sure about the "better" part:
- Generally speaking, different people prefer decomposing computations in a different way. - When few years ago I re-implemented Raku's LLM packages in Python, Python did not have equally convenient packages.
WL's LLMGraph is more developed and productized, but Raku's "LLM::Graph" is catching up.
I would like to say that "LLM::Graph" was relatively easy to program because of Raku's introspection, wrappers, asynchronous features, and pre-existing LLM functionalities packages. As a consequence the code of "LLM::Graph" is short.
Wolfram Language does not have that level introspection, but otherwise is likely a better choice mostly for its far greater scope of functionalities. (Mathematics, graphics, computable data, etc.)
In principle a corresponding Python "LLMGraph" package can be developed, for comparison purposes. Then the "better choice" question can be answered in a more informed manner. (The Raku packages "LLM::Functions" and "LLM::Prompts" have their corresponding Python packages implemented already.)
"LLM::Graph" uses a graph structure to manage dependencies between tasks, where each node represents a computation and edges dictate the flow. Asynchronous behavior is a default feature, with specific options available for control.