Titan
npmjs.comKey Features
Key Features
Tech Stack
For example, the https://github.com/SaynaAI/sayna has been mostly Claude Code + me reviewing the stuff + some small manual touches if needed, but for the most part, I have found that Claude Code writes way more stable Rust code than JS.
It would be easier and safer to give the JS code to a translator and have it translate it into Rust, and then continue AI Dev with Rust, than to invest time in an automated compiler from JS to Rust. IMHO!
However, I don’t see it that way at all. I find claude much more capable of writing large chunks of python or react/js frontend code than writing F#, a very statically type-checked language. It’s fine, but a lot more hand-holding is needed, a lot more tar pits visited.
If anything, it seems a popularity contest of which language features the most in training data. If AI assistance is the goal, everyone should write Python and Javascript.
My experience also points to compiled languages that give immediate feedback on build. It’s nearly impossible to stop any AI agent from using 'as any' or 'as unknown as X'casts in TypeScript - LLMs will “fix” problems by sweeping them under the rug. The larger the codebase, the more review and supervision is required. TS codebase rots much faster then rust/C#/swift etc.
It's not clear to me how this would have better performance than plain Node.js, which is a C++ binary using the V8 JS engine.
It looks like you're handling routing in Rust, but this seems unlikely to move the needle measurably. In fact, it could be hurting you - you're basically betting that the rust program (route request + invoke JS interpreter + marshal data) is faster than the much simpler JavaScript program (route request). That doesn't seem likely.
There's a JS to Rust transpiler? How? If this is true, this is the most impressive part. The web server/framework almost irrelevant.
The AI generated documentation is very confusing.
With this approach, you might be able to do some multithreading to improve the throughput
However, each request is almost guaranteed to be slower because V8 will be faster than Boa
You could also achieve this by spinning up multiple NodeJS instances and putting an nginx server in front to do load balancing - which is pretty standard practice
How does it compare in terms of HW resources?
The worst part of a setup like this is deployment. There's just a lot of little moving pieces - like nginx needs to keep track of which frontend servers are up and which are down. How are you doing load balancing? You want to have websocket connections? That makes it more complex. How do you deploy code? Etc. Its great, but its not at all simple. Configuring nginx feels like its a little puzzle all of its own.
“Without a clear indicator of the author’s intent, any parodic or sarcastic expression of extreme views can be mistaken by some readers for a sincere expression of those views.”
[1] https://dev.to/denyspotapov/porting-is-odd-npm-to-ocaml-usin...
Rust can go that fast because of guarantees its compiler enforces about what the code is doing, that JS emphatically doesn't.
By all means build your tooling and runtime in Rust if it helps, but "you can write high-performance code in JS with no Rust-like constraints" is fundamentally a nonsense pitch.
But Boa is very very slow compared to JIT compiled JavaScript. As soon as your business logic starts trying to stand up and walk I think you’ll start hitting request latency sadness.
Second, how does concurrency (like promises) translate to rust ?
Murderous, wretched and evil Rust proponents will censor, downplay and distract from this post. These murderous Rust proponents already know that they deserve worse than hell.
What's about code and DX: it's not a good practice to export anything using globals, this is what JS world refused to do long ago. It turns your code into a hardly debuggable mess quickly
const body = JSON.stringify({ model: "gpt-4.1-mini", messages: [{ role: "user", content: "hii" }] });
const r = t.fetch("https://api.openai.com/v1/chat/completions", { method: "POST", headers: { "Content-Type": "application/json", "Authorization": `Bearer ${API_KEY}` }, body });
const json = JSON.parse(r.body);
```
.. no async? i wonder how they are doing this & how they plan on more js interop
bun build ./cli.ts --compile --outfile mycli
So the value proposition here is debatable.Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.