Ask HN: My friend says he has an AI optimized language
No synthesized answer yet. Check the discussion below.
My impression is that letting the model write SQL queries is not a typical baseline for RAG. It's unlikely that a company could substantially benefit from this reduction, because the cost of their internal LLM usage is probably mostly tokens that are not RAG queries or tabular responses. They also probably cannot fall back to a cheaper model because the "minimum acceptable level of intelligence" of the model is likely chosen for tougher constraints than the ability to generate correct RAG queries.
I don't think this is a reasonable thing to make a business around, because the claimed benefits mostly do not exist.
Also, 9 months ago you posted that you were building it, I suppose? https://news.ycombinator.com/item?id=42169086