Runprompt
github.comKey Features
Tech Stack
Key Features
Tech Stack
Functions require you to specify them on the command line every time they're invoked. I would prefer a tool like this to default to reading the functions from a hierarchy where it reads e.g. .llm-functions in the current folder, then ~/.config/llm-functions or something like that.
In general I found myself baffled when trying to figure out where and how to configure things. That's probably me being impatient but I have found other tools to have more straightforward setup and less indirection.
Basically I like things to be less centralized, magic, and less controlled by the tool.
Another thing, which is not the fault of llm at all, is I find Python based tools annoying to install. I have to remember the env where I set them up. Contrast with a golang application which is generally a single file I can put in ~/bin. That's the reason I don't want to introduce a dep to runprompt if I can avoid it.
The final thing that I found frustrating was the name 'llm' which makes it difficult to conduct searches as it is the generic name for what the thing is.
It is an amazing piece of engineering and I am a huge fan of simonw's work, but I don't use llm much for these reasons.
seems like it would be, just swap the openai url here or add a new one
If you curl/wget a script, you still need to chmod +x it. Git doesn't have this issue as it retains the file metadata.
#!/bin/env runprompt
---
.frontmatter...
---
The prompt.
Would be a lot nicer, as then you can just +x the prompt file itself. #!/bin/bash
file="$1"
model=$(sed -n '2p' "$file" | sed 's/^# \*//')
prompt=$(tail -n +3 "$file")
curl -s https://api.anthropic.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "content-type: application/json" \
-H "anthropic-version: 2023-06-01" \
-d "{
\"model\": \"$model\",
\"max_tokens\": 1024,
\"messages\": [{\"role\": \"user\", \"content\": $(echo "$prompt" | jq -Rs .)}]
}" | jq -r '.content[0].text'
hello.prompt #!/usr/local/bin/promptrun
# claude-sonnet-4-20250514
Write a haiku about terminal commands.#/usr/bin/env runprompt
I wasn't aware of the whole ".prompt" format, but it makes a lot of sense.
Very neat. These are the kinds of tools I love to see. Functional and useful, not trying to be "the next big thing".
"Chain Prompts Like Unix Tools with Dotprompt"
https://pythonic.ninja/blog/2025-11-27-dotprompt-unix-pipes/
"One-liner code review from staged changes" - love this example.
That's typically how we expect bash pipelines to work, right?
- arrow up
- append a stage to the pipeline
- repeat until output is as desired
If you're gonna write to some named location and later read from it you're drifting towards a different mode of usage where you might as well write a python script.
https://microsoft.github.io/promptflow/how-to-guides/develop...
It certainly doesn't intuitively sound like it matches the "Do one thing" part of the Unix philosophy, but it does seem to match the "and do it well" part.
I think you mentioned elsewhere that you dont want to have a lot of dependencies, but as the format evolves using the reference impl will allow you to work on real features.
I've been using mlflow to store my prompts, but wanted something lightweight on the cli to version and manage prompts. I setup pmp so you can have different storage backends (file, sqlite, mlflow etc.).
I wasn't aware of dotprompt, I might build that in too.
I've been using mlflow to store my prompts, but wanted something lightweight on the cli to version and manage prompts. I setup pmp so you can have different storage backends (file, sqlite, mlflow etc.). I wasn't aware of dotprompt, I might build that in too.
Not affiliated with Hacker News or Y Combinator. We simply enrich the public API with analytics.