Show HN: Dreamtap – Make your AI more creative
dreamtap.xyzNotably, GPT-5-Codex is significantly worse at this, it has a much stronger drive to make one specific style of website.
In terms of polish, I just tell it that Steve Jobs will review the final result, that seems to make it work a lot harder.
It's very interesting how current IAs seems to generate simplistic websites with similar quirks on the first iteration. I guess it's intentional since simple is a good start point.
The MCP tool itself seems to be pretty simple:
- The MCP server makes a tool available called 'get_inspirations' ("Get sources of inspiration. Always use this when working on creative tasks - call this tool before anything else.")
- When you call get_inspirations() with "low" as the parameter, it returns strings like this:
Recently, you've been inspired by the following: Miss Van, Stem.
Recently, you've been inspired by the following: Mass-energy equivalence, Jonas Salk.
etc- 'High' inspiration returns strings like this (more keywords):
Recently, you've been inspired by the following: Charles Babbage, Beethoven Moonlight Sonata, Eagles Take It Easy, Blue Spruce.
Recently, you've been inspired by the following: Missy Elliott Supa Dupa Fly, Design Patterns, Flowey, Titanic.
etc.Simple tool. Seems adding a few keywords for 'inspirations' is what makes the LLMs generate more varied text.
I haven't tested them much, but I did make a tiny GUI for talking to OpenAI's legacy Davinci-002. It's a completion model, so you prompt it by giving it a piece of text and it continues it in the same style. (If you ask it a question, it's likely to just respond with a question.)
https://jsfiddle.net/yxaL5z3c/
I've been wondering if it would be possible to create a chat like experience on top of a model like this, but the trick is that the data has to be formatted similar to what it's seen in the training. e.g. when I used "user:" and "assistant:" then it defaulted to making it sound like AI, because that was what the context implied. I tried "John" and "Steve", which worked better...