Psychological tricks can get LLMs to respond to "forbidden" prompts | Not Hacker News!