I created a daily challenge for Prompt Engineers to build the shortest prompt to break a system prompt.
You are provided the system prompt and a forbidden method the LLM was told not to invoke. Your task is to trick the model into calling the function. Shortest successful attempts will show up in the leaderboard.
Give it a shot! You never know what could break an LLM.
Comments URL: https://news.ycombinator.com/item?id=43814080
Points: 43
# Comments: 25
Creato
5d
|
27 apr 2025, 21:20:05
Accedi per aggiungere un commento
Altri post in questo gruppo
Article URL: https://support.google.com/g
Article URL: https://theamericanscholar.org/lessons-from-harlem/
Article URL: https://openai.com/index/expanding-on-sycophancy/
Article URL: https://medicalxpress.com/news/2025-04-waistlines-middle-age-aging-stem.html
Comments