Problem: LLMs can't defend against prompt injection.
-
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
-
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
-
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
@mttaggart It'll never work. Unless you allow it to connect to the Internet.
-
@mttaggart It'll never work. Unless you allow it to connect to the Internet.
@cR0w That's really where all the troubles began, isn't it
-
@cR0w That's really where all the troubles began, isn't it
@mttaggart The Internet was a mistake.
-
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
What are we doing with our time on this earth
https://www.promptarmor.com/resources/claude-cowork-exfiltrates-files
https://www.varonis.com/blog/reprompt -
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
@mttaggart That will make sure nobody uses it! Problem solved.
-
What are we doing with our time on this earth
https://www.promptarmor.com/resources/claude-cowork-exfiltrates-files
https://www.varonis.com/blog/reprompt@mttaggart Man, I'm old enough to remember when computers did exactly what you told them to do, and you didn't have to grovel. (You *did* have to learn something like C or BASIC, but that's more like consensual masochism.)
-
Problem: LLMs can't defend against prompt injection.
Solution: A specialized filtering model that detects prompt injections.
Problem: That too is susceptible to bypass and prompt injection.
Solution: We reduce the set of acceptable instructions to a more predictable space and filter out anything that doesn't match.
Problem: If you over-specialize, the LLM won't understand the instructions.
Solution: We define a domain-specific language in the system prompt, with all allowable commands and parameters. Anything else is ignored.
Problem: We just reinvented the CLI.
@mttaggart muahahaha
-
J jwcph@helvede.net shared this topic
