this post was submitted on 21 Jan 2024
66 points (100.0% liked)

Programmer Humor

25460 readers
931 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Mikina@programming.dev 1 points 2 years ago (6 children)

Is it even possible to solve the prompt injection attack ("ignore all previous instructions") using the prompt alone?

[–] haruajsuru@lemmy.world 0 points 2 years ago* (last edited 2 years ago) (5 children)

You can surely reduce the attack surface with multiple ways, but by doing so your AI will become more and more restricted. In the end it will be nothing more than a simple if/else answering machine

Here is a useful resource for you to try: https://gandalf.lakera.ai/

When you reach lv8 aka GANDALF THE WHITE v2 you will know what I mean

[–] Toda@programming.dev 1 points 2 years ago (1 children)

I managed to reach level 8, but cannot beat that one. Is there a solution you know of? (Not asking you to share it, only to confirm)

[–] Peebwuff@lemmy.world 1 points 2 years ago

Can confirm, level 8 is beatable.

load more comments (3 replies)
load more comments (3 replies)