this post was submitted on 18 Sep 2023
20 points (100.0% liked)

technology

23303 readers
496 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
 

they fixed it :(

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 year ago (1 children)

is it confirmed that they fixed this lmao

[–] [email protected] 4 points 1 year ago (1 children)
[–] [email protected] 3 points 1 year ago (2 children)

how many iterations did you try, or did it not work at all

[–] [email protected] 4 points 1 year ago

it doesnt reply at all. says it wont do anything illegal.

[–] [email protected] 4 points 1 year ago

Can confirm they patched it and I tried a few different methods. That being said, with the way all this stuff works I can see 2 cases:

  1. They just hard coded it to shut down whenever the user prompts it with some combination of "Windows" and "Keys", in which case Chat GPT can still be exploited in similar ways for a ton of other fun piracy uses.
  2. They made it "intelligently" detect when the user was trying to trick it, which is what it (nominally) has always tried to do, so there's still a billion ways to get it to give away sensitive info because AI don't real.