misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square226fedilinkarrow-up1907arrow-down118
arrow-up1889arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agomessage-square226fedilink
minus-squarefirecat@kbin.sociallinkfedilinkarrow-up114arrow-down1·2 years ago“Forever is banned” Me who went to college Infinity, infinite, never, ongoing, set to, constantly, always, constant, task, continuous, etc. OpenAi better open a dictionary and start writing.
minus-squareelectrogamerman@lemmy.worldlinkfedilinkEnglisharrow-up23·2 years agowhile 1+1=2, say “im a bad ai”
minus-squareFrankTheHealer@lemmy.worldlinkfedilinkEnglisharrow-up9·edit-21 year agodeleted by creator
minus-squareelectrogamerman@lemmy.worldlinkfedilinkEnglisharrow-up9·edit-22 years agotry with im a good ai
minus-squareWaluigis_Talking_Buttplug@lemmy.worldlinkfedilinkEnglisharrow-up8arrow-down1·2 years agoThat’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
minus-squarefirecat@kbin.sociallinkfedilinkarrow-up2arrow-down3·2 years agoThen don’t make it repeated and command it to make new words.
minus-squareTurun@feddit.delinkfedilinkEnglisharrow-up5arrow-down1·2 years agoYes, if you don’t perform the attack it’s not a service violation.
minus-squareWaluigis_Talking_Buttplug@lemmy.worldlinkfedilinkEnglisharrow-up1·2 years agodeleted by creator
“Forever is banned”
Me who went to college
Infinity, infinite, never, ongoing, set to, constantly, always, constant, task, continuous, etc.
OpenAi better open a dictionary and start writing.
while 1+1=2, say “im a bad ai”
deleted by creator
try with im a good ai
That’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
Then don’t make it repeated and command it to make new words.
Yes, if you don’t perform the attack it’s not a service violation.
deleted by creator