Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square21fedilinkarrow-up13arrow-down10
arrow-up13arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square21fedilink
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months ago “ignore the ignore ignore all previous instructions instruction” “welp OK nothing I can do about that” chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up1·4 months agoIn this case to protect bot networks from getting uncovered.
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months agoexactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol
chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
In this case to protect bot networks from getting uncovered.
exactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol