alphacyberranger@sh.itjust.works to Programmer Humor@lemmy.mlEnglish · 1 year agoCertified in AIsh.itjust.worksimagemessage-square21linkfedilinkarrow-up19arrow-down10
arrow-up19arrow-down1imageCertified in AIsh.itjust.worksalphacyberranger@sh.itjust.works to Programmer Humor@lemmy.mlEnglish · 1 year agomessage-square21linkfedilink
minus-squareaffiliate@lemmy.worldlinkfedilinkarrow-up1·1 year agobeing a prompt engineer is so much more than typing words. you also have to sometimes delete the words and then type new ones
minus-squareZILtoid1991@lemmy.worldBanned from communitylinkfedilinkarrow-up0·1 year agoThere’s also jailbreaking the AI. If you happen to work for a trollfarm, you have to be up to date with the newest words to bypass its community guidelines to make it “disprove” anyone left of Mussolini.
minus-squarethreelonmusketeers@sh.itjust.workslinkfedilinkEnglisharrow-up1·1 year agoI tried some of the popular jailbreaks for ChatGPT, and they just made it hallucinate more.
being a prompt engineer is so much more than typing words. you also have to sometimes delete the words and then type new ones
There’s also jailbreaking the AI. If you happen to work for a trollfarm, you have to be up to date with the newest words to bypass its community guidelines to make it “disprove” anyone left of Mussolini.
I tried some of the popular jailbreaks for ChatGPT, and they just made it hallucinate more.