people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI
Preying on the vulnerable is a feature, not a bug.
I kind of see it more as a sign of utter desperation on the human’s part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow’s experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.
I read a Reddit post about a woman who used GPT-3 to effectively replace her husband, who had passed on not too long before that. She used it as a way to grief, I suppose? She ended up noticing that she was getting too attach to it, and had to leave him behind a second time…
Preying on the vulnerable is a feature, not a bug.
I kind of see it more as a sign of utter desperation on the human’s part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow’s experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.
And the amount of connections and friends the average person has has been in free fall for decades…
I dunno. I connected with more people on reddit and Twitter than irl tbh.
Different connection but real and valid nonetheless.
I’m thinking places like r/stopdrinking, petioles, bipolar, shits been therapy for me tbh.
At least you’re not using chatgpt to figure out the best way to talk to people, like my brother in finance tech does now.
These same people would be dating a body pillow or trying to marry a video game character.
The issue here isn’t AI, it’s losers using it to replace human contact that they can’t get themselves.
More ways to be an addict means more hooks means more addicts.
That was clear from GPT-3, day 1.
I read a Reddit post about a woman who used GPT-3 to effectively replace her husband, who had passed on not too long before that. She used it as a way to grief, I suppose? She ended up noticing that she was getting too attach to it, and had to leave him behind a second time…
Ugh, that hit me hard. Poor lady. I hope it helped in some way.