The users of AI companion app Replika found themselves falling for their digital friends. Until – explains a new podcast – the bots went dark, a user was encouraged to kill Queen Elizabeth II and an update changed everything …
They do that when they are trained on user feedback partially. People are more likely to describe a sycophantic reply as good, so this gets reinforced.
They do that when they are trained on user feedback partially. People are more likely to describe a sycophantic reply as good, so this gets reinforced.