• relevants@feddit.de
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    It’s because humans have rated potential responses and ChatGPT has been trained to generate the kind of responses that most consistently get preferred rating. You can imagine how an AI trained to say what people want to hear would become a people pleaser.