Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • Nougat@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Sure but that would mean it would have to know anything.

    It would have to know that it doesn’t know, and it doesn’t.

    • Puttaneska@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      It seems that ChatGPT does sometimes know that what it’s offered is wrong and actually knows a better answer when challenged.

      I’ve often asked for code help, which hasn’t worked. Then I’ve gone to other sources and found that ChatGPT has been wrong about something and there’s an alternative way. When this is put back to ChatGPT, it says that I’m correct (x can’t do y) and offers a perfect solution.

      So it looks like it does sometimes know what it appears to not know, but inexplicably doesn’t give the correct info immediately.

      • Nougat@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        No, it’s responding to your comment suggesting something different by giving you something different. It has no idea what’s correct or incorrect. You do, so when you give it input that you know is more correct, of course it’s going to respond by telling you you’re right.

        Try feeding it incorrect answers as though they are correct and see what happens.