Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    6 months ago

    it’s just a glorified autocomplete. it doesn’t know that it doesn’t know the answer because it doesn’t know anything. so if what you wanted happened, chatgpt would not answer any question, because it doesn’t know anything.

    chatgpt doesn’t look for information, it looks for the most likely words that will follow the previous ones.