ChatGPT

pyre , in ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer.

it's just a glorified autocomplete. it doesn't know that it doesn't know the answer because it doesn't know anything. so if what you wanted happened, chatgpt would not answer any question, because it doesn't know anything.

chatgpt doesn't look for information, it looks for the most likely words that will follow the previous ones.

phoneymouse , in ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer.

Claude does this

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • chatgpt@lemmy.world
  • test
  • worldmews
  • mews
  • All magazines