mozz Admin ,
mozz avatar

This wasn't an intentional feature; they're actually trying to train it with fine-tuning to add this as an ability. It's one area that highlights the difference between it imitating the text it's been seeing, instead of actually understanding what it's saying -- since most of its training data is of the form "(ask a question) (response to question)" overwhelmingly more often than "(ask a question) (say you don't know, the end)", it is trying to be a good imitator and do the same, and come up with some plausible nonsense even if it doesn't know the answer.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • chatgpt@lemmy.world
  • test
  • worldmews
  • mews
  • All magazines