hedgehog ,

"Glue is not pizza sauce" seems like a common fact to me but Googles llm disagrees for example.

That wasn’t something an LLM came up with, though. That was done by a system that uses an LLM. My guess is the system retrieves a small set of results and then just uses the LLM to phrase a response to the user’s query by referencing the links in question.

It’d be like saying to someone “rephrase the relevant parts of this document to answer the user’s question” but the only relevant part is a joke. There’s not much else you can do there.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • asklemmy@lemmy.ml
  • test
  • worldmews
  • mews
  • All magazines