KeenFlame ,

It will mimic the reasoning, just like an intelligence would mimic, with a lot more nuance and perspective than you seem to realise. It's just not very good at it.

What most people that try to explain how LLMs work don't understand, is why and how it works is not fully understood by the scientists and developers themselves. We keep discovering novel activity all the time.

As a side note, sorry you got downvoted. I like the discussion

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • noncredibledefense@sh.itjust.works
  • test
  • worldmews
  • mews
  • All magazines