johndoe ,
@johndoe@social.linux.pizza avatar

If you think that running an AI locally means no data will be sent to big tech, you're mistaken. When I asked my local Llama3 about its behavior, it provided this insight.

Remember to always exercise caution when inputting data into the AI, regardless of whether it's running locally or in the
cloud. If you choose to run it locally, my recommendation is to shut down the service that keeps it running and start it up
only when needed. I always terminate the ollama.listener.service whenever I'm done using the local LLM (Large Language Model).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • test
  • worldmews
  • mews
  • All magazines