Most cost efficient way to run a local LLM ( lemmy.ml )
So I I'm playing around with Open WebUI running llama3:70b and I like it a lot....
![](https://mbin.grits.dev/media/cache/resolve/entry_thumb/29/ac/29ac077c922eb0bd563b862ec2177fa4567f6252b1a33052b7a1d0ae99d23890.jpg)
So I I'm playing around with Open WebUI running llama3:70b and I like it a lot....
Short but thrilling. We started this year with a bang! Quite literally. A prosecutor wakes up in the middle of the night to the voice of an officer reporting to him the shooting of a villager, marking the beginning of a thrilling and mysterious crime investigation amidst the perturbation of the administration. As the title of...