WalnutLum ,

There are VERY FEW fully open LLMs. Most are the equivalent of source-available in licensing and at best, they're only partially open source because they provide you with the pretrained model.

To be fully open source they need to publish both the model and the training data. The importance is being "fully reproducible" in order to make the model trustworthy.

In that vein there's at least one project that's turning out great so far:

https://www.llm360.ai/

FaceDeer ,
@FaceDeer@fedia.io avatar

Fortunately, LLMs don't really need to be fully open source to get almost all of the benefits of open source. From a safety and security perspective it's fine because the model weights don't really do anything; all of the actual work is done by the framework code that's running them, and if you can trust that due to it being open source you're 99% of the way there. The LLM model just sits there transforming the input text into the output text.

From a customization standpoint it's a little worse, but we're coming up with a lot of neat tricks for retraining and fine-tuning model weights in powerful ways. The most recent bit development I've heard of is abliteration, a technique that lets you isolate a particular "feature" of an LLM and either enhance it or remove it. The first big use of it is to modify various "censored" LLMs to remove their ability to refuse to comply with instructions, so that all those "safe" and "responsible" AIs like Goody-2 can turned into something that's actually useful. A more fun example is MopeyMule, a LLaMA3 model that has had all of his hope and joy abliterated.

So I'm willing to accept open-weight models as being "nearly as good" as a full-blown open source model. I'd like to see full-blown open source models develop more, sure, but I'm not terribly concerned about having to rely on an open-weight model to make an AI system work for the immediate term.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • privacy@lemmy.ml
  • test
  • worldmews
  • mews
  • All magazines