hireillo , to random
@hireillo@illo.social avatar

PSA: There's an update for Glaze out today, don't forget to update your copy as there's no auto-update. Grab the latest version at https://glaze.cs.uchicago.edu/downloads.html

#Glaze #NoAI

sunguramy , to random
@sunguramy@flipping.rocks avatar

Hmm. So I tried Glaze, which is a program to run your images through in order to stop AI scraping from your work. I thought this would be a good idea to start doing, especially as I (for now) plan to keep my facebook page for my photography as many followers there will not follow me other places, it's the longest running page I've had.

It took me 2 hours to download the program (on high speed fiber)

It took 20 minutes to install on my computer.

I had to close everything on my computer for it "to have enough memory to run".

THEN: on the lowest settings (fast run, so worse rendering and more visible things PLUS less writeovers to stop AI scraping) it estimated 30 minutes for a small 1600 x 1200 pixel image I was going to test it on. Annnnnd then it choked on my computer and I said F-it.

I mean. I appreciate the effort going into designing these things, but this is NOT a workable solution....

(1/2)

abucci , to random
@abucci@buc.ci avatar

Nightshade 1.0 is out: https://nightshade.cs.uchicago.edu/index.html

From their "What is Nightshade?" page:

Since their arrival, generative AI models and their trainers have demonstrated their ability to download any online content for model training. For content owners and creators, few tools can prevent their content from being fed into a generative AI model against their will. Opt-out lists have been disregarded by model trainers in the past, and can be easily ignored with zero consequences. They are unverifiable and unenforceable, and those who violate opt-out lists and do-not-scrape directives can not be identified with high confidence.

In an effort to address this power asymmetry, we have designed and implemented Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.

-E

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • test
  • worldmews
  • mews
  • All magazines