555_1 ,

So that means they don’t need the same budget and we can use that money for socialized healthcare, free college education, and raising social security payments?

mozz OP Admin ,
mozz avatar

Or! Hear me out…

disguy_ovahea ,

I hope the militant radicals bent on revolution are taking notes. They probably just need more guns, right? 2A4eva?

this ,
@this@sh.itjust.works avatar

Yea, let's just slap the missile equivalent of chatgpt on a bunch of drone missiles, what could go wrong? /s

Serriously though, what happens if the AI driving the drone hallucinates? I wouldn't want to be anywhere near these things when they're testing them.

mozz OP Admin ,
mozz avatar

I highly doubt they are putting LLMs on their little throwaway drones. The US military has actually been working on “let’s figure out what that thing is and blow it up automatically” technology since at least as far back as the 90s; e.g. modern warship defense systems use it to be able to react faster than a human can to blow up an incoming missile.

Personally I am much more worried about it working exactly as intended.

disguy_ovahea , (edited )

You are correct. Large language models like ChatGPT are a subset of deep learning, which is a subset of machine learning. Common examples of simple machine learning software are facial recognition, social media algorithms, speech-to-text, and predictive text.

There is no reason to include software as complex, resource intensive, or experimental as an LLM when dedicated ML will suffice.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • news@lemmy.world
  • test
  • worldmews
  • mews
  • All magazines