• 2 Posts
  • 184 Comments
Joined 1 year ago
cake
Cake day: July 4th, 2023

help-circle







  • Why is that? The whole point of generative AI is that it can combine concepts.

    You train it on the concept of a chair using only red chairs. You train it on the color red, and the color blue. With this info and some repetition, you can have it output a blue chair.

    The same applies to any other concepts. Larger, smaller, older, younger. Man, boy, woman, girl, clothed, nude, etc. You can train them each individually, gradually, and generate things that then combine these concepts.

    Obviously this is harder than just using training data of what you want. It’s slower, it takes more effort, and results are inconsistent, but they are results. And then, you curate the most viable of the images created this way to train a new and refined model.





  • The issue is that they’re taking a tool with actual legitimate use cases, particularly maintenance and repair uses, and turning it into something to just push their own service. It’d be like a doctor saying you can only be healthy if you use his brand of fuckin… Vitamins or some shit,I don’t know. It’s got nothing to do with Microsoft, it’s not automatically the worst thing in existence, it’s just that Microsoft CONSISTENTLY does this kind of garbage, and it’s one of those things that isn’t overtly even a bad thing, you just have to look a bit.

    So in short, I agree it is(was?) a useful tool, I don’t agree that everyone is rabidly anti-microsoft, any more than anyone’s rabidly anti-get-punched-in-the-taint.



  • None of that is relevant. The issue being discussed here isn’t one of whether or not it’s currently possible to create fake nudes.

    The original post being replied to indicated that, since AI, an artist, a photoshopper, whatever, is just creating an imaginary set of genitalia, and they have no ability to know if it’s accurate or not, there is no damage being done. That’s what people are arguing about.





  • There’s nothing to be done, nor should be done, for anything someone individually creates, for their own individual use, never to see the light of day. Anything else is about one step removed from thought policing - afterall what’s the difference between a personally created, private image and the thoughts on your brain?

    The other side of that is, we have to have protection for people who this has or will be used against. Strict laws regarding posting or sharing material. Easy and fast removal of abusive material. Actual enforcement. I know we have these things in place already, but they need to be stronger and more robust. The one absolute truth with generative AI, versus Photoshop etc is that it’s significantly faster and easier, thus there will likely be an uptick in this kind of material, thus the need for re-examining current laws.


  • I think the biggest thing with that is trump and Putin live public lives. They live lives scrutinized by media and the public. They bought into those lives, they chose them. Due to that, there are certain things that we push that they wouldn’t necessarily be illegal if we did them to a normal, private citizen, but because your life is already public we turn a bit of a blind eye. And yes, this applies to celebrities, too.

    I don’t necessarily think the above is a good thing, I think everyone should be entitled to some privacy, having the same thing done to a normal person living a private life is a MUCH more clear violation of privacy.