Also, don’t leave your account unused, delete it. User and follower numbers count.

And least as important, reply (if necessary to another corporate mail address) every email with Twitter/X in the footer, with a kind request to stop promoting and facilitating X.

https://bio.link/everyonehateselon

  • araneae@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    14 days ago

    A computer program trained off millions of pictures of people that the program’s parent company acquired by hook or by crook and now any photo you posted of your kids is fair game to be recycled into the most heinous shit imaginable at the press of a button by the kind of slug still using Twitter. There is abuse happening there, when they decided to build a machine that could take family photos put online by wellmeaning people and statistically morph them ship-of-Theseus style into pornography with no safeguards.

    If I were a parent and even theoretically one pixel of hair on my child’s head were used as aggregate data for this mathematic new form of abuse by proxy, I’d be old testament mad. I would liken it to a kind of theft or even rape that we have no clear word for or legal concept of yet.

    I would suggest just not defending this stuff in any way because you’re simply going to lose, or from your perspective, be dogpiled by people having what you perceive to be an extreme moral panic over this issue.

      • araneae@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        14 days ago

        I understand, albeit in layman’s terms, more or less what LLMs and image generators are doing, and used the ship of Theseus as shorthand referring to the processes by which real photos are laundered into data sets.

        I am aware of the literal difference between an individual model and the data it trains on, and understand that Grok and it’s like are divorced from their output. I have even played with running a local model. This level of concern would be unwarranted if humans were decent and only trained Grok on and requested of Grok to generate puppies playing in open fields.

        That doesn’t mean any image created by it henceforth is in any meaningful way a picture of you.

        Of course not, it is a picture of hundreds of thousands or maybe millions of people who offered up varying degrees of consent* for the use of their bodies to make any kind of porn.

        *Usually 0%, as data illegally scraped is not subject to a hostile TOS agreement if it is uploaded as training data by the scraper without even the knowledge or consent of the original company hosting the data

        Like anyone who has ever seen a child or depiction of a child producing any sexually explicit illustration of any sort everafter, then? Because even human artists do not create ex nihilo, they start from a foundation of their cumulative experiences, which means anyone who has ever laid eyes on your child or a photo of your child is at some level incorporating some aspect of that experience however small into any artwork they create henceforth and should they ever draw anything explicit…

        I think this largely speaks for itself as some of the worst words ever put together in any order generally. Comparing human creativity and how we draw inspiration from our forebears to the present subject is abhorrent. Imagining human minds turning every single speck of human flesh they see into jackoff material because you assume that is how the mind works because you learned it from a robot is beyond everything.

        Die on some other hill unless you’re being paid well. This is a neo Nazi’s CSAM and propaganda machine. If they want to fix it, they’ll scrub their training data and figure out the weights and do a massive ban wave on the abusers. It is not on this world to suffer excuses for this hideous fucking bullshit.