• Tywèle [she|her]@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    62
    ·
    1 year ago

    So the author of the WaPo article is typing in anorexia keywords to generate anorexia images and gets anorexia images in return and is surprised about that?

    • ojmcelderry@lemmy.one
      link
      fedilink
      arrow-up
      19
      ·
      1 year ago

      Yep 🤦🏻‍♂️

      This isn’t even about AI. Regular search engines will also provide results reflecting the thing you asked for.

      • PostmodernPythia@beehaw.org
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        Some search engines and social media platforms make at least half-assed efforts to prevent or add warnings to this stuff, because anorexia in particular has a very high mortality rate, and age of onset tends to be young. The people advocating AI models be altered to prevent this say the same about other tech. It’s not techphobia to want to try to reduce the chances of teenagers developing what is often a terminal illness, and AI programmers have the same responsibility on that as everyone else,

    • Schedar@beehaw.org
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      Exactly what I was thinking.

      I mean it is important that this kind of stuff is thought about when designing these but it’s going to be a whack-a-mole situation and we shouldn’t be surprised that with targeted prompting you’ll easily gaps that generated stuff like this.

      Making articles out of each controversial or immoral prompt isn’t helpful at all. It’s just spam.