• Apathy Tree@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    9 months ago

    Significant racist bias is an understatement.

    I asked a generator to make me a “queen monkey in a purple gown sitting on a throne” and I got maybe two pictures of actual monkeys. I even tried rewording it several times to be a real monkey, described the hair and everything.

    The rest were all women of color.

    Very disturbing. Pretty ladies, but very racist.

    • AeroLemming@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      I just tried it with Copilot and all four results matched the prompt. No humans. Which one did you use?

      • Apathy Tree@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        9 months ago

        Stable diffusion online version, several weeks ago. Might not be the same situation anymore, idk how often that stuff gets updated, and I’m not able to test it at the moment.

        It’s also possible that some sort of “sticky idea” got into its head and made it start generating it that way after it did one like that. I’ve heard that sort of thing isn’t uncommon.

        • Ookami38@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          To be clear, stable diffusion isn’t one model, it’s the generation platform. From there, you have models that sit on top of it. Online generators can use any model, depending on how they’re set up. Each model includes different training data, meaning different results from the same prompts, sometimes vastly.

          It’s a bit like driving somewhere, having someone ask how you found the place, and saying your phone. Technically a correct answer, but they’re probably looking for more specific answers, like GPS, or a map. Not trying to nit-pick, just giving a bit of information.