Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • Player2@lemm.ee
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    4 months ago

    There is a difference between having actually diverse data sources and secretly adding the word “diverse” to each image generation prompt

    • Dayroom7485@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      4 months ago

      Never claimed they had diverse data sources - they probably don’t.

      My point is that that when minorities are underrepresented, which is the default case in GenAI, the (white, male) public tends to accept that.

      I like that they tried to fix the issue of GenAI being racist and sexist. Even though the solution is obviously flawed: Better this than a racist model.

      • StereoTrespasser@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        4 months ago

        I can’t believe someone has to spell this out for you, but here we go: an accurate picture of people from an era in which there was no diversity will, by definition, not be diverse.