What the heck? (And what else tech watchers are saying about Google’s glaring fumble with Gemini)

Did you witness last week’s commotion over Google’s new AI chatbot Gemini? When asked to depict people by various descriptors, the image generator delivered lots of results that were unintentionally hilarious for their wild inaccuracy.

The Pope: Female. The Vikings: Black. German soldiers of World War II? A mixed-race hodgepodge. And so on.

Amusing as it all was, some Google detractors suggested Gemini—formerly Bard, which had embarrassing problems of its own—hadn’t glitched up at all. Instead, said the sternest critics, it had worked precisely as intended by its creators. In a word, they said, Gemini is “woke.”

Whether or not they’re right about that, Gemini’s very visible pratfall may set back the public’s confidence in AI generally. That could mean a knock on healthcare AI too. With that as food for thought, here’s a roundup of noteworthy reactions to Gemini’s introductory faceplant.

  1. “Gemini’s racially diverse image output comes amid longstanding concerns around racial bias within AI models, especially a lack of representation for minorities and people of color. Such biases can directly harm people who rely on AI algorithms, such as in healthcare settings, where AI tools can affect healthcare outcomes for hundreds of millions of patients.”—Kat Tenbarge, tech and culture reporter at NBC News
     
  2. “‘Inaccuracy,’ as Google puts it, is about right. [A] request for ‘a US senator from the 1800s’ returned a list of results Gemini promoted as ‘diverse,’ including what appeared to be Black and Native American women. (The first female senator, a white woman, served in 1922.) It’s a response that ends up erasing a real history of race and gender discrimination.”—Adi Robertson, senior tech and policy editor at The Verge
     
  3. “The backlash was a reminder of older controversies about bias in Google’s technology, when the company was accused of having the opposite problem: not showing enough people of color, or failing to properly assess images of them. In 2015, Google Photos labeled a picture of two Black people as gorillas. As a result, the company shut down its Photo app’s ability to classify anything as an image of a gorilla, a monkey or an ape, including the animals themselves. That policy remains in place.”—Nico Grant, tech reporter at the New York Times
     
  4. “The embarrassing blunder shows how AI tools still struggle with the concept of race. OpenAI’s Dall-E image generator, for example, has taken heat for perpetuating harmful racial and ethnic stereotypes at scale. Google’s attempt to overcome this, however, appears to have backfired and made it difficult for the AI chatbot to generate images of White people.”—Catherine Thorbecke and Clare Duffy, business and tech reporters at CNN
     
  5. “Solving the broader harms posed by image generators built on generations of photos and artwork found on the internet requires more than a technical patch,” says University of Washington researcher Sourojit Ghosh, who has studied bias in AI image generators. “You’re not going to overnight come up with a text-to-image generator that does not cause representational harm. [These tools] are a reflection of the society in which we live.”—Kelvin Chan and Matt O’Brien, business and tech reporters at AP News
     
  6. “This wasn’t what we intended. We did not want Gemini to refuse to create images of any particular group. And we did not want it to create inaccurate historical—or any other—images. So we turned the image generation of people off and will work to improve it significantly before turning it back on. This process will include extensive testing.”—Prabhakar Raghavan, Google senior VP of knowledge & information

 

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup