Last week, Google’s brand-new image recognition program had a major failure when it identified two black people as gorillas. Obviously, this delivered the all-important reminder that image recognition algorithms have much to learn about context and sensitivity.
The misfire first surfaced via a screenshot posted last week by Twitter user @jackyalcine. The screenshot showed that the recently released Google photos app had categorized a picture of a black man and a black woman into a category named “Gorillas.”
@jackyalcine used some foul language, understandably so, when the app called his friend an ape – largely considered a racial slur directed at a black person.
It didn’t take two hours for a Google engineer to respond to @jackyalcine, asking for access to his account to troubleshoot the high-profile mistake. Yonatan Zunger, chief architect of Google’s social products, tweeted later in reference to the problem: “Sheesh. High on my list of bugs you never want to see happen. Shudder.”
Clearly, this was a complete accident on the part of Google. “We’re appalled and genuinely sorry that this happened,” said company spokeswoman Katie Watson, referring to the imperfect image recognition algorithm. “We are taking immediate action to prevent this type of result from appearing.”
Despite the apology, the mistake comes at an inopportune time for the tech giant, considering it is one of the Silicon Valley elite that has been defending itself against accusations of hiring discrimination. After all, Google is similar to Facebook and other tech giants in that most of their employees are white or Asian.
Obviously, there have been significant racial tensions in the US, considering the recent unjust police killings of black people, not to mention the murder in June of nine black people in a church in Charleston, South Carolina. But of course, that’s the problem with computers –the don’t understand context and don’t realize that this is a particularly bad time to make a racially insensitive mistake.
Of course, the app has made other mistakes– something that Google executives warned would happen ahead of time.Some people have inadvertently been categorized as seals or even horses. However, there’s not much of an issue of historical sensitivity surrounding mistaking people for seals, is there? More similar to Google’s error was the one made by Yahoo’s Flickr service, which also tagged black people as apes, and even accidentally referred to a Nazi concentration camp as a “jungle gym.” But luckily for Yahoo, not nearly as many people are paying attention to them.
After the gorilla incident, some commentators on social media started wondering aloud if the flaws in Google’s image recognition algorithms stemmed from the fact that roughly one in one hundred Google technology workers happen to be black, whereas about 94% are white or Asian.