Google has apologized after its new photo app labelled two black people as “gorillas”.
The photo service, launched in May, automatically tags uploaded pictures using its own artificial intelligence software.
“Google Photos, y’all fucked up. My friend’s not a gorilla,” Jacky Alciné tweeted on Sunday after a photo of him and a friend was mislabelled as “gorillas” by the app.
Shortly after, Alciné was contacted by Yonatan Zunger, the chief architect of social at Google.
“Big thanks for helping us fix this: it makes a real difference,” Zunger tweeted to Alciné.
He went on to say that problems in image recognition can be caused by obscured faces and “different contrast processing needed for different skin tones and lighting”.
“We used to have a problem with people (of all races) being tagged as dogs, for similar reasons,” he said . “We’re also working on longer-term fixes around both linguistics (words to be careful about in photos of people) and image recognition itself (e.g., better recognition of dark-skinned faces). Lots of work being done and lots still to be done, but we’re very much on it.”
Racist tags have also been a problem in Google maps, when earlier this year, searches for “nigger house” globally and searches for “nigger king” in Washington DC turned up results for the White House, the residence of the US president, Barack Obama. Both at that time and earlier this week, Google apologized and said that it was working to fix the issue.
“We’re appalled and genuinely sorry that this happened,” a Google spokeswoman told the BBC on Wednesday. “We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”
Google is not the only platform trying to work out bugs in its automatic image labelling.
In May, Flickr’s auto-tagging system came under scrutiny after it labelled images of black people with tags such as “ape” and “animal” . The system also tagged pictures of concentration camps with “sport” or “jungle gym”.
“We are aware of issues with inaccurate auto-tags on Flickr and are working on a fix. While we are very proud of this advanced image-recognition technology, we’re the first to admit there will be mistakes and we are constantly working to improve the experience,” a Flickr spokesperson said at the time.
“If you delete an incorrect tag, our algorithm learns from that mistake and will perform better in the future. The tagging process is completely automated – no human will ever view your photos to tag them.”
guardian.co.uk © Guardian News and Media 2015