This article is more than 10 years old

Google has apologized after its new photo app labelled two black people as “gorillas”.

The photo service, launched in May, automatically tags uploaded pictures using its own artificial intelligence software.

“Google Photos, y’all fucked up. My friend’s not a gorilla,” Jacky Alciné tweeted on Sunday after a photo of him and a friend was mislabelled as “gorillas” by the app.

Shortly after, Alciné was contacted by Yonatan Zunger, the chief architect of social at Google.

“Big thanks for helping us fix this: it makes a real difference,” Zunger tweeted to Alciné.

He went on to say that problems in image recognition can be caused by obscured faces and “different contrast processing needed for different skin tones and lighting”.

“We used to have a problem with people (of all races) being tagged as dogs, for similar reasons,” he said. “We’re also working on longer-term fixes around both linguistics (words to be careful about in photos of people) and image recognition itself (e.g., better recognition of dark-skinned faces). Lots of work being done and lots still to be done, but we’re very much on it.”

Racist tags have also been a problem in Google Maps. Earlier this year, searches for “nigger house” globally and searches for “nigger king” in Washington DC turned up results for the White House, the residence of the US president, Barack Obama. Both at that time and earlier this week, Google apologized and said that it was working to fix the issue.

“We’re appalled and genuinely sorry that this happened,” a Google spokeswoman told the BBC on Wednesday. “We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

Google is not the only platform trying to work out bugs in its automatic image labelling.

In May, Flickr’s auto-tagging system came under scrutiny after it labelled images of black people with tags such as “ape” and “animal”. The system also tagged pictures of concentration camps with “sport” or “jungle gym”.

“We are aware of issues with inaccurate auto-tags on Flickr and are working on a fix. While we are very proud of this advanced image-recognition technology, we’re the first to admit there will be mistakes and we are constantly working to improve the experience,” a Flickr spokesperson said at the time.

“If you delete an incorrect tag, our algorithm learns from that mistake and will perform better in the future. The tagging process is completely automated – no human will ever view your photos to tag them.”

You’ve read 31 articles in the last year

Article count

Any support you give us today will benefit our annual climate appeal. As denialist politicians join with fossil fuel companies to dismantle climate progress, journalism is a critical line of defence for our planet. Please help fund this vital work.

26,717 of 40,000readers

They’re fighting dirty, we’re fighting back …

Across the planet, decades of progress towards a healthier planet is being threatened by a global resistance in the form of far-right politicians and governments, fossil fuel influence and corporations empowered by the political winds to roll back their green pledges.

A growing lobby of economic and political denial has been empowered by a US administration that has declared war on climate progress and wider environmental protections.

One powerful way to do that is by funding strong, independent journalism that can help stand up to this tide.

The Guardian doesn’t bow to political pressure and we don’t take advertising from fossil fuel companies. Our work is funded by readers just like you. For our annual environment support campaign, we are asking 40,000 readers to back our journalism with a one-off or recurring amount.If you can afford it, please consider doing so today, it takes less than a minute to sign up.

Continue