Google’s feeling a little sheepish after its image recognition feature failed in a big, somewhat racist way.
Usually, the search giant’s image recognition is pretty good, considering it’s all automated. It knows how things look – whether it’s a national landmark or a type of animal – and automatically categorizes the picture.
But it isn’t perfect. That became abundantly clear when Twitter user Jacky Alcine noticed that a picture of a couple of friends had been automatically labeled as “Gorillas.”
Alcine noted, correctly, that Google “f’d up.” While the image recognition feature is bound to fail from time to time, it’s a pretty spectacular failure to identify two black people as gorillas – especially given the fact that ape-based insults form a cornerstone of Racism 101.
To their credit, Google responded quickly. The company’s chief architect of social (we’re not sure what that means), Yonatan Zunger, responded to Alcine’s tweet in an apologetic, somewhat profane way.
The drama doesn’t end there, however. Even though Zunger and his team worked on fixing the problem, it seems that the problem persisted. According to Alcine, Google’s image recognition continued to insist that black people were gorillas.
Google finally fixed the problem by removing the “Gorilla” tag altogether. Zunger says his team is working on a more permanent fix.
This isn’t the first time the image recognition robots of the Internet have been outed as racist. In fact, pretty much the exact same thing happened just a few weeks ago. Read on to find out more.
If Google was feeling a little bit embarrassed after this debacle, they can at least take comfort in the fact that one of their biggest competitors experienced the same issue.
Yahoo uses similar image recognition technology in their image hosting website Flickr. Just two weeks after the service launched, there was trouble. It began when photographer Corey Deshon posted an image of a black man named William to his profile.
Nothing wrong with that. Good composition, stark contrast – it’s just a great portrait of William, right?
Of course, you can guess what happened – Flickr’s image detection robots had to get racist. Deshon immediately noticed that Flickr had applied some questionable tags to the portrait.
You can see that the portrait had the word ‘animal’ applied to it – and, according to Deshon, it also said ‘ape’ before the error was fixed.
It was a rough first month for Flickr’s image detection service – its algorithm also connected the words “sport” and “jungle gym” to images of infamous Nazi concentration camps.
Internet image detection algorithms aren’t always offensive – sometimes they’re just hilariously wrong. Click ahead to find out how Microsoft messed up with its age-detection feature.
You might remember when Microsoft’s age-guessing website how-old.net went viral (albeit in kind of a lukewarm way). The site, which remains active, will detect faces in a picture that’s uploaded and try to determine the age and gender of the person or people in the image.
Sometimes the site is bang-on…but sometimes it’s waaaay off.
Take this image, for instance. Who can forget the Harry Potter movies, which starred the titular young wizard? If how-old.net can be trusted, Daniel Radcliffe was actually 35 years old when the first movie was shot. He was also a woman.
Okay, the site isn’t always right. Let’s try another child star, Haley Joel Osmont.
Nope, that isn’t right either. At least they got the gender right this time.
Continuing the Hollywood theme, but moving onto non-humans, here’s Caesar from Rise of the Planet of the Apes.
If Caesar had actually been 69 years old in the movie, he’d be way too old to lead the rise on the planet that would someday belong to the apes. In fact, chimps rarely, if ever, live that long.
Let’s feed the website something that doesn’t even look like a human face to see what it does…
What?! Go home, how-old.net. You’re drunk.
Source: