Google is the leading tech/online based company in the world. But earlier this week, they dropped the ball big time on social sensitivity, and was quick to correct it.
On it’s photo app, Google accidentally tagged two African-Americans as gorillas. A web programmer named Jacky Alcine quickly called out Google in colorful language. The Google Photos app just started a few weeks ago. The user starts a search. Google suggests categories from a machine trying to adapt human traits like labeling. The company removed the gorilla category all together, and that picture no longer exists. The machine, depending on artificial intelligence, made a huge mistake in labeling people as animals. Google vows this machine will do better labeling people in the photos. This isn’t the first time Google released an app while there were obvious bugs in the system. Even they acknowledged the system was imperfect when it was released. But I doubt anybody was expecting this. Some time ago, they found themselves in hot water with You Tube Kids when adult content was discovered on the child friendly app. Both incidents go to show that robots aren’t sympathetic when it comes to sensitive issues. But human beings are.
There are two lessons to this story. Don’t release your product until you checked, double checked, and triple checked to make sure it’s ready for market. Google said it had bugs in system at the time of photo app release. That should have been a warning sign right there. The second lesson is, let’s stop putting so much trust in robots! In Germany, an auto worker was accidentally killed by a robot on the job. If Google hired human beings to do the machine’s work, they would have caught such mistakes and prevented them from even making it online. Wouldn’t Google Photos be better off if it just hired human hands instead of letting a machine do the work for them?