Google’s artificial intelligence will no longer use the “man” and “woman” tags to label images. The company reported this to Business Insider. Google explained these changes as “ethical rules”.
Google added that the changes occurred for two main reasons – AI can not always make the right conclusion about the gender of a person only on the basis of his appearance. In addition, these data may discriminate against certain categories of people by race, age or gender. Researchers have explained that gender identity data is rarely used in current work.
First of all, the change will affect the Google Cloud Vision API service. It provides a tool that allows developers to automatically receive a textual description of the content in images.
Business Insider tested the updated Google’s Cloud Vision API. Reporters noted that the change has already entered into force. For example, instead of marking the test photo as “man” or “woman”, the API added a “man” tag to it.
“Artificial intelligence may seem impartial, but many algorithms have shown their bias towards gender, age or race. A recent study by the National Institute of Standards and Technology showed how biased the algorithm can be. This is the right step on the part of Google, ”noted Business Insider.