Photos by TechCrunch writer Lucas Matney / TWITTER

Archived: FaceApp Fail: 'Racist' Ethnic Filters Removed After Backlash

Using face-altering apps on photos is common nowadays, but FaceApp continues to miss the mark with its filters. The app just launched in January and is again under fire for producing tone-deaf technology deemed racist by users.


FaceApp alters or perfects selfies using artificial intelligence. It became widely popular for allowing users to radically transform a face to make it smile, look younger or older or even change gender. However, its newest update on Wednesday included”ethnic filters” “Black,” “Caucasian,” “Asian” and”Indian.”

Social media users criticized the filters, likening the Black and Asian filters to the perpetuation of racist stereotypes used in the practice of “Blackface” and “yellowface.” And the reaction to the filters was swift on Twitter:

TechCrunch writer Lucas Matney demonstrated the capabilities of the several filters by using photos of President Donald Trump, Vice President Mike Pence and former President Barack Obama.

The app’s creator is Yaroslav Goncharov, CEO of the Russian app development company Wireless Lab. Goncharov, a former Microsoft and Yandex engineer, said in a statementWednesday”the new controversial filters will be removed in the next few hours.”

He added, “The ethnicity change filters have been designed to be equal in all aspects. They don’t have any positive or negative connotations associated with them. They are even represented by the same icon. In addition to that, the list of those filters is shuffled for every photo, so each user sees them in a different order.”

The filters are no longer available. But,this isn’t the first time Goncharov was placed in the hot seat.

In April, the app’s “hot” filter, said to make one look more attractive, automatically lightened people’s skin. The filter came under fire for perpetuating the age-old association of pale skin with beauty.

Goncharov apologized, saying it was an “unfortunate side-effect of the underlying neural network caused by the training set bias.”

Basically, a diverse data set was not used when training the filter to define “hotness.” In essence, the AI technology built was racist. Goncharov said in Aprilthat the data set used to train the “hotness” filter is not a public data set, rather its own.

He noted Wednesday that “algorithmic bias is something that requires ongoing attention,” which means offensive outputs may continue.

“When dealing with such complex algorithms and neural networks, it would be naive and even irresponsible to declare such an issue completely fixed. It is an ongoing process,” he toldTechCrunch.

“What I can say is that we are committed to keeping this issue in mind at all phases of our product cycle from creating datasets and training neural networks to quality assurance and processing customer feedback.

“Our style filters are designed to preserve ethnicity origin and our tests show that they do this quite well.”

It seems that Goncharov needs a system-wide update in understanding and implementing customer feedback. Ultimately, the app, which he said has had more than 45 million downloads since it launched, will sink or swim based on consumer opinion, not algorithms.

Readmore news @ DiversityInc.com

Latest News

Marriott International Launches “Bridging The Gap” Hotel Development Program

Originally published at news.marriott.com. Marriott International is a Hall of Fame company.   At the 44th Annual NYU International Hospitality Investment Conference, Marriott International announced the launch of “Marriott’s Bridging The Gap,” a multi-year, $50 million development program that aims to address the barriers to entry that historically underrepresented groups…

5 Biggest News Stories of the Week: June 16

As the saying goes, the news never stops — but there’s a lot of it out there, and all of it doesn’t always pertain to our readers. In this weekly news roundup, we’ll cover the top news stories that matter most to our diversity focused audience. 1. Buffalo Supermarket Shooter…