Photos by TechCrunch writer Lucas Matney / TWITTER

FaceApp Fail: 'Racist' Ethnic Filters Removed After Backlash

Using face-altering apps on photos is common nowadays, but FaceApp continues to miss the mark with its filters. The app just launched in January and is again under fire for producing tone-deaf technology deemed racist by users.

FaceApp alters or perfects selfies using artificial intelligence. It became widely popular for allowing users to radically transform a face to make it smile, look younger or older or even change gender. However, its newest update on Wednesday included”ethnic filters” “Black,” “Caucasian,” “Asian” and”Indian.”

Social media users criticized the filters, likening the Black and Asian filters to the perpetuation of racist stereotypes used in the practice of “Blackface” and “yellowface.” And the reaction to the filters was swift on Twitter:

TechCrunch writer Lucas Matney demonstrated the capabilities of the several filters by using photos of President Donald Trump, Vice President Mike Pence and former President Barack Obama.

The app’s creator is Yaroslav Goncharov, CEO of the Russian app development company Wireless Lab. Goncharov, a former Microsoft and Yandex engineer, said in a statementWednesday”the new controversial filters will be removed in the next few hours.”

He added, “The ethnicity change filters have been designed to be equal in all aspects. They don’t have any positive or negative connotations associated with them. They are even represented by the same icon. In addition to that, the list of those filters is shuffled for every photo, so each user sees them in a different order.”

The filters are no longer available. But,this isn’t the first time Goncharov was placed in the hot seat.

In April, the app’s “hot” filter, said to make one look more attractive, automatically lightened people’s skin. The filter came under fire for perpetuating the age-old association of pale skin with beauty.

Goncharov apologized, saying it was an “unfortunate side-effect of the underlying neural network caused by the training set bias.”

Basically, a diverse data set was not used when training the filter to define “hotness.” In essence, the AI technology built was racist. Goncharov said in Aprilthat the data set used to train the “hotness” filter is not a public data set, rather its own.

He noted Wednesday that “algorithmic bias is something that requires ongoing attention,” which means offensive outputs may continue.

“When dealing with such complex algorithms and neural networks, it would be naive and even irresponsible to declare such an issue completely fixed. It is an ongoing process,” he toldTechCrunch.

“What I can say is that we are committed to keeping this issue in mind at all phases of our product cycle from creating datasets and training neural networks to quality assurance and processing customer feedback.

“Our style filters are designed to preserve ethnicity origin and our tests show that they do this quite well.”

It seems that Goncharov needs a system-wide update in understanding and implementing customer feedback. Ultimately, the app, which he said has had more than 45 million downloads since it launched, will sink or swim based on consumer opinion, not algorithms.

Readmore news @

Latest News

The Future of Sustainable Packaging: Dow introduces INNATE™ TF Polyethylene Resins for Tenter Frame Biaxial Orientation

Originally published on Answering the industry needs of high performance, consumer convenience, and recyclability, Dow is excited to announce an innovative and revolutionary brand extension to the family of INNATE™ Precision Packaging Resins. INNATE™ TF Polyethylene Resins for Tenter Frame Biaxial Orientation are bringing commercial viability to a long-desired packaging goal:…

AT&T: Leading with Listening and Four Key Pillars Supporting Transformation – People, Process, Technology and Culture

Originally published on The first blog in the series discussed how companies need to invest in creating Transformational DNA. The second blog defined the five characteristics of Transformational DNA. At its core, any service organization must embody humility. This means leading with listening. It’s been critical to ensuring adoption across our organization. As…

AbbVie: Scientists Rock! Row, Row, Row Your Boat

Originally published on Scientists Rock! is a monthly Q&A where we pull an AbbVie scientist out of the lab to hear what makes them tick. This month we travel to Cambridge, Massachusetts, United States, to chat with Samantha Brecht, associate scientist, AbbVie Foundational Neuroscience Center. Once upon time, in…

Michael brown

Police Officer Who Killed Michael Brown Won’t Face Charges

Darren Wilson, the former officer in Ferguson, Missouri, who, in 2014 fatally shot Michael Brown, an unarmed Black teenager, will not face any charges, the Washington Post reports. St. Louis County’s top prosecutor, Wesley Bell, said at a news conference on Thursday that his office investigated the case, including witness statements, forensic…