Images of child sexual abuse are not just present on the darknet and other obscure corners of the internet. A New York Times report found that child abusers run rampant on surface web platforms like Google Drive, Dropbox, Amazon and Apple that struggle to balance privacy and security and only have limited abilities to detect illegal images.
The report features the story of F. and E., two sisters whose father sexually abused them — and shared images and videos of the abuse on the internet — when they were 7 and 11 years old. These horrifying images of the sisters are still circulating on the web. This year alone, the Times reported, images of the sisters were found in more than 130 child sexual abuse investigations. These photos and videos are found on phones, computers and cloud storage accounts like Google Drive, Dropbox and Microsoft OneDrive. Because offenders can continue to victimize survivors once they are adults, F. and E. told the Times that they live in constant fear of being recognized, which is why they opted to remain anonymous.
Tech companies are largely looking the other way. The Times’ investigation found that Apple does not scan its cloud storage for illegal images, and it encrypts its messaging apps, making detection nearly impossible. Amazon also does not scan its cloud service. Dropbox, Google and Microsoft’s drive services only scan images when they’re shared, not uploaded. Facebook, which has historically been the most thorough in its scans, recently announced that it’s going to be encrypting its Messenger app. On all platforms, videos — especially live videos — present challenges. The Times found no major tech company is able to detect illegal imagery in livestreamed content.
Zoom, the business video chatting application, has been a cesspool of live videos of this abuse. A case in Pennsylvania in 2015 involved the livestreamed abuse of a 6-year-old boy. The only reason these abusers were busted was because an officer was performing an undercover investigation. Livestreams are harder to detect and leave no record, which is why abusers flock to them, the federal prosecutor in the case, Austin Berry, said in his closing remarks. He called Zoom the “Netflix of child pornography,” and apps like Facetime, Facebook, Omegle, Skype and YouNow have similar problems. Automated imagery analysis does not work on live videos.
Last year, Joshua Gonzalez, a Texas computer technician was arrested with 400 images of child sexual abuse on his computer, including some pictures of F. and E. He told authorities he used the Microsoft search engine Bing to find much of the illegal content. Microsoft created the PhotoDNA detection tool 10 years ago to compare digital fingerprints of images to other known images of abuse. However, a TechCrunch report found search terms like “porn kids” could be used in Bing. In response to the report, Microsoft said it would ban the search term and similar ones, but the Times ran an experiment recently where it found that when a known child abuse website was entered into the search box, Bing suggested related searches.
The Times created a computer program that searched Bing and other engines for these images without displaying them. It used more than three dozen search terms, some of which Bing suggested. The Times search sent the web addresses to PhotoDNA, which identified many of the images that turned up as matching already logged illegal images. The times reported these web addresses to the National Center for Missing and Exploited Children and the Canadian Center for Child Protection.
The Times says abusers know about Bing’s weaknesses and use them to find this imagery.
There is no exhaustive list of all child sexual abuse content out there. Even if there were, it would not help flag livestreams or new content being shared.
Tech companies are struggling to balance their users’ privacy with security. Amazon does not scan its users’ content at all, with a spokesperson telling the Times, “privacy of customer data is critical to earning our customers’ trust,” but that the company had a policy prohibiting illegal content.
Dropbox and Google representatives also told the Times they believed scanning users’ content would raise privacy concerns.
The Canadian Center for Child Protection said they had also found images on Google last year, and battled with the company to get them taken down. Google said three pictures depicting child sexual abuse did not meet its threshold for removal, the Times reports, but eventually, it removed them.
Related Story: A Mother’s Warning: White Teen and Tween Boys Easily Propagandized by White Supremacists Online