Google scans Gmail and Drive for child sexual abuse cartoons

[ad_1]

Over the past two decades, tech giants have faced an ever-increasing deluge of child sexual abuse videos and images on their platforms. As Apple recently discovered, this is a difficult problem to solve, where scanning people’s devices and online accounts for illegal content can lead to privacy issues.

But it’s not just explicit photos and videos of underage children that the biggest companies in Silicon Valley are trying to find and erase from their servers. They are also looking for cartoons depicting graphic acts involving children, as revealed by a recent search warrant asking Google to provide information on a suspect who allegedly detained such animations.

This type of content is potentially illegal under US law and can be detected by Google’s systems against child sexual material (CSAM), a fact that has not previously been discussed in the public domain, reveals the mandate. Google has long recognized that its code can detect child abuse using two technologies. The first uses software designed by YouTube that searches for “hashes” of previously known illegal content. Such hashes are alphanumeric representations of a file, which means that a computer can parse files in, say, a Gmail email and will raise a flag if there is a file with the same hash as the photo. or illegal video. Google also uses machine learning tools to examine files and analyze them for any signs of child abuse.

It is not known which of these two technologies was used in late 2020 in Kansas, when Google detected “digital art or cartoons depicting children engaging in sexually explicit behavior or having sex.” in a Drive account, depending on the mandate. He then details the graphic images, which included what appeared to be sexually explicit cartoons of underage boys. In accordance with its legal requirements, Google forwarded information about what it found, along with the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then forwarded the results to DHS. Homeland Security. Investigation unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving information about emails to and from the accused.

It appears that the suspect is in fact a well-known artist. No charges having been laid, Forbes does not publish his name, but the man identified in the warrant had won several small Midwestern art competitions, and a piece of art from the 1990s was mentioned in a major West Coast newspaper.

This may concern some artists who draw or represent nudes. But the Cartoon Pictures Act is worded in such a way as to offer some protection to anyone sharing animation of children engaging in sexual behavior for artistic or scientific purposes. A prosecutor attempting to convict anyone possessing such material would have to prove that the relevant images were “obscene” or lacked “serious literary, artistic, political or scientific value”.

Google, meanwhile, has published transparency reports in recent years showing how often it reports issues to NCMEC. The figures show a worrying trend. In the first six months of 2021, it found more than 3.4 million potentially illegal content in 410,000 separate reports. This was an increase from 2.9 million in 365,000 reports in the last six months of 2020, and far more than double from January to June 2020, when 1.5 million CSAM pieces were discovered and reported to NCMEC. in 180,000 reports.

At the time of publication, Google had not provided a comment on the cartoon’s case and how it detects moving images in Google Accounts. The Kansas Department of Justice did not respond to a request for comment. Attempts to send emails to the addresses listed for the defendant in the warrant failed, with the Gmail account appearing to be disabled.

Given the recent fury over how Apple planned to digitize iPhone images for CSAM, a decision it ultimately delayed after criticism that it compromised user privacy, the spotlight has been turned on the how other tech giants are dealing with the problem. Since Google doesn’t end-to-end encryption of its communications tools like Gmail or its file storage technology like Drive, there’s still an opportunity for the tech company to search for illegal content. And since it has no plans to introduce these features, law enforcement can always rely on Google to notify NCMEC of any abuse on its servers.

Whether the majority of users want Google to scan people’s accounts so it can help find child molesters, or improve privacy with end-to-end encryption, the Mountain View-based company, in California, will have to struggle with this balance in perpetuity. The same goes for any of its rivals.

A version of this story appeared in my newsletter Listening earlier today. Sign up to receive cybersecurity news and exclusives here.

[ad_2]
Source link

Comments are closed.