Take it Down video from Nat’l Center for Missing & Exploited Children
by Larry Magid
This post first appeared in the Mercury News
Non-Consensual sharing of intimate images (NCII) can be devastating no matter what the victim’s age. But any sharing of nude or sexually explicit images of people under 18 is illegal and can be especially traumatic. Sometimes those images are an unintended byproduct of sexting – a young person shares their image with someone, and usually without the victim’s permission, it’s posted online, which is a violation of trust.
Sometimes the images are hacked or taken in secret. And even if the person knowingly allowed the image to be posted, the presence of those images can harm that person now and in the future.
On Monday, the National Center for Missing & Exploited Children (NCMEC), whose board I was on for 20 years, launched a tool to “help kids remove their sexually explicit images from the internet.” It’s important to point out that a person who reports that their image has or is at risk of being posted online will not get into trouble. The purpose of this tool and laws against distributing sexually explicit images of minors are designed to punish those who exploit children.
The new tool, called Take it Down, was launched with support from Meta, the parent company of Facebook and Instagram. It “allows users from around the world to submit a report that can help remove online nude, partially nude, or sexually explicit photos and videos depicting a child under 18 years old,” according to NCMEC. “Having explicit content online can be scary and very traumatizing, especially for young people,” said Gavin Portnoy, vice president of Communications & Brand at NCMEC. “The adage of ‘you can’t take back what is already out there’ is something we want to change. The past does not define the future and help is available.”
The Take it Down online tool assigns a unique hashtag to images and videos. These hashtags are provided to participating technology companies so that they can identify and remove the content on both public and encrypted sites and apps. This all happens without the image or video ever leaving a device. No one at NCMEC or the tech company receives a copy of the actual image. When you use the tool to select an image, it uploads a hash, not the image itself.
Regardless of whether their image was already shared online, the next step is to use a simple tool to click on the images or videos. This does not send the images anywhere – they remain on the device. Instead, it makes a hash of the image that will be sent to participating tech companies. It’s important to emphasize that this tool in no way shares the young person’s picture or video. That content never leaves their device.
So far, participating companies include Meta (Facebook & Instagram), MG Freesites (Pornhub, Mindgeek), OnlyFans and Yubo
How to use the Take it Down tool
To report content, you go to takeitdown.ncmec.org, click “get started” and answer a few simple questions, starting with age. If the victim is over 18, they’re directed to Stopncii.org. If they’re younger, they are asked to verify if “your images and videos include nude, partially nude or sexually explicit content.” If yes, the victim is asked if “any of your intimate image(s)/video(s) have already been shared online?” If the answer to that is yes or “I don’t know,” the tool suggests that the victim also report the incident to CyperTipline.org.
The CyberTipline is where individuals and tech companies can report child sex abuse images, which are colloquially sometimes referred to as “child porn.” It was initially announced by then Vice President Al Gore in December 1997 and officially launched by NCMEC at a press conference in 1998, where I spoke along with NCMEC CEO Ernie Allen, FBI Director Louis Freeh, America Online CEO Steve Case, Treasury Undersecretary for Enforcement Raymond Kelly, and Senators Judd Gregg (R-New Hampshire) and Ernest Hollings (D-South Carolina). The tipline has come a long way in the ensuing 25 years. In 2021, it processed more than 29 million reports, most of which came from social media companies. Those reports included 85 million files, but it’s also important to know that there are many repeat reports of the same images, and some of them are very old. Still, NCMEC says that it “alerted law enforcement to over 4,260 potential new child victims,” which is an alarming number.
Take it Down is an important new tool, but it’s only one part of the solution of ending the scourge of online child sex abuse. Law enforcement plays an important role in finding and arresting people who abuse children, but it’s also essential to support prevention program like those run by NCMEC and other organizations. People need to know that these images harm children in multiple ways. In some cases, the mere taking of the pictures is abusive, especially if an adult or even another minor is coercing or forcing a child to take off their clothes or engage in a sexual act. But the abuse doesn’t end there. The sharing of the images can scar a child for life, and once online, those images can continue to harm the child for years. Anything that can be done to take the image down is welcome, and I’m especially happy that several companies are participating in this program.
Disclosure: Larry Magid is CEO of ConnectSafely.org, a nonprofit internet safety organization that receives financial support from Meta. He is also a former member of the National Center for Missing & Exploited Children’s board of directors.