Meta backs a new system that allows minors to stop their intimate images from being posted online
Source:https://techcrunch.com/2023/02/27/meta-backs-a-new-system-that-allows-minors-to-stop-their-intimate-images-from-being-posted-online/ Meta backs a new system that allows minors to stop their intimate images from being posted online 2023-02-27 21:49:59

Meta announced today it’s helping finance a new organization called Take It Down that will, with the support of the National Center for Missing and Exploited Children (NCMEC), help young people under 18 years old prevent intimate images of themselves from spreading online. The system, available as an online tool, works similarly to an earlier Facebook initiative designed to prevent the spread of non-consensual intimate imagery, sometimes referred to as “revenge porn.”

Alongside the launch, Meta says it’s also rolling out new tools that will make it more difficult for “suspicious adults” to interact with teens on Instagram.

The company claims the new take-down system for non-consensual intimate imagery is designed in a way to protect user privacy as it won’t require young people to actually share the images themselves with Meta or another organization. Instead, the system will assign a unique hash value — a numerical code — to the image or video directly from the user’s own device. This hash will then be submitted to NCMEC, then allowing any company to find copies of those images and take them down automatically, in addition to preventing them from being posted in the future.

The original, so-called “revenge porn” system had been criticized during its pilot for requiring user uploads before the hash was created, as security experts pushed back that it wasn’t the most responsible method of dealing with intimate content. It’s since been retooled to create hashes locally, noting in help documentation that “your images will never leave your computer.” Now, this new Take It Down system appears to utilize the same methodology.

“Having a personal intimate image shared with others can be scary and overwhelming, especially for young people,” writes Meta’s Global Head of Safety Antigone Davis in the announcement. “It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact or money — a crime known as sextortion.”

Though aimed at young people whose intimate images are being shared non-consensually — and illegally — Meta notes that this system can also be used by adults, including parents or guardians of the young person, or even adults worried about non-consensual images taken of themselves when they were younger.

The Take It Down website also connects people to other NCMEC resources, including tools to search to see if your own explicit imagery is out there on the web, a CyberTipline to report anyone threatening you over images or other forms of online exploitation, and more.

While Meta helped financially back the system by providing initial funding and will use it across Facebook and Instagram, other participating companies who have signed up to participate in using the new technology include social network Yubo as well as adult sites OnlyFans and Pornhub (MinGeek). Notably absent from the list are other big tech companies like Twitter and Snapchat.

Since then, more than 200 cases have been submitted. 

NCMEC says the system actually launched in December 2022 — ahead of this public announcement — and has since seen more than 200 cases submitted. A new PSA created by ad agency VCCP will appear on platforms used by kids to ensure it is seen.

In our limited testing, TechCrunch found that submitting an image to the tool instantly returns its hash value in the browser but without uploading the image to the internet, as promised. However, while making the system available as a web app, users should be aware that any browser extension that has access to the webpage (as many do) could potentially access the images. For added security, we’d recommend the Guest Profile if using Google Chrome in order to gain access to a clean Chrome window.

The system could be a useful tool for people who are aware of or in possession of the non-consensual images being shared, presuming they know this takedown option exists. While companies were already legally bound to report child sexual abuse material, or CSAM, to NCMEC, their systems or processes for detecting this material have been left up to them to implement. Current federal law does not mandate if or how they must search for this type of imagery on their platforms, leading to the spread of CSAM across platforms. Not surprisingly, given its 2.96 billion monthly users for Facebook alone, Meta is a large contributor to this growing problem. Meanwhile, attempts to pass new legislation, like EARN IT, that would close this loophole, have not yet been successful. (Though that particular bill was also controversial due to its potential unintended consequences on freedom of speech and consumer privacy, critics have argued.)

But the lack of legislation in this area has forced platforms like Meta to self-regulate when it comes to if and how they’ll manage this type of content and others. With Congress seemingly unable to pass new laws designed for the internet age, legal liability concerns about big tech’s responsibility for the content on their platforms have now made their way to the Supreme Court. There, the justices are reviewing Section 230 of the Communications Decency Act, which had been created in the internet’s early days to protect websites from being legally liable for the content users post. New cases involving Twitter and Google’s YouTube — and their accompanying recommendation algorithms — will determine if those decades-old protections should be rolled backed or even overturned. Though not related to CSAM, they’re another example of how the overall system of platform regulation is broken in the U.S.

Without guidance from the law, platforms like Meta have been making up their own rules and policies in areas around algorithm choice, design, recommendation technology and end user protections.

Image Credits: Meta

In more recent months, Meta has been ramping up its protections for teens in anticipation of coming regulations by doing things like setting new teen accounts to private by default and applying its most restrictive settings, in addition to rolling out a variety of safety tools and parental controls. Among these updates were specific features aimed at restricting adult users from contacting teens they didn’t know and warning teens of adults engaged in suspicious behavior — like sending a large number of friend requests to teen users, for instance.

Today, Meta says suspicious adults will no longer be able to see teen accounts when scrolling through the list of people who have liked a post or when looking at an account’s Followers or Following list, further cutting off their access. And if a suspicious adult follows a teen on Instagram, the teen will receive a notification prompting them to review the follower and remove them. It will also prompt teens to review and restrict their privacy settings and will again notify teens and prompt them to review settings when someone comments on their posts, tags or mentions them in a post, or includes them in Reels Remixes or Guides.

 

Science, Tech, Technology Source:https://techcrunch.com/2023/02/27/meta-backs-a-new-system-that-allows-minors-to-stop-their-intimate-images-from-being-posted-online/

Leave a Reply

Your email address will not be published. Required fields are marked *