privacy updates for teens //
Facebook and Meta, the parent company of Instagram, have just released new privacy updates to everyone below 16 years old. Some countries may require that you be 18 years.
A new privacy default. Teens will now have more private settings available when they sign up for Facebook. Meta suggests that teens who are already using Facebook make these settings manually. These privacy settings are now:
- Who can see their friends’ list?
- Who can see who, what pages and what lists they are following?
- Who can see the posts they have tagged in?
- Before the post appears on their profile, review any tags they have applied to it.
- Commenting on public posts is permitted for those who are allowed
Restricting connections. Meta is currently testing ways to prevent teens from sending suspicious adults messages. Those adults will not be displayed in the teens’ People You May Know recommendations. Meta clarifies that a suspicious account is one that belonged to an adult who may have been reported or blocked recently by a young person. Meta is testing the possibility of removing the message button from teens’ Instagram accounts if they are viewed by suspicious adults.
Get the daily newsletter search marketers rely on.
” />
” />
input type=”inlineEmail control rounded-0, w-100″ placeholder=”Enter business email here.” required=”” type=”email”/>
Processing…Please wait.
New safety tools. Meta is working on new safety tools. Meta states that they encourage teens to report accounts after they block someone. They also send safety notices to them with instructions on how to avoid inappropriate messages from adults. More than 100 million Messenger users saw safety notices in just one month of 2021. We have made it easier for people find our reporting tools. As a result, we saw more that 70% increase in minors sending us reports in Q1 2022 compared to the quarter before on Messenger and Instagram DMs.
Stopping the spread sensitive images. Meta is also developing new tools to stop teens from sharing intimate images online. Meta states:
We are collaborating with the National Center for Missing and Exploited Children to create a global platform for teens concerned that intimate images they have created may be shared on public internet platforms without their consent. Similar to our work to stop the non-consensual share of intimate images for adults, this platform will also be used. This platform will help to prevent intimate images of teens from being uploaded online. It can also be used by companies in the tech industry. To help teens regain control over their content in such horrific situations, we have been closely working with NCMEC experts, academics and parents. In the coming weeks, we’ll share more information about this new resource.
Thorn and NoFiltr are also partners in creating educational materials to reduce shame and stigma around intimate images and empower teens to get help if they have shared them or are suffering from sextortion.
Dig deeper. Meta advises that anyone looking for support or information about sextortion should visit their education and awareness resources. This includes the Stop Splurging hub on Facebook Safety Center. This announcement can be found on Meta’s blog.
Why do we care? It is hard to fault Meta for taking steps towards protecting and preventing teens from harm. Teens will automatically be redirected to the new settings when they sign up. However, they have the option to opt out at any time. Teens already on the platform may have to select the new options manually, which some teens might not.
Parents of teens should now be aware and can take appropriate precautions to protect their children.
Search Engine Land first published the post New Meta privacy upgrades for teens.