NewsBite

Meta’s huge call on sending nude images announced

Meta has announced some huge changes regarding the sending and receiving of nude images over Facebook and Instagram.

Meta has made a huge call. Picture: iStock
Meta has made a huge call. Picture: iStock

Meta is introducing measures that will make it harder for people to send and receive nudes.

The new safety tool will be aimed at teens but adults can also integrate it across their Facebook and Instagram accounts.

The move comes after Meta faced criticism from governments and police for encrypting Messenger chats by default, which some say makes it more difficult for authorities to detect child abuse.

The company claims that the new feature targeting nude images has been designed solely to protect users, particularly women and teenagers, from being sent nude images or being pressured into sending nude images.

Children aged under 13 are not allowed to use any of Meta’s platforms.

The new measures will mean teenagers will be unable receive explicit material via messenger on Facebook or Instagram.

Meta said the tool will also “discourage” this age group from sending such images, but did not specify exactly how this would occur.

Adults will also be able to use the new tools to keep themselves safe online.

Meta made the announcement on their blog. Picture: iStock
Meta made the announcement on their blog. Picture: iStock

The company also announced that minors would, by default, no longer be able to receive messages on Instagram or Facebook Messenger from strangers.

“Today we’re announcing an additional step to help protect teens from unwanted contact by turning off their ability to receive DMs from anyone they don’t follow or aren’t connected to on Instagram – including other teens – by default,” Meta said in a statement.

“Under this new default setting, teens can only be messaged or added to group chats by people they already follow or are connected to, helping teens and their parents feel even more confident that they won’t hear from people they don’t know in their DMs.

“Teens in supervised accounts will need to get their parent’s permission to change this setting.

This default setting will apply to all teens under the age of 16 (or under 18 in certain countries).

“In addition, we’re planning to launch a new feature designed to help protect teens from seeing unwanted and potentially inappropriate images in their messages from people they’re already connected to, and to discourage them from sending these types of images themselves.”

There will be big changes for the way people send and receive explicit content. Picture: iStock
There will be big changes for the way people send and receive explicit content. Picture: iStock

Legal filings that were recently made public as part of a US lawsuit against Meta allege that company documents show an estimated 100,000 teen users of Facebook and Instagram are sexually harassed online every day.

The company retaliated by stating that the lawsuit was ‘mischaracterising’ their work.

This new safety tool will also work in encrypted chat messages, with more details on this to be released later this year.

Meta’s recent decision to protect Facebook Messenger chats with end-to-end encryption (e2ee) by default has been fiercely criticised by government, police and children’s charities.

End-to-end encryption means only the sender and recipient can read messages, meaning Meta cannot spot and report child abuse material in messages, critics say.

Other texting apps such as Apple’s iMessage, Signal and Meta-owned WhatsApp already use the tech and have strongly defended the technology.

However, some critics say such platforms should use a technique known as ‘client-side scanning’ to be able to detect child abuse in encrypted apps.

The changes will be rolled out on Instagram and Facebook. Picture: iStock
The changes will be rolled out on Instagram and Facebook. Picture: iStock

This system would scan messages for any matches with known child abuse images before they are encrypted and sent.

It would also mean an automatic report of any suspected illegal activity would be sent to the company.

On their blog, Meta says it has introduced over 30 tools and resources to help keep children safe and claim there will be more measures introduced over time.

According to Australia’s eSafety Commissioner, it is estimated that in 50% to 70% of cases of online child sexual abuse, the abuser is someone who is known to the child.

They state that children who have been sexually abused online are four times more likely to experience mental health problems both immediately after the abuse and throughout their lives.

Originally published as Meta’s huge call on sending nude images announced

Original URL: https://www.dailytelegraph.com.au/technology/online/metas-huge-call-on-sending-nude-images-announced/news-story/fa62a8885b2b3ef7e928b430fe6bd402