Losing weight is hard. Here are 5 things to keep in mindMeta tool to block nude images in teens’ private messagesLosing weight is hard. Here are 5 things to keep in mind

Losing weight is hard. Here are 5 things to keep in mindMeta tool to block nude images in teens’ private messagesLosing weight is hard. Here are 5 things to keep in mind

Meta has said it will launch a new safety tool to block children from receiving and discourage them from sending nude images, including in encrypted chats later this year.

The tool is likely to be optional and available to adults too on Instagram and Facebook.

It follows criticism from government and police after Meta started to encrypt Messenger chats by default.

They say encryption will make it harder for the firm to detect child abuse.

According to Meta the new feature is solely designed to protect users, particularly women and teenagers – under-13s are not allowed to use its platforms – from being sent nude images or being pressured into sending them.

It also announced that minors would, by default, be unable to receive messages on Instagram and Messenger from strangers.

Earlier this month, police chiefs said youngsters sending nude images contributed to a rise in sexual offences committed by children in England and Wales.

And legal filings recently made public as part of a US lawsuit against Meta, allege company documents show an estimated 100,000 teenage users of Facebook and Instagram are sexually harassed online every day. Meta has accused the lawsuit of mischaracterising its work.

But on Thursday the tech-giant revealed a planned new feature to help protect teenagers from inappropriate images in their messages.

This system will also work in encrypted chats with more details to be revealed later this year.

Meta’s recent decision to protect Facebook Messenger chats by default with end-to-end encryption (e2ee) has been fiercely criticised by government, police and leading children’s charities.

E2ee means only sender and recipient can read messages meaning, critics say, Meta cannot spot and report child abuse material in messages.

Other messaging apps such as Apple’s iMessage, Signal and Meta-owned already WhatsApp use the tech and have strongly defended the technology.

However, some critics say platforms should deploy a technique called client-side scanning to detect child abuse being sent via encrypted apps.

Client-side scanning refers to systems on a user’s device that scan messages for matches with known child abuse images before they are encrypted and sent, and report any that contain suspected illegal content to the company.

Children’s charity the NSPCC has suggested Meta’s new system “shows that compromises that balance the safety and privacy rights of users in end-to-end encrypted environments are possible”.

According to Meta its new feature is not client-side scanning, which it believes undermines the chief privacy protecting feature of encryption, that only the sender and recipient know about the contents of messages.

It will use machine learning only to identify nudity and will work entirely on device, the BBC understands. According to Meta, using machine learning to identify child abuse is much harder and there would be a serious risk of errors if this was attempted across its billions of users with the potential of innocent people being reported with grave consequences..

Instead a range of systems are used to protect children, which according to Meta do not undermine privacy including:

  • Systems that identify adults behaving suspiciously, and stop them interacting with under-18s or finding and following other suspect adults.
  • Preventing adults from contacting minors, by measures limiting over-18s’ ability to message teenagers who don’t follow them.

New tools

Meta argues it has introduced over 30 tools and resources to help keep children safe and on Thursday it also revealed a number of new child safety features.

By default children will be unable to receive messages on Instagram or Facebook Messenger from people they do not follow or are not connected to, it announced.

Meta policy already stops adults from messaging teenagers who do not follow them.

“Under this new default setting, teens can only be messaged or added to group chats by people they already follow or are connected”, Meta blogged.

The parental supervision tools will now also give parents the ability to deny a teenagers’ requests to change their default safety settings – such as who can direct message them or whether they can view more sensitive content. Previously they were merely notified of a change.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *