Apple launching Message feature that warns kids about nude images

Published: Updated:
Credit: CBS News.

A new feature coming to the latest iOS 15.2 update for Apple devices will include a warning to anyone under 18 years of age, about incoming messages that could contain nudity.

The feature is designed to warn children before receiving or sending nude images in the Messages app.

Parents can enable the feature on their child’s phone and if the child receives a potentially nude photo it will be blurred, along with a warning, before opening the image. However, it does not prevent the image from being opened.

According to Apple’s website,” Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.”

The notification feature was initially designed for parents with children under 13 who would like to opt-in to receive the alerts if their child received a nude image in Messages. However, Apple is no longer going to send parents or guardians the notification.

We spoke to internet cybersecurity expert Alan Crowetz, CEO of Infostream, Inc. about the tool. He said, “I personally think they should have left it in. What’s the point of having this unless the parents are somehow notified? Maybe not in detail, but maybe at least a red flag should go up? Parents should be checking the phone.”

Elissa Redmiles is a privacy scholar at Max Planck Institute for Software Systems.
She explained, “Not all children have parental relationships that are safe. And so if a parent finds out something about their child, like sexual identity, and as a result of that notification, that can be really dangerous. It could lead to them getting kicked out of the home or physical violence.”

This Messages feature is separate from a controversial scanning feature that could detect child abuse in photos on your phone and in iCloud Photos.

The company previously announced it was launching a broader Child Sexual Abuse Material (CSAM) campaign for child safety to include several more features and tools to detect child sexual abuse. Following an outcry from security and privacy experts who warned the technology could be exploited for other surveillance purposes by hackers and intrusive governments, much of it was delayed.

Apple insisted its technology had been developed in a way that would protect the privacy of iPhone owners in the U.S. But the Cupertino, California, company was swamped with criticism from security experts, human rights groups and customers worried that the scanning technology would open a peephole exposing personal and sensitive information.

Apple traditionally has rejected government demands for data and access to devices that it believes are fishing expeditions or risk compromising the security of its customers or devices.

There has been no update since on when, or if, that particular feature will ever be implemented.

Copyright ©2024 Fort Myers Broadcasting. All rights reserved.

This material may not be published, broadcast, rewritten, or redistributed without prior written consent.