Apple Is Now Scanning Your Photos To Check For Child Abuse

Table of Contents

Apple is doubling down on its efforts to combat the crime of child abuse. It’s now more vigilant than ever and is poised to scan each and every photo uploaded to the iCloud in order to check for any incidents of child abuse or to catch potential perpetrators or suspects. In fact, it’s not just Apple receiving the said pressure but other tech companies as well with Cloud services.

In the meantime, however, Apple has already unveiled its initiative at a tech conference. From this point on, any images backed up to the company’s online storage services, iCloud is to be screened and checked for any illegal activities. This tends to mark a first for Apple as it has frequently clashed with authorities for refusing to break into criminals’ phones to make catching them easier.

Child Abuse

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled,” according to Jane Horvath, Apple’s chief privacy officer.

Child Abuse

Horvath further explains that while removing encryption for messaging in order to catch criminals is not the way they’re solving things, the company is more than willing to utilize some technologies to help screen for child sexual abuse materials.

It’s unclear how the company checks for child abuse images as it did not elaborate on the process.

RELATED: How iCloud is Moving Forward

Disclaimer: Please note that some of the links in this article may be Amazon affiliate links. This means that if you make a purchase through those links, we may earn a commission at no extra cost to you. This helps support our website and allows us to continue providing informative content about Apple products. Thank you for your support!

3 thoughts on “Apple Is Now Scanning Your Photos To Check For Child Abuse

  1. “This tends to mark a first for Apple as it has frequently clashed with authorities for refusing to break into criminals’ phones to make catching them easier.”

    The presumption of guilt is disturbing here.
    The is the USA, not a third-world dirt hole.

    The Goob hath ruled on this issue.

Leave a Reply

Your email address will not be published. Required fields are marked *

Share the Post:

Related Posts