Posts

Showing posts from August 29, 2021

Apple Suprises - Apple Is Finally Listening to the Public (CSAM And Future Features and Products)

Apple surprised pretty much everyone when it announced that it would scan photos for evidence of sexual child abuse. The first thought, I am sure it is the same for most other Apple followers, is that it seems like a bad one, one that would quickly be followed by "the road to hell was paved with good intentions". Now, another surprise: Apple has backdown from following through, for now. It shows that Apple has finally learned a valuable lesson. Perhaps, Apple could have implemented the scan photos for Child Sexual Abuse Material (CSAM).  Last month, Apple made the announcement of an upcoming feature to scan photos stored on iPhones for sexual abuse of children.  You likely have seen CSAM all over the news and on the Internet. It really has not been talked about enough in the context of what it could offer users in general. For something like this, it means that Apple has developed algorithms that can recognize certain elements in photos that is beyond identifying people, obje...