Friday, September 3, 2021

Apple Is Listening To You - CSAM Delayed (Or Shelved), Other Ways Apple Heeded Public Opinions

 Apple surprised pretty much everyone when it announced that it would scan photos for evidence of sexual child abuse. The first thought, I am sure it is the same for most other Apple followers, is that it seems like a bad one, one that would quickly be followed by "the road to hell was paved with good intentions". Now, another surprise: Apple has backdown from following through, for now. It shows that Apple has finally learned a valuable lesson. Perhaps, Apple could have implemented the scan photos for Child Sexual Abuse Material (CSAM). 

Last month, Apple made the announcement of an upcoming feature to scan photos stored on iPhones for sexual abuse of children.  You likely have seen CSAM all over the news and on the Internet. It really has not been talked about enough in the context of what it could offer users in general. For something like this, it means that Apple has developed algorithms that can recognize certain elements in photos that is beyond identifying people, objects, and places. 

For many, I suppose it can be a great feature to have. I have gone through my photo library based on people or places that I have visited over the years. Not once has it dawned on me that Apple may will know more about me that I expected but not to the point where Google knows more about me than even I know about myself. Given Apple's stance on privacy, I still think that Apple has kept certain doors closed to itself about us.

What remains true about that is that it is Apple who still holds the keys to the privacy door. Not the user and at any time, whether through corporate needs, decrees by government, or anything else that could compel Apple to change its policy or view on privacy, that door will be opened. Perhaps a little at a time.

Apple's about-face here shows that it is listening to users. What we do not know is why.  Has Apple decided it is a feature they’re still hashing out and need more time to think through or bring it back at a later time when all the uproar has died down?

How about other features and products that Apple makes? Will Apple now be more receptive to what the public wants to a point? It will be interesting to see. There are times when it seems that Apple figures it knows what is best for the consumers and will not add a feature that many are request if ever or comes after years of waiting. One example is being able to chat with multiple users on FaceTime. It took years for Apple to finally implement that.

And the MacBook with the keyboard debacle around 2016-2018. Apple finally gave up on the butterfly keyboards and moved on to give users what they want -  you know, keys that actually can be typed on. Then there is the ports or lack of on the MacBooks. Not only did Apple take away ports but it affected critical workflow for professional users that have come to depend on them.

That Apple finally came around on CSAM should not be surprising. Apple under Tim Cook has always charted a course to avoid controversy whether in the US or even places like China. I believe Apple does think it is doing good with CSAM. I think it just needs to take public opinion into account and really make sure the feature is fully baked.

No comments: