Saturday, February 20, 2016

FBI Used iPhone Wrong and Locked Themselves Out of iPhone Belonging To San Bernardino Terrorists

Apple and the FBI, and, by extension, the rest of the federal government has been locked in an epic battle that will define what privacy in America means. Parsing through the carefully worded statements from both sides and smoke created by the media and tech pundits, the government does want a backdoor or a key to the whole iOS walled garden so they can access user data whenever they want while Apple is fighting to prevent that from happening.

Despite claims to the contrary, making tech companies build backdoors to their devices or platform is the ultimate goal of the government. However, a recent development in the fight between Apple and the FBI shows that the government cannot be trusted to use the technology correctly much less responsibly.

The government has a tough job fighting crime and keeping the public safe in light of the recent events of terror on both sides of the Atlantic. With both the Paris and San Bernandino attacks fresh in minds of the public, the government is making a push to gain a technical advantage over an increasing use of encryption in operation systems and apps that prevent government intrusion.

While Apple believes it stands on the right side of the issue, the bottom line still figure into Apple's opposition as well. Apple's CEO, Tim Cook, has made privacy a theme that Apple takes to heart and why users should buy iPhones and iPads over competing devices where intrusion on almost every level is expected (a claim rightly made or not is still subject to debate).

For the government, in a presidential election year, this seems like the right time to change the dynamics of this discussion over backdoor access, use of encryption, and striking the right balance between privacy and protecting the public from acts of terrorism.

What Tim Cook said about backdoors is correct: if the government has access, any government, criminal syndicate, hacker group, or anyone else regardless of their intent will find their way to these backdoors as well. On a whole, it makes user data vulnerable. It will not only the good guys with good intentions with access but bad guys as well. Any claim that the government will protect the key as closely as Apple or Google should they ultimately be forced to provide the pertinent information to the government is false.

Consider the latest development over the fight regarding the iPhone 5C owned by one of the San Bernardino terrorists: the government mishandled the iPhone by resetting the password to the iPhone resulting in themselves not being able to access the latest data on the mobile device and locking themselves out, leading them to go to Apple. If the government's own experts could not even handle a routine every day task a major of iPhone users are capable of performing, how can Apple expect the government to save guard a backdoor and make sure they do not misuse or mishandle it?

No comments:

Using Generative AI Has Given Me A New Appreciation For Siri and Excited For The Future of Apple Intelligence

I used generative AI this week to find the dimensions of a refrigerator based on the model number. I googled first because of muscle memory ...