Tim Cook, CEO of Apple, says that an iPhone is an extension of ourselves. The amount of personal data that we send out through our smartphones is what makes Cook’s statement so true. The phones then provide countless companies with a picture of who we are.
How are companies supposed to handle this information?
In the past I’ve examined cases of how data is collected and used, and have done so through either a Utilitarian or Deontological lense. The former considers what’s best for all parties, the latter focuses on people’s basic rights. This paper will analyze Apple’s controversy with the FBI, judging Apple’s response by a virtue of ethics, and will explore what “privacy” really means.
On December 2nd, 2015, a man and woman committed mass murder in San Bernardino, California. The man was named Syed Rizwan Farook.
While this attack was thought to be tied to a terrorist organization, the FBI was conducting further investigation. This involved getting access to the information on Farook’s iPhone. Unfortunately, the iPhone was locked, so the FBI asked Apple for their help. One would think that a company as great as Apple would comply, right? No, wrong.
Apple refused to help the FBI bypass the security feature that disables an iPhone after the password is entered incorrectly too many times. They released a letter to their customers which is published on their website.
Is it okay?
By virtue of ethics, meaning to judge a situation by the motives involved, Apple’s response is appears to be in light of what’s best for their customers’ security, however their delivery is questionable.
The strong opposition to the FBI’s demands have been denounced by many as a marketing ploy. That may be, and so be it – it’s a great one. Apples well-worded response plays on arguably the largest concern in today’s tech industry: cybersecurity.
However, they use fear tactics in pitting their customers against the US government. Using a fear approach in order to defend yourself during a terrorist investigation doesn’t exactly lighten the mood. On the other hand, what better way to ease people’s concerns about overarching data access than to publicly decline government access to user data?
In their statement, Apple states, “the order would set a legal precedent that would expand the powers of the government and we simply don’t know where that would lead us.” This screams a sound that a lot of American people love to hear: keep the government out.
While Apple could have phrased this to be a little less daunting, it does raise a good point. The US has a “free-market” mentality, and the amount of power that the government would have due to access to corporate data would be absolutely unprecedented.
In their publication on their website, Apple explains that the government never asked for blatant access to the actual data within Farook’s phone. Infact, Apple can’t even see that without the user’s consent to decrypt. Instead, the FBI wanted Apple to create a new operating system.
“They have asked us to build a backdoor for the iPhone,” says Tim Cook in his announcement. It would also allow passcodes to be entered electronically, rather than by hand. This would give law enforcement – and criminals – the ability to use computing power to enter endless passwords until they’ve found one that works (actually a short process).
Apple points out that “the only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.” Once again, more fear tactics here.
Privacy is becoming obsolete
While Apple is injecting fear into the what-if scenario of the government being able to peek into your iPhone, they are doing a good job at thinking about how to handle all of the personal data they have. Their statement was very clear. By saying no to the FBI, they were ensuring consumers that they were “protecting” their privacy.
But what does that even mean? Facebook, Google, Twitter, Yahoo, Slack, and a bunch of other successful tech startups and corporations have all come out and sided with Apple. It’s as if they all took this as their chance to get on the right side of a big emerging issue: consumer privacy.
Does consumer privacy require all of these companies to keep their data to themselves? Does it require them to keep the government away from it? Apple says our privacy is protected because only they can see our data. In that case, our sense of privacy may change from if we are vulnerable to who are we vulnerable toward.
Your data’s being used. Deal with it.
Consumers need to accept that their personal data is in the hands of companies. This is the first step in adjusting our perception of “privacy” and “protection”.
In chapter 8 of his book entitled Big Data, Viktor Mayer-Schonberger talks about a “paradigmatic” shift in our approach toward personal data moving forward. Describing hot not all data is personal data, he notes that non-personal data can easily become personal when you pair it with enough other pieces of information. Suddenly, items that normally seem trivial are costing you money on insurance rates because of data points that you didn’t know existed.
Nobody wants this sort of technology to be used against them, but most people would agree that this sort of analysis could be great for the justice department. Then why didn’t Apple help the FBI? Well, they did.
Apple has complied with all search warrants for any data that Apple has access to. As for personal data inside of someone’s cell phone, Apple doesn’t have immediate access to it because it’s encrypted, hence the reason the FBI needs help. As mentioned before, the fact that Apple neglected to create something that would put consumer privacy at harm shows where Apple stands in terms of putting customers first. By virtue of ethics, this is a good move.
Consumer knowledge needs work
Education will play a key role in developing our ethics moving forward. Schonberger brings up critical questions about keeping consumers informed during a time when predictive analytics can be applied in new ways every day. He inquires, “how can companies provide notice for a purpose that has yet to exist? How can individuals give informed consent to unknown?” These questions are hard to answer, especially since most of America is already under-informed. As Schonenberger notes, moving the responsibility from the consumer over to the company could make things better.
Apple’s decision to refrain from creating a new operation system for the FBI, as well as the support from Google, Facebook, Twitter, Yahoo, and other tech giants, set a precedent for all companies: put your customers first. Even if there is an ulterior circumstance, like the San Bernardino gunman, it’s important to consider the short and long term effects on your customers. By virtue of ethics, Tim Cook gave the right response to the FBI.