Secure software development doesn't need to be a conflict between pushy security teams and resista
Apple CEO Tim Cook leans on the Founding Fathers to suggest the company did the right thing when asked by the FBI to unlock a terrorist's iPhone. It's an issue that affects IT professionals who need to protect company data, as well as consumers and their personal information.
Apple caused quite a stir earlier this year when it refused a request from the US Department of Justice to unlock a suspected terrorist's iPhone. At the time, Apple argued that the request would affect more than a single smartphone, that it was unconstitutional, and that it would weaken security for everyone.
Now, Tim Cook, Apple's CEO, tells The Washington Post that the issue boils down to civil rights.
In a wide-ranging interview that covers host of topics, the Post's Jena McGregor dug deep on the FBI's fight with Apple and Apple's response.
"We knew it was going to be very, very difficult. And that the cards were stacked against us," responded Cook. "But we spent a lot of time on 'what is right here?'"
First, Apple set out to determine if it even could unlock the iPhone per the FBI's request. Cook explains that exploring the technical issue helped determine its response to the FBI. As Cook told the Post:
The lightbulb went off, and it became clear what was right: Could we create a tool to unlock the phone? After a few days, we had determined yes, we could. Then the question was, ethically, should we? We thought, you know, that depends on whether we could contain it or not. Other people were involved in this, too -- deep security experts and so forth, and it was apparent from those discussions that we couldn't be assured. The risk of what happens if it got out, we felt, could be incredibly terrible for public safety.
Cook goes on to suggest that people should not need to be computer experts to set up privacy and security protections -- including ones from the government. From that point of view, consumers and businesses rely on companies such as Apple to do some of the work for them. Apple provides the encryption and iPhone owners are then allowed to create their own key.
Apple doesn't believe the government should have access to that or any other key.
"In this case, it was unbelievably uncomfortable and not something that we wished for, wanted -- we didn't even think it was right. Honestly? I was shocked that [the FBI] would even ask for this," explained Cook. "That was the thing that was so disappointing that I think everybody lost. There are 200-plus other countries in the world. Zero of them had ever asked [Apple to do] this."
[Read about Edward Snowden's new iPhone add-on for security.]
The FBI eventually figured out how to unlock the iPhone in question without Apple's help.
Privacy is a right to be protected, believes Cook: "In my point of view, [privacy] is a civil liberty that our Founding Fathers thought of a long time ago and concluded it was an essential part of what it was to be an American. Sort of on the level, if you will, with freedom of speech, freedom of the press."
Clearly there is room to disagree with Apple's stance, as the FBI did.
The argument takes on a more interesting color when one considers how it should apply to the phones owned by businesses and used by executives.
Everyone knows smartphones contain hordes of personal and business data. What if the government wants or needs access to a phone that contains trade secrets or other sensitive company information?
Should the phone maker be beholden to give that data to the government? Is it possible to protect business data while complying with law enforcement requests?
It's certainly a topic IT and company leadership should be prepared to discuss.