ManageEngine OpManager, a powerful NMS for monitoring your network, physical & virtual (VMware/ HyperV) servers & other IT devices. Deploy and start monitoring in less than an hour. Trusted by over a million admins world-wide. Try it for free.
Almost lost among the deluge of new features in the upcoming version of iOS Apple touted last week was the company's announcement about privacy.
"All this great work in iOS 10 would be meaningless to us if it came at the expense of your privacy," Craig Federighi, Apple's senior vice president for software engineering, told attendees at the company's Worldwide Developers Conference in San Francisco.
"We believe you should have great features and great privacy," he said. "You demand it, and we are dedicated to providing it."
Apple offers end-to-end encryption by default in apps like FaceTime, Messages and HomeKit, and it performs data crunching at the device level, with the data remaining under a user's control.
While gathering data about its customers' data usage, it uses a technology called "differential privacy."
"Differential privacy is a research topic in the area of statistics and data analytics that uses hashing, subsampling and noise injection to enable this kind of crowdsourced learning while keeping the information of each individual user completely private," Federighi explained.
Despite its potential benefits, differential privacy is not free of controversy.
"What Apple is doing is really neat -- they're trying to make things more private. But if they're going to be collecting a lot of data, it's good to know what they're going to do with that data, and we don't," noted Matthew Green, a computer science professor at Johns Hopkins University.
"We don't know much about what Apple is doing. They seem to be doing something very much like what Google is doing," he told the E-Commerce Times. "Hopefully, they'll publish more details as we get closer to the release of iOS 10, but right now there are ways to get it wrong and ways to get it right, and we just don't know how Apple is doing it."
Google has been using differential privacy in its RAPPOR (Randomized Aggregatable Privacy-Preserving Ordinal Response) project since the fall of 2014.
"Building on the concept of randomized response, RAPPOR enables learning statistics about the behavior of users' software while guaranteeing client privacy," noted ŕlfar Erlingsson, Google's tech lead manager for security research.
"The guarantees of differential privacy, which are widely accepted as being the strongest form of privacy, have almost never been used in practice despite intense research in academia," he continued. "RAPPOR introduces a practical method to achieve those guarantees."
Differential privacy has its roots in survey methods developed in the 1960s to get honest answers to sensitive questions, according to Joseph Lorenzo Hall, chief technologist at the Center for Democracy & Technology.
For example, if you wanted to find out how many people in a sample ever had a sexually transmitted disease, a respondent would be told to flip a coin. If heads appeared, the respondent would answer yes. If tails appeared, the respondent would answer truthfully.
"Effectively, this meant that any given Yes response was completely deniable by the respondent, preserving the privacy of those who answered Yes," Hall wrote.
However, by taking into account that 50 percent of a sample were yes by mandate, surveyors could get significant data. So, if 100 people were asked the STD question, you could eliminate 50 yes answers as mandatory. The number of yes answers in the remainder of the sample would give you a good idea how many people actually had an STD.
"To me, the question is how effectively does it enhance privacy?" asked Ben Desjardins, director of security solutions at Radware.
"If it is unproven, using it as a primary means of protecting privacy could create some risk," he told the E-Commerce Times.
It's too soon to tell whether differential privacy will succeed the way Apple envisions it working, said Bob Ertl, senior director of product management of Accellion.
"However, that also means harsh criticisms against Apple and this technology are premature," he told the E-Commerce Times.
"What we do know is that Apple has arguably overdemonstrated that it is a passionate and vociferous advocate for consumer privacy," Ertl continued. "Therefore, I don't think it's very likely that they are going to undermine the trust they have generated among the more than 1 billion people using their devices with this technology," he added. "Privacy has become Apple's brand, and I'm sure the company will take every measure necessary not to compromise such a core principle."