Caught off guard by a huge backlash, Evernote recently abandoned its plan to let staffers read customer notes under certain circumstances.
The plan, which was scheduled to go into effect in January, would have allowed staffers to review private customer notes as a means of assessing the accuracy of its new machine learning technology.
The company made a mistake in judgment and failed to provide the type of transparent decision making that would make customers feel like they could trust the process, Evernote CEO Chris O'Neil acknowledged.
"We communicated poorly" regarding the planned change, "and it resulted in some understandable confusion," he said, apologizing for any resulting angst.
"The change we were going to make was ... intended to allow a select set of trained data scientists to verify the veracity of the machine learning-based algorithms we would use in the future," explained Evernote spokesperson Greg Chiemingo.
"To our knowledge, there is extensive use of machine learning across many Internet services and applications. We have announced we won't move forward as we had announced and we are revising the policy in the coming months," he told the E-Commerce Times.
"We will have no human review of any content without express user permission, as we do today when users ask for assistance," Chiemingo emphasized.
Evernote has espoused three laws of data protection: The data is yours; the data is protected; the data is portable.
In agreeing to Evernote's policies, customers did give the company permission to back up data, send data over a network, index data for search, and display it over various devices. The data was private by default, meaning the company would not try to make money by selling customer data. Further, the data was portable, meaning there was no lock on the content.
Evernote plans to implement machine learning technology that will automate a lot of what customers now do manually -- for example, creating to-do lists or putting together travel itineraries.
Under the newly revised policy, select Evernote employees will see random content to make sure the features are working properly, but the employees will not know which customers the data belongs to, O'Neil said. If a machine should identify any personal information on a customer, that information will be masked from employees.
Rivals offer apps that compete with Evernote, and each firm has its own policies regarding the handling of customer data and the level of artificial intelligence involved in the process.
"Microsoft is committed to maintaining our customer's privacy and preserving the ability of customers (including OneNote) to control their data," the company said in a statement provided to the E-Commerce Times by company rep Lenette Larson.
"We never review the contents of our customers' data," said Rich Siegel, CEO of Bare Bones Software.
"We also provide the option for the user to encrypt individual items using AES-256 encryption. When enabled at the customer's discretion, this ensures that their encrypted items cannot be read at all, by humans or machines, without the user's own passphrase, which we don't have access to," he told the E-Commerce Times.
"Our customers' data belongs exclusively to our customers. We do not consider our customers' data to be an asset, and is not for sale or trade," Siegel emphasized. "Our business is helping our customers to do their work and achieve their goals. We have no aspirations to do data mining of customer data, and we can't see any benefit, in our business model, that would make mining our customer data beneficial for them, or us."
Although some Evernote users were riled by the privacy controversy, others viewed it as a tempest in a teapot.
Tirias Research Principal Analyst Paul Teich is one Evernote premium customer who assumed his content would be subject to review.
"The whole point of Evernote is that I scrape and collect content I find interesting so that I have a very rich search environment for information I value that may disappear from the Web at any time," he told the E-Commerce Times. "It's great for archiving product and service prices and features with very little effort."
In order for any service to understand whether a machine learning model is working, they must have a human to read through a representative sample of the content -- at least for the next 10 years or so, Teich estimated.
"At some point, machines will be able to train machines, but we aren't there yet," said Rob Enderle, principal analyst at the Enderle Group. "However, any time you have an employee review customer's information, the customer must opt-in. Otherwise, it's a clear violation of their privacy."