EU Data Protection Law May End The Unknowable Algorithm

...

Millennials know exactly what they want and expectations are high - very high. They are empowered

Slated to take effect as law across the EU in 2018, the General Data Protection Regulation could require companies to explain their algorithms to avoid unlawful discrimination.

Europe's data protection rules have established a "right to be forgotten," to the consternation of technology companies like Google that have built businesses on computational memory. The rules also outline a "right to explanation," by which people can seek clarification about algorithmic decision that affect them.

In a paper published last month, Bryce Goodman, Clarendon Scholar at the Oxford Internet Institute, and Seth Flaxman, a post-doctoral researcher in Oxford's Department of Statistics, describe the challenges this right poses to businesses and the opportunities it presents to machine learning researchers to design algorithms that are open to evaluation and scrutiny.

The rationale for requiring companies to explain their algorithms is to avoid unlawful discrimination. In his 2015 book The Black Box Society, University of Maryland law professor Frank Pasquale describes the problem with opaque programming.

"Credit raters, search engines, major banks, and the TSA take in data about us and convert it into scores, rankings, risk calculations, and watch lists with vitally important consequences," Pasquale wrote. "But the proprietary algorithms by which they do so are immune from scrutiny."

Several academic studies have already explored the potential for algorithmic discrimination.

A 2015 study by researchers at Carnegie Mellon University, for example, found that Google showed ads for high income jobs to men more frequently than to women.

That's not to say Google did so intentionally. But as other researchers have suggested, algorithmic discrimination can be an unintended consequence of reliance on inaccurate or biased data.

Google did not immediately respond to a request to discuss whether it changed its advertising algorithm in response to the research findings. 

A 2014 paper from the Data & Society Research Institute echoes the finding that inappropriate algorithmic bias tends to be inadvertent. It states, "Although most companies do not intentionally engage in discriminatory hiring practices (particularly on the basis of protected classes), their reliance on automated systems, algorithms, and existing networks systematically benefits some at the expense of others, often without employers even recognizing the biases of such mechanisms."

Between Europe's General Data Protection Rules (GDPR), which are scheduled to take effect in 2018 and existing regulations, companies would do well to pay more attention to the way they implement algorithms and machine learning.

But adhering to the rules won't necessarily be easy, according to Goodman and Flaxman. They note that excluding sensitive data having to do with race or religion, for example, doesn't necessarily mean algorithms will return non-biased results. That's because other non-sensitive data points, like geographic area of residence, may have some correlation with sensitive data.

What's more, the researchers observe that many large data sets are the product of multiple smaller data sets, making it difficult if not impossible for organizations to vouch for the integrity, accuracy, and neutrality in their data.

"The GDPR thus presents us with a dilemma with two horns: Under one interpretation the non-discrimination requirement is ineffective, under the other it is infeasible," write Goodman and Flaxman.

In a phone interview, Lokke Moerel, senior of counsel at Morrison & Foerster, said the provision on automated decision making is not new. Also under the current Directive (the data rules that apply to criminal matters), companies have to inform individuals about the underlying logic involved in their automated decisions.

[See 8 Ways To Secure Data During US-EU Privacy Fight.]

Moerel acknowledged the difficulties of the rules, noting that in an era where algorithms are dynamic and self-learning, it's very difficult to know how an algorithm made a decision at any point in time, let alone communicate this to an individual in a meaningful manner. If logic is incomprehensible to the vast majority of people, the question becomes: What is the added value of providing this information in the first place?

Moerel said she found it troubling that algorithms can end up being discriminatory through data correlation. As an example, she noted that an insurance company charging higher premiums in a certain region because of higher accident rates could end up discriminating against a specific ethnic group that happens to live in that area. She also suggested there's a risk that companies may try to hide such discriminatory correlations by performing further analytics and finding other non-sensitive correlations that they know are correlated with the sensitive data. Requiring the disclosure of algorithmic logic guards against such action, she said.

In order to avoid being questioned about algorithmic logic, Moerel suggested companies give individuals affected by their decisions more control over the implications of how data are used (e.g., by giving them control over their ad preferences, whereby they can view and adjust the indicators that triggered the relevant advertisement for the visitor).

"It will help to avoid individuals questioning your logic if you give them control of the triggers that matter to them," she said. "If people are looking at a black box, it won't be acceptable for European regulators."

Goodman and Flaxman say that work is already underway to make algorithms more easily subject to inspection. And they remain optimistic that technical code can coexist with the legal code.

"We believe that, properly applied, algorithms can not only make more accurate predictions, but offer increased transparency and fairness over their human counterparts," they conclude.

Categories
APPLICATIONS
0 Comment

Leave a Reply

Captcha image


RELATED BY

  • 5300c769af79e

    Opera Claims Power-Saving Mode Improves Battery Life

    Currently available in the developer stream, Opera claims the power-saving mode extends the laptop battery life by up to 50% compared with earlier versions of the Opera browser and to Google Chrome.The new power-saving mode is one of several new features Opera has rolled out in only the last weeks that have focused on improving privacy, security, and efficiency when it comes to its Web browser.
  • 5300c769af79e

    11 Secret Codes That Unlock Hidden Features on Your Phone

    Unstructured Supplementary Service Data (USSD)—sometimes known as "quick codes" or "feature codes"—is an extra-UI protocol, which allows people to use hidden features.Still, it can be fun to play around and see what unexpected functionality your phone is hiding beneath the surface.
  • 5300c769af79e

    DEAL: Galaxy Note 7 Pre-Orders at Sam's Club Also Include $150 Gift Card on Top of Freebies

    Pre-orders for the Samsung Galaxy Note 7 opened up last night at all of the major carriers, but let’s not forget that various retailers have the phone available now as well.As of this morning, Sam’s Club seems to have the best deal, which includes not only the bonus of a free Gear Fit2 or 256GB SD card, but a $150 gift card as well.
  • 5300c769af79e

    Video: LeEco Le Pro 3 Unboxing and First Look

    The Le Pro 3 from LeEco is the company’s first big push into the US market, and from the buzz we have seen so far in the community, I would say they are doing a pretty good job at getting folks excited.The phone competes with the best of them in terms of specs, with the Pro 3 equipped with a 5.