pith. machine review for the scientific record. sign in

arxiv: 1606.08813 · v3 · submitted 2016-06-28 · 📊 stat.ML · cs.CY· cs.LG

Recognition: unknown

European Union regulations on algorithmic decision-making and a "right to explanation"

Authors on Pith no claims yet
classification 📊 stat.ML cs.CYcs.LG
keywords explanationwillalgorithmsalgorithmicdecision-makingeuropeanrighttake
0
0 comments X
read the original abstract

We summarize the potential impact that the European Union's new General Data Protection Regulation will have on the routine use of machine learning algorithms. Slated to take effect as law across the EU in 2018, it will restrict automated individual decision-making (that is, algorithms that make decisions based on user-level predictors) which "significantly affect" users. The law will also effectively create a "right to explanation," whereby a user can ask for an explanation of an algorithmic decision that was made about them. We argue that while this law will pose large challenges for industry, it highlights opportunities for computer scientists to take the lead in designing algorithms and evaluation frameworks which avoid discrimination and enable explanation.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Towards A Rigorous Science of Interpretable Machine Learning

    stat.ML 2017-02 unverdicted novelty 6.0

    The authors define interpretability for machine learning, specify when it is required, and propose a taxonomy for its rigorous evaluation while identifying open research questions.