The EU General Data Protection Regulation (GDPR), which came into force on 25 May 2018, gives effect to individual rights to information about using your own data for fully automated ADM systems (algorithmic decision making, in short: ADM) that work without human intervention. However, the individual rights to information of the GDPR cannot uncover systematic deficiencies or discrimination of entire groups of people.
Bank lending, the preselection of applications, police work – algorithms evaluate people and decide on them, but so far almost without social control. It is not known which algorithmic decision-making systems are used for which purpose and to what effect. One example of fully automated ADM systems with no human participation in decision making is the pre-selection of job applications. In some companies, software programmes screen CVs and sort out applicants without an employee ever having looked at their documents. The GDPR ensures that an unsuccessful applicant can find out which of his data was decisive for the negative decision. However, for the most systems in which people are involved in the decision-making process, the ADM specific obligations of the GDPR to provide information and explanations do not apply. This is shown by an analysis that was commissioned by the Bertelsmann Stiftung
Photo by ev