Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679
http://ec.europa.eu/newsroom/just/document.cfm?doc_id=47963
Example
A data broker collects data from different public and private sources, either on behalf of its clients or
for its own purposes. The data broker compiles the data to develop profiles on the individuals and
places them into segments. It sells this information to companies who wish to improve the targeting of
their goods and services. The data broker carries out profiling by placing a person into a certain
category according to their interests.
Whether or not there is automated decision-making as defined in Article 22(1) will depend upon the
circumstances.
Example
Imposing speeding fines purely on the basis of evidence from speed cameras is an automated decisionmaking process that does not necessarily involve profiling.
It would, however, become a decision based on profiling if the driving habits of the individual were
monitored over time, and, for example, the amount of fine imposed is the outcome of an assessment
involving other factors, such as whether the speeding is a repeat offence or whether the driver has had
other recent traffic violations.
Example
An automated process produces what is in effect a recommendation concerning a data subject. If a
human being reviews and takes account of other factors in making the final decision, that decision
would not be ‘based solely’ on automated processing.
Example
Hypothetically, a credit card company might reduce a customer’s card limit, based not on that
customer’s own repayment history, but on non-traditional credit criteria, such as an analysis of other
customers living in the same area who shop at the same stores.
This could mean that someone is deprived of opportunities based on the actions of others.
In a different context using these types of characteristics might have the advantage of extending credit
to those without a conventional credit history, who would otherwise have been denied.
Example
A controller uses credit scoring to assess and reject an individual’s loan application. The score may
have been provided by a credit reference agency, or calculated directly based on information held by
the controller.
Regardless of the source (and information on the source must be provided to the data subject under
Article 14 (2) (f) where the personal data have not been obtained from the data subject), if the
controller is reliant upon this score it must be able to explain it and the rationale, to the data subject.
The controller explains that this process helps them make fair and responsible lending decisions. It
provides details of the main characteristics considered in reaching the decision, the source of this
information and the relevance. This may include, for example:
•the information provided by the data subject on the application form;
•information about previous account conduct , including any payment arrears; and
•official public records information such as fraud record information and insolvency records.
The controller also includes information to advise the data subject that the credit scoring methods used
are regularly tested to ensure they remain fair, effective and unbiased.
The controller provides contact details for the data subject to request that any declined decision is
reconsidered, in line with the provisions of Article 22(3).
Example
An insurance company uses an automated decision making process to set motor insurance premiums
based on monitoring customers’ driving behaviour. To illustrate the significance and envisaged
consequences of the processing it explains that dangerous driving may result in higher insurance
payments and provides an app comparing fictional drivers, including one with dangerous driving
habits such as fast acceleration and last-minute braking.
It uses graphics to give tips on how to improve these habits and consequently how to lower insurance
premiums.
Example
Some insurers offer insurance rates and services based on an individual’s driving behaviour. Elements
taken into account in these cases could include the distance travelled, the time spent driving and the
journey undertaken as well as predictions based on other data collected by the sensors in a (smart) car.
The data collected is used for profiling to identify bad driving behaviour (such as fast acceleration,
sudden braking, and speeding). This information can be cross-referenced with other sources (for
example the weather, traffic, type of road) to better understand the driver’s behaviour.
The controller must ensure that they have a lawful basis for this type of processing. The controller
must also provide the data subject with information about the collected data, the existence of
automated decision-making, the logic involved, and the significance and envisaged consequences of
such processing.
Example
A data broker sells consumer profiles to financial companies without consumer permission or
knowledge of the underlying data. The profiles define consumers into categories (carrying titles such
as “Rural and Barely Making It,” “Ethnic Second-City Strugglers,” “Tough Start: Young Single
Parents,”) or “score” them, focusing on consumers’ financial vulnerability. The financial companies
offer these consumers payday loans and other “non-traditional” financial services (high-cost loans and
other financially risky products).
Example
Some mobile applications provide location services allowing the user to find nearby restaurants
offering discounts. However, the data collected is also used to build a profile on the data subject for
marketing purposes – to identify their food preferences, or lifestyle in general. The data subject expects
their data will be used to find restaurants, but not to receive adverts for pizza delivery just because the
app has identified that they arrive home late. This further use of the location data may not be
compatible with the purposes for which it was collected in the first place, and may thus require the
consent of the individual concerned.
Example
A user buys some items from an on-line retailer. In order to fulfil the contract, the retailer must
process the user’s credit card information for payment purposes and the user’s address to deliver the
goods. Completion of the contract is not dependent upon building a profile of the user’s tastes and
lifestyle choices based on his or her visits to the website. Even if profiling is specifically mentioned in
the small print of the contract, this fact alone does not make it ‘necessary’ for the performance of the
contract.
Example
A data broker undertakes profiling of personal data. In line with their Article 13 and 14 obligations
the data broker should inform the individual about the processing, including whether it intends to share
the profile with any other organisations. The data broker should also present separately details of the
right to object under Article 21(1).
The data broker shares the profile with another company. This company uses the profile to send the
individual direct marketing.
The company should inform the individual (Article 14(1) (c)) about the purposes for using this profile,
and from what source they obtained the information (14(2) (f)). The company must also advise the
data subject about their right to object to processing, including profiling, for direct marketing purposes
(Article 21(2)).
The data broker and the company should allow the data subject the right to access the information
used (Article 15) to correct any erroneous information (Article 16), and in certain circumstances erase
the profile or personal data used to create it (Article 17). The data subject should also be given
information about their profile, for example in which ‘segments’ or ‘categories’ they are placed. 33
If the company uses the profile as part of a solely automated decision-making process with legal or
similarly significant effects on the data subject, the company is the controller subject to the Article 22
provisions. (This does not exclude the data broker from Article 22 if the processing meets the relevant
threshold.)
Example
A local surgery’s computer system places an individual into a group that is most likely to get heart
disease. This ‘profile’ is not necessarily inaccurate even if he or she never suffers from heart disease.
The profile merely states that he or she is more likely to get it. That may be factually correct as a
matter of statistics.
Nevertheless, the data subject has the right, taking into account the purpose of the processing, to
provide a supplementary statement. In the above scenario, this could be based, for example, on a more
advanced medical computer system (and statistical model) carrying out more detailed examinations
and factoring in additional data than the one at the local surgery with more limited capabilities.