NAALA | Not An Average Legal Advisor

Impact of Italian profiling ruling on digital health

Impact of Italian profiling ruling on digital health

iga-palacz-urqnpYi22X4-unsplash

Anne Sophie Dil

Co-founder of NAALA

Published on 25 August 2021

The Italian data protection authority (DPA), Garante per la Protezione dei dati Personali, in short: the Garante, issued a €2,5 million fine to online meal delivery service Deliveroo for violating the General Data Protection Regulation (GDPR). This case follows a previous case in which the Garante fined online meal delivery service Foodinho €2,6 million for violating the GDPR. Why did the two meal delivery services receive such a high fine? And what lessons can be learned from this for, say, providers of digital healthcare solutions?

Foodinho is the Italian branch of the Glovo Group, a Spanish courier service for (mainly) meals. Meal deliverers are called riders, referring to the fact that they often deliver orders riding a bicycle.

In cooperation with the Spanish DPA, the Garante investigated the platform’s modus operandi. The investigation was initiated because there were suspicions that Foodinho was using algorithms to closely monitor riders, without being transparent about this to the riders.

The investigation found that riders were ranked based on customers and partner reviews. With a low review score, riders were placed lower on the priority list. As a result, low-scored riders were less likely to get orders (and thus: work). This ranking was done by the algorithm, without any human intervention. Hence, riders could not challenge this decision to offer them fewer job opportunities.

Nonconformities were noted on several topics:

  • Insufficiently transparent information was provided to riders, including not mentioning that automated decision-making was used for offering orders.
  • Insufficient distinction was made in the retention periods of different types of personal data, collected for different purposes, and the retention periods were not (all) appropriate for the purpose for which the data had been collected.
  • The confidentiality, integrity, availability, and resilience of the systems could not be permanently ensured, because a significant number of system administrators had unnecessary access to personal data.
  • A Data Protection Impact Assessment (DPIA) had wrongly not been carried out, despite the innovative nature of the technology used. This was partly due to the use of automated processing (including profiling) which could have significant consequences for the riders.
  • Insufficient appropriate measures had been taken “to safeguard the rights and freedoms and legitimate interests” of the riders. This at least includes the right to human intervention in automated decision-making.

In addition to the €2.600.000 fine, the Italian DPA has imposed a number of corrective actions on Foodinho ensuring that the company will be acting in line with the GDPR after all.

Deliveroo Italy (part of Roofoods Ltd) uses a centralized computer systems to book the riders in the predetermined time slots until the riders are fully booked. Each rider (approximately 8000 at the time of the inspection) has been assigned a factor in the system. This factor is determined based on:

  • Availability of the riders, i.e., effective participation in the “super peak” services (from 7pm-21pm on Friday, Saturday, and Sunday),
  • Reliability of the riders, i.e., effective participation in the booked services, and
  • Speed of the delivery.

The higher the resulting factor, the higher the priority the booking is presented with by the computer system. This can lead, as in the Foodinho case, to people with a lower score being offered fewer orders (thus: work). This is because the available time slots run out as the riders with priority access to the weekly calendar express their preferences, gradually reducing the possibility of access to shifts for other riders.

During an (ex officio) inspection at Deliveroo, the Garante had concerns (once again) regarding the lack of transparency about the use of the algorithm. In addition, the Garante felt that an unnecessary amount of personal data was being collected, such as the location of the riders every 12 seconds. Subsequently, some personal data were also stored for an unnecessarily long time, such as the riders’ routes for six months.

Deliveroo has been given 60 days to correct violations. For example, appropriate measures must be identified for periodically verifying the correctness and accuracy of the results of the algorithm. This should prevent errors, and discrimination on any basis. On top of this comes the €2.500.000 fine for current violations of the GDPR.

Article 22 of the GDPR provides that anyone whose personal data is processed has the right not to be subject to automated decision-making if it significantly affects him or her. Put more simply, decision making based on automated processing of personal data is not allowed in principle 

As always, the law provides exceptions:

  • if the decision is necessary to allow a contract to be concluded,
  • if the decision is permitted by a legal provision (e.g., for the purpose of control and prevention of tax fraud and avoidance),
  • if the decision is taken with the explicit consent of the person to whom the decision relates.

For all exceptions, the condition is that rights and freedoms and legitimate interests of the person to whom the decision and personal data relate must be protected. This includes at least the right to have a decision reviewed by a human being before it is final (human intervention) and the right to challenge a decision. The latter therefore requires that the person to whom the decision applies is informed that automated decision-making is used and what this decision-making entails and can result in for them.

Automatic decision-making is the process of making a decision without human intervention. Often, automatic decision-making includes profiling, but it does not have to.

Profiling is an automated processing of personal data, in which certain personal aspects of a natural person are evaluated to analyze or predict, for example, job performance, reliability or location.

In both Fooinho’s and Deliveroo’s cases, profiling was involved. The riders were automatically given a score to rank them in order of priority for booking the time slots – set by Foodinho and Deliveroo respectively – for offering orders. No one verified this score, ranking or decision before it was put into effect. According to the Garante, this could create discrimination against ‘subordinate’ riders, reducing their employment opportunities. These are consequences that are considered significant, which means that the riders were subjected to automated decision-making that significantly affects them.

The GDPR is a European Regulation. This means that its provisions apply equally throughout the European Economic Area. However, it does not stop there: non-EU organizations are also affected by the GDPR if they process personal data of individuals in the EU.

Because of this broad scope of the GDPR, the rulings of the Italian DPA are relevant to all countries. The Garante gives interpretation to (among other things) Article 22 of the GDPR. This interpretation should be considered by all organizations processing personal data under the GDPR. DPAs of other countries will after all adopt this interpretation.

In the healthcare sector, particularly sensitive personal data is handled. Personal data relating to health must be especially well protected, because of the far-reaching consequences for the persons to whom the data relates if this data is leaked or no longer available or usable. Moreover, (automated) decisions based on data in healthcare can have far-reaching consequences, especially if these decisions relate to the diagnosis or treatment of a patient.

Article 22 of the GDPR therefore emphasized that health data may only be used for automated decision-making if one of the aforementioned exceptions applies (formation of a contract, prescribed by a legal provision, or explicit consent of the data subject). On top of that:

  • the person to whom the personal data relate must have given (additional) explicit permission for the use of his or her health data for automated decision-making, or
  • there must be a compelling public interest, such as national security or the protection of public health.

The Foodinho and Deliveroo cases emphasize the importance of transparency and proper disclosure of information to affected individuals. This is particularly important when sensitive information such as health data is used.

Profiling and automated decision-making can be particularly useful, including in healthcare. By using technologies and analyzing data, faster and more unambiguous decisions can be made. When these decisions affect the data subject, for example when a patient’s treatment is adjusted, it is necessary to ensure the safety of the data subject and the security of their data.

Are you curious about the impact of the Italian DPA on your activities? Or would you like to know how you can organize your (automatic) data processing in a safe and compliant way? Please feel free to contact us.

Please note that all details and listings do not claim to be complete, are without guarantee and are for information purposes only. Changes in legal or regulatory requirements may occur at short notice, which we cannot reflect on a daily basis. 

Other articles you may be interested in:

Liked the article? Maybe others will too. Feel free to share!